The Gong Project: Driven by Data

Andrew Baltrus
Indeed Engineering
Published in
5 min readOct 7, 2019

In the first week of my new job, fresh out of school, I didn’t expect to make changes to a page with millions of monthly visitors. But that’s what new college hires do at Indeed: dive in head first. My experiment with a prominent part of the Indeed job search taught me the importance of data in driving product decisions.

A new Indeedian bangs a gong to signal the completion of their first experiment
New hires to ring a large gong when they deploy their first code

I’m Andrew. I’m an associate product manager (APM) with a passion for building products that affect real people’s lives.

Starting product management at Indeed has been a whirlwind. Only a few months ago, I relied on Indeed to search for jobs. Now I’m a builder rather than a user, making a difference with Indeed’s powerful platform and tools. Every day at Indeed, I can see the impact of my work: we help people get jobs.

University after college

Normally, when you start work as a new-grad hire, your company installs you on a team. Indeed is a little different. They place us in a program called Indeed University. This 12-week summer program teaches us about the data-driven strategies that make Indeed tick. At Indeed University, we experiment with existing products and build new ones. Think of it as a start-up incubator or idea factory powered by Indeed.

I’ve always been interested in solving complicated problems like the ones Indeed faces every day. When I joined the company, I expected the most interesting work to come from planning and building. But, in fact, experimenting has been the most exciting experience. In our first week at Indeed University, they handed us the keys to Indeed’s search page and let us take it for a test drive. Allowing us to tinker with such an important piece of the core business shows how much Indeed trusts the people they bring on board. Indeed calls this first project the gong project because it’s customary for new hires to ring a large gong when we deploy our first code (and, in this case, our first experiment).

The experiment

The beauty of the gong project is that you can learn by doing the moment you begin your job. The Indeed University gong project is typically an A/B test in which you make a small change on Indeed’s search page and see the impact it has on job search metrics. I and my teammate, Mo Chen, had the idea to modify part of Indeed’s job description page. We saw how Indeed asks job seekers to sign up for job alerts and we believed the language could be more engaging.

The existing text on the mobile job description page reads: “Get email updates for the latest jobs in New York, NY”

We thought it would be interesting to draw in users with a question. We hypothesized that this language would encourage job seekers to sign up for alerts:

Our new prompt asks a question: “Want to receive email updates on the latest developer jobs in Austin, TX?”

We used two of Indeed’s internal tools to test our hypothesis:

  • CrashText enables small text changes to live Indeed pages without deploying a full application.
  • Proctor facilitates A/B testing by letting us vary the page content we deliver to different users and log the results. (Indeed made Proctor available open source in 2013.)

Together these tools made it simple for us to compare our text with the existing text. How many users signed up in response to each?

Learning from results

In the first few days, we saw that our experiment affected the number of sign-ups. What a rush! We were so excited.

But then we learned: early in an experiment it’s common to see extreme results. Indeed’s practice is to gather at least a week of data before reaching any conclusions. We looked closer and saw that the experiment hadn’t yet included many users. We had only a few sign-ups, and thus our confidence interval was 30% wide. It was way too early to say we made a significant improvement.

Early A/B test result showing a 29.34% increase in sign-ups (and the bounds of our confidence interval)

So we did what all Indeed engineers and product managers do when they experiment: we waited. Would continued testing corroborate our initial finding? We worked on other projects to give the experiment time to collect more data.

We returned to the experiment a few weeks later to discover that we had stronger evidence. Our change on the mobile job description page increased job alert sign-ups by 28.29%. And, importantly, our confidence interval shrank to a width of 11%. Now we could celebrate! Our A/B test was a success. With time, we learned that the question-based intake form spoke to mobile jobseekers more effectively than the existing text.

A/B test result showing a 28.29% increase in sign-ups (and the bounds of our confidence interval)

Data shows how we’re helping

What’s exciting about live user A/B testing is that we see measured improvement from our ideas and our work. I felt privileged to have that experience my very first week on the job. The trust that Indeed showed by letting us experiment with live search pages was an empowering welcome. It also gave me a taste for the data-driven decision making that is so important at Indeed.

The gong project was also a rewarding educational opportunity. I learned not to get too excited over positive results too early in an experiment. I now know that valuable experimentation takes time. To actually roll out a change like ours to all Indeed users, we would need more data. Indeed tests every change because we know that the services we provide are too important to build upon assumptions. Experimentation and testing ensure that we’re doing what’s right for the people who matter most: job seekers.

--

--