Fired Up On My Strava Internship

This is the second post in Strava’s Intern Blog Series, where we give interns the opportunity to talk about the projects that they’ve been working on.


My name is Annie, and I’m a rising senior at Stanford University studying Symbolic Systems, which integrates computer science, psychology, and other fields. I’m especially interested in health psychology and human computer interaction in health technology, plus running is one of my favorite things to do. Strava obviously incorporates all of these academic and personal interests, making it an absolutely awesome place to be!

Pictures from two of the amazing runs I’ve gotten to go on with Strava friends.

At Strava, I work on the growth team, which focuses on increasing new registration and activation with the goal of boosting Strava’s active users. The unofficial mantra of the growth team is that we’re “FIRED UP!!!!” which means we are an enthusiastic and energetic group. One way we exercise this enthusiasm is by being, as one of us said last week, a “mean, lean, testing machine.” In other words, one of our focuses as a team is to speedily create iterative A/B tests.

The growth team’s fired-up-on-growth emoji.
The growth team’s fired-up-on-growth hat.

A/B testing is extremely cool. With just a few lines of code and some help from both a 3rd party (Apptimize) as well as our own internal experiments and cohorting infrastructure, we can measure how a simple copy change impacts the number of users who connect to Strava through Facebook, or how giving a product tutorial during onboarding can change the activation rate of new users.

A/B testing was especially exciting for me to dive into because I came in with a background doing academic research. At Stanford, I’ve worked as a research assistant for the past three years in the Mind and Body Lab, doing psychological research. After working there for so long, I’m accustomed to the glacial pace of academic research — everything has to go through the Institutional Review Board before you can start a study, and after you’ve spent months recruiting and running each participant through the study, the publishing process can take over a year.

At Strava, things work differently. We don’t hesitate to run multiple experiments at the same time, and our tests can be thought up and executed within twenty minutes. For example, during my second week, I asked a coworker if we had experimented with the copy of our “Connect to Facebook” buttons at all. These buttons encourage users to connect their Facebook accounts and phone contacts so that we use their social graphs to connect them to Strava’s social network. “Nope,” she said, “why don’t you run a test on it?”

The test took about 20 minutes to set up, and increased the amount of people who click on that button by eight percent.This is a great example of two things: the attitude the growth team has around testing and the way Strava encourages interns to take ownership of things they feel like matter in the product.

The copy test I ran on these two connect cells.

This was one extremely simple A/B test, but things can get more complicated too. Over the past few weeks, we’ve brainstormed, designed, and rolled out four A/B tests on a version of one screen in the app. These tests are complicated for lots of reasons — they test multiple changes made on the same screen, they alter code that is not often changed, and they require a lot of moving parts to be coordinated all at once between designers and engineers.

One unexpected realization we encountered during this process was that testing cadence (how long you need to run a test in order to reach a statistically significant result) can be different than engineering cadence (how much time it takes to write the test and get it ready to roll out). Balancing these various processes forces us to make a lot of decisions that require us to weigh speed against the accuracy of potential learnings.

Of course, academic research and A/B testing within a tech company do share many things, as they are both iterative, continuous processes of making observations, developing hypotheses, operationalizing concepts and writing tests, then gathering findings and improving your understanding with each cycle. In both cases, this process is satisfying and exciting.

Big thanks to Varun Pemmaraju, who organized the Intern Blog Series all the way from the Himalayas, and to my technical mentor Jason van der Merwe and manager Evelyn Cordner.