The Who and the How: Using Cohort Analysis and A/B Testing To Increase Conversion

Kevin Wang
Earnest Product Management
5 min readJun 21, 2016

Do you want your company to achieve more [user, volume, revenue] growth and do it cost-effectively? Of course you do. Improving conversion is one of few methods to achieve this outcome. Every company with a website or app has a conversion funnel, usually consisting of four stages. Uber’s rider funnel involves using an acquisition channel to convince a person to download their app, create an account, link a credit card, and finishes when they request their first ride. Earnest’s conversion funnel involves an acquisition channel, landing page, rate estimate, application submission, loan decision, and finishes when a client signs their loan. We’ve used cohort analysis and A/B testing separately and synchronously to greatly improve our conversion rate at every step of the funnel.

The Who

Use cohort analysis when you have distinct groups of users you want to understand. You can segment users based on geography, time, or other user characteristics. At Earnest, we use cohort analysis to understand portfolio delinquencies, pricing, and conversion over time.

We think of conversion rate as a KPI and report out on it weekly. Examining conversion rate on a consistent basis allows you address issues that arise before they can have an adverse impact on your business.

*Example Data — excludes acquisition channel conversion

Table stakes conversion analysis tracks the cumulative conversion based on the month of initial page visit. (Be careful if you’re cohorting on time. The information provided if we cohorted by application month would be very different.) In this example, we can see that our funnel shows some incremental improvements from January through April and then saw significant improvement in May, which has been sustained through June. However, like any good analytical PM, we’d like to know what might have caused this spike.

Two possibilities are that one of our channels improved conversion significantly or our mix of channels changed. Both of these hypotheses can be tested with cohort analysis.

This is a little hard to read right?
Make the tradeoff of using less data to show the result you care about

Focusing on just the changes from April to May, we observe that Source 1 and 3 clearly improved. It’s hard to read when we try to use the same chart type as before. Instead, we can get a clearer picture by showing only the cumulative conversion (the 30+ days) data points. Based on this, we would next want to investigate the channel mix and hopefully see that we saw more volume from Sources 1 and 3 in May while investigating what caused the drop-off in conversion from Source 2.

The How

The How happens to be the band name of a The Who tribute band.

Silicon Valley has successfully evangelized A/B Testing nearly to death. Traditionally, A/B testing has been narrowly applied to layout, text, image and flow changes. In contrast, I believe that the focus of A/B testing should encompass all the different ways your users interact with your company.

Source: vwo.com — The typical A/B test

There’s nothing wrong with limiting your initial set of tests to be focused on changing button colors or going from a three column layout to a two column one. However, unless you have a dedicated growth team that is committed to testing, the big wins dwindle quickly, leaving only small wins and big redesigns. These should and will remain useful strategies in the toolbox of any growing company, but they’re not enough.

You’ll notice that earlier when I defined A/B testing, I did not talk about interacting with the website, I said it concerned how users interact with the company. That’s because Earnest is more than just a website or a mobile app. There are so many moving pieces in running an online lending institution. We’ve become data-informed and data-tested on both operational and technical aspects of doing business. We have applied the principles of A/B testing to pricing, to the channels that we use to communicate and respond to our clients, and to how we organize and process applications.

If you are not testing new processes and non-client facing changes, you are leaving growth on the table.

For example, we’ve considered whether to ask applicants for additional information using free-form questions or using pre-defined language. The former provides us flexibility to ask very specific questions whereas the pre-defined questions have been optimized for internal processing. What matters in the end is how quickly we can arrive at an accurate decision. We can identify the “A” (free-form) and the “B” (pre-defined) of the test as well as the metric (time to decision). We tried both tactics for a couple of weeks and regrouped with data that validated one method as more effective than the other. Changes like this are not traditional A/B tests, but these changes show up in the number of people who decide to take an Earnest loan and love the process.

The WhoHow

^This band name is on the house.

How can you use both A/B testing and cohort analysis in conjunction with one another? There’s awareness and then there’s augmentation. Sometimes, you may find something to be confounding in your cohort analysis if you are unaware of A/B tests that have been implemented. You may need to exclude tests subjects or split them out as a separate group during analysis to control for the impact of particularly successful or negative tests.

You can augment your A/B testing with cohort analysis. This is especially important once your company has users or customers from diverse backgrounds. For instance, an A/B test might show you that you improved conversion by 32% (great job!), but upon further analysis, it improved conversion of users coming in from paid advertising by 50%, but decreased organic conversion by 10%. Both improved conversion, but using cohorts tells you a far more detailed picture and, in this case, would allow you to tailor two different experiences for these two groups, thereby improving conversion not by 32% but by 35% instead!

It’s critical for your business to understand who your users are and how they are using your product. Neither of these tools will tell you why — that relies on intuition and research. By using cohort analysis and A/B testing in conjunction with some understanding of The Why (also a band name, but not related to The Who or The How), your team should be able to make proactive choices and retroactive analyses that make drastic improvements to how well your users interface with your company.

Kevin is a Product Manager at Earnest. He’s interested in growth and the different avenues to achieve it.

Earnest is a San Francisco-based technology company building a modern bank for the next generation. Our Product Management team is a nascent group of technical, entrepreneurial, jacks and jills of all trades, and we are actively hiring!

--

--