How to improve your app metrics in weeks with Play Store Experiments
It’s no secret that distribution is key for app developers. No matter how great your app experience, if you can’t encourage people to download, you have no users to show it to.
The Play Store has long been a developer-friendly platform. We are able to update pretty much anything in our listings without having to reship an app (which…hum… is not the case for all mobile stores). Recently, Google has done app developers an even bigger favor by enabling us to optimize those listings with an Experiments tool.
Needless to say, the Dashlane team jumped on this feature right away. Here is a walk of our experiences with the Play Store Experiments tool so far — including tips on what try, and what to avoid.
Why improving your Play Store page should be a priority
First of all, let me try to explain the value Play Store optimization can bring to your business (for those of you who are already convinced, feel free to skip to the next section ).
Let’s say you spend $1,000 on marketing to get 1,000 visitors to your Play Store app page. Of course not all of them are going to download your app. And some of those who do will never open it. Those who do might not create an account. And last but not least, a big proportion of those who make it that far will stop using your app after 5 minutes. So your funnel might look something like this:
You can try to optimize onboarding to boost retention at the bottom of this funnel. Note that to do that, you will probably need to build an A/B testing structure (or pay for an existing service such as apptimize or optimizely), run different tests and implement the winners. This will take some time before you see any results. If at the end of that process you are able to boost retention by 10%, your funnel might look something like this:
Alternatively, you could start by optimizing your Play Store listing. This is a different game entirely. It requires work, but no developer resource or implementation, and Google has provided you the framework to do it — for free. If you are just as successful here and manage to improve visitor to download rates by 10%, here’s what your funnel will look like:
With less effort you could finish with better results. This has a huge impact on your entire funnel. Take a look at the impact of these two different approaches on the dollars spent to acquire a retained user:
Of course this is a simplified view that does not take into account factors such as the intent of your new users, but one thing is certain: the more people you get in the funnel the better optimization tests you can run afterwards.
What we learned from our first Play Store Experiments
We don’t claim to be Play Store scientists — but here’s what we’ve discovered in running our experiments — the successes and the failures.
Test 1: The icon
Google released Experiments just after we launched the redesign of our Dashlane app following Google Material design guidelines. At that moment we hadn’t touched our app icon, making it the perfect candidate for our first test. We set our regular icon (control) against a materialized makeover and a brand new design that we thought might better build trust in our app.
Both new icons performed better than the control with a slight win for the trustworthy shield design, which got us 3.8% more installs. Interestingly, the difference in performance between our two test variants was slight: Materialized style is clearly more important to our visitors than specific design. As soon as we read the final results, we were able to switch the icon for all Play Store visitors instantly.
Test 2: The video (the failed test)
When I said that you don’t need an analytics team to use Experiments I meant it — our tests are planned, run and read directly by product marketing. But that doesn’t mean that anything goes, and we’ve made some mistakes too.
We thought that video would be an effective way to explain our app to Play Store visitors. This might seem like a no-brainer, but in past website tests we’ve found that video can actually harm conversion.
We ran our experiment with three test cells:
Already spotted what went wrong here? Yes, we made too many changes to understand what had an impact on the metrics. We made some basic (wrong) assumptions and focused on making our feature graphic ‘compatible’ with the video and took this opportunity to update our tagline with what was performing better on other channels. We did not think that just changing this could have an impact on metrics independently from the video but it actually harmed our metrics on top of making them unreadable.
Anyway, the flexibility of the tool meant that we could easily revert our listing and launch a new test.
Test 3: The video (for real)
This time we prepared our test cells to get to the heart of the matter: was it the change to our feature graphic design or copy which had a negative effect on conversion?
Good news for us (zero implementation doesn’t mean zero work) — our new video was a winner, attracting 4% more downloads than the control. Here, ‘Test 2: The video’ and ‘Test 3: The video (for real)’ together show us that copy mattered more than the visuals. When we reverted the tagline the control image performed a little better than the one featuring a new design, a stark different from disappointing metrics that new copy brought us in the first test. With messaging left alone, we could see that video was most popular and rolled this out universally to the Play Store.
Test 4: The tagline
It is really hard to predict how people will react to product messaging: in this experiment, we gave our tagline the dedicated attention it deserves. We thought that by describing the product benefit with clear jargon-free language, we would motivate more people to download the app. We use some of this language in our video and elsewhere in product marketing, so Experiments offered a good opportunity to test how real people respond.
This test confirmed that tagline has a real impact, but that we don’t have a quick win to improve on our control. We lost 6% in downloads with this new copy. Remember, even if you aren’t focused on optimizing your page, you should always avoid blind changes. You may end up harming conversion without ever realizing why.
A word of advice (and a wish list for Google)
This is no A/B testing 101 course (there are plenty of great resources available on the web), but I will leave you with some key takeaways from what we’ve learned in our team:
- The Play Store allows you to test all elements of your listing in any combination, but that doesn’t mean you should. If there’s any uncertainty on the conclusions you will be able to draw when your test is over, strip it back down and keep it simple.
- When it comes to product positioning there is no place for gut feelings. By now you are so biased towards your app that your instincts are the last ones to trust. If you find yourself or a colleague saying “but we know that…” without any proof — challenge it, and test anyway.
We’re happy with this tool and today we are constantly running experiments on the Play Store — I don’t see any reason why this would stop.
Although the tool is already pretty awesome, it does not mean we cannot wish for more. There are two things we hope to see soon in the Developer Console. First, we would like visibility of not only our download numbers, but of the visits to our listing. And second, we would love to track Experiment test cells all the way into our app. In a nutshell: which visitors, seeing which versions of our Play Store, become the most active users?
While we wait, this is a great tool and I encourage you to spend some playing with it. If you liked this post, please recommend it. If you’re using Play Store Experiments, what are you testing? What’s working? Share your lessons learned.
Oh and if you need a great Password Manager, feel free to have a look at Dashlane on the Play Store (that will help us run more A/B tests :) )