Just because it’s Google, doesn’t mean it’s right

Alex Austin
4 min readAug 11, 2015

--

I love data and the stories that can be told with it. Probably too much. I actually look forward to the weekends before Branch’s bi-monthly board meetings, where I can sit down for 48 hours straight and synthesize the story of our business through visualizations and plots.

Unfortunately, data can be a very destructive tool when used inappropriately, such as in Google’s latest ‘experiment’ around mobile web interstitials for Google+ on iOS. In case you hadn’t seen it, here is a link to the short release. The post has generated quite a bit of traffic, so it’s only fair to evaluate its validity.

To help explain what was wrong, let me first give a quick summary of the boiled down experiment so that we’re all on the same page:

Problem statement: 69% of users abandon the site on the interstitial, and 9% of people click to the app, with 22% continuing on to the mobile site.

Hypothesis: The current interstitial is turning people off

Experiment: Turn the interstitial off

Measured response: -2% total iOS installs, +17% total users on mobile site for a day

The author concludes from this data that the interstitial should be eliminated and the internet rejoices. Here’s the top comment:

“In other words nobody likes app download interstitials, so stop. ”

Now, personally, I hate these generic interstitials (so I’m fine with removing them) but this is a really poor experiment. Their conclusion cannot be made based on their collected data. The measured responses are second and third order proxies of things that would be directly influenced by this change, and could be influenced by a number of different, totally unrelated events (popular Google+ posts during the experiment, etc). For example, how many of the total iOS installs were driven by the interstitial? It seems like this is a very small portion of the App Store traffic, so much so that the noise from other app install sources could have easily drowned any signal from the experiment.

The most offensive aspect of this whole post is that the experiment is not tied to any business goal whatsoever. How does Google+ measure success of the product? By the way that it’s written, it seems that the team believes that 1-day unique page views on the mobile site are equivalent in value to an app install on iOS. At Branch, I work with thousands of apps to convert mobile web traffic to apps and believe me, I can guarantee you that no one in their right mind would ever think that mobile page views and app installs are equivalent.

To do this experiment correctly, first off, you need to understand what your goals are. In general, most businesses have found that mobile web users do not retain and do not convert at anywhere near the rates of native app users. Ask anyone who has both a mobile site and an app, and you will hear the same thing. This is why these annoying interstitials appeared in the first place. Everyone is trying to convert that crappy mobile web traffic into high value app users, no matter the cost. For some apps, the end goal is for users to complete a purchase. For others, the goal is for them to upload a photo. In other words, for this experiment to work, you have to pick a down funnel metric to compare.

Second, your experiment would need to only compare traffic which originated from a source that would have seen an interstitial. You can’t just look at the total number of key events that occurred that day, since they can originate from many sources. You need attribution back to the distinct channel that you’re testing. Simply, you’ll want to compare these two numbers:

1. Baseline conversion from page view on the interstitial to your key metric on the mobile site + the mobile app

2. Conversion of a page view without the interstitial to that same key metric on the mobile site + the mobile app

Lastly, there needs to be a time component to this experiment so that you can compare overall time value of a user. It’s common knowledge that native app users are more retained, so you need to measure the total impact on overall retention for each split.

At Branch, we created a full stack, trackable, deep linking, smart app banner that allows companies to run these types of experiments and actually measure the direct impact on traffic and their business. We provide this service for free so my suggestion would be to run this experiment yourself. Do not use the data in Google’s blog post to make any business decisions.

--

--

Alex Austin

I was constructed from remnants of HAL 9000 and Optimus Prime. I’m currently working on Branch links @branchmetrics to help mobile devs with app discovery.