iOS 9 adoption and Mixpanel

Related: Apple Topic Page, and a previous post on iOS and Android adoption patterns, as well as two earlier posts (1, 2) on Android versions in use.

Last week, following the release of iOS 9 by Apple, Mixpanel (along with other analytics firms) began releasing data relating to the pace of adoption of iOS 9. That data suggested that iOS 9 was being adopted more rapidly than iOS 8, and also that it had reached around 30% by the end of the day on Saturday. Then, this morning, Apple issued a press release about the new iPhones, but in passing noted that iOS 9 was now on more than 50% of devices, based on data from Saturday, September 19th. That’s a fairly sizable discrepancy, and it made me want to dig into the numbers to understand what was going on.

Note: I’ve reached out to both Mixpanel and Apple about this, and I will update this post as warranted once I hear back from them. As of right now, the analysis below doesn’t include any additional information from them beyond what they’ve put out publicly.

A word on methodologies

It’s worth starting with a quick statement of methodologies. Apple’s goal is to give developers a sense of the operating system versions their target audience is using, and so is based on devices hitting the iOS App Store (Google, incidentally, does the same thing). It generally seems to pick a specific single day, usually a Monday, and measures which versions of iOS those devices hitting the App Store are using.

Mixpanel, on the other hand, provides analytics to app developers to help them understand engagement around their apps, but in the process also collects lots of data on which operating systems the users of those apps are running. As with any analytics software of this kind, the picture will always be incomplete, but the bigger the base of devices, the more likely it is to be representative, and Mixpanel’s is fairly big at this point.

Mixpanel is generally very close from October to August

With that note on methodologies as context, the first thing to note is that Mixpanel is generally very good at approximating Apple’s own numbers for iOS adoption, even though their methodologies are different. For the period from October 2014 to August 2015, Mixpanel’s numbers generally tracked within about 4% of Apple’s own numbers for iOS 7 and iOS 8 adoption. Interestingly, Mixpanel tends to estimate higher usage for the latest version and lower usage for previous versions than Apple.

September seems to be more of a problem

However, even though Mixpanel’s data tracks closely to Apple’s for most of the year, it tends to be quite a bit off the mark in September, immediately after the release of new iOS versions, at least for the last two years. The chart below shows the difference between Apple and Mixpanel’s adoption rates for iOS 7, 8, and 9. Negative numbers mean that Mixpanel’s rate is lower than Apple’s, while positive percentages mean Mixpanel’s numbers are higher.

Mixpanel and Apple iOS adoption rate differences

The chart shows several things that are worth noting:

  • As I mentioned, the margin of “error” (I’ll explain the quote marks later) is generally under 5%, though as you can see it grows steadily from late October 2014 to September 2015
  • However, the discrepancy between the two figures is much more significant for two dates — September 21, 2014, and September 19, 2015 — which happen to be the dates immediately after the launches of the new versions for the last two years. In both cases, Mixpanel’s adoption rate for the new version of iOS was far lower than Apple’s, in contrast to the usual pattern during the year.
  • The discrepancy quickly shrank last year — by early October the two numbers were very close again. We don’t know yet what will happen this year, of course.

What explains the September discrepancy?

I’ve carefully avoided describing Mixpanel’s data as faulty above — I did use the term margin of error once, but carefully put “error” in quotation marks. And that’s because even Apple’s own numbers aren’t necessarily accurate in reflecting the devices actually in use. For the sake of developers, knowing what mix of devices hit the App Store is of course actually more important and relevant than knowing the total mix of devices out there. But it’s not necessarily an accurate picture of what people are using across the broad base of devices. Mixpanel’s data may actually be more representative of the actual base of devices in use, but there’s no way to know for sure; ultimately, it is also measuring something other than true adoption rates across the base.

So, having framed this as a discrepancy or difference rather than an error, what explains why the numbers are so close from October to August, and yet so far apart immediately after a new iOS version launches? Here are a few possibilities:

  • Apple’s numbers, which reflect App Store visits, are unduly skewed early on by the influx of recent updaters looking for apps that take advantage of new features — e.g. content blockers, multitasking, and in-app search in iOS 9. Once the initial rush has subsided, App Store visitors return to looking more like the overall base. Since we don’t have detailed day-by-day data from Apple, it’s hard to tell how quickly this effect wears off, but I suspect it may be a big part of the answer.
  • It’s possible that Apple’s numbers, which are global in nature, are more representative of true trends than Mixpanel’s, which may skew towards (or away from) particular countries. As a result, if users in China or other major markets which Mixpanel may not track as closely update iOS more quickly, Apple’s numbers might capture that whereas Mixpanel’s wouldn’t. As download rates catch up in other regions, the discrepancy would work its way through over time. I don’t know enough about Mixpanel’s data to know how much of an issue this is, but it might be a secondary factor.
  • Apple’s regular iOS adoption data is usually captured on a Monday, whereas its post-launch data for the last two years was captured at the weekend (a Sunday last year, and a Saturday this year). It’s possible that the mix of devices in use — and especially those hitting the App Store — on the weekend is different from those in use on a Monday, but this is unlikely to account for much of the discrepancy, especially since the Mixpanel data was collected on the same day.

Another thing that’s worth noting is that other sources of iOS adoption data tend to agree more with Mixpanel at this point than with Apple’s numbers — both Fiksu and David Smith’s Audiobook app data tend to suggest adoption closer to Mixpanel’s than Apple’s, for example. So, either all these methods suffer from the same “problem” or Apple’s data is actually unrepresentative of the true base of devices out there, especially in these first few weeks. Until I hear more from either company, it’s going to be hard to know which is the case. But it’s certainly worth viewing Mixpanel’s data (and any other third-party data) in this context in the future, especially when it comes to the period immediately after a new version of iOS comes out.

Like what you read? Give Jan Dawson a round of applause.

From a quick cheer to a standing ovation, clap to show how much you enjoyed this story.