Why analytics teams fail

Eric Seufert
8 min readApr 11, 2019

--

This article was originally published on Mobile Dev Memo

That analytics is an integral function for consumer mobile tech companies is a fairly uncontroversial notion: the sheer size of the market for apps and the relative ease of app distribution (putting aside discovery issues) means that reaching scale on mobile can involve usage by hundreds of millions or even billions of people, and with such large potential user base numbers, optimizing a product based on actual user feedback rather than product manager intuition becomes an organizational imperative.

Put another way, when an app developer is able to collect massive amounts of data from real users via their always-connected devices, that data and a company’s ability to interpret it for the sake of optimizing the product is a substantive competitive advantage. Combining this reality with the fact that freemium is the dominant business model on mobile, and the analytics function for app developers can be seen as almost symbiotic with the product development function.

That said, analytics teams at mobile consumer tech companies very frequently fail. By fail, I mean they settle into a non-optimal workflow with product teams that more or less subverts their ability to produce value and directly increase revenue. For mobile app developers, and especially freemium app developers, the analytics function is not a cost center but a revenue driver: when analytics teams aren’t given the freedom and authority to focus on projects and initiatives that generate revenue, they often wither away into support roles that bring very little benefit to the companies that employ them.

Three common reasons that analytics teams fail are: 1) lack of agency and authority, 2) lack of investment into infrastructure, and 3) improper placement in the organization (in terms of reporting lines and accountability). Before fleshing out these thoughts, though, the term “analytics team” should be defined in more detail. In this article, the term “analytics team” is conceived of as the team responsible not only for analytics architecture and tools development (eg. the data collection and storage backend, the dashboard interface, query tools, etc.) but also the team that conducts product analysis (eg. analysts, researchers) and builds data tools (eg. data scientists). In practice, these roles aren’t always staffed on the same team (I spell out the distinction between the various analytics-related roles in this article), but for the purposes of this article they’re all grouped together under the umbrella term “analytics team”.

Lack of agency and authority

If the analytics function is thought of as a value driver for the business, then it makes sense that it should be given the authority to prioritize tasks and define its own roadmap. But this is oftentimes not the case, and in many companies, the analytics function is forced into a “reactive” role with respect to product development: new features are added to the product or a new product is launched, and the analytics team is forced to prioritize the measurement and support of those changes.

This is problematic for a number of reasons. The first is that, in a data-driven organization, the analytics team should have an important voice in deciding which features or products are developed in the first place (based on analysis and testing). If the analytics function is seen as removed from this process — from the process of using data to determine how the product can best be improved — then much of its time will be tied up in measuring the impact of things that it wasn’t able to influence (and thus will react to product changes rather than drive them). Without the agency to set its own agenda and roadmap, rather than forward-looking analysis that may inspire product changes, the analytics team will be occupied by backward-looking analysis on product changes that have already been made (and thus the “data-driven” company becomes the “intuition-driven” company with a lot of data).

The second reason is more subtle but nonetheless significant: when analytics teams are not empowered to prioritize their own tasks, then ad-hoc analysis from the product teams becomes their entire (or at least a very significant portion of their) remit. By ad hoc analysis, I mean the types of requests that analytics teams receive from product teams for information that may be interesting to know but are merely descriptive: things like simple counts of users in a certain geography, or average session times for users that joined from a specific marketing channel, or the average age of users that have spent some amount of money in the app, etc. These types of numbers are trivia, not insight, and I call them “first-order analysis”: answers to “what” questions (eg. what is the size of our user base in South America?) that describe the product as it exists today but can’t help to improve it.

The “second-order analysis” projects — those that answer “why” and “how” questions — are the ones that drive product improvements, and these should occupy a very significant majority of the analytics team’s time. But when an analytics team isn’t empowered to say “no” to requests (and thus prioritize its own backlog of tasks), then second-order analysis projects are almost always sidelined in favor of first-order tasks that are deemed more immediately pressing (“How many users per day are using the new feature we just pushed in the last release?”). First-order requests almost always beget more first-order requests (“That’s an interesting result — I wonder if it’s the same in country X, can you check?”), and so these small, immediate tasks balloon into days-long back-and-forth query-and-report sessions.

This is the kiss of death for an analytics team: highly skilled analysts don’t want to be running simple SQL queries or doing data entry into a Google Doc for eight hours per day, and they don’t stay long in positions that require them to do so. Unless an analytics team is able to push back against requests that aren’t hypothesis driven — that is, the task request is accompanied by at least some set of assumptions as to why the work might provide value — then the team will fail.

Lack of investment into infrastructure

This point is somewhat of a corollary to the first. When analytics teams aren’t given the resources to build a robust infrastructure that can be interfaced with by the entire organization to conduct first-order analysis (answer the “what” questions), then that analysis falls to them and ends up occupying a significant portion of their time. Simple trivia should be retrievable from some sort of analytics interface, be it a dashboard, an off-the-shelf analytics tool, or even a relational database. When this isn’t possible, the analytics team is tasked with the type of menial work that is better suited for machines: simple counts, averages, sums, etc. Humans are better equipped than machines are for creative tasks and critical thinking, and so organizations should leverage them for that kind of work; machines are better equipped than humans at quick calculation, and so they should be tasked with that.

For the analytics team to provide real value and drive revenue growth, it needs to be focused on the second-order analysis that can produce product changes and new features: analysis of trends, testing of hypotheses, etc. But the ability to focus on this type of work is only afforded when first-order analysis is either automated (encapsulated in some front-end tool that the rest of the organization can access) or removed from the purview of the analytics team altogether.

And so here, infrastructure doesn’t only mean tools and back-end systems that allow data to be accessed by the entire organization but also training for the people in the organization that consume first-order analysis to empower them to collect data themselves. It’s not unreasonable to expect a product manager or marketer at a consumer technology company to be able to conduct simple SQL queries; if training is needed to accommodate that, then the analytics team should invest its time in providing that training (or convince company management to invest into external training programs).

That said, the analytics team is responsible for providing the underlying platform from which this data can be queried, and this in most cases is not only a considerable undertaking, it’s also somewhat mundane and uninteresting (eg. building a script to normalize user data and aggregate it into relational database tables). But the work that necessitates the existence and accessibility of this data is important, or at least very often urgent, and avoiding the investment required in bridging accessible, actionable data with the people that need it simply means putting the analytics team on a SQL treadmill from which it might not be able to remove itself.

Improper placement in the organization

Given the importance of analytics in the product development process for mobile apps, it makes sense that the analytics team is closely aligned with (but independent of) the product team. If the analytics team reports directly into the product unit, then its independence and authority (and empowerment to say “no”) are undermined; likewise, if the analytics team is too removed from the product development function, then it runs the risk of becoming a distant “task black hole” into which requests are sent and never seen again.

The product and analytics teams need to work closely but independently; the analytics team should be able to receive fully-formed requests from the product team and prioritize them (or reject them) at its discretion. This is often accomplished with a configuration that sees analysts embedded with the product teams but reporting into a Head of Analytics.

Regarding to whom the Head of Analytics reports, there are many workable designs but only a few that almost universally are unproductive. As a general rule, the more intimately familiar the person the Head of Analytics reports into is with mobile product management and the broader app economy, the more that person will value analytics and leave the team to its own designs. It can be awkward to have the Head of Analytics report into, for instance, the CFO because that person may have trouble evaluating the importance of the analytics team’s work (and thus is less likely to create political cover for the team in disputes). On the other hand, when the Head of Analytics reports to the CTO, solutions to problems can tend to take the form of proprietary tools and analytics products when off-the-shelf options are available.

The specific configuration and seating arrangement isn’t as important as the ability for the analytics team to make prioritization decisions independently and to focus on second-order analysis that drives product decisions and generates real value. Without that independence — the independence to make prioritization decisions and to make investment decisions around infrastructure and toolsets — the analytics team runs the risk of failing. And, again, given the absolute necessity of thorough analysis and muscular analytics infrastructure for app developers, a failed analytics team can lead to a failed company.

--

--

Eric Seufert

Quantitative Marketer. Author of Freemium Economics (Elsevier 2014). Blogger at Mobile Dev Memo. All views strictly my own.