Action Analytics

A framework for data analytics best practices, team structure, and more.

Brandt Belson
Geek Culture

--

Analytics is essential in making sense of ever-growing mountains of data. Data analysts risk getting lost in this increasingly treacherous terrain without a destination and direction. Action Analytics is a framework that provides this clear destination and direction, namely to change actions. While this may sound simple at first, it leads to a number of useful and uncommon best practices for data analysts and decision makers.

Thought Experiment

Imagine analytics done “perfectly”: every split of data being ready before anyone even asks; user behavior modeled with high accuracy; dashboards updated in real-time; beautiful one-off reports prepared for investors; revenue projected with almost no error; ad attribution and CAC calculated to the penny, etc. Although this sounds wonderful, all of the foregoing could be true and so the company could be better off laying off everyone who made it happen. If every dashboard, analysis, and report confirms what was already known or suspected, what was the impact and value? None and zero. Had none of it been done, the same actions would have been taken (without paying for expensive engineers and analysts!). The real, and only, value of analytics is when it influences actions.

The Goal is to Change Actions

The more nuanced value of analytics is: over time, change actions leading to lower value to actions leading to higher value.

For those mathematically inclined (this is an analytics article, after all!), the net value of analytics is:

Net Value = (value of the result of action taken)-(value of result of default action without analysis)-(cost of analysis)

Summed over many opportunities for action, higher is better. A few takeaways from this equation are below.

  • Confirming the default action has zero value, and the net value is negative after subtracting the resources spent on analytics.
  • Some analyses will lead to no change in action, and some will. This probability is averaged out when repeatedly applying the equation.
  • For high-value or high-stakes actions, e.g., pivoting a startup (maybe from B2C to B2B), the chance of influencing the action could be low but still worth doing an analysis. This can be justified if over many similar high-stakes analyses, at least a few are influenced so that in sum the net value is positive.
  • It’s not the number of actions changed, but the difference in value, that matters. Changing many low-stakes actions, or over-optimizing or over-analyzing high-stakes actions where the default was already very good, may not amount to a positive net value.

Ben Horowitz offers an insightful question: “What’s something you know that no one else knows?”. For this article, we restate the question as follows: “What’s something analytics knows that no one else knows?”.

In keeping with the broader point of this article, the abstractions so far are only as valuable as the actions they change. Next are ways these insights lead to definitive actions one can take to get the most value from data and analytics.

Analysis Before Actions — “What if this vs. that?”

Before performing an analysis, it is helpful to map how different results lead to different actions. In its purest form this is a decision tree, but it’s usually unnecessary to be so formal. A great start is asking, “What if we find this vs. that?”. Such a simple question should be easy to answer, but it often isn’t!

In the case that the relevant result is a single number, then before doing any analysis, ask “What if the number is on the low vs. high end of a reasonable range, what would that mean and what different actions would result?”. Sometimes this thinking reveals that the result is irrelevant because no different actions would result. An added benefit is that such “what if” thinking often uncovers the most crucial reasons for taking one action vs. another, what other analysis is worth doing, and a decision can be reached faster.

Avoid comically pointless analysis.

In addition to avoiding unnecessary analyses and uncovering what actually matters, a third benefit is the analyst understands what different actions could come from different results. Such an understanding guides choices on the technical details of the analysis, resulting in more relevant results and fewer costly back-and-forths with the final decision maker.

Thus, mapping results to actions with something like a decision tree, even an informal one, before doing an analysis provides these three main benefits.

  1. Determining if the analysis has potential to change actions.
  2. Uncovering a different, better, analysis that could lead to changes in actions.
  3. An improved analysis due to a better understanding by the analyst of how the results lead to different actions. This knowledge guides the analyst’s choices when working with complex data.

As always, there are exceptions. Some high-level analyses may not be tied to a specific decision or action, but are used as part of many decisions. An example is the gender breakdown of customers. For these, the process above can be skipped because the results influence many actions in a partial way.

Another possible, but uncommon, exception is when there’s a known problem but all of the possible actions are not yet known and will be informed by the analysis. Such an open-ended analysis can be long and costly. It should at least start with a specific goal and a few possible actions, even if more could be discovered. It’s also best to have a sense of the possible upside to avoid spending more time on it than it’s worth.

Example

A simple example is a decision to fix or remove a way for users to update payment methods. The feature has lots of problems and most users don’t use it. An analysis on how often it is used seems reasonable, right? Maybe not. After thinking through the “What if” question, it turns out that even if it is hardly ever used, it can’t be removed because no other action exists yet. The analysis would have been irrelevant and had a negative net value. Before an analysis, options other than removing the feature need to be identified.

Continuing, say the “What if” question leads to a pre-decision that above 2% usage to fix the update payments feature, and below 2% it’s better to remove it and have users call for support. The analyst is asked to find the percentage of users who update their payment method, a very specific and simple request. The analyst does so without knowing why, and calculates: (number of users who updated) / (total number of users), which is 1%. The troublesome feature is removed, yay! Yay? No, no yay. There’s no happy ending. The company’s growing quickly and most users are new without a need to update their payment method yet. Had the analyst understood how results mapped to actions, they would have instead focused only on users who joined over a year ago, which would have shown 4% updated their payment method. A costly mistake was made, and no one’s happy.

Analysis After Actions — Second guessing is good!

Data analytics applied to previous actions could be called “second guessing”, and is a type of QA (quality assurance). Engineers aren’t perfect, and so products are tested by QA; data analytics plays a similar role for decision makers. It’s important to have the right team structures, incentives, and mindsets in place so that data analysts can perform this important role.

QA works closely with, but has some autonomy from, engineers in order to check their work. The same is true for analytics and decision makers. Unlike when testing hardware or software where the right behavior is mostly objective, decision making is more subjective. Further, decision makers are the most powerful people at a company. This subjectivity and power means analysts need a balance between collaboration with and autonomy from decision makers to avoid bias and mistreatment if their results go against decisions. One way to achieve a balance is to not have analysts report directly to the person whose decisions they will be influencing. Instead, they can be on a cross-functional team with the decision maker to encourage collaboration while reporting to an analytics manager.

In the best case, an effective analytics team and decision makers make each other better by being collegially adversarial, like QA and engineers. Other functions would look to analytics to help them make choices, do early experiments, and guide high-stakes decisions. However, inevitably, an effective data analytics team will occasionally upset decision makers whose job it is to make good decisions.

Further, an analytics team should also independently prioritize analyzing previous actions by their business net value. In short, they should validate assumptions and second guess previous actions as more data becomes available, conditions change, etc. It’s human nature that decision makers may not want this to happen, and so autonomy is needed. As Herbert Simon observed, decisions are necessarily made with incomplete information and heuristics. When more data and analytics can be brought to a previous decision, a better alternative may be exposed.

Example

Returning to the previous example, say that a decision was made to support several payment methods so that every user can pay with as few obstacles as possible. “We will never lose a willing buyer’s sale! We live and die by conversion rates”, was exclaimed. A year later, one of the payment services seems to be disproportionately problematic.

The Action Analytics data analytics team hears about the problems, and decides to investigate if it is worth shutting down the troublesome payment service. Note this requires two things: first for the team to be aware of the problem, and second the autonomy to independently prioritize to investigate it when the original decision maker is not interested in revisiting it.

The result shows that a small percentage of users use the payment service, and it’s hypothesized that nearly every new user could use another payment option so no sales would be lost. An A/B test hides the payment service, and the conversion rates are the same. Yay! It’s removed, saving the company substantial resources and headaches, and, ultimately, the original decision maker is pleased and respects the process.

Effective Data Analysts

Data analysts with an ability to understand the broader picture and reasons behind an analysis are the most effective because they are better able to use their results to change actions. In particular, when doing analysis after actions, an analyst needs to independently assess which assumptions and previous actions are likely to be shown to be sub-optimal under their fresh data-driven scrutiny.

A newish trend is to distinguish data scientists from decision scientists, where the former are seen as more technical, and the latter as more action and business-oriented and using the mathematical results from the data scientist (see here). While the function of decision science is a positive step towards Action Analytics, this distinction of decision scientist from data scientist is counterproductive. Data scientists doing data analytics should be expected to understand the actions and decisions that are informed by their results. There are, of course, a range of skills among data scientists and some may be stats wizards while others are better at picking the right action to investigate. A high functioning team should have a mix, but splitting the roles and responsibilities is counterproductive. There are a few reasons data analysts should have a mix of skills from both data and decision science.

  1. As data-driven thinking is now common, every decision maker should know a little decision science.
  2. With few exceptions, data scientists should be decent decision scientists as well. The process of analytics is highly iterative. One may set out to solve a broad question, but the path there is unknown and a lot of judgement calls are needed along the way. At such junctures, it’s ideal if the analyst has enough context and skill for most judgement calls (which can be reviewed and reversed later, in the uncommon cases that it’s needed).
  3. It’s rare that the stats wizard is uniquely able to have a large impact on ROI anyway. If an analysis is so complicated and nuanced that it takes such a mathematical genius to solve it, then it’s likely better than another action only by a razor thin margin. For all but the largest companies, such margins are not worth investigating and resources are better spent elsewhere. Even the largest companies, one would hope, would have higher ROI actions available when the ROI is only slightly positive following a very complicated analysis.

In the case of small startups, in their aspiration to be data-driven, they hire an analyst in their first 5–10 employees. This is usually a mistake. When so small, the amount of data is small, simple, and heavily biased with early adopters, one big client, etc. No sophisticated analysis is needed or worthwhile, and often no more than simple checks of big obvious trends in averages are needed or even possible.

Credit https://xkcd.com/2400/

Further, there are many other justifiable strategic reasons for actions that go directly against the data, further lowering the net value equation. Most PMs and engineers are more than capable of the simple analyses worth doing, and an analyst will be underutilized.

Caveat: Actions & Decisions are not all data

There are times where there’s good reason to overrule what the data may say. For one, a healthy skepticism about problems with any analyses and biases is in order. As the quote goes, “There are three types of lies: lies, damned lies, and statistics”. Generalizing, there’s always some uncertainty, which should be estimated when it can be. As Nate Silver pointed out, “We abhor uncertainty, even when it is an irreducible part of the problem we are trying to solve”. It’s not always possible to eliminate uncertainty.

There are also practically immeasurable quantities like morality or long-term impacts. Facebook, by optimizing for metrics including time spent in app and profit, caused harm to their users’ wellbeing as well as creating (or at least hastening) backlash towards the targeted-ad business model they, and other tech giants, are built on (Congressional hearings on Russia’s interference with the 2016 election, #deletefacebook, Netflix “Social Dilemma”, Atlantic Article, Washington Monthly article). Sacrificing the metrics a bit in exchange for morality and long-term sustainability were better actions not taken. All of this is to say that human judgement should overrule analytics in some cases, especially when human elements or unobservable considerations are crucial.

When overruling the recommended action from an analysis, it should be done consciously with clear reasons that are fully explained to everyone involved. Ideally, those reasons will be revisited and checked later.

Summary

This article itself follows the format of Action Analytics — follow the results of an analysis through to the ultimate ends of recommended actions. Generally speaking, forks in the road that lead to “insights” without actions are possible seeds for future analyses that lead to actions, but in themselves are just interesting, “cool”, useless, and fool’s gold. Some Action Analytics best practices are below.

  1. Before starting an analysis, ask, “What if the result is this vs. that?” to map the possible results to different actions.
  2. Analytics should function as QA for decision makers and second guess previous actions and test assumptions.
  3. Analysts should be given autonomy from decision makers by not directly reporting to them. They need more autonomy than QA does from engineering.
  4. Data analysts need context, and so should work closely with others. One way is being on cross-functional teams.
  5. Effective analysts understand the bigger picture and how to have the most impact on a change in actions resulting from their work, even at the expense of some technical skill. When hiring, preference should be given to these candidates. Analysts should focus on improving these skills as well.
  6. Early stage startups should generally not hire an analyst.
  7. There are sometimes very good reasons to overrule data-driven results. In such cases, it’s important to clearly state them to everyone involved, and to later analyze if those reasons were valid to possibly take different actions.

There are more implications to an action-oriented mindset, and those presented here are only a start. In the same way a software organization may say they follow an agile process for development, my hope is many organizations and data analytics teams will adopt Action Analytics.

--

--