3 Ways to Restore Your Trust in Data

Conor Fitzpatrick
New Markets Insights
5 min readSep 24, 2018

Mark Twain once said, “there are three kinds of lies: lies, damned lies, and statistics.” While statistical methods have certainly progressed since the late 19th century, Twain touches on a question that persists to this day: what truth can we extract from the outputs of our increasingly sophisticated statistical models? We suggest three different approaches to getting clear results from complicated data.

Look for causality in context to identify sources of doubt. When weighing their options for an important strategic decision, managers often look to their competitors and addressable customers for guidance on what to do. They look at past data to try to figure out what worked and what didn’t.

If we step back and reexamine the context and circumstances that surround our data subjects’ decisions, we can uncover paths to a solution. In strategic analysis, we assume that all strategic decisions will have a positive or negative effect on a company’s financial success. This could lead us to conclude that any strategy that a financially successful company like Google deploys is inherently a good strategy.

But sometimes when companies are successful and have been growing for some time, they can afford to implement risky and perhaps unprofitable investment strategies that they wouldn’t be able to make if they were cash-strapped. That is, Google is probably not profiting very much from its seemingly inexplicable investment in the meal-replacement venture Soylent. This unorthodox move is much more likely to be a result of Google’s success than a driver of that success. Even if you observe a strong relationship between a company’s financial success and a particular strategy, do not base your decisions on it without further testing. Misinterpreting context and indicators of the drivers of your subjects’ decisions is one of the most serious pitfalls in quantitative analysis of strategy and financial performance. More than most other factors, context should inform the methods of analysis that researchers choose.

The same is true of customer analysis for product development. Radical product ideas — like Colgate’s flimsy, single-use “Wisp” Toothbrush — are often laughed out of conference rooms on the basis that product-improvement customer survey data don’t support the idea. But, customers’ preferences are only as accurate as the market of familiar solutions is expansive (i.e. the existing market causes customers to have preferences). The Wisp was like nothing else that existed in the market, and it met a customer need that had been un-addressed. That’s why Colgate’s single-use toothbrush was far more of a hit than predicted with customer survey data — it is impossible to predict the success of something that had never existed.

These two examples demonstrate problems for determining causal relationships — problems which erode the reliability of analysis for strategic decisions. Fortunately, we have models that can resolve them.

Find a model that preempts and fixes misattributions of causality in the data. It’s often easier to find problematic assumptions than it is to find methods that get accurate estimates, but fortunately there are a few statistical tools to help. The gold standard of simple and sound statistical analysis — the randomized control trial that compares treated subjects to control subjects — is used in clinical trials to test new drugs’ efficacy. But how can we replicate this method in analyzing/interpreting different firms’ strategic data?

There are a few ways to recreate a control subject — e.g., the version of Google that decided not to invest in Soylent for non-financial reasons — in your analysis so that you can discover how much value a strategy has really added to the companies you are researching. One method is called the “synthetic control method,” which identifies the companies most similar to the one you want to analyze. This method has been deployed in a wide range of analyses, from social science research finding that late ’80s tobacco excise tax hikes reduced California cigarette sales (and emphatically not that California’s distaste for tobacco caused tax hikes!), to the strategic analysis in Chris Zook’s Beyond the Core. This method uses linear regression techniques, and is deployed easily with a few lines of code and a downloadable module in statistical programs such as R and Stata.

Synthetic control group analysis of California’s anti-cigarette Proposition 99’s effect on California’s cigarette sales from Abadie et al. (2010). Note that “synthetic California’s” cigarette sales decreased after Prop 99, but less than California’s cigarette sales. While California’s policy of cigarette sale reduction was clearly successful here, comparing Google against its “control Google” before and after the Soylent strategy hypothetically would have shown no effect.

If your company is looking for ways to back up a bet on an offering that doesn’t exist in the marketplace, like Colgate was trying to do with the Wisp, the Jobs to be Done method of customer analysis is the right approach. When a company asks customers directly how a product should be improved, respondents will usually limit their answers according to preexisting measures of success and ignore totally new or different possibilities. But when you use a Jobs-focused approach and ask customers questions like “when is the last time you thought, ‘I wish I had a toothbrush’?” you can develop a new understanding of their success criteria. When the customer says “after lunch at work to get kale out of my teeth,” a single-use disposable toothbrush that doesn’t even require a trip to the bathroom starts to sound like a very good idea!

When doubt overcomes your analytical process, identify the source of that doubt and build/find/create a model, like the ones above, that solves it.

Understand that analysis is never perfect, and improve it with trial, error, and mixed methods. Knowledge from data analysis will never scream an exact answer for a strategic question to you, or even guarantee to you the answer’s veracity. Once you’ve found methods that work, make sure that you’re neither over- nor under-confident in applying your analysis findings to your strategic decision-making process.

Every analysis you carry out should, to avoid over- and under-confidence, include the following:

· A list of assumptions on which your analysis is contingent

· A list of caveats explaining the problems that remain in the analysis

· And, most importantly, a list of problems that your data analysis has solved

This will keep your strategic decisions rooted in the data. Mark Twain would turn in his grave.

Conor is a Senior Associate at New Markets Advisors.

--

--