You know what worries me? When people tell me they ‘believe’ something to be true. I’m even more worried when people tell me they ‘know’ something to be true. The problem with believing and knowing something is that the person has stopped looking for contrary or supporting evidence to challenge or validate what they know.
They know it, so why question it?
I am much more interested when someone tells me that they ‘reckon’ something. They don’t know for sure, but they have a hunch, or an intuition, or an educated guess. They reckon customers want an easy way to manage household bills in one place. They figure that £5 is the right price point for this app. They guess that stakeholders are likely to fund another round based on certain success measures.
These assumptions are strong opinions that are weakly held. That is, the person with the opinion has enough confidence to act on it, but are also well aware that their view is merely an estimate, highly contingent on new information. If every customer says the household bill management idea is worthless, they rapidly shift their view. If sales are glacial at £5, they recognize they were wrong.
The ability to effectively act on a ‘reckoning’– what I refer to as a ‘hypothesis’– is something that I have observed to be strongly correlated with success in uncertain environments, where little is actually known. Which of course aptly describes our an Incubation phase at BCG Digital Ventures.
During an Incubation phase, we are trying to bring an idea to life and a product to market while knowing next to nothing about how the product will be received by customers. We haven’t a clue about the unit economics of what will make the venture successful, about the stakeholder reaction to the thing we’re building, or about the features it must have and value it must offer in order to achieve product/market fit.
Under those conditions, we have to be able to operate under assumptions until presented with better evidence, at which point we reassess and make lightning-fast course corrections to operate under the new assumptions. We have to reckon what our customers will think, we have to guesstimate our unit price, we have to figure what features are crucial. Our starting point is to identify these starting hypotheses and then to test them and start learning.
Starting hypotheses are great, but that’s all they are — a start. In order to gain the information that we need to do effective development, we need a system. The process of acquiring the needed information is the foundation of Hypothesis-Driven Development.
That process has four steps:
- Innovation Observations
- Success Measures
Hypothesis-Driven Development starts with an Innovation or Validation Sprint. During this sprint, the whole team is generating observations. Our test customers love the household bill management idea! Wait, they love it if its free but they hate it if it costs £5! Wait, they only like it if it supports all of their various bills!
From those observations, we generate new hypotheses. We started out with “Customers will love a household bill management app.” Now we throw that out and replace it with “Customers will love a free household bill management app.” We develop tests to probe the hypothesis. Are we on the right track? What are the actual parameters? Is it actually about the cost? Or are we framing the price in an unpalatable way? We then establish success measures which guide us in deciding whether the hypothesis was valid.
Every Innovation sprint yields a huge pile of hypotheses, assumptions, and unknown quantities — “HAUQs” for short. These HAUQs can be sized, analyzed for cross-dependencies, and prioritized for the team to work on.
Devising tests and success measures for these HAUQs is an investment of team resources. We measure the investment of those resources in terms of discrete units of value, or “DUVs.” The conversion of HAUQs into DUVs makes Hypothesis-Driven Development a trackable and measurable system to communicate the amount of uncertainty left in a venture, and shows at a glance how ‘risky’ launching a given venture is likely to be.
We aren’t measuring how fast the team is going, or how many features they’re defining, or how many commits they are making, or how many P&L forecasts they’re creating. What we’re measuring is how quickly and how much the team is learning about their product, their market, their solution, their competitors, and the venture itself.
Thinking this way empowers us to move the focus away from features or functionality. We can move the conversation past ‘velocity’ and towards what really matters: Launching a business to market with its key challenges addressed and a good understanding of its place in the world. If we operate in this way long enough, imagine the dataset we will create about how ventures are built, tested, and launched. We may even be able to start to predict outcomes based on patterns we have seen before.
I shared this approach to venture development with a VP at a 10,000 engineer-strong organization. She said “I love this idea; it’s the way everybody thinks but nobody actually works.”
Intuitively, this is a powerful and effective approach. It is no easy task to operationalize it our everyday work. But it is worth it; implementing Hypothesis-Driven Development in our ventures at all stages helps teams explore their hypotheses, and focuses their attention on how much we’re learning, not on how fast we’re delivering features.
I don’t expect any of this to be new to you. You probably know all of this already, but perhaps haven’t systematically implemented it into your ventures. I would love for us to make it happen.
What do you reckon?