In traditional idea evaluation, this idea might be developed and marketed.

Why traditional idea evaluation fails

Jacob de Lichtenberg
Product Leadership & Practice
4 min readJan 18, 2017

--

This article is part of a series about experimenting to evaluate ideas. Here I focus on a real life example showing how a lack of experimenting can be very dangerous when trying to figure out if something is a good idea. This is by far the most traditional way of evaluating ideas.

The risky validation path without experiments

When working as part of an innovation unit in a production and delivery company, we were developing ideas for new business areas. We came up with a specific idea: to create a new product using the existing production facilities. Let’s call the idea PUK. Briefly, the idea was a delivery concept for both eCommerce stores and consumers, where consumers could receive groceries and parcels at the same time. It meant there were two ways in which the idea should benefit the consumer. PUK was initially evaluated internally by different committees in the company — as is pretty standard procedure in many other companies. We could have run a kill experiment, but instead, we did something blasphemous in the eyes of the experimental approach: We asked random people! And even worse, we used a survey company to ask people if they would use PUK, so we didn’t learn anything from users ourselves. But according to TNS-Gallup, we were in luck and PUK was among the top concepts TNS-Gallup had ever tested:

As a source of information, asking people is very unreliable. But we took this as a confirmation that we should build the product. Furthermore, it was very expensive: $40,000 to run the survey.

The story could stop here, but I want to emphasize the problematic aspects of this approach.

A business case for project PUK. The sheets below are assumptions for this business case.
Every tiny number in above sheets are assumptions for the PUK business case. (Made small as they contain confidential information)

Building a business case

In this risky approach, another important step to deciding if something is a good idea is by building a business case. This is something we also did in the very early stages of the project.

The biggest problem with building a business case this early is that you don’t know very much. You don’t know how much you will sell, how much it will cost to develop, and how much it will cost to run. This means you have to fill knowledge gaps with assumptions.

If you look to the left, you’ll see the huge amount of assumptions we made for PUK (highlighted areas). Some of the assumptions were run on computer simulations, others according to our best knowledge. The growth numbers especially were based on desktop research. In short, we were making a lot of guesses.

If you can see the small numbers above the graphs, you’ll see that we asked for six million dollars for this idea, so it was important for the company (and us) that the idea would be successful.

Making sure the idea was good

So how did we verify that the idea was good? Did we rely entirely on the TNS-Gallup survey? Yes, kind of…! Because after we had received the approval and the funding (based on a survey, committee decisions, and what most people in the company thought), we never ran a kill experiment. We simply relied on survey data.

We did, however, go out and talk to eCommerce stores, but in the same non-committing way: We never asked them to buy (a principle in the experimental approach).

The conclusion

You probably guessed it. The concept does not exist today. We could have been lucky and hit the jackpot, but it would have been based on guesses since we had no behavioral data.

Part of the idea was building a new route optimization tool that could work across products, so efforts were not wasted. But the idea never succeeded.

In hindsight, we should have run a kill experiment — without asking anyone. Running kill experiments are important for features, but for entirely new concepts it is imperative.

Read more about experiments in our Experiments publication.

--

--