Tackling Assumption Monsters by Embracing Ambiguity and Curiosity

This article is based on a 50min talk. The purpose of the talk was to explore the problems that Design Thinking can solve but also reflect on why a process that has been around for 30ish years, that seems like common sense, isn’t that common.

“How do we know we’re creating the right thing?”

This is something I’ve asked myself throughout my design career. I don’t know. I’ve always been curious. I’ve been driven to find out.

Rules, processes, methods… they are all tools. They are all solutions. Before looking at solutions let’s look at the problems. There are problems when you try and create the right thing in any size or type of company.


Bigger companies are slow. But being ‘slow’ isn’t the problem. It is an outcome of numerous real problems. Things like constant juggling, hidden queues of work, lack of strategy and vision, endless debate, structure, lack of delegation of… it goes on and on.

Projects move slowly. Big problems go unsolved.

Behind any active work develops a backlog of brilliant ideas. Ideas are of course always brilliant until they meet reality.

So the teams, especially the leadership, are constantly frustrated because they’re so slow to execute. They focus on the slowness problem. Restructuring, setting up innovation labs, moving to agile development, hiring new leaders that can “get shit done”, and trying to create a culture of “execute fast!”.

They’re focused on the wrong thing. They’re not tackling their real problems that cause the slowness. And their backlog is often full of untested, unvalidated ideas.

Tech Companies

Too fast… isn’t quite the right way to put it. But the obsession with building things quickly can lead to a confused approach, a confused idea of what success looks like, or even to ensure the right things are being measured.

Me: “How successful was feature A?”
Product Team: “Customers aren’t complaining about it.”
Introducing the Fail/Meh Rate

As high as 50–75% of development work in a tech company is spent building things that fail to deliver the expected outcome. Or it was mediocre. Meh.

The more I share this, I’m surprised more people don’t say “No, that’s wrong! You’re fake news! Our team is way better.” Instead, I see nodding or I’m told:

“That’s fine, they’re experimenting in-market and learning from it. It’s helping them establish the right thing to build.”

I agree with the principle. But I refuse to accept that inefficiency.

Early stage tech startups

Driven to build things quickly, they share the same issues as a more established tech company. However, they’re being pushed to build things quickly by mentors. Mentors who are often potential investors.

They’re often looking for any teams that can begin to show market validation and product-market fit. That they may be onto something. Then they’ll invest.

What’s the Fail/Meh Rate for startups?

But they use a validation process right? … ಠ_ಠ

Entrepreneurs walk away from their idea having bootstrapped, burnt their savings, and throwing a codebase in the bin. We should be trying to reduce the fail/meh ratio here because entrepreneurs deserve better.

All three groups need to spend more time figuring out whether they’re creating the right thing.

“How do we know we’re creating the right thing?”

We can reframe this as…

“How do we improve our odds of building the right thing sooner?”

The Ambiguity Reality

Product teams have their backlog of challenges. Whether it’s new product/services, features, or any problem we’re tasked to solve. Let’s look at what a good project/work-process looks like…

The squiggliness represents ambiguity and uncertainty.

This is a good project because the ambiguity is clearly being reduced down and clarity is established.

People hate the feeling of uncertainty and ambiguity. If not dealt with it leads to confusion which is a bad thing.

Ambiguity, however, is not a bad thing. It is simply the fact that there are multiple interpretations and potential outcomes. Closely associated with this is uncertainty. Which is not a bad thing either, especially right at the start, where there are so many unknowns.

The way we often deal with ambiguity and uncertainty is to leap to comfortable clarity. Leap to a solution. But…

  • Did we truly understand the problem we’re trying to solve?
  • Did we understand the users/humans involved?
  • Are we solving the right problem?
  • Is our solution actually going to solve the problem?

When we don’t accept there is an ambiguity phase of a project when we leap to a solution, we make assumptions and increase the risk we’re building the wrong thing.

We can pretend we haven’t made assumptions, but these little monsters linger on…

People in the team start to question different assumptions at different times. They see things that make them question the solution. The assumptions create feelings of doubt, ongoing questions and concerns about the project, and they fuel endless debate.

Leaders hate to see uncertainty in their teams. So they will often wade in…

“Okay enough discussion! We’re going to do X.”
Sometimes referred to as HIPPO, when the Highest Paid Person’s Opinion defines the way forward.

The team now have no feelings of uncertainty. There is no ambiguity. Glorious clarity has been supplied. The assumptions are now all the leader’s, and any risk is now on them. The team can move on comfortably.

We know this Opinion/Assumption Based Approach is bad and of course what we want is an Evidence Based Approach. And there’s a popular one…

The process behind Lean Startup is to take your idea, build it, launch it, get it live, in-market, and learn from how customers, users, humans, react to it.

The faster you can get your reps/cycles done, the better.

Most teams I’ve met though, that say they apply Lean Startup, have distorted the key principles. Most aren’t measuring things properly, therefore not learning.

For many teams, this often turns into every feature being built as a “MVP (Minimum Viable Product) version” and launched, then the team move onto the next feature.

So many features fail (or are meh), and remain in the product, that it simply becomes baggage. This isn’t technical debt. Technical debt is something you generate by building something useful quickly.

“Can you tell me about this right column, with all these features?”
PM: “No, don’t worry about that, no one uses that.”
Me: “Well… why is it still there?”
PM: “It’ll take a sprint to remove it, and it isn’t a high priority.”

Then there are those that believe the main principle of Lean Startup is to “Fail Fast” and at its worst “Build Fast”. The point has always been to do the whole process fast, specifically to learn… fast.

How do we learn? How do we build evidence to reduce ambiguity and uncertainty, reduce the guess work? It’s all about having our two sides of this little twosome get together.

The best way to learn is to witness these two meeting. This is where it all becomes real. This is where the magic happens.

Agile Development and Ambiguity

Then there’s agile development which is used to cope with the ramifications of not dealing with ambiguity.

The product backlog is typically full of ideas coated in assumptions.

Agile Development is great. But ‘garbage in, garbage out’ right?

As products become real, those team’s and product managers react with uncertainty, it fuels questions and debates. Changes. But being agile means welcoming them. To a point.

But sometimes without a decent feedback loop, people don’t notice the issues. Those products/features are released, and they turn out to be…

But because teams aren’t measuring the right things, it simply lands and the team moves on. If people do realise it failed, rather than exploring the problem, they again leap to plausible sounding solutions and start racing out changes and perceived improvements. Let’s polish this MVP.

Shiny! Watch me dazzle like a diamond in the rough, strut my stuff; my stuff is so shiny!

“How do we know we’re creating the right thing?”

If we know that there is an ambiguity phase that we often ignore. If we know assumptions are the problem. That we are taking a lot of risk—risk of building the wrong thing. And that the best way forward is an evidence-based approach. What’s the problem we need to solve…

How might we identify and test assumptions, discover and assemble evidence to reduce ambiguity and risk?


How can we learn the right things… sooner?

Well, there’s a way…

Design Thinking and Human-Centred Design

A lot has been written about Design Thinking — here read dis. It’s been around a long time. The key thing to understand from its history is that it came about in an effort to separate design sciences from engineering.

Design thinking is a set of principles for solving problems (reducing ambiguity/figuring out what to build).

Human-Centred Design (HCD) is a framework, based on Design Thinking.

The HCD process is…

1. Empathise & Understand (Unpack the problem)

2. Collaborative design (Create and consider numerous ideas)

3. Refine & Prototype (Reduce the ideas down and prototype one or more)

4. Test with real users (Have the humans interact with the prototype)

5. Iterate, test, repeat (Take what you learn and iterate the prototype and test it again, and again)

6. Deliver (Through testing, it becomes self-evident that you’ve solved any major issues, it’s time to build it for real)

This is a step by step process for dealing with ambiguity.

This is an evidence-based approach. It’s not new. You can get degrees in it. Hell, there’s even an ISO standard for it.

But let’s look at those steps again, why should you care?

1. Empathise & Understand

Without spending time exploring the problem you risk missing the mark. Without understanding the people you are creating products/services for, you risk creating something they don’t like/want.

2. Collaborative Design

Design shouldn’t be happening behind closed doors. It has to be opened up to a cross-functional team. Rather than designs being presented to the wider team, and their role setup to be: Have an opinion. It instead tasks them with getting involved, to solve the problem themselves. By getting into the detail, it elevates the conversation. The debate becomes productive.

3. Refine and Prototype

Prototyping simply makes the product real as soon as possible. Why wait?

4. Test with real users

Gather the evidence you need. Test the assumptions behind the prototype. Learn about both the problem and the solution.

5. Keep iterating. Once you’re learning less, then it’s time for 6. Deliver/Execute

It’s simply shortcutting the process…

…and learning sooner, what we would otherwise learn later.

These two are going to meet at some point. Why not make it happen this week? Why not today?

…and what you may discover is, there is no point continuing.

Some of the proudest moments of my career were when I helped generate the evidence required to prove an idea was flawed. They were creating the wrong thing. They didn’t understand the problem.

A project died.

Not a single line of code was written.

Reality Check

So if this process is so fantastic, and it’s been around so long, why isn’t it more common?

Well. It is. In some parts of the world, in some professional communities.

But I am always curious about why it isn’t used more. I’ve done talks, I’ve mentored people and teams, I know people that loved the sound of it, so why didn’t they apply it?

Well, I went and asked some of them, and here’s the what I heard.

1. What do users know?

Quite a few people respond with “What do users know?” and it’s usually accompanied by the Henry Ford quote…

“If I had asked people what they wanted, they would have said faster horses.”

This is a misunderstanding of the process. It’s not about asking people for opinions. The research phase is about understanding the wider context, how this problem, this job-to-be-done fits into their world. Not their opinions.

In the design phase, it is simply putting a product/service in front of someone and letting them use it. And observing. How does it work for them?

2. Action and Progress

Teams and leaders have a predilection for action. They favour perceived progress.

Research does not feel like action/progress.

It #$%*en should. But it doesn’t. And for those that say “We know what to build, let’s just build it” even prototyping seems like a waste of time.

They’re trying to answer the question “How can we build things faster?” not “How do we know we’re building the right thing?”

3. The ‘Problem’ Problem

We have a problem. Teams leap to solutions and build the wrong thing.

And we have a solution. We can use a process that allows us to learn, quickly, challenge assumptions, and help improve our odds of building the right thing. Quickly too.

Relevant anecdote…

I worked with an entrepreneur who had an interesting problem he wanted to solve. I helped him and his team gather evidence that it was a significant problem and that their solution would solve it.
However, when customers with the problem were presented with the solution, they didn’t want it.
Through research interviews, it became clear none of them believed they had the problem. They definitely did. But they couldn’t see it.

People have to feel the very real pain of a problem to seek out a solution.

Corporates are focused on the wrong thing and simply assume all their backlog is full of brilliant ideas waiting for their chance to shine. Special venture projects (Corporate Labs) will continue to fail as cranking out bad ideas faster is never going to be the answer.

Tech Companies fail/meh rate of 50–75% isn’t a problem, they often blame their product teams for not having better ideas or failing to manage changes correctly. Or simply they shrug ¯\_(ツ)_/¯ “we’re experimenting!” … “We’re lean, agile, and this is part of our process”.

And the entrepreneurs that bootstrap and build, 90–99% of them are walking away after burning their savings and ¯\_(ツ)_/¯ “I guess it wasn’t meant to be”.

So for a better process to be adopted. You have to spend a lot of time with clients, leaders, teams, entrepreneurs helping them realise there is a problem with what has turned into a “build fast” mindset and that there is a better way of working.

Trying to find a better way of working with these groups I started learning about and experimenting with…

Design Sprints

Design Sprints are the most digestible form of Design Thinking that I’ve found. The process is essentially a HCD approach compressed into 5 days.

It has no specific up-front research component, which means it is criticised by HCD/UX practitioners. However, it takes a significant portion of HCD and bundles it up in a format that teams tolerate… somewhat.

It also bundles in a bunch of productivity enhancers so it feels like action. It feels like progress.

And this happens in a maximum of 5 days. Once you realise how fast you can do it, you can move faster.

Reality Check… again

Even this solution isn’t being adopted widely. I’ve written up what I learned from running sprints inside corporates, including the challenges and the mistakes I made.

I’ve since continued to run sprints when I’ve been able, I’ve run a dozen sprint training workshops, and mentored a few sprint facilitators.

There are 3 problems I‘ve seen…

1. Too fast or too slow?

For corporates, the process always felt too fast, and too soon. Ultimately I realised, you can only run the process after training and coaching. It’s about changing people’s mindsets first.

For tech companies the idea of stopping and putting a valuable team in a room for 5 whole days seems way too slow—especially for the “we know what to build, let’s just build it” crowd.

It takes an experienced sprint facilitator to customise the process, compress it, or expand it. Currently it’s presented as a 5-day process, however, it’s evolving as people use it more and more so it’s becoming flexible enough to be applied anywhere.

2. It’s still not digestible for some

There was a product manager in one of my sprint training sessions who realised the tech company he was working for was not dealing with ambiguity. Assumptions were plaguing the product through development. So he went back and organised their first design sprint. The team were excited to try the new process as they felt the real pain of the problem. I then got this message from him…

Our Design Sprint got cancelled by the CTO today prior to kick off next week. Probably similar stuff you’ve heard before, “we already know what we need to build, product management just need to write it down”, myself and two others tried to use examples of previous projects that have gone off the rails after weeks of development (because they went down the wrong path) but the examples were thrown back in our face for not managing those projects properly.

Again, you can only introduce this process after changing people’s mindsets.

Another local tech company when they got into strife essentially fired the majority of their product team.

Remember “Agile doesn’t have a brain” it’s the process applied by the product team that give it direction. Teams may not understand what the problem is, but they do know which team to blame.

3. Wait… you use this, really!?

The other thing that comes up is when people challenge me “Do you really use this at Neighbourly?”. Neighbourly is a very small, lean team. I work there three days a week and two days on other products.

No, I don’t apply a robust HCD process.

No, I’ve never run a full 5-day design sprint.

But I do accept there is an ambiguity phase. I run a HCD Lite approach on some challenges, but the main process I apply is RAT — Riskiest Assumption Testing.

Is it bigger, or is it just closer?

It’s the basis of Lean UX and it’s something being discussed more and more. In the short time I have on each product, I simply focus on the biggest assumptions. If we get _________ wrong, how bad will the consequences be? The small stuff still flows straight into the product.

That’s my current professional challenge: How can I deliver value as fast as possible.

I’m learning.

Embracing Ambiguity

How might we identify and test assumptions, discover and assemble evidence, to reduce ambiguity and risk?

The solution is that everyone needs to stop being uncomfortable with ambiguity. Embrace it. Accept that it is a phase in every project. We have a process of working through it. Use it.

You have to apply the principles of HCD.

You have to identify and test assumptions with real users.

You have to get out of the office and gather evidence that propels you toward building the right thing sooner.

For your wider team to accept it, you have to help them see the real problem in your current way of working.

And you have to do these activities as fast, as efficiently as possible, so it feels like action, it feels like progress.


Finally, let’s do a quick test.

Let’s look at a real problem on Neighbourly.

Neighbourly is a private social network for neighbourhoods. It’s similar to NextDoor or Nabo. When people sign up, we need to verify they live at the address they provide. This is to ensure they are really part of that community.

During the signup process, the user provides their email and home address, real name and then they come to this step…

We ask for their driver’s licence to verify their details by using the New Zealand Transport Agency (a government department) API.

We do not store the information.

We want people to enter their driver’s licence because it’s instant and we can let them onto the site. The alternate method, a verification letter posted to their address takes around 5 days. It also costs us more.

Here’s the problem…

“A disproportionate number of people select the ‘I don’t have a driver’s licence’ button and opt for the letter method. How can we get more people to enter their drivers licence?”
[Pause for effect]

What would you do if you were me?

[Let your existing mindset reveal itself]


You may leap to assuming the problem is “you’re asking for too much personal information” and you can easily think of some plausible sounding solutions. This is too common.

Let’s evolve.


The next level of thinking may react to this sort of issue with some A/B tests. No point discussing it at length. Let’s just run some tests. See if we can move the needle.


The level above that would be to realise, this is a human decision. Data won’t ever tell us why people are making this decision. What they’re thinking. So let’s go sit next to some people signing up and listen to them.


The level above that would be to realise this process is made up of multiple steps. Every step is a hurdle. It appears this hurdle is too high, but if people were truly sold on the product they’d enter the details. So we need to look at the value proposition and through all the steps, with real people, and learn as much as possible… then work on a solution.


Above all else, if we embrace ambiguity, and accept we are at the start of the squiggly line…

…then the best response to this challenge is: I don’t know”.

I don’t know.

I don’t know how to solve this.

I don’t know what the problem truly is.

I also don’t know what process I’m going to apply to find clarity.

What happens when you start a challenge in this way is, you get curious. Curiosity is the most important mindset for tackling ambiguity. The most important thing that comes out of a curious mindset is questions.

By starting every challenge this way you may ask questions that you would never have thought of otherwise.

Even by leaping into a process you’ve narrowed your view, there are questions that you may have missed…

By running out the door to talk to users, you’ve limited it more, they may start biasing you in one direction…

By only being data driven, you’ve drastically limited what you could learn and created blindspots where the real problem and solution may reside…

And if all you do is leap to plausible sounding solutions you’ve drastically increased the risk of creating the wrong thing…