Avoiding Bias in Product Decision-Making: Recognition Heuristic

Improving product outcomes by understanding how we make decisions

Sam Nordstrom
Agile Insider
9 min readJun 15, 2020

--

Photo by Simon Abrams on Unsplash

Product management is centered around informed decision-making. Whereas engineers produce code, designers design, and analysts analyze, project managers primarily produce decisions.

Many folks have espoused different approaches and principles for successful long-term decision-making as a PM. My favorite is what I’ll call “The 60% Rule” from Brandon Chu’s “Making Good Decisions as a Product Manager.”

Chu defines being a good decision-maker as doing two things:

  • Making decisions using the right amount of information
  • Making decisions as quickly as possible

The aim isn’t to get every decision correct, or even to try to get as much confidence in every decision that is made. Rather, he advocates for a model of trying to get enough confidence to make ~60% of decisions correctly. Time is focused on more confidently making the smaller number of decisions that are truly important, while the majority of relatively less-important decisions get made quickly, without too much information.

PMs are required to make numerous decisions every day that can either speed up or slow down progress in a given direction. We want the direction we choose to be the right one, but a tension exists between how quickly we can pick a direction and how confident we can be in its accuracy, due to the information we must gather and synthesize in order to decide.

Gathering all the information required to make a 100% confident decision takes time. Increases in the information needed to increase confidence from 70%–99% require the most amount of time and tend to deliver the least amount of incremental value.

So what’s the cure? Figuring out which decisions are important enough to require the most information and time, and which can be made quickly with less certainty and information.

All PM decisions fall somewhere along this spectrum of importance and effort, and tools such as prioritization principles and KPIs can guide where and when to focus.

Decision-making and heuristics

I recently read Daniel Kahneman’s Thinking Fast & Slow. The connection between this model of PM decision-making and Kahneman’s model of System 1 and System 2 decision-making was evident.

We, as humans, have two modes of thinking: System 1, which is fast, intuitive, reactionary and low effort; and System 2, which is slow, more intentional and more logical.

Here are examples of appropriate tasks for each system:

  • System 1: Determine which of two objects is bigger.
  • System 2: complex logical reasoning about which outcome is more likely when there are multiple variables at play.

Why do we have two systems?

System 2 takes lots of energy and time, and we’ve evolved to be able to rely on System 1 to help us make faster decisions that, for the most part, serve us well throughout life. What is more interesting is the role a subconscious bias for System 1 thinking can play in leading to illogical decisions, where we really should be employing System 2.

This bias is explained by Kahneman’s theory of heuristics, which are simple mental processes for quickly forming judgments, making decisions or devising solutions to problems. These processes cause us to approach new decisions with new information by relying on historical information that may or may not be relevant and applicable to the decision at hand.

In other words, we substitute what should be a System 2 decision with a System 1 decision, and as a result, we often commit logical errors and end up with the wrong decision.

There are several categories of heuristics, each of which has its own reasoning shortcuts that can lead to the same type of wrong decisions when used incorrectly:

Recognition: our bias for recognized options over unfamiliar ones

  • Replicating any decision that was made before will automatically bias you toward it.
  • We need to apply extra-deliberate due diligence in evaluating its merits, and see if the reasons it worked before are also true now.

Availability: our bias for simple, coherent explanations with clear causes and effect

  • Decisions that advocate for simple explanations and fixes are more appealing and can lead us to ignore other relevant information or a lack of information.
  • Any decision that is based on vivid and relatable customer stories needs to be counterbalanced with the relevant data and base rates.

Representativeness: our bias toward finding relationships between things and using that similarity to make decisions

  • Decisions that rely on some similarity between an option and an outcome need to scrutinize whether that similarity is actually correlated with the outcome we want.

Anchoring and adjustment: our tendency to under-correct for priming or anchoring numbers

  • If provided with an an initial estimate, regardless of whether it is relevant, make sure to apply more weight to relevant base rates for your decision, rather than the initial estimate or anchor.

Product decision-making and heuristics

Chu’s criteria for evaluating how important a decision is and how much information and time you should spend on it is a fantastic starting point. I tend to agree with him that “good decision-makers are quick decision-makers.” Time not spent on the majority of fast decisions can be reallocated to the fewer, most important decisions.

However, even with the right time and information required to allocations, heuristics can lead to poor decisions in both the important and unimportant decisions.

Why?

First: Making fast decisions without complete information, per the 60% rule, inevitably leads to heuristics biasing our judgement every day. Here, we can draw parallels between Kahneman’s System 1 and Chu’s faster, less-important decisions.

Second: Even with the fewer, more important decisions, heuristics can cause us to make unprincipled decisions that subvert the whole 60% model on the decisions that matter most. In the same way Kahneman notes examples of System 1 taking over when System 2 should be at work, heuristics can bias even the big decisions on which we spend more time and effort.

After all, this is their job in our lives outside product management — to help us make decisions or judgements that are mostly right most of the time, quickly and with less mental energy.

So what do we do?

Awareness is a good starting point. Once we are aware, there are a few simple tactics that can help us guard against heuristic-creep.

Here, I’ll dive deeper on the first heuristic: recognition. I’ll offer an example of how it can arise in product decision-making and propose a strategy for combating it. I’ll follow up with deeper discussions on the remaining three heuristics in subsequent articles.

Recognition heuristic

If one option is recognized and the others are not, we will be more likely to pick the recognized choice. This is true, even when the recognition of one option over the other is not actually correlated with the outcome we want.

We may decide to pick an option due to a few well-publicized examples of other great outcomes from that decision, without understanding A) whether the reasons it worked in those contexts are also true in our context, or B) whether those cases are statistically representative of the broader population we care about.

Recognition serves us well, if the decision we have to make is related to how well we recognize the options — for example, picking between two dishes to eat, where the goal is to not upset your stomach. If we recognize one of the options, the heuristic helps us pick that one, as our recognition is a good proxy for the dishes being safe to eat.

Recognition can lead us into trouble, when the decision is not related to how well we recognize the options. For example, picking between two dishes to eat, where the goal is to get as much iron and vitamin C as possible. If we recognize one of the options, that recognition is not related to how much iron and vitamin C is in the dish. Instead, we need to place more value on the base rates for which types of food have higher amounts of iron and vitamin C, and in turn, which dish has more of those types of food.

Example in product

We are tasked with improving onboarding conversion for our banking app. Customers currently have to enter lots of sensitive information before they can use our services, including social security number and bank account number, which is causing the most drop-off before prospects finish signing up.

Feedback tells us customers are wary of trusting an app on their phone with such sensitive information, and consequently, they shy away from continuing to use it, even though the value seems appealing.

We research a bunch of options, and the team narrows in on two options:

  1. Move the sticky fields to the very end of the funnel, after we allow customers to see how intuitive our app is, and create a deposit using promotional money that we will complete for them if they finish signing up.
  2. Add a landing screen up-front, explain what information we are about to ask for, why we need it and what promotion the customer will get once they complete.

We have to pick an option to lead with as our first test.

Option 1 starts to become the most appealing, as the team discusses. Everyone can cite one or two examples of apps they have used where they got to use the product before having to enter any sensitive data or payment info. This worked great for our team’s favorite project management app, where they only asked for payment after we created our first tasks and invited team members. At that point, we already had experienced the value and were more incentivized to continue using it. One engineer claims his previous food-delivery company implemented this model, and they saw a big lift in sign-ups by moving payment and delivery info until after the customer added items to their cart.

Option 2 is interesting, and our designer is really advocating for it. Customers might respect and appreciate the clarity up-front, mentally preparing them for the tasks ahead. But we can’t reference any examples of the tactic having worked for other products we use.

Everyone votes and lands on Option 1. We decided to run the test.

To our dismay, the experience only moves the same level of drop-off to the new stage in the funnel where a customer can start moving money.

What happened?

The recognition of Option 1 was appealing, but it led us to ignore an important question: Are the reasons it works for project management and food-delivery apps solving the same problem our customers face in financial onboarding?

Perhaps project management tools are such that they have to be experienced before would-be customers are willing to pay. But financial products, such as banks and payment systems, face a different problem of establishing trust for customers wary of entering banking information on a mobile app. They may not trust a product that doesn’t require the same sensitive information as their current solution.

Recognition influenced our team’s decision, even though that recognition was not related to the outcome we were trying to drive: increased trust in our product.

How to combat this

First, quickly pressure-test potential decisions by asking:

  • Are the reasons this decision worked in other contexts also true in my context?
  • What are the reasons they may not be true?
  • Are these other contexts representative of all decisions like the one I’m faced with?

Second, watch out for a bias for decisions someone has made before. Just because you built that feature at a previous company or team doesn’t mean it will work now.

What it does mean is that you will be inherently biased toward liking that option over others, so apply a little extra scrutiny to any decision that has been made before.

Implementing vigilance against our inherent biases observed in these heuristics will make us better decision-makers and, ultimately, lead to better product outcomes. I’ll follow up with a deeper discussion in subsequent posts on the remaining three heuristics: availability, representativeness, and anchoring and adjustment.

--

--