Sam Goldsober
5 min readNov 2, 2022
Photo by Jan Genge on Unsplash

Cognitive Bias — Should We Care and What Can We Do?

I overheard someone say “instincts over everything” on my commute yesterday. My first thought was “that’s got a nice ring to it” and my second was “what a dangerous way to live”. If they’d read Kahneman’s work they would know acting instinctively is a system one heuristic, subjected to error and cognitive bias.

Thinking about the opposite of this phrase, challenging everything you want to say or do also isn’t right, you’d simply go insane. Is there a sweet spot and is there anything we can actively do to combat cognitive bias?

In short, yes — there are things we can do and there are some really specific examples on the web for all sorts of circumstances. The purpose of this article however is to touch on the relevance of all of this, whilst suggesting some practical solutions at a high level.

Before continuing I am just going to quickly define some relevant terms for reference and context:

Heuristic — a ‘mental’ shortcut used to reach a conclusion based on previously held information and experiences which can lead to cognitive bias

(Cognitive) Bias — systematic pattern of deviation from norm or rationality in judgment

Examples of bias’:

Availability bias — reliance on immediate examples that come to a given person’s mind when evaluating a specific topic, concept, method or decision

Framing — the manner in which data is presented can affect decision making

Anchoring bias — heavy reliance on the first piece of information we are given about a topic

Confirmation bias — tendency to search for, interpret, favour and recall information in a way that confirms or supports a pre-held belief

Groupthink — the aim of achieving harmony or conformity within a group

Everyone is subject to bias to varying degrees and in various ways as a result of different formative experiences. I am not going to talk about heuristics or bias themselves but if you wanted to learn in-depth I’d suggest the following books;

Thinking, fast and slow — Daniel Kahneman

The Irrational Ape — David Robert Grimes

Designing the Mind — Ryan A. Bush

Or if you have less time the following articles on medium I found interesting:

The question, is given what we know about bias, how do we try to avoid it without it being an overwhelming consideration? There are different ways to go about this but in my experience, I have the following to share.

Firstly, by accepting that you’ll never be truly bias-free and that’s ok if you understand that you can get away with it most of the time. We can think of this in the following way:

bias risk x exposure = expected impact

Where:

bias risk = strength of bias ranging from 0–1

exposure = impact if the decision/action was to be wrong

In most peoples’ day to day, the exposure is nil or close enough that the sky will remain in place if whatever it is goes pear-shaped (also a thought pattern for coping with stress which I won’t go into here but may do in a future article). Even if the bias is so strong it leads to an incorrect decision, who cares. For example, conforming to the group opinion that bowling this Saturday night would be more enjoyable than the cinema. You’ll probably have a nice time doing either.

Now we’ve accepted we don’t need to “sweat the small stuff”, we can address when the exposure in the above formula is high and think about some practical solutions to mitigate the risk of cognitive bias. Starting with the instinctive responses I reference in the opening, which are often in dialogue, where saying the wrong thing could impact a job opportunity, business relationship or personal relationship.

If the risk is that a verbal response is not thought through in the heat of an emotionally charged moment, a simple approach would be to forcibly take time before responding. Making yourself take a pause even for a split second extra and not saying that first thing that comes to mind will immediately reduce the chance of saying an instinctive response unnecessarily. It won’t impact how your response is received and if anything you’ll appear like you’ve thought it through more. Sometimes you may end up saying whatever came to mind first but creating that delay will filter out some of the rubbish.

In business situations where accuracy can be vital, things can get harder. There are two scenarios which spring to mind highlighting this, both data related (obviously):

  1. Posing questions to collect data from
  2. Data interpretation

The first is subject to framing, anchoring etc, and it feels like unless you have a Nobel prize in behavioural economics, you won’t get it right immediately (and even then it still isn’t easy). The second is subject to the things like confirmation and availability bias.

I believe everyone should have a reasonable understanding and awareness of different biases which in itself will help mitigate some risks. However, looking for a more practical solution, we want to think about the risks here given the two aims and these would be:

Scenario 1 — obtaining skewed data

Scenario 2 — misinterpretation

Something practical in scenario 1 would be to have multiple framings. If the worry is data being skewed, ask things in multiple ways and any accidental framing bias will be evident in the comparison of the data sets. You should always aim to word questions openly of course, but why risk an entire data set on a single framing?

If you are looking to frame it in a certain way, test the wording. Use smaller sub-groups to feel out how things are interpreted using it as dummy data before performing a wider scale run. Additionally, always remember that there can be a huge disparity between asking for a verbal response and observing a behavioural response.

Keep in mind the formula above. If the risk of bias is low, you probably don’t need to spend more time, effort or money mitigating it.

For scenario 2, a simple approach is to do more; have more minds than just one do the analysis and interpretation. This doesn’t need to be a huge amount more, just one additional, independent analysis (i.e. not a review of a review, but rather a simultaneous one) will drastically reduce the risk and actually, the first additional review should have the largest impact (diminishing returns to reducing the same risk using higher output).

If everyone is subject to bias to varying degrees, chances are with enough people, you mitigate that risk. Again, keep in mind risk to reward.

Closing thoughts

It’s impossible to avoid bias consistently over a prolonged period. At some stage you will fall victim, it’s about making sure when you do the risk of it is so low it does not matter.

Often priming yourself going into a scenario where you could be subject to bias is the solution e.g. interviewing candidates for a new role where you want purely to focus on meritocracy. Other times it’s more complicated but what’s apparent from experience is any effort is far better than ignorance and this starts with awareness.

P.S. check out this Codex of Bias’ which is a great visual tool for understanding and learning through Wikipedia:

https://upload.wikimedia.org/wikipedia/commons/6/65/Cognitive_bias_cod

ex_en.svg

Sam Goldsober

Venture Capital Investor commenting on a range of thoughts about business and people. Some insightful, some maybe not but up to you to decide.