Confirmation bias: don’t fool yourself

Georg Horn
Varia Blog
5 min readFeb 9, 2021

--

Confirmation bias — the tendency to discredit non-confirmative evidence, or to seek out (actively or subconsciously) primarily confirming evidence — is part of all of us.

When trying to write in an accessible manner about confirmation bias, it is difficult to produce a better piece of content, than what Shane Parrish & the Farnam Street team produced here. This is simply an amazing write-up about the science, origins, and implications of confirmation bias. I therefore invite you to now take a break and only continue reading here below, once you are done.

Confirmation bias, in one diagram

After reading the Farnam Street article, I hope we can all acknowledge that confirmation bias is a serious issue — and that we all suffer from it, every day. Confirmation bias applies to our confrontation with any information, and so particularly also to news media, the type of information we are all engaging with daily. This area was not addressed specifically in the Farnam article, giving me a chance to expand a little on the topic.

We all know that great feeling of reading an article that speaks exactly to our mind. It’s the type of article we share with friends, or even post on LinkedIn with the commentary “great article”. Ask yourself this; what was the last article that you read, that made you question or change your beliefs about a certain topic?

Its not just that we like article that confirm our viewpoints better, we are also very easy to fool about the actual content of the article. It turns out, cues that are given to us alongside the news content, have a heavy influence on our perception of the content. As tested in multiple studies (e.g. 1, 2). We frame the news differently, depending on the source. The exact same content can be judged by the same readers as liberal or as conservative, depending on the news outlet logo that is placed on top of the text. The linked studies show how the logos of CNN, MSNBC, Fox, or Breitbart, when placed next to identical texts, lead to vastly different conclusions drawn by readers.

This is a testament to the reality that we tend to pay way too much attention to the WHO of a message, compared to the WHAT of a message. What is being said is usually way more important — but think of an article starting with “President Trump said that…” and how easily it is to already see the rest of the message under a heavy bias.

The above explanations and cited studies are the reason why we hide logos and ideally also publisher names in most places in our products. Also in our journalistic research product, we by default hide publisher brands — as pure content should matter more than cued perception, especially in a professional setting. Note; nobody is safe from confirmation bias, everybody is biased.

One of the biggest problems with the world today is that we have large groups of people who will accept whatever they hear on the grapevine, just because it suits their worldview — not because it is actually true or because they have evidence to support it. The striking thing is that it would not take much effort to establish validity in most of these cases… but people prefer reassurance to research.

Neil deGrasse Tyson

Regarding the matter of discrediting information that does not comply with our beliefs, there is also the opposite. We tend to overvalue data that speaks to our convictions heavily — and upgrade random data to evidence, blissfully ignoring any deeper evaluation. It helps to keep in mind that there is quite a gap between data and evidence, and that true evidence is hard to come by.

The hierarchy of information quality

We have to treat content that we read differently, depending on where on the above hierarchy the presented story lies. Alex Edmans has a brilliant talk on this topic (see below). Whenever we see some information, we should ask ourselves whether it can possibly serve as evidence — or not. No matter our beliefs, no matter the source. Keep this in mind for all news you read, but these days especially when reading Coronavirus related headlines.

There is even a neurological explanation, why it feels good to read content that reflects our beliefs. Karl Friston, one of the leading neuro-scientists of our time, is famous for his theory of active inference and the free energy principle (you can read up on this here). He argues that we humans (but also all living beings) constantly project a model of our reality onto our perception of reality. We perform active and permanent inference on our environment, through all inputs available to us. The closer the incoming signals match our internal reality (perceptions meets expectation/model), the greater the cognitive consistency. If there are discrepancies, we have a cognitive dissonance — we have to adjust our model of reality to the new values. This takes effort. While when we only focus on inputs that are confirming our existing model — the error rate is minimized, and the required adjustment effort is small. Keep in mind, with no error rate — and no adjusting, there is also no learning.

Over time this has led to feelings of great satisfaction, whenever we are/were right about something. While in all those moments we should recognize that being right means in most cases learning nothing or less, compared to being wrong. So do you want to be right, or do you want to learn?

A beautiful poem by Boris Pasternak speaks to this; stating that we should not differentiate so much in our treatments of victory or defeat — all that matters is being alive. Alive, which Friston defined by performing active inference.

Another, step by step, will follow
The living imprint of your feet;
But you yourself must not distinguish
Your victory from your defeat.

And never for a single moment
Betray your credo or pretend,
But be alive-this only matters-
Alive and burning to the end.

To sum up, here is what I think helps to deal with confirmation bias:

1. Understanding it, being aware of it (the theory & implications of it)

2. Critically assessing the quality of information you are presented with

3. Critically assessing your own mental state when consuming information (do you react in a certain way? why?)

4. Focus on content, not on format or source

5. Question everything, proactively seek different perspectives (we can help you with that)

What was the last subject you changed your opinion on? When did you last share an article that you disagreed with?

This article was researched and written using Varia Research

--

--