I’m guilty of it. We all are, to some degree.
I’ve entered situations with preconceived notions of what to expect, either of the event itself or of the people involved, and found my ideas to be quite accurate.
Yes, I thought: This entrepreneur is indeed a snake oil salesman of the highest order. He dresses all in black, dark hair slicked back, and crows about his past conquests with delight befitting an ancient Norse Viking warrior speaking of victories past.
I didn’t need to actually meet the jackal to know he was a jackal. I knew it.
I knew it.
How many times have we all thought or said that phrase?
The idea behind this phenomenon is actually fairly simple: It’s called confirmation bias.
Confirmation Bias: The tendency to process information by looking for, or interpreting, information that is consistent with one’s existing beliefs.
In short, humans actively seek out evidence that will support whatever it is they already think true.
All of us are guilty of this.
In our modern social media-connected world, the concept of “fake news” has become a household term. These are blatantly false claims presented as news stories that are created for a specific audience with the knowledge that they will most likely buy into the falsehoods, and therefore like and share them.
In terms of evolution and biology, however, confirmation bias is actually supposed to be a good thing. By allowing us to form preconceived notions, our mind became a little more efficient at quickly recognizing threats or, alternatively, positive signs in our environments that might lead us to food, water, or shelter.
It may happen consciously or subconsciously, but it happens, and we have very little control over it.
All of the “isms” thrive on confirmation bias. Racism, sexism — As soon as we are given any hints at a young age that any other different kind of person is somehow lesser than we are, the rest of our lives we watch out for clues to back up those assumptions, no matter how wrong they may be.
Psychological conditions such as anxiety are also bolstered by confirmation bias. Conditions like paranoia are essentially nothing but confirmation bias.
And then there is the realm occupied by conspiracy theorists and cryptozoologists, who are convinced something exists (like a secret cadre of ultra-wealthy influencers controlling the world or an elusive hirsute distant cousin of humanity wandering the forests of North America) and set out to prove that they are, indeed, right.
Confirmation bias is like a mental virus that is highly transmissible and impossible to cure. Even if you are aware of it, constantly trying to fight it by stamping out your own ignorance, confirmation bias manages to survive because it can live within any concept.
Philosopher and linguist Frederic Friedel published an excellent piece here on Medium which describes how he illustrated the concept of confirmation bias for his 17-year-old nephew Noah. Friedel began by asking Noah to tell him what rule he was using when rattling off a number series: 2, 4, 6… Noah jumped to the conclusion that each number would surely be two more than the last. Noah had jumped to a conclusion based on what he knew already — how most examples of such problems he’d encountered in life had panned out. Friedel explains that the proper way to use the scientific method would be, to begin with, a hypothesis and then work to disprove it, rather than try to prove he was right with his first conclusion.
The scientific method, the correct and empirical way to do things, contains six simple and immovable steps:
- Make an observation.
- Ask a question.
- Form a hypothesis (a testable explanation).
- Make a prediction based on the hypothesis.
- Test the prediction.
- Iterate: Repeat the process using your results.
Perhaps our biggest foible as a species is that we are not mentally wired to follow these steps by nature, and such a large percentage of us are never trained to do so. It’s a matter of ignorance rather than a lack of intelligence because the scientific method is nothing if not easy to follow.
It is also arguable as to whether or not the above method is without flaw. Most conclusions are drawn solely from the data outcomes of a given study. But wouldn’t it be more correct, in a way, to first look at the fundamental assumptions of the current worldview — before iterating — so that those can be accounted for in the results?
Confirmation bias can be absolutely world-altering when it is embraced on institutional levels.
Take the tobacco and oil industries, which both knowingly fund research that supports results which contribute to improved public image:
- Contesting well-documented relationships between various cancers and tobacco use (source).
- Funding research that backs up the climate change denial paradigm (source).
Confirmation bias is essentially a form of self-deception, in which we convince ourselves that something is good or right simply because we want it to be, and so that whatever it is aligns with our current conceptions.
Unfortunately, this is a systemic problem compounded by what is a hereditary trait. It’s natural for us to believe that how we understand the world to be is most likely the right way for it to be.
Confirmation bias begins with the individual and trying to undo the negative consequences of it have to start with the individuals.
Part of the solution is to actively take the opposite approach: View your ideas as assumptions until you have rigorously researched them with the goal of disproving yourself, and encourage others to do the same — especially children, mentees, your students, and anyone else you might influence.
Rather than continue to live in an egocentric bubble, let’s do our best to seek out any untruths which may be coloring our worldview and correct them.
When the words “I know I’m right” automatically spring to mind or out of our mouths, replace them with “How might I be wrong?”
Thank you for reading and sharing.