
Confirmation bias making you dumber, and what to do about it.
Various biases color our decision making and cause us to make suboptimal choices, despite what the data might suggest. One particularly insidious one is confirmation bias, which causes us to reinforce false beliefs and slows down our growth and development. How do we account for that bias and help accelerate our growth?
Bias
Most definitions of the word “bias” have a negative connotation and focus strictly on the bias an individual might feel toward particular groups based on race, gender, etc. Without diminishing the importance of this particular flavor of bias, I’m going to explore a broader, more neutral definition here. I’d like to define bias as a mechanism by which our brains make decisions with incomplete information by extrapolating from our existing beliefs. In short, they’re decision accelerators.
I want to take special care to not whitewash the subject. Often the decisions they accelerate lead to negative outcomes, like treating two people differently because of gender, race, social class, etc. I am in no way trying to suggest that racism, sexism or any of the other ‘ism’s are positive or do not exist.
An example of a potentially positive bias that I regularly leverage is favoring action over inaction. When presented with incomplete information and multiple paths forward, my first instinct is to favor the path that results in producing something over the path that results in indecision. I can override this bias, but it takes a conscious choice to do so. While certainly not perfect, I generally consider this bias to be positive, and it typically results in statements like: “… yes, but I’d like to make sure we actually produce something this sprint.”
Confirmation Bias
One particularly useful, but also particularly problematic, bias we all have is the bias toward believing what’s contained within our own head. Without that meta-belief, life becomes paralyzing and it’s difficult to function. If we can’t believe what’s in our own heads, how can we trust any of our decisions. This bias toward our existing beliefs is called confirmation bias.
Confirmation Bias — the tendency to search for, interpret, favor, and recall information in a way that confirms one’s preexisting beliefs or hypotheses.
Unfortunately, this essential bias has a number of negative side effects. To understand the power of confirmation bias, you needn’t look any further than the proliferation of fake news. When presented with news that confirms our beliefs, we tend to question it a bit less than inconvenient news. Media networks are incentivized to give us more of what we like, so they give us more things we agree with. Cut to 2017, and we have poorly vaccinated populations allowing once eradicated diseases to flourish and NASA releasing a statement to officially reject the notion that Mars is a secret, child labor, slave colony. Confirmation bias leads to suboptimal outcomes when we make decisions based on false, reinforced beliefs.
Approximating Reality
You’re probably not going to be solving the fake news problem any time soon, but there is another problem that you can solve — making better decisions.
In order to function in the world, we develop mental models that help us predict the outcomes of our actions. As children we learn that fire is hot, and the fear of being burned keeps us from touching fire. As we a grow and learn, the “fire — >hot — > pain — > bad — > don’t touch” knowledge in our model gets more sophisticated, and we learn to avoid getting into situations where we would cause a fire to exist. We start to think in the abstract and act in groups to create legislation that requires contractors to adhere to building codes and minimize the chances of people getting burned in fires. All of this is made possible by our ever sophisticated and growing mental models. The better our model, the more effectively we make decision, the more we’re able to accomplish.
So, we spend our formative years building vast logical networks in our heads. As a child, most of that time is spent filling in the many knowledge gaps as we regularly come across new concepts. We learn that we can take our feet off the ground if we pedal fast enough on our bike, that it hurts when you fall of a bike, and perhaps in school that objects dropped near sea-level on the 3rd planet from Sol accelerate at a rate of 9.8m/s^2 until they hit terminal velocity due to air resistance. We also learn that God is great, Obama is the antichrist, and Tom Brady is a cheater, except for those of us who learn that god doesn’t exist outside of Tom Brady’s uniform, and we can’t reconcile that fact with our hatred for Tom’s friend Donald Trump. Also, Yankees suck.
The problem is, these models are imperfect approximations. To learn, you need to improve your model by building out the parts that don’t exist and, and this second operation becomes more important over time, by refactoring the parts that are misaligned with reality. Unfortunately, once foundational beliefs are laid down, our friend confirmation bias comes along and makes it much harder to re-learn something in opposition to what we already (think we) know. It’s harder to rebuild a part of your model that’s performing poorly than it is to build it in the first place. Once you’ve developed a fairly sophisticated understanding of the world, learning how to overcome this resistance to new information is one of the most effective ways to continue growing, prevent poor decision making, and enable future success.
Overcoming Our False Truths
So, how do we do that? Well, acknowledging the existence of the problem is the first step. It’s difficult to remove a bias and, since belief in your convictions is actually a quite useful tool, we don’t want to remove that bias completely. Instead, we want to develop a second artificial, conscious bias that enables us to act in opposition to our instinct.
If you believe my basic premise here, that we are all affected by confirmation bias, and you agree it can lead to suboptimal decision making, then you should already be incentivized to correct that behavior. In other words, you should be biased against allowing confirmation bias to negatively impact you. You should believe that questioning your beliefs is a good thing, and that’s the belief that we’re going to exploit.
This above all: to thine own self be true
— Bill S.
As a first principle, we’re going to assume that questioning our beliefs is necessary to progress.
When I’m challenged with information that conflicts with my mental model, my first reaction is typically to be defensive. I’m emotionally invested in that belief, and it’s painful to think that, for all these years, I’ve been wrong. But, what’s potentially even more painful is being wrong for even longer and that false belief leading to a meaningful negative impact on your life. The trick is recognizing when this is happening, and overriding that defensiveness with our new conscience bias.
When presented with information that conflicts a strongly held belief, we’re going to recognize our bias and put the burden of proof on ourselves to show that our prior belief remains correct.
The data will set you free
I used to say that I didn’t have opinions, just facts. That bravado gets tempered a little after about the 5th time where your “someone is wrong on the internet” crusade results in you realizing that you were the one who was wrong. As an imperfect being with finite time, I can’t help but have opinions. However, that doesn’t mean I shouldn’t aspire to have my beliefs backed up by data and facts. This means we’re going to rely on data, we’re going to be skeptical of our facts, and we’re not going to be dismissive of data that disputes our beliefs without cause.
In the absence of data we tend to use anecdotes as a substitute. Anecdotes are useful in helping illustrate our points, but they are not a substitute for data and should not be used to “prove” your point.
When the data suggests that we’re wrong, we’re wrong. But, data is not the plural of anecdote, and anecdotes are not proof.
But, don’t be paralyzed
A useful concept introduced to me by a colleague early on in my career was the idea that a dull axe does a poor job of chopping wood, so you have to spend some time sharpening your axe if you want to be productive. But, if you spend all of your time sharpening the axe, you’re not going to cut any wood. You should still spend most of your time chopping, but every now and then you have you look up from your work and check your axe.
We are going to spend some time refactoring our beliefs, but the majority of time is still going to be spent using them as a tool to get things done. Inaction is often worse than picking the 2nd or 3rd best path, and a great way to prevent yourself from getting anything done is to spend all your time convincing yourself of the path you’re taking.
Fin.
In summary,
- We have imperfect mental models, which can lead to poor decisions.
- Confirmation bias slows down the process of repairing those models.
- Overcoming confirmation bias is key to unlocking continued learning.
- Being skeptical of our own beliefs will help overcome that bias.
- When data conflicts with your belief, you should have a new belief.
- But, don’t let yourself get paralyzed by unnecessary overthinking.
Did you like this work? Did you hate it? Please feel free to write a response, highlight any points that you feel need further exploration, and recommend this if you’d like others to be able to participate in the discussion.
