The Science Behind Why Your Facebook Friends Ignore Facts
Cognitive Bias and You
I’ve long believed that humans are rational beings. That is to say, that people use logic and evidence to make decisions and determine what’s true. As it turns out, a wealth of cognitive research proves I was decidedly wrong.
We live in a world where more information is flooding our brains than ever before. Advertisers have long battled for our attention. But now, software developers battle for it too, and then sell it back to the advertisers. The more effectively Google and Twitter and Medium can capture our attention, the more money they make.
If we didn’t filter almost all of the information we perceive, we’d be completely overwhelmed. That’s why our brains use “shortcuts” to pick out the bits of information that are most likely to be useful. And by useful, I don’t necessarily mean true. By useful, I simply mean that the information we keep you alive and help you reproduce.
You may find yourself wondering: Why is the world so divided on religion and politics? Why do people support Donald Trump? Or Hillary Clinton? Why can’t I convince my friend to change his mind?
In this article, I share how our brains deal with information overload — and the associated cognitive biases that prevent us from correctly understanding the facts.
1. The Availability Heuristic: We Believe What’s Top of Mind.
The availability heuristic is a mental shortcut that relies on immediate examples that come to mind to determine truth or falsehood. It posits that when it comes time to make a decision, we leverage what is already top of mind. We give greater credence to this information and tend to overestimate the probability and likelihood of similar things happening in the future.
This shortcut is helpful in decision-making because we often lack the time or energy to investigate complex issues in greater depth. The availability heuristic allows people to arrive at a conclusion more quickly.
However, like other shortcuts, it can lead us astray. Of course, just because something is in our mind, doesn’t necessarily mean it’s true.
For example, after Donald Trump referred to Hillary Clinton as “crooked”, we were primed to keep interpreting her behavior as crooked. The interpretation doesn’t necessarily mean she’s not crooked, it just means that our brains are more likely to come to the conclusion that she is because it’s easier than evaluating the situation from scratch.
2. Attentional Bias: We Believe What We Pay Attention To.
Attentional bias is the tendency for our conclusions to be affected by our recurring thoughts. Furthermore, attentional bias predicts that attention will be preferentially allocated towards threatening compared to neutral or positive stimuli.
If you think what you see is the whole story, you’re displaying attentional bias. To arrive at a more accurate conclusion, you also need to consider the things you don’t see.
For example, when someone looks at only one, or a few, economic data points and then determines that the economy is strong or that the government is doing a great job, he is forgoing the time and energy necessary to gain a more complete picture.
3. The Illusory Truth Effect: We Believe What’s Repeated.
Repetition is another way that misconceptions can enter our knowledge base. Per The Illusory Truth Effect, repeated statements are easier to process, and subsequently perceived to be more truthful than new statements. Our brain spends less time and effort on processing information that’s been repeated and takes it as truth simply because it’s familiar.
The reverse is also true: people interpret new information with skepticism and distrust.
Take the topic of nutrition as an example. For decades we’ve been repeatedly been told that eating fat is unhealthy. Despite recent studies proving the contrary, our diets continue to be high in sugars and processed carbohydrates.
It doesn’t always matter what is told to us — a truth or a lie — we’ll believe it as long as it’s repeated enough. It’s the frequency, not just the plausibility that matters.
4. The Mere Exposure Effect: We Believe What’s Familiar.
Not only is repeated exposure more likely to make us believe something, it’s more likely to make has have a favorable opinion of it. Per The Mere-exposure Effect, also known as the familiarity principle, we tend to like things more when they’re familiar to us.
Repeated exposure of a stimulus increases perceptual fluency, which is the ease with which information can be processed. Perceptual fluency, in turn, increases positive sentiment. Familiar things require less effort to process and that feeling of ease signals truth.
We are attracted to familiar people because we consider them to be safe and unlikely to cause harm. We can even adapt to like fairly objectively unpleasant things, such as when prisoners miss prison.
Living in an Irrational World…
When trying to make an important decision, have you ever come short of considering all information and possibilities? While we might like to think that we take all the facts into consideration, the reality is that we often overlook some information.
If we fact checked anything and everything that crossed our senses we’d be paralyzed. By using the shortcuts above, instead of coming to our own conclusions using reason and evidence, we may be wrong sometimes, however determining what’s true is not always necessary to survive.
Ease trumps truth.
Imagine if a CEO couldn’t trust his marketing team to analyze data. The CEO wouldn’t be able to focus on keeping the company alive and growing it. He’d be stuck in the minutia of marketing analytics.
Similarly, our brains have to sacrifice accuracy to increase the chances of surviving and reproducing.
Humans across the world have come to different conclusions about important issues like religion and politics. Logically then, most of them must be wrong. However most people are not dying as a result of believing in the “wrong” religion.
The fact that we are so divided on the election is further evidence that we are not completely rational. Even if one side was “right,” that would mean that about 50% of the population was wrong. If we were even predominately rational, wouldn’t at least a more significant portion of the population be on one side?
You might say “ah, but the two party system doesn’t make any sense.” But we have a two party system — which doesn’t make sense. That’s further evidence to my point!
If people were rational, there would be no need for emotional marketing, advertising or political speeches, we would simply educate people about the facts.
When I first began realizing that people were irrational, I was confused and frustrated. I felt hopeless. I wished things were different. But that only caused anxiety.
Accepting that most people — myself included — are irrational most of the time actually eased the stress. Now I don’t have to wonder about why my Facebook friends believe half the crap the media espouses, why they endorse a political candidate that I don’t agree with, or why they believe in their respective religion.
We live in an incredibly complex world. Familiar information is easier to understand and repeat. You can see how the cognitive biases above can help us make decisions faster and therefore stay alive, but not necessarily to find truth. In a weird way, being irrational actually seems rational.