How Cognitive Bias Fuels Coronavirus Fake News

Learn the Behavioral Science Behind Poor Judgement

Russ W
Russ W
May 16, 2020 · 10 min read
Photo by United Nations COVID-19 Response on Unsplash

America has a severe “infodemic” on its hands: The coronavirus pandemic has led to, arguably, one of the worst proliferations of fake news in history.

There are countless examples that range from the more dubious (created as a bioweapon) to the highly misleading (coronavirus is under control), the medically dangerous (miracle cures, ingestion of disinfectants), the outlandish (5G conspiracy theories), the irresponsible (there are no PPE shortages) and much, much more.

The impact has been as wide ranging as the misinformation. Panic and hysteria. Unnecessary individual exposure and bogus treatments. Monumental burden on healthcare providers. Prison riots and armed protests. Xenophobia, racism and hate crimes. Increases in depression, stress and stress. Strain on relationships.

As United Nations Director Paul Berger explained, “When disinformation is repeated and amplified, including by influential people, the grave danger is that information which is based on truth, ends up having only marginal impact.”

In a valiant (but potentially too-little-too-late) attempt to combat misinformation, the World Health Organization added a mythbusters page to its website and has released sharable coronavirus factoids on social media. A PEW Research Center study revealed more than 80% of Americans had been exposed to coronavirus fake news on social media.

While the irresponsibility of major media outlets pedaling misinformation and its disturbing viral spread across social media networks has been well documented, the question we should really ask is “why are we willing to believe misinformation?” And why do we resist credible information to the contrary?

“Fear is a very powerful emotion that can drive people to share things or even consider ideas that in normal times they would reject as rubbish,” Marinna Spring, a BBC disinformation and social media reporter, told Esquire Magazine. “Medical myths and speculation about how hospitals are coping are providing answers to people who are clutching at straws.”

During my 15-year career at top public relations agencies, it was my job to navigate the dividing line that separates fact from fiction. What version of the truth we would like to believe is, in many ways, dictated by our own biases and willingness to over or underweight different sources of information.

Here are five relevant examples of cognitive bias and how they enabled the spread of fake news, followed by advice on how to avoid being duped by fake news.

Photo by Belinda Fewings on Unsplash

Confirmation Bias: What Do I Want to Believe?

Confirmation bias is one of the most dangerous biases in today’s polarized political climate. I’ll let Nobel Prize Winning Psychologist Daniel Kahneman explain. In “Thinking: Fast and Slow,” he notes:

“Contrary to the rules of philosophers of science, who advise testing hypotheses by trying to refute them, people (and scientists, quite often) seek data that are likely to be compatible with the beliefs they currently hold,” writes Kahneman. “The confirmatory bias…favors uncritical acceptance of suggestions and exaggeration of the likelihood of extreme and improbably events.”

In other words, we look for evidence to confirm our pre-existing beliefs and disregard evidence to the contrary (think anti-vaxxers). Once we have already formed a belief, we close ourselves off to thorough re-evaluations of said beliefs once we encounter conflicting information.

This is part of the reason that social media can be so dangerous in spreading information. Social media algorithms want to keep us engaged — so they deliver information that we’ve engaged with in the past, particularly information we like or agree with. This effectively forms networks of people who share information reinforcing each other’s beliefs (i.e. echo chambers).

Conformity Effect: Groups Influence Our Judgements

We like to believe that social influence doesn’t control our decisions and beliefs, but many studies have revealed that social groups actually exert a strong influence on our behavior.

In “Being Wrong,” Kathryn Schultz quotes Cornell Psychologist Thomas Gilovich, who said “other things being equal, the greater the number of people who believe something, the more likely it is to be true.”

This effect can lead to false beliefs in two different ways. In the 1950s, Sociologist Solomon Asch conducted an experiment in which a subject was tested alongside with a group of actors, who had been instructed to select the wrong answers. Three-quarters of the test subjects conformed to the group and gave at least one wrong answer, and one in four subjects gave the wrong answer more than half the time.

Alternatively, if an individual has a vested interest in upholding their beliefs (say to save political face), the individual will cling to false beliefs in face of all evidence to the contrary. Social neuroscientist Matthew Leiberman has found that social rejection and the loss of status actually impact the same parts of the brain that process physical pain (see his book “Social”)

This effect is particularly strong on social media where sources and references are often stripped from the information is shared among direct friends. If we view these friends as part of our “tribe” of likeminded individuals and the information appears plausible, then we tend to accept what they have shared on face value without conducting further independent research.

This topic is explored by Sarah Cavanagh in “Hivemind” and Siva Vaidhyanathan in “Antisocial Media.” The trust that we often allot to our extended social network explains how provocative but misleading information can spread like wildfire over social media networks.

Just consider the last time you were presented with a provocative social media post with no source information. Don’t 25,000 likes seem to give it inherent authority?

Photo by Milo Milk on Unsplash

Confabulation: We Make Up Explanations on the Spot

If you’ve ever watched a Trump press conference (I avoid them), you can almost see the gears turning in his head when he gets a difficult question. To be clear, I’m not talking about when he tries to wriggle his way of out partisan gotcha questions, but rather a scientific or vaccine development question where it is highly unlikely that he would know the answer.

“An acquaintance once confessed to me that when his spouse contradicts a theory he’s just hatched, he begins spontaneously generating ‘facts’ to support it — even when he realizes that she is right, and he is wrong,’ writes Schulz.

Part of my job was to sniff out the BS of corporate executives. Trump’s vague, wishy-washy responses to scientific questions indicate that he is really “guessing” or “ballparking” rather than relying on actual scientific information. And I can tell you that his answers, most certainly, have nothing to do with a “natural ability” passed on from a great-uncle who taught at MIT.

“People who work with clinical confabulators report that the most striking thing about them isn’t the strangeness of their erroneous beliefs, not even the weirdness of the confabulations they generate to cover them, but rather the fact that these confabulations are uttered as if they were God’s word.”

Overconfidence: What You See Is All There Is

Kahneman discovered that we often make decisions based on the evidence that we have immediately before us, which can be a dangerous mistake if a virus is spreading overseas in far-away lands. He called this the “What You See is All There Is” rule (WYSIATI).

“As the WYSIATI rule implies, neither the quantity nor the quality of the evidence counts for much in subjective confidence,” writes Kahneman. “The confidence that individuals have about their beliefs depends mostly on the quality of the story they can tell about what they see, even if they see little. We often fail to allow for the possibility that evidence that should be critical in our judgement is missing.”

This is likely a significant reason why the United States did not implement more safeguards or begin to make significant domestic policy changes until after the first U.S. death had been occured on February 29. In my mind, this New York Times timeline of the coronavirus pandemic makes it clear (alongside overconfident statements made at the time) that the Trump administration did not consider the threat to be real until it was already spreading actively on our own soil. Sometimes “what you see is all there is.”

Photo by Jon Tyson on Unsplash

Availability Bias: Do Examples Come to Mind Easily?

It could be argued that this particular bias is responsible for much of the quarantine protesting in various parts of the country.

In Kahneman’s words, availability bias “substitutes one question for another: you wish to estimate the size of a category or the frequency of an event, but you report an impression of the ease with which instances come to mind.”

In the case of coronavirus in America, there are many parts of the country that have not had that many, if any, cases. Compare their experience to someone in Manhattan who lives right next to a hospital with a refrigerated trailer to hold dead bodies because the city’s morgues are full. The risks of coronavirus are clearly much more “available” for recall in that person’s mind.

In addition, Americans in other parts of the country were misled by leaders and pundits who repeatedly noted that coronavirus is just like the flu. This is known as “anchoring effect,” and results in the shifting of coronavirus risks to something similar in nature to the common flu — which occurs every year and is “available” in their minds. Given that the majority of these citizens have never lived through a pandemic before, the common flu became their only point of reference.

While this was just a linguistic trick by Trump to downplay the severity of the pandemic and reduce its political and reputational impact on his upcoming presidential campaign, words spoken by leaders in public office do matter. These words have significant public ramifications. The loose and often conflicting language that has been used to describe the coronavirus pandemic not only led to unnecessary Covid exposure across the country but also cases of ingestion of bleach and other disinfectants.

There are myriad examples of how political leaders leverage cognitive bias to manipulate the views of the masses. Framing is one of the most common, wherein politicians put facts or occurrences within a “convenient” context that happens to slant any judgement in their favor (i.e. 100K coronavirus deaths can “appear to be a positive outcome” alongside the possible death of millions). They ensure that their messages resonate by using bold color contrast (such as red hats with white lettering), simple words in verbally appealing constructions (“bend the curve”) and association with strong emotional concepts (“great” or “winning”). The reality is that many public officials around the world have made quite a number of questionable claims.

Photo by Tingey Injury Law Firm on Unsplash

How Do We Overcome Our Poor Judgement?

Now that we understand the power of cognitive biases to mislead our beliefs and decisions. What can we do to reduce the influence that we have and prevent them from having a negative impact on our health and well-being?

Here are a few recommendations to keep in mind:

Seek out different opinions and engage devil’s advocates: Actively seek out a diversity of opinions. This will help us to break out of the echo chambers that only serve to strengthen our confirmation bias.

Different points of view are critical to make well-rounded decisions. Respectful debate with devil’s advocates can enable you to view your beliefs from additional perspectives. While nuanced debate does not typically occur on social media, you can always take conversations offline. Devil’s advocates will also keep you in check if you start down the path of confabulation.

Hear others out: Especially related to politically charged topics, which coronavirus has become, we can be so certain of our point of view that we can automatically disregard or emotionally lash out if others disagree with our perspectives. Even if you disagree, turn down the volume and emotion on disagreements by focusing on your similarities and identifying common ground.

Don’t point fingers in anger: Fear of being wrong and hits to social status are more powerful than you might think. If you have read Jonathan Haidt’s book “The Righteous Mind,” you would know that when finger pointing, belittling and self-righteousness come out, everyone stops listening. The conversation devolves into attack and defend mode. If you want your opinion to actually be heard and potentially change another person’s mind, you need to listen respectfully to their opinion and make an attempt to see their POV.

Always check sources: This is a no brainer, but, in today’s infodemic, one can never be too careful. Are scientific claims backed by valid credentials? Are any statements or conclusions underpinned by evidence-based research findings? If you’re skeptical of something, does it really take that much time to click on the next Google search result?

Question the source: Do they have any skin in the game? Do they have political interests to protect and therefore might be framing information to slant your judgement of the evidence? When you’re listening to a political pundit, don’t ever take their word on face value. Do your own homework, go to the “original source” and consider your beliefs to be active decisions that are made only after a careful evaluation of the facts.

Hit the brakes: We are all moving a million miles an hour. Be mindful of when you are moving a little too fast. When your decisions become reflexive, the odds are that you’re running on autopilot. Before you forward information on to others, carefully consider the impact that it might have (will they know it’s a joke? Are you making your intentions clear?). Think about yourself as a politician. Would you want to be on the cover of the New York Times tomorrow if what you shared turns out to be false?

Advocate for punishments against fake news: Politically, the best thing that lawmakers can do is to enact a policy that punishes the intentional distribution of false information to a broad public audience (at a minimum punish major news organizations). Germany recently enacted a similar policy related to social media and hate speech.

And one last warning:

“It is not necessary for propagandists to produce fraudulent results to influence belief. Instead, by exerting influence on how legitimate, independent scientific results are shared with the public, the would-be propagandist can substantially affect the public’s beliefs about scientific facts,” conclude Cailin O’Connor and James Weatherall in “The Misinformation Age.”

###

The Startup

Get smarter at building your thing. Join The Startup’s +724K followers.

Russ W

Written by

Russ W

Recovering agency survivor. Writer. Creative. Empath. Human. Hit me up: russellweigandt@gmail.com

The Startup

Get smarter at building your thing. Follow to join The Startup’s +8 million monthly readers & +724K followers.

Russ W

Written by

Russ W

Recovering agency survivor. Writer. Creative. Empath. Human. Hit me up: russellweigandt@gmail.com

The Startup

Get smarter at building your thing. Follow to join The Startup’s +8 million monthly readers & +724K followers.

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store