Created by E Rosalie | NOVEL SCIENCE | Lady Liberty of Fiery Light

A Study in Humans

How Bias Affects Us Even After We Believe Facts We Dislike and What It Means for Misinformation

We know people often reject ideas that conflict with their worldview, but what happens if they accept it — do biases affect us then?

E. Rosalie
Jul 6, 2020 · 5 min read

Audio version pending

ountless studies have shown people often reject ideas that disagree with our existing beliefs. How bias affects us once we accept unfavorable information, however, is less clear.

Anti-misinformation campaigns on social media mark stories with “false” or “partly false” alerts on platforms like Twitter and Facebook via third-party fact-checkers. The efforts hope to address our tendency to reject or accept information based upon what we already believe.

But what happens after we overcome the ornate mental defenses that resist information we dislike? Do biases affect people once they accept information that conflicts with their worldview, and if so, how?

Dr. Pierce Ekstrom, a psychologist and political scientist, and Dr. Calvin Lai, an expert in psychological and brain sciences, examined the question of what happens after someone accepts information.

The scientific duo explored three critical questions:

Are people more likely to share information that aligns with their political beliefs;

are people selectively sharing information that supports their existing beliefs even when they report believing unfavorable information to be accurate too; and

how do biases affect what people share when conversing with others who agree versus disagree with their stance?

People in the study read non-partisan briefings on controversial topics like gun control and raising the minimum wage. Ekstrom explained some of what his study found.

“Getting a few individual voters to believe accurate information isn’t enough to ensure everyone has the facts. Voters themselves seem willing to act as gatekeepers, holding back information that clashes with their opinions.”

Participants selectively shared information that supported their views across all topics.

How readily people shared depended on the subject and whether their conversation partner shared or disagreed with their ideas. The tendency to share or withhold information varied with the topic discussed. Gun rights provoked the most biased sharing.

People who accepted unpleasant facts still preferentially shared the details and information that affirmed their opinions.

That the misinformation crisis comes from people rejecting accurate information neglects other potential drivers of the misinformation epidemic — infodemic.

The causes matter because the crisis will continue if we only address one of the crisis’ drivers. When people marked ideas as “definitely true,” that did not mean they shared the information with others.

Ekstrom wants people to understand “they might hold back important political information from others.”

Discussion with those that agree versus disagree with us generated some unexpected results for Ekstrom and Lai. Liberals shared information with more bias when they conversed with conservatives. Among other liberals, selective sharing was less pronounced.

The reverse was true of conservatives, who showed the least preferential sharing with people who held opposing beliefs and more bias with like-minded peers. Other studies have reported the opposite. We need larger, random samples not drawn from college students or active volunteers from other projects.

Ekstrom and Lai noted that popular culture holds that conservatives are the more ideologically rigid, but this result contradicts that. Still, we should not take the find as a definitive answer.

One cannot draw firm conclusions from a single study, but it can inform the direction of future research.

“Selective communication may distort the facts available to citizens, but citizens’ opinions are made of more than facts.”

Ekstrom and Lai

A more profound view might see the dissimilarities as related to the distinct reasons liberals and conservatives discuss politics.

For example, conservatives might hope to persuade others, and liberals might wish to further policies. One study found that people reduced their preferential sharing when they felt understood, which may mean we share selectively to be understood.

Unexpectedly, people shared suspect information less readily. That could mean people do not preferentially share “fake news” if people successfully identify stories as accurate or false.

A 2019 study showed everyday citizens fared well in identifying flagrantly false information termed “fake news.” Information that seemed suspicious was shared less often, an encouraging find.

A less comforting discovery:

People withheld facts that might help others recognize misinformation. We also share information that misrepresents or omits relevant details, so long as the story has some reality-based content and affirms our opinions.

Thinking big-picture, positive change requires individuals as much as it does politicians, to share information ethically. People must recognize their contribution to polarizing our country, which happens when they share skewed stories and selectively shared information that serves their opinion.

“To the extent that people selectively communicate ideology-consistent information to their political opponents, they run the risk of… exacerbating the mistrust that partisans already feel toward their opponents (Levendusky, 2013).”

- Ekstrom and Lai

These behaviors create an impression of a reality that does not exist. Many partisan media personalities discuss their opposition in ways that do not reflect reality. They refute and argue with a caricature that does not exist.

Politicians must wield the power of political consensus (and potential damages that come from its absence) with the responsibility and maturity that their positions command. Voters must hold politicians delivering inflammatory and biased statements to account, especially when the bias favors your shared opinion.

People readily dismiss critique from opponents, meaning those who share our beliefs might be the only people who can hold us to account.

If we allow like-minded people to mischaracterize issues, we become complicit in creating a more-polar tomorrow.

can learn from the past and change our behavior, moving forward. That requires us to own our role in the problem.

Each side imagines itself the “good guy,” and the other, the “bad guy,” but this is not a story, and you are not a fictional character. If we want a functional society, we won’t get there by pointing fingers at whoever you perceive to be the problem.

Everyone must indict the only person anyone controls: you.

Novel Science

coming August 2020

Sign up for Threats, Fact Checks, and Reads

By Novel Science

1-3x week alert for developing disinformation narratives, emerging threats, research, and recommended reads. Take a look.

By signing up, you will create a Medium account if you don’t already have one. Review our Privacy Policy for more information about our privacy practices.

Check your inbox
Medium sent you an email at to complete your subscription.

E. Rosalie

Written by

Public health biologist studying at Johns Hopkins | Science writer & artist | Views reflect me alone | Subscribe @ Novel-Science.com

Novel Science

A novel science publication where we strive for a candid perspective, scientific integrity, and writing excellence.

E. Rosalie

Written by

Public health biologist studying at Johns Hopkins | Science writer & artist | Views reflect me alone | Subscribe @ Novel-Science.com

Novel Science

A novel science publication where we strive for a candid perspective, scientific integrity, and writing excellence.