The Social Psychology of the Backfire Effect: Locking Up the Gears of Your Mind

Angi English
Homeland Security
Published in
11 min readApr 30, 2016

--

A co-worker and I were sharing stories about going back home and seeing high school friends. We both remarked how awkward it is to visit with people who have not moved emotionally or socially past high school. It’s an awkward feeling and situation to be in. We both shared the uneasiness and thoughts on how to endure the situation. It is almost like people have gotten stuck in a time warp, including being stuck in old belief systems that no longer serve them well. Turns out, there’s some social psychology behind this which permeates the culture we now live in.

Growing and changing our beliefs over time by the honing of life experience, success and failure gives one a unique perspective. When we get with our social groups, there is quite a bit of discomfort when our beliefs contradict with other members of our social group. It is so uncomfortable that many of us will go to extraordinary efforts to disguise, deny or change our beliefs for a short time so that we fit into the in-group.

One of my favorite bloggers, Maria Popova at brainpickings.com spoke to this awkwardness by saying, “It’s a conundrum most of us grapple with — on the one hand, the awareness that personal growth means transcending our smaller selves as we reach for a more dimensional, intelligent, and enlightened understanding of the world, and on the other hand, the excruciating growing pains of evolving or completely abandoning our former, more inferior beliefs as we integrate new knowledge and insight into our comprehension of how life works.”

Growing Pains

Nonetheless, this discomfort of change and entertaining new ideas that challenge old ideas is necessary for human emotional and social growth. Many folks opt to stay stuck. Many just decide not to incorporate new information and insight which can lead to a state of self-delusion and dangerous self-righteousness.

This human tendency and cognitive bias is called the backfire effect. Most of us think we are rational beings making rational decisions with all the necessary information to do so. However, David McRaney in his book, You Are Now Less Dumb: How to Conquer Mob Mentality, How to Buy Happiness, and All the Other Ways to Outsmart Yourself explains the backfire effect. The backfire effect basically is “[w]hen your deepest convictions are challenged by contradictory evidence, your beliefs get stronger.”

This helps explain why in our current polarized social and political environment, facts and scientific evidence don’t matter because we want to hold firmly to our beliefs even if they are erroneous. It’s one reason why people attack the media for putting out new information.

The Backfire Effect: Geese that Grow on Trees

Craig Silverman is the editor of RegretTheError.com. He wrote in a recent column, “the backfire effect makes it difficult for the press (media) to effectively debunk misinformation. We present facts and evidence, and it often does nothing to change people’s minds. In fact, it can make people dig in even more. Humans also engage in motivated reasoning, a tendency to let emotions “set us on a course of thinking that’s highly biased, especially on topics we care a great deal about”.

This motivated reasoning where emotions reign is part and parcel of what McRaney writes about related to this cognitive bias:

“Once something is added to your collection of beliefs, you protect it from harm. You do this instinctively and unconsciously when confronted with attitude-inconsistent information. Just as confirmation bias shields you when you actively seek information, the backfire effect defends you when the information seeks you, when it blindsides you. Coming or going, you stick to your beliefs instead of questioning them. When someone tries to correct you, tries to dilute your misconceptions, it backfires and strengthens those misconceptions instead. Over time, the backfire effect makes you less skeptical of those things that allow you to continue seeing your beliefs and attitudes as true and proper.”

Implications for Homeland Security and Risk Perception

A research study noted by McRaney puts this in perspective when it comes to ramifications to homeland security and risk perception.

“In 2006, Brendan Nyhan and Jason Reifler at The University of Michigan and Georgia State University created fake newspaper articles about polarizing political issues. For instance, one article suggested the United States found weapons of mass destruction in Iraq. The next said the U.S. never found them, which was the truth. Those opposed to the war or who had strong liberal leanings tended to disagree with the original article and accept the second. Those who supported the war and leaned more toward the conservative camp tended to agree with the first article and strongly disagree with the second. These reactions shouldn’t surprise you. What should give you pause though is how conservatives felt about the correction. After reading that there were no WMDs, they reported being even more certain than before there actually were WMDs and their original beliefs were correct.

They repeated the experiment with other wedge issues like stem cell research and tax reform, and once again, they found corrections tended to increase the strength of the participants’ misconceptions if those corrections contradicted their ideologies. People on opposing sides of the political spectrum read the same articles and then the same corrections, and when new evidence was interpreted as threatening to their beliefs, they doubled down. The corrections backfired.”

Brain Biases

Turns out that the feeling of “knowing what we know” is part of a brain bias. When we feel correct or certain about something, it has neurological roots. A study of “The Neural Basis of Loss Aversion in Decision-Making Under Risk” shows when people hold an opinion differing from others in a group, their brains produce an error signal. “A zone of the brain popularly called the “oops area” becomes extra active.” As a brain bias, according to Evatt, “[i]f science can shame us into questioning the nature of conviction, we might develop some degree of tolerance and an increased willingness to consider alternative ideas — from opposing religious or scientific views to contrary opinions at the dinner table.”

Dr. Gregory Berns, professor of psychiatry and behavioral sciences at Emory University School of Medicine in Atlanta, Georgia describes in his book “Iconoclast: A Neuroscientist Reveals How to Think Differently,” found this cognitive or brain bias is related to fear and anxiety of becoming ostracized by one’s in-group. The mantra then is “to act more rationally, I must not get swept up into group acting, thinking and feeling frenzies” — even when you know it is the truth.

Our brains like to be happy. At least according to David Disalvo who has written, “What Makes Your Brain Happy and Why You Should Do the Opposite.” Disalvo says that, “ [i]f there’s anything that cognitive psychology studies have made clear over the years, it’s that humans can be exceptionally gullible. With a little push, we’re prone to developing false beliefs not only about others but also about ourselves with equal prowess — and the results can be, well, hard to believe. And at the core of many of these false beliefs live false memories.”

The Science of Fear and Exclusion

As you can begin to surmise, the backfire effect is a huge reason why we can’t get anything done in Congress and also why state government and legislatures deal with “unintended consequences” of legislation. We as a collective human body have gone to our collective corners and have become so locked into our belief systems that we are not willing to budge. The backfire effect plays a leading role in marginalizing any group different than our own. Daniel Gardner in his book, The Science of Fear: How the Culture of Fear Manipulates Your Brain, states “[w]hen like-minded people get together and talk, their existing views tend to become more extreme. In part, this strange human foible stems from our tendency to judge ourselves by comparison with others. When we get together in a group of like-minded people, what we share is an opinion that we all believe to be correct and so we compare ourselves with others in the group by asking “How correct am I?” Inevitably, most people in the group will discover that they do not hold the most extreme opinion, which suggests they are less correct than others. And so they become more extreme. Psychologists confirmed this theory when they put people in groups and had them state their views without providing reasons why — and polarization still followed.” So, just as confirmation bias gives us cover to only seek out information that fits our belief system, the backfire effect gives us cover when information blindsides us. Either way, people stick to their beliefs instead of doing the critical analysis of the facts and questioning them. When provided with factual information, which is a challenge one’s belief system, it typically backfires and causes one to even more suspect of the information.

The most profound example of this is noted in McRaney’s book, You Are Now Less Dumb: How to Conquer Mob Mentality, How to Buy Happiness, and All the Other Ways to Outsmart Yourself.

He tells the following story:

“In 1976, when Ronald Reagan was running for president of the United States, he often told a story about a Chicago woman who was scamming the welfare system to earn her income. Reagan said the woman had eighty names, thirty addresses, and twelve Social Security cards she used to get food stamps, along with more than her fair share of money from Medicaid and other welfare entitlements. He said she drove a Cadillac, didn’t work, and didn’t pay taxes. He talked about this woman, whom he never named, in just about every small town he visited, and it tended to infuriate his audiences. The story solidified the term welfare queen in American political discourse and influenced not only the national conversation for the next thirty years, but public policy as well. It also wasn’t true. Sure, there have always been people who scam the government, but historians say no one who fit Reagan’s description ever existed. The woman most historians believe Reagan’s anecdote was based on was a con artist with four aliases who moved from place to place wearing disguises, not some stay-at-home mom surrounded by mewling children. Despite the debunking and the passage of time, the story is still alive. The imaginary lady who Scrooge McDives into a vault of food stamps between naps while hardworking Americans struggle still appears every day on the Internet. The mimetic staying power of the narrative is impressive, and stories like this often provide one of the main foundations for the backfire effect. Psychologists call them narrative scripts, stories that tell you what you want to hear, stories that confirm your beliefs and give you permission to continue feeling as you already do. If believing in welfare queens protects your ideology, you accept it and move on. You might find Reagan’s anecdote repugnant or risible, but you’ve accepted without question a similar anecdote about pharmaceutical companies blocking research, or unwarranted police searches, or the health benefits of chocolate.”

The backfire effect is behind why there are so many people who deny global warming, the antivaccination movement, the belief that immigrants are rapists, the belief that homosexuals are mentally ill and dangerous, the belief that torture works, and that all Muslims are terrorists — just to name a few. These are errors in in our metacognition or our thinking about thinking.

Metacognition: Thinking about Thinking

Metacognition is the term given to “thinking about thinking.” Everything I’ve discussed so far has to do about the lack of the deliberate effort to think about what we think. If metacognition is working, people are able to take a step back and look at their erroneous beliefs, based on new facts and scientific evidence and weigh the best decision out — however, this is usually not the case. It is easier to make snap judgments based on your socially constructed belief system. That is part of what Nobel Prize Winner Daniel Kahneman calls System 1 thinking. In our current technological and social media culture, there is one phenomenon that does serve to facilitate and strengthen the backfire effect and this is called “filter bubbles.”

Filter Bubbles

The backfire effect is rife for dangerous unintended consequences unless, as decision makers, especially those in positions of power and authority, we challenge our long held belief systems and cultural ideology. As a culture, we get trapped into filter bubbles.

A filter bubble is a term given to a personal search for information in which a website algorithm selectively guesses what information a user would like to see based on information about the user. Facebook and Twitter use filter bubbles other social media does too.

In digital communities, one can choose to only communicate with people who support one’s personal belief systems or biases, only purchase products that support one’s world view or join groups that already confirm one’s biases.

The Internet has unchained the freedom to express ourselves. McRaney puts a sharp point on this and states, “[a]s social media and advertising progress, confirmation bias and the backfire effect will become more and more difficult to overcome. You will have more opportunities to pick and choose the kind of information that gets into your head, along with the kinds of outlets you trust to give you that information. Services such as Facebook already employ algorithms that edit the posts from your contacts so that what you see from moment to moment inside their walled gardens is most likely to be something with which you agree. In addition, advertisers will continue to adapt, not only generating ads based on what they know about you, but also creating advertising strategies on the fly based on what has and has not worked on you so far. The media of the future may be delivered based on not only your preferences, but also how you vote, where you grew up, your mood, the time of day or year — every element of you that can be quantified. In a world where everything comes to you on demand, your beliefs may never be challenged.”

Eli Pariser argues powerfully in his TED.com video below that filter bubbles lead to the backfire effect which will ultimately prove to be bad for us and bad for democracy.

Conclusion

From a homeland security perspective, an awareness of what the backfire effect is and how it manifests in one’s thinking is paramount. Our human brain is very emotional. We’d rather feel more and think less. The evolutionary neural wiring of the brain is such that our emotional brain overpowers the cognitive brain almost every time. It takes a conscious effort to understand this so as not to fall victim to this cognitive bias called the backfire effect. Citizens look to security and risk professionals to make decisions on their behalf to keep them safe. We have an obligation and responsibility to challenge long held beliefs susceptible to the backfire effect that could lead to catastrophic and deadly results.

Angi English has a Master’s in Security Studies from the Naval Postgraduate School’s Center for Homeland Defense and Security and a Master’s in Educational Psychology from Baylor University. English is a Licensed Professional Counselor, Licensed Marriage and Family Therapist and a Certified Part 107 Drone Pilot. She currently is the Chief of Staff at the New Mexico Department of Homeland Security and Emergency Management and lives in Santa Fe, New Mexico.

--

--

Angi English
Homeland Security

HSx Founding Scholar for Innovation, Center for Homeland Defense and Security, Part 107 Drone Pilot. MA National Security Studies, MS Ed. Psychology