Why is it so tricky to change people’s minds?

Tal Cherni
behaviouralarchives
6 min readApr 20, 2021

The last few years have seen a growth in polarization and radicalization in the political sphere, and a rise in conspiracy theories. Whether be it conspiracies about climate change, the origin of COVID — 19, the evil plan behind 5G, the re-emergence of Flat Earthers or the followers of QAnon, we see a rise in instances of people who adhere to certain beliefs and ideas that would have seemed silly several years ago.

Such rise in what used to be unpopular beliefs might have something to do with the notion that beliefs in conspiracy theories rise during times of crisis. We are indeed in the midst of a global pandemic, but Covid-19 is probably not the only reason. More compelling is the notion that we live in the information era, where almost everyone is connected to the internet, and exposed to a vast amount of ideas from around the globe. As information becomes more accessible, people find it easier to just look for forums and groups that follow a similar belief system (‘Echo Chambers’), instead of updating their own according to scientific facts. Combined with the fact that the results one receives on google align with one’s customized profile, it is little wonder why people don’t bother looking for the facts.

Some might think that if only those who follow such conspiracies would learn the facts, and listen to reason and data, they would come to believe in the truth. Yet, as research taught us, providing facts and data doesn’t usually change peoples’ minds. Most people either agree with ideas that already confirm what they already believe in, known as ‘Confirmation bias’, or hold more firmly to their beliefs if they sense that people are trying to change them, known as the ‘Boomerang effect’(1,2,3).

Don’t try to prove the other person wrong, as they might get defensive and entrenched in their beliefs.

Interestingly, even those who are considered more analytical tend to succumb to such pitfalls. A 2017 study shows that the more analytical and intelligent people are, the better they are at rationalizing and interpreting information as they wish. So ironically, some people might ‘take advantage’ of their intelligence by looking for flaws and holes in arguments that don’t fit their world view.

It is important to note that in general, it is not easy to change people’s minds once it’s set. One reason might be that for some people, belief is intertwined with their whole being and sense of purpose. For such people, changing their opinions means not only changing their diet or relocating to a different city, but radically changing who they are, and throwing away everything they worked for in life.

In his famous work about ‘Cognitive Dissonance’, Leon Festinger describes cult members who were faced with the fact that their prophecy failed to manifest itself. Upon the discovery that a divine flood will not consume the earth as they thought will happen, some moderate group members left the cult, while others, the most committed believers, chose to rationalize why said flood did not appear as it should have, and stuck together, supporting each other and sharing new ideas to how might the revelation come true in the future.

In a sense, the self-assuring behaviour among occultists that Festinger described could be thought of as an analogy to how social bubbles and echo chambers work in certain forums. They allow for such theories to thrive as participants in such forums only listen to likeminded people. Additionally, the ubiquity of information available allows for such groups to survive, as for almost every argument there could be a counter-argument, supported by a different study that suits the groups’ belief system.

Speaking of social bubbles, another reason for sticking with beliefs has to do with group identity and norms. One might argue that holding to certain imperatives that are considered important by one’s social group allows for better social support and functioning social life. Changing one’s mind about certain core beliefs that identify the group could prove maladaptive. For example, in cases where a member of a group violates a certain moral imperative or norm, the feedback from group members can be negative, starting from blame, and ending with some sort of punishment.

Finally, it is also possible that persistently holding to one’s belief is just a by-product of how our brain evaluates new information in comparison with what we already know about the world. It is probable that when we encounter new information that doesn’t fit our prior beliefs about the world, it means that this new information is wrong. If a person told you that he saw a flying cat, it would be reasonable to think that the person is lying and not trustworthy. We develop our beliefs over time and experience, and it is reasonable that we shouldn’t change them every time we encounter new information. Unfortunately, this makes it difficult to change inaccurate beliefs.

The previous notion might be verified by mathematical evidence if we are to believe that our minds work somewhat similarly to a Bayesian machine (1,2,3). For those of you who are not familiar with Bayes’ theorem, it is an equation that allows us to calculate the probability of an event, based on prior knowledge of conditions that might be related to that event.

One cool aspect about this formula is that if you decide to set the probability of your prior knowledge (or belief, formalized as P(H)) about a certain something to zero (i.e., ‘no chance it would ever happen’) or one (i.e., ‘I am one hundred precent sure it’s true’), no matter what new evidence you enter the equation, the result will always stay the same. In a sense, this idea implies that arguing with stubborn people is a little like talking to a wall. However, it is fair to point out that other scholars think that our minds are probably not completely Bayesian, and people can indeed change their views, no matter how radical, with the right persuasion.

So, what can we do about it?

A good place to start is forming trust and being empathetic to the other side. The more empathetic you are to the other side, trying to see the world through their eyes, and mostly listen to what they have to say, not being judgmental and harsh about it, the greater the chance they will trust you and listen back. Earning the listeners trust will help you to make better connections with them, and will give your words more weight than somebody else’s. No matter how impressive your CV is, your audience has to trust you before they listen to what you have to say.

Additionally, it is important to try and find common ground. You don’t necessarily have to prove the other person wrong, as they might get defensive and block you out. A better way is to look at something you both agree on, and start from there. In a 2015 study aimed at finding ways to influence anti-vaccination enthusiasts to vaccinate their children, the authors found that the best way to convince such parents is by highlighting something that both doctors and parents agree on: the importance of keeping their children safe, rather than highlighting the parents’ misinformation about vaccinations and trying to change their mind. So, it’s not about erasing the old, but rather adding a new way of thinking about the situation.

In conclusion, when trying to influence, being right is not always the best way to go about it. Being empathetic, trustworthy and looking for common grounds is the best way to get people to listen to you. In general, it is beneficial to try and keep an open mind. We don’t always have all the answers, and the more we accept it the better we’ll get at allowing ourselves to listen to other people, and perhaps in return, these people will listen back.

--

--