Facts Don’t Change Our Fictions

With so much scientific evidence so readily available, it really is troubling that people still doubt the effects of climate change; still hold on so dearly to astrological signs; and still believe that vaccines cause autism. Why have our governments, social organizations and scientific leaders had so much difficultly changing people’s beliefs when the facts are so readily available?

Recent research in cognitive and social psychology has the answers. It paints a clear, albeit deeply concerning, picture about the nature of belief. It demonstrates how perfectly reasonable people can come to believe completely different things even when exposed to the same set of information.

Let’s say, for example, you held the initial belief that women are bad drivers. Every time you are exposed to another woman driving badly, a little voice of approval starts sounding in your head; ‘you see, women are bad drivers’, ‘of course it’s a woman driving’. Every incident that you’re exposed to reaffirms your belief, further solidifying this stereotype in your mind.

Some of you might be thinking, ‘Sure, but so what’? Surely seeing women driving badly is evidence of the fact that women are bad drivers. No problem here, right?

And you would be right. Assuming, of course, that as human beings we pay equal and adequate attention to all of the relevant evidence presented to us, constantly, and analytically integrate this evidence when forming our beliefs.

Herein lies the problem; that is not how the human mind works.

Confirmation Bias:

The first issue arises at the level of attention. Even before encoding and memorization, our beliefs are already being skewed by what information we do and don’t attend to. Once we have formed an initial belief on a particular topic, we have a hardwired tendency to seek out and be more receptive to information that confirms the belief. We, equally, tend to ignore and devalue any information that runs counter to it.

In the case of women being bad drivers, we tend to pay more attention to the recklessness and mistakes made by female drivers than those made — arguably as often — by their male counterparts skewing our judgement in a certain direction. This systematic error of the mind is known as confirmation bias.

If we are only selecting information — at both a conscious and subconscious level — that confirms our initial belief that women are bad drivers, then of course we will think that they are, and find ‘proof’ of this everywhere.

If we really wanted to test the belief that women are bad drivers, we would not only pay attention to when female drivers make mistakes, but also male drivers. Adding further context would also be necessary. Eg collecting instances of both women and men driving particularly well. We would, of course, still be subject to noise (random variation), but at least our confirmation bias would have been effectively countered. With enough data (in this case driving events), we should start to see a reliable signal in the noise.

Of course, in reality trying to deliberately pay attention to such a large variety of data points is impractical and, because of our cognitive limitations, it is physically unlikely — unless you’re some sort of Spock-like being.

Disconfirmation Bias:

Unfortunately, confirmation bias is only the beginning of our worries. Another concerning tendency is it’s bigger and slightly more destructive brother, disconfirmation bias. Unlike confirmation bias — which leads us to be more open to information that confirms our existing beliefs — disconfirmation bias leads us to spend a disproportionate amount of time and energy challenging, debunking and discrediting information that comes to our attention, but that runs counter to our beliefs.

Information consistent with our preferred conclusion is examined less critically than information and evidence to the contrary. Consequently, significantly less information is required to confirm our beliefs than to alter them.

Peter Ditto, an aptly-named social psychologist, did a really neat experiment to help us get more clarity on how these two biases come about. To begin with, participants were told that they were part of a medical procedure that would test them for the existence of a certain disease.

Each participant was given a strip of paper and asked to place it in their mouths in order to provide a saliva sample. The participants were then asked to dip the saliva-soaked strip in a glass of transparent liquid. Half of the participants were told that if the paper strip stayed the same colour they were clear of the disease, but if the strip changed colour they unfortunately have it. A second sample of participants were told the opposite.

In reality there was no disease, no dye and no medical experiment. The participants were simply dipping a strip of ordinary paper into a glass of room temperature tap water. What Ditto and his colleagues were really looking at was the difference in how the two groups would react when faced with these ‘facts’. Would there be any evidence of bias relating to the outcome that suited the participants (not having the disease) as opposed to when it didn’t?

The result was an overwhelming yes! To such a degree, in fact, that it even surprised the researchers.

For participants where no colour change meant that they were clear of the disease, they dipped the paper strip into the liquid, waited a few seconds before taking it out and effortlessly, without question, indicated that they were finished with the task.

For the participants where no colour change meant that they had the disease, the reaction was completely different. Instead of simply taking the paper strip out and indicating that they were finished, they left the paper in for much longer before taking it out, shaking it, dipping it backinto the solution, and questioning the researchers and the validity of the entire experiment.

The simplicity and elegance of this experiment nicely captures the way that people go about processing information that is either consistent with or contrary to what they want to believe.

Reasonable Misinterpretation:

What happens when the evidence is so blatantly obvious that it cannot be questioned, challenged or refuted? When an unstoppable force (of evidence) meets an immovable object (a core belief)?

Ideally, when faced with unquestionable evidence, we would try to engage objectively with this information and adjust our beliefs accordingly. Unfortunately, most often this isn’t what happens. Instead of adjusting our core beliefs, we go to extreme lengths to create a coherent story about the world where both the core belief and the evidence can thrive, pushing us even further away from objective reality and the truth.

This tendency was nicely demonstrated by psychologist Lean Festinger in his infamous infiltration of a Chicago-based cult called the Seekers.

The Seekers believed that they could communicate with aliens including one called ‘Sananda’, the apparent astral incarnation of Jesus Christ, who was going to save them from an earth-ending catastrophe on the 21st of December 1954. Festinger was interested to see how the group would react that day when the belief that they had invested so much into — emotionally, psychologically and socially — had been unquestionably refuted.

According to Festinger, the day of the 21st came and went without any apocalypse and the group visibly struggled with integrating this information. After several days, however, the rationalizations started to surface. A ‘new message’ had been received from their Christ-like alien friend. Because of their commitment to the cause, the Seekers had saved the Earth!

When faced with evidence that so clearly disproved their beliefs, the Seekers didn’t stop to re-assess their beliefs in the face this new information. In fact, what happened was quite the opposite. After a short period of cognitive tension, grappling with the hard evidence, they manufactured a coherent story encompassing the unquestionable evidence and pushing themselves even deeper into their delusions. Festinger summarized his findings well when he said:

“A man with conviction is a hard man to change. Tell him you disagree and he turns away. Show him the facts or figures and he questions your sources. Appeal to logic and he fails to see your point.”

Can we do anything about these biases?

One of the most popular contemporary strategies for dealing with these tendencies revolves around shifting our thinking away from this fast, intuitive and often overconfident type of mental processing — the type that Daniel Kahneman and colleagues call ‘System 1’ — towards the slower, more deliberate and analytical ‘System 2’. If we can get people to apply System 2 thinking more often, then we can help them to form more accurate beliefs about the world.

Unfortunately as Dan Kahan and colleagues demonstrated in a recent study, when it comes to entrenched beliefs even the slower, more deliberate method has its flaws.

Kahan wanted to understand why public conflict over societal risks, such as climate change and gun control, still persists in the face of such compelling and widely accessible scientific evidence. In response to this he explored two alternate answers. The first was the Science Comprehension Thesis (SCT) which identifies people’s lack of knowledge and reasoning capacities as the source of the issue. The second was the Identity-protective Cognition Thesis (ICT) which argues that our main drivers are orientated around our social relationships and our own self-identity and therefore we tend to prioritize information that keeps these intact.

Kahan asked a sample of participants to interpret a data set relating to whether or not a certain skin cream reduced rashes. In addition, he tested each participants’ numeracy abilities. The numeracy test that he used measures an individual’s ability and disposition to make use of quantitative information, something that most mathematicians, scientists and analysts score highly on.

Unsurprisingly, he found that people who score better on numeracy tests performed well in that they were able to draw more accurate conclusions from the data. This is what you would expect with regards to SCT.

Kahan then did something very interesting. For the second variation he kept everything the same, except for one important substitution. Instead of the same data points referring to cream and skin rashes, he said that they pertained to the effects of an enforced gun-control ban, a deeply polarizing issue for the two major US political parties. He also made note of each participants’ political position. With just this one small adjustment, the results became polarized and less accurate than the initial study. This is exactly as you would expect with the ICT and our tendency to see data in a way that is more consistent with our existing beliefs, identities and those of our ‘in-groups’.

What was not expected, however, was an additional finding that really shook the science world. Unlike the first experiment, in the politically-charged experiment individuals with higher levels of numeracy did not reach any more accurate conclusions. INSTEAD, Kahan found that there were actually even higher levels of polarization for those with higher numeracy scores, and therefore less accurate conclusions being drawn from the results!

What is truly fascinating is that this result makes sense from the perspective of ICT. The ICT thesis would have predicted that highly numerate individuals use their quantitative reasoning abilities selectively, so that their interpretation of the data is coherent and consistent with their pre-existing political beliefs. In a way, what these highly numerate individuals were doing was using their reason to twist and shape their understanding of the evidence, and they were having more success at achieving this than those with lower levels of numeracy.

If you find this slightly depressing, don’t be dismayed. You are not the only one… Many scientists, thinkers and writers have expressed similar sentiments. One article even goes so far as to call it “the most depressing discovery about the brain, ever”. While this may be right, it is also incredibly enlightening progress in our understanding of how the mind works. We now know why the facts just aren’t getting through. Through a clearer understanding of the problems, we can begin to develop strategies for solving them.

In my next article I’ll explore the weird and wonderful approaches to belief adjustment. There’s still hope, it just exists within our social relationships rather than with the facts.

For more visit the Gravity Ideas