Bias, Counterevidence, and Arguing on the Internet

Why people hang on to false beliefs and why simply telling people the truth (usually) doesn’t work.

Polina Stepanova
7 min readFeb 20, 2019

I wonder how much of the 6 or so hours the average person spends on the internet in the western world is taken up by arguing.

I should acknowledge that most of it is done by trolls and people throwing out ridiculous text snippets under the guise of anonymity (probably). Yet still one incident sticks out in my mind. I was having an argument with someone on Reddit (who was probably in the above categories of internet persona), where they claimed that poverty didn’t exist in the UK. It made me wonder how a viewpoint like that could form, all under the massive weight of counter-evidence.

I started looking into how beliefs form, and how, in the face of most sources proving them wrong, they stick even harder. I assumed it would be the brain being irrational… but I hadn’t expected it to be quite so bad.

Turns out we knew this 150 years ago.

Meet John Stuart Mill, proponent of utilitarianism, caller for women’s suffrage, supporter of colonialism (mutli-faceted guy, read about him more here).

He stated the following:

So long as opinion is strongly rooted in the feelings, it gains rather than loses instability by having a preponderating weight of argument against it. For if it were accepted as a result of argument, the refutation of the argument might shake the solidity of the conviction; but when it rests solely on feeling, worse it fares in argumentative contest, the more persuaded adherents are that their feeling must have some deeper ground, which the arguments do not reach; and while the feeling remains, it is always throwing up fresh intrenchments of argument to repair any breach made in the old.

And there are so many causes tending to make the feelings connected with this subject the most intense and most deeply-rooted of those which gather round and protect old institutions and custom, that we need not wonder to find them as yet less undermined and loosened than any of the rest by the progress the great modern spiritual and social transition; nor suppose that the barbarisms to which men cling longest must be less barbarisms than those which they earlier shake off.

Those are some long sentences, John. But they reveal something as pertinent today as 150 years ago. I may be able to convince you Thomas Edison didn’t invent the lightbulb (which he didn’t) or that a peanut isn’t actually a nut (seriously), but if we were to breach a debate that had emotional significance — such as the causes of a traumatic event, or topics around religion or politics, this effect is clearly visible.

Lots of research has since been conducted in this space. Daniel McRaney for instance calls this the “Backfire Effect” and describes multiple studies which show it in action. Another brilliant article is right here on Medium which describes a 2016 study by Jonas Kaplan to shed more light on why this happens, not only how.

When your most deeply held beliefs are challenged, “many of the most biologically basic brain systems, those responsible for protecting us, kick into high gear,” Kaplan says. “These are things like the amygdala, which tells you when to be afraid, and the insula, the part of your brain that processes visceral feelings from the gut and tells you things like if you’re encountering food that’s bad for you. We have a strong motivation to defend those sacred values.”

So much of our identity is social, and so many of our social connections are founded on shared beliefs. Ultimately, Kaplan says, most people find it simpler to maintain both their established beliefs and their social circle than to consider a drastic value shift, for reasons that are as practical as they are mental.

So these views are closely tied to emotions, and moreover the people around us. Our beliefs often form how we perceive ourselves and others, and how we function within our social circle. It makes a lot of sense as to why changing a “core” belief would be a personality shift of tectonic proportions, and why the brain itself would want to protect itself against it.

Though we have access to more information than ever before, ending up in an online bubble is also frighteningly easy.

Social media and even news feeds these days are tailor-made experiences. With the click of a button we weed out content we dislike and consume what we like. To have us consume more, we are given what we like. These algorithms can be good, but time and time again the suggestion algorithm rears its uglier head.

Ending up in a bubble is as easy as leaving Youtube on autoplay — a feature seemingly (or inadvertently?) designed to bring you more and more radical content in the direction you were leaning towards. The anti-vaccine movement is another unfortunate example of the backfire effect in practice. The published study that effectively started the movement has been proven false time and time again — doesn’t seem to stop the believers of a link between vaccines and autism. Meanwhile the US is seeing a huge outbreak of measles in part because of this inability of ours to shift our emotionally-rooted views (when it comes to children, it’s about as emotional as it can get, after all).

So if you’re asking “how many more studies and arguments will it take to change their minds?”, turns out maybe none of them will, at least not the ones most emotionally-involved.

On a daily basis you may not even be exposed to information alternative to your beliefs in your tech bubble, so the backfire effect may even be magnified as a result. I’m theorising here though, does anyone want to do a master’s thesis on it? I’ll help out.

Another thing to consider in the tech space is that we’re increasingly reliant on machines to do calculations and predictions. In criminal law, healthcare and education especially, putting our human biases into algorithms and big data can spell big, big trouble. It’s important to remember machines are only as un-biased as the data we’re putting in. I wrote about this in a little more detail here.

Well, where do we go from here? How do we fix it?

I spent a bit of time getting you worried. But I believe there’s definitely ways we can try and combat this bias.

1. See/acknowledge alternative information

It may be hard to do this and won’t apply to all beliefs held. The social media bubble may also be tighter than you think. An awareness, at the very least, that what you believe may not be true is a good place to start. From there, next time you see an article or hear someone bring forward an alternative view, have a proper read/listen.

It’s also an interesting mental exercise to think of why you have a particular belief. Was it taught to you by your parents, or at school? Did it stem from a personal experience? Did you hear someone speak about it and were convinced by their argument?

Relevant XKCD, as always.

2. Stop arguing online (where possible)

Yes I should follow my own advice here. But more often than not, after an argument online both parties just dig their heels deeper into their own ground, and both are usually left in a worse mood than when they started. Argue in person instead! In a civilised manner, naturally.

3. De-stigmatise getting things wrong

To me personally this one is very important. We’re afraid of admitting we’re wrong. It makes us vulnerable. We think it’ll make others think worse of us. And that may be true in some cases — however I believe in the long term, the benefits far outweigh the costs of admitting you’re wrong.

Julia Rohrer put this idea into reality by starting the “Loss of Confidence Project”.

It’s designed to be an academic safe space for researchers to declare for all to see that they no longer believe in the accuracy of one of their previous findings. (…) The project is timely because a large number of scientific findings have been disproven, or become more doubtful, in recent years. One high-profile effort to retest 100 psychological experiments found only 40 percent replicated with more rigorous methods.

So, clearly we get things wrong all the time. But creating a culture where admitting this, recognising that we may be wrong, not penalising others for it and focusing on iterative progress could be the key to combatting some of the biases we hold.

Julia talks about “intellectual humility” as a concept where we recognise our own shortcomings (and say them out loud) and work to fix our mistakes. I would imagine this requires a huge culture shift — but I’m ready to be part of the movement, and I hope, dear reader, that you would be too.

I leave you with one of my favourite quotes.

“The first rule of the Dunning-Kruger club is that you don’t know you’re part of the Dunning-Kruger club”.

I’ll let you google that concept on your own :)

Thanks for reading! Have a great day.

--

--

Polina Stepanova

Here to fuel my curiosity, share insight and write for fun. Passionate about storytelling, behavioural science, tech, gaming and assorted geeky matters.