Why Do Our Relatives Believe Wild Conspiracy Theories?
For many of us, the holiday dinner conversation was more bizarre than usual this year. Some family members told wild tales of violent ‘liberal mobs.’ Others detailed the horrors done to children by ‘conservative Nazis.’ And conspiracy theories long-disproven are making a comeback. Some topics are so ludicrous that it is hard to tell when they are meant to be comical. Was your millennial cousin serious when he professed himself to be a ‘flat earther?’ Didn’t Aristotle put that idea to rest back in the B.C.? Was everyone laughing with you or at you when you suggested that photos taken from space proved Earth to be round? Is everyone losing their grip on reality? Or are we all playing with a collective schadenfreude as our ability to truly know anything becomes shockingly feeble. ‘Bizarre’ is an apt word for the current scenario — around our family dinner tables and around the country.
How did we get here?
Your relatives are not completely to blame for the national conversation taking a turn toward the strange. As a social scientist tracking human behavior through data, I see the current disintegration of truth as the culmination of a years-long perfect storm. It started when the Internet democratized publishing quicker than media companies (the fourth estate) could adapt. And the problem has only compounded as people have started using social media platforms as their news sources. Throw in intentional disinformation intending to muddy the waters, and it is no surprise that we have lost a consensus on truth. Our family members are just the latest victims of this storm.
Before the internet democratized publishing, most of the information we all read went through a rigorous editorial process. It was checked for structure, clarity, the expertise of sources, veracity of facts, and bias. Because of this, we could read news and magazines with some confidence that the content was accurate, researched, and reasonably unbiased.
But today, a writer (or bot) with no education or publishing experience can write and publish anything with no concern for or knowledge of editorial processes. Readers have no way of knowing if anyone confirmed sources or facts. The writer’s goal might be to spread lies designed to undermine a political opponent, sell something, or further some hidden agenda. There are ways to evaluate articles for these things. But few people know them because we’ve always trusted the editorial process to an independent fourth estate.
The complete breakdown in the editorial process is only the beginning of the mess, though.
The Backfire Effect
After misinformation is published, it spreads — like wildfire. Lies, unfortunately, spread faster than truths, especially if they are sensational. Headlines that spark anger, fear, or other strong emotions get people’s attention, which is why the news media — desperate to maintain readership in a world where no one else follows the rules — write sensational headlines. Less reputable sites go to greater lengths to get you to click on a story. And suddenly nearly every publisher is (at least a bit) caught up in the frantic push to get readers to click and share biased and outrageous content and emotion-based narratives. That’s when the real trouble starts.
Once an idea gets in your mind, you lose track of its source. If that idea lodges when you are forming an opinion, it can take root and grow into a belief. And once an idea has grown into a fully formed belief, you will defend it against ideas — and people — that attack it. Psychologists call this “the backfire effect.” If you share your belief with others, you become a part of a tribe and the benefits you gain — social, personal, and sometimes financial — cement the belief even further.
Social media compounds the problem by creating an echo chamber where it is a simple matter to block anything that opposes this belief and allow in only notions that support it. All you have to do to create this tribal, idea-safe zone is to ‘like’ concepts you agree with and block anyone who makes you angry. The social media sites you use automatically take care of the rest.
Once falsehoods take root in people’s minds, they are difficult to unseat. So it is important to stop misinformation at its source. When people believe things that are wrong, disproven, and clearly designed to be divisive — and these beliefs become part of their worldview — it can lead to extreme conflict.
None of this is completely new. Untruth has been with us throughout human history, sometimes intentional, sometimes accidental. The fourth estate was our culture’s working solution. But not anymore. It is a Wild West of dubious information out there.
Now, The feelings of gaslighting we experienced at our holiday dinners are happening on a global scale. One millennial clinging to a long-disproven theory may seem like a small thing, even a funny anecdote. But his existence is a symptom of a serious problem across society.
At Goodly Labs, we are working on a solution to what we see as one of the most fundamental challenges to basic human cooperation. We are building tools allowing whole societies to rebuild trust in one another again. And we are not alone. There are many of us out here, working hard to create solutions to this enormous problem.
While our allies tackle the problem of doctored photographs and video, and other teams attempt to teach writers, publishers, and citizens to identify falsehoods and vet facts, we are building a way for anyone to annotate the news so that readers can see — before they even bother to click a headline — if what they are about to read is biased, fact-checked, and sourced. It is called Public Editor.
To our volunteer contributors, Public Editor is a fun puzzle, much like doing the crossword. It breaks articles into pieces and disperses those pieces to Internet-based volunteers around the world. When one of our volunteers logs in, she answers a few questions about a small piece of an article, chosen at random for her. It takes about ten minutes and is a bit like a middle-school reading comprehension quiz. She can do this every day, like brushing her teeth, or only when the mood strikes. Once she is done — and thousands of other volunteers like her are also done — Public Editor stitches all their assessments together to produce a rich set of labels that are displayed over each article, pointing out mistakes and their severity.
For a newsreader using our free browser extension, all these labels are like having 40 friends from across the political spectrum checking an article for you and offering their consensus opinion about where and how it is misleading. Public Editor also uses its crowd-sourced annotations to generate an overall credibility score (0 to 100) that readers can quickly check before going deeper into an article. The big content platforms like Google, Facebook, and Twitter can also use these credibility scores in their algorithms, demoting or promoting content based on quality, thereby slowing the spread of misinformation through social media.
So, next time you find yourself around a dinner table where the conversation has taken a turn toward the cooky, rest assured. You won’t be alone in caring about what’s true and what’s false as so many of our friends and family become victims of misinformation. Better yet, there’s no need to shout into the void or gnash your teeth. You can open your laptop and do something easier than a sudoku puzzle that makes a big difference. It might save your sanity. And you just might help save us all from the global version of a crazy holiday dinner argument.
Ready to be a part of the solution? We need volunteers. Sign up at PublicEditor.io