Image result for family dinner argument

Why Do Our Relatives Believe Wild Conspiracy Theories?

For many of us, the holiday dinner conversation was more bizarre than usual this year. Some family members told wild tales of violent ‘liberal mobs.’ Others detailed the horrors done to children by ‘conservative Nazis.’ And conspiracy theories long-disproven are making a comeback. Some topics are so ludicrous that it is hard to tell when they are meant to be comical. Was your millennial cousin serious when he professed himself to be a ‘flat earther?’ Didn’t Aristotle put that idea to rest back in the B.C.? Was everyone laughing with you or at you when you suggested that photos taken from space proved Earth to be round? Is everyone losing their grip on reality? Or are we all playing with a collective schadenfreude as our ability to truly know anything becomes shockingly feeble. ‘Bizarre’ is an apt word for the current scenario — around our family dinner tables and around the country.

How did we get here?

Before the internet democratized publishing, most of the information we all read went through a rigorous editorial process. It was checked for structure, clarity, the expertise of sources, veracity of facts, and bias. Because of this, we could read news and magazines with some confidence that the content was accurate, researched, and reasonably unbiased.

But today, a writer (or bot) with no education or publishing experience can write and publish anything with no concern for or knowledge of editorial processes. Readers have no way of knowing if anyone confirmed sources or facts. The writer’s goal might be to spread lies designed to undermine a political opponent, sell something, or further some hidden agenda. There are ways to evaluate articles for these things. But few people know them because we’ve always trusted the editorial process to an independent fourth estate.

The complete breakdown in the editorial process is only the beginning of the mess, though.

The Backfire Effect

Once an idea gets in your mind, you lose track of its source. If that idea lodges when you are forming an opinion, it can take root and grow into a belief. And once an idea has grown into a fully formed belief, you will defend it against ideas — and people — that attack it. Psychologists call this “the backfire effect.” If you share your belief with others, you become a part of a tribe and the benefits you gain — social, personal, and sometimes financial — cement the belief even further.

Social media compounds the problem by creating an echo chamber where it is a simple matter to block anything that opposes this belief and allow in only notions that support it. All you have to do to create this tribal, idea-safe zone is to ‘like’ concepts you agree with and block anyone who makes you angry. The social media sites you use automatically take care of the rest.

Misinformation backfires

None of this is completely new. Untruth has been with us throughout human history, sometimes intentional, sometimes accidental. The fourth estate was our culture’s working solution. But not anymore. It is a Wild West of dubious information out there.

Now, The feelings of gaslighting we experienced at our holiday dinners are happening on a global scale. One millennial clinging to a long-disproven theory may seem like a small thing, even a funny anecdote. But his existence is a symptom of a serious problem across society.

At Goodly Labs, we are working on a solution to what we see as one of the most fundamental challenges to basic human cooperation. We are building tools allowing whole societies to rebuild trust in one another again. And we are not alone. There are many of us out here, working hard to create solutions to this enormous problem.

While our allies tackle the problem of doctored photographs and video, and other teams attempt to teach writers, publishers, and citizens to identify falsehoods and vet facts, we are building a way for anyone to annotate the news so that readers can see — before they even bother to click a headline — if what they are about to read is biased, fact-checked, and sourced. It is called Public Editor.

Public Editor

For a newsreader using our free browser extension, all these labels are like having 40 friends from across the political spectrum checking an article for you and offering their consensus opinion about where and how it is misleading. Public Editor also uses its crowd-sourced annotations to generate an overall credibility score (0 to 100) that readers can quickly check before going deeper into an article. The big content platforms like Google, Facebook, and Twitter can also use these credibility scores in their algorithms, demoting or promoting content based on quality, thereby slowing the spread of misinformation through social media.

So, next time you find yourself around a dinner table where the conversation has taken a turn toward the cooky, rest assured. You won’t be alone in caring about what’s true and what’s false as so many of our friends and family become victims of misinformation. Better yet, there’s no need to shout into the void or gnash your teeth. You can open your laptop and do something easier than a sudoku puzzle that makes a big difference. It might save your sanity. And you just might help save us all from the global version of a crazy holiday dinner argument.

Ready to be a part of the solution? We need volunteers. Sign up at PublicEditor.io

Public Editor

Public Editor is a community generating credibility scores…

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch

Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore

Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store