Loading…
0:00
20:42

In the past, censorship worked by blocking the flow of information,” wrote author Yuval Noah Harari in Homo Deus. This is unsurprising. Historically, for any tyrannical regime to be successful, whether that’s American slavery or British colonization the world over, it had to block access to education and news from its oppressed.

“In the twenty-first century,” Harari writes, “censorship works by flooding people with irrelevant information.”

In the months leading up to the 2016 presidential election, a Gallup poll found that Americans’ overall trust in news was at its lowest since Gallup started the survey shortly after Watergate. “Fake news” was dominating minds and headlines.

Alex and Boris are fake names of fake newsmen who capitalized on this phenomenon. They copied articles that contained inflammatory and outright false information onto their own websites, changed the headlines, and set them loose on Facebook. As Alex told the Guardian, his “primary goal [was] to influence American policy, especially politics.” For Boris, it was just about money. According to a profile on Wired, Boris made $16,000 from just one of his false news websites in four months. The average worker in his town makes $371 per month.

There’s blatantly false news propagated with malicious intent by foreign agents or by men like Boris for hard cash. Then there’s news that is poorly reported, news that is taken out of context, news that is overly biased, and news that a reader simply doesn’t like.

These distinct problems are all given the same label of “fake news.” Together, they erode trust in the media and have real-world consequences. For news publishers or platforms like Google and Facebook to combat these problems, it’s imperative that they first be seen in all their complexity.

What makes false claims so darn shareable and facts so hard to believe? And what part do we, as readers, play in the problem?

The Problem Was Never Purely Algorithmic or Journalistic—It Was Us

Megan Phelps-Roper was five years old the first time she stood in a picket line with her family. She left her dolls in the minivan and picked up a sign she couldn’t yet read: “Gays are worthy of death.”

Megan was born into a family who followed the extremist teachings of the Westboro Baptist Church. As Westboro started to gain international notoriety, Megan spent two decades traveling with its members, picketing anything that was linked to the church’s definition of blasphemous, from baseball games to funerals. In her home, Megan says, “life was framed as an epic spiritual battle between good and evil. The good was my church and its members, and the evil was everyone else.”

Cognitive dissonance was first studied in the 1950s by social psychologist Leon Festinger in an observational study of a doomsday cult. Festinger describes cognitive dissonance as the discomfort of holding two thoughts in mind that are in conflict with each other. In the face of contradictory evidence, beliefs become even more persistent.

Cults are an extreme example of an everyday phenomenon. If you’ve ever found yourself unable to stop bingeing YouTube videos even though you should be working toward a project deadline, you’ve experienced cognitive dissonance. The magnitude of cognitive dissonance is directly tied to the importance that you place on the belief being contradicted.

Think about it this way: There are plenty of things about which you haven’t made up your mind, like your choice of career. Then there are beliefs that are core to your identity, like your morals, religious beliefs, and, for many Americans, political affiliation. Under questioning by someone else or even that little voice in your head, your grip on your belief becomes tighter.

Megan sees similarities between the current state of political discourse and the church she ultimately walked away from at age 26: “We’ve broken the world into us and them, only emerging from our bunkers long enough to lob rhetorical grenades at the other camp. We write off half the country as out-of-touch liberal elites or racist misogynist bullies. No nuance, no complexity, no humanity.”

We’re living in a time of hyperpartisanship, where political affiliation is seen as an integral part of personal identity. Party polarization is the highest it’s been in decades (or since 1879, depending on whether you prefer research from Pew or political scientists Keith Poole and Howard Rosenthal).

If we do construct metaphorical bunkers from our core beliefs to keep ourselves safe and separate, it is rare that one of us would leave it and never return again. Mostly, we’re content with simply furnishing our bunker with a radio and tuning in to chatter from the outside world every now and then. In a 1967 study, participants were given access to deliberately staticky recordings of both sides of contentious arguments: pro- and anti-Christianity, and smoking does or does not lead to cancer. Researchers observed that when participants listened to information that supported their existing beliefs, they increased the volume and literally tuned in. When the information conflicted with existing beliefs, they decreased the volume and tuned right out.

Eli Pariser, founder of Upworthy, coined the term “filter bubble” to describe the manner in which people form into ideological tribes online, a natural tendency that is exacerbated by algorithms. An added danger of this is that no matter how obscure, any ideology is likely to find a tribe in the vastness of the internet. “More than fake news,” Pariser said in a recent interview, “ask whether the truth is being told loudly enough.” But no matter how loudly the truth is told doesn’t mean it’ll be heard and believed.

Online, tuning out a contradictory belief is as simple as scrolling right past it, hitting delete, or only following individuals whom you trust share your views. And when contradicting points of view happen to seep into your media diet, they are dismissed.

Rachel Botsman, a professor at Oxford University, researches trust. In her upcoming book, Who Can You Trust?, she argues that there has been a huge shift in the past century in how and whom people trust. Prior to the 1900s, a person was likely to trust a reputed member of their local community. Then, as nationalized institutions took off, people began to place trust in brands. In the internet age, where the rules of conduct are still very much evolving, Botsman argues that trust is moving from institutional, brand-based trust back to individual trust. Now we get into cars with strangers, book rooms in their homes, seek funding from them, and more, all based on individual ratings. A 2016 study from Gallup corroborates Botsman’s findings. In general, Americans are becoming less trusting of the majority of institutions — banks, Congress, organized religion, or news.

The Media Insight Project digs a little deeper. In a 2017 study, researchers observed the behavior of participants as they scoured through links on social networks and chose what to click on, read, and share. The study concluded that online, people are more likely to trust news based on the individual who shared it rather than the institution that produced it.

The problem that the news industry faced was never purely algorithmic or journalistic. The problem was also…us.

Despite all this, one individual who gained a significant amount of trust from news readers in the past election season is Washington Post journalist David Fahrentold. He posted on Twitter his copiously handwritten lists of more than 500 charities that he had called to report a story on the lack of donations by the Trump Foundation. His reporting won a Pulitzer, and perhaps more interesting, the tedious and meticulous process of the reporting he shared won him the trust of a new readership.

There is perhaps a concept to be experimented with where a news publisher builds a news literacy project containing explainers of how some of its best reporting was done — how was the information collected, verified, and presented; who did it; and just how meticulous was their process? Perhaps bringing transparency to the process of journalism would show a publishing institution as a group of trustworthy individuals.

I Trust My News, Not “The News” and Certainly Not Your News

In the days following the presidential inauguration in January 2017, the White House and Trump himself vehemently claimed that the event had drawn the largest crowd ever, larger even that the crowd that came to see Barack Obama inaugurated in 2009. The media was admonished for reporting that this claim simply wasn’t true. In fact, retorted the media, they could prove it wasn’t true with indisputable photographic evidence.

In the following days, the Washington Post set out to test just how indisputable that photographic evidence really was. The Post showed 1,388 respondents’ unlabeled images of Trump’s inauguration crowd and Obama’s clearly larger crowd side-by-side. Half the respondents were asked which president they believed had a bigger crowd at their inauguration. Perhaps unsurprisingly, respondents answered in accordance with their political affiliation, with more than 40 percent of Trump voters responding incorrectly that the bigger crowd was at Trump’s inauguration.

The other half of the respondents were asked a simpler question: “Which photo has more people in it?” More than 15 percent of Trump voters, realizing which photo was for Trump’s inauguration, claimed that the larger crowd was in the wrong photograph. Only 2 percent of Clinton voters and 3 percent of nonvoters made that mistake.

Tribalism is not a phenomenon that started with Trump. Motivated reasoning, the unconscious tendency to try to arrange information to fit your desired goals or beliefs, affects not only conservatives.

Pew released a research study in 2014 that left researchers puzzled. The number of Republicans who said they did not believe in evolution had increased dramatically (up 9 percent) in just four years, but there was hardly any shift in the number for Democrats and Independents. Coverage in the liberal media factions concluded that this meant conservatives were becoming dangerously anti-science, with Paul Krugman of the New York Times writing, “Republicans are being driven to identify in all ways with their tribe — and the tribal belief system is dominated by anti-science fundamentalists.” But there was never any conclusive evidence or context to support the anti-science claim in the original Pew report.

As Dan Kahan, Yale Law School professor and social scientist, impatiently explained in a blog post, this reaction was an example of confirmation bias on the part of liberals. Second, there is no correlation between believing in evolution and having an understanding of evolution.

You cannot look outside your window and see the climate changing. In your 80 years of life, it’s unlikely you’ll see a species evolve noticeably. These are concepts of which you have little firsthand experience and on which you exercise little control. You rely on the expertise of others whom you trust to understand the concepts — a problem here, again, is that people are more likely to trust experts with whom they agree. But saying that you believe in these concepts is a different matter altogether. Belief becomes a part of your identity, Kahan says, a kind of mental shortcut for saying that you belong to a certain team.

A person can understand evolution but say they don’t believe in it because of their greater affiliation to their religion. Or a person can say they believe a picture has more people in it when it doesn’t because their support for their party trumps their need for accuracy in a comparatively meaningless survey.

This becomes even more interesting when we look at this widely circulated and alarming statistic from Gallup: Only 32 percent of Americans say they trust the news.

A report from the American Press Institute digs in deeper. According to that report, only 8 percent of Republicans said they have “a lot of trust” in the news media, but this jumped to 27 percent when asked if they have a lot of trust in their news media. And this trend continues: While 45 percent of Democrats believe that the news media deals fairly with both sides, only 15 percent of Republicans do. These numbers jump to 56 percent and 45 percent for the news media they use most frequently.

In other words, I might trust my news media but not your news media or the news media.

It is perhaps the case that news, real or false, is automatically suspected of bias, mistrusted, or dismissed simply because it comes from across party lines. The reader’s tendency toward motivated cognition makes it hard to accept anything from such sources.

The uniformity in the design of social networks and the lightning speed of distribution doesn’t help either. All links look the same on a news feed, and there is some transference of credibility from the well-sourced news article to the blatantly false news article. Take, for example, Facebook’s effort to add a “disputed” label to links that contain false news as verified by independent sources like PolitiFact and Snopes. A study by Gordon Pennycook on the increase in perceived accuracy of false news with repeated exposure showed that adding a “disputed by third parties” label like Facebook’s made no difference whatsoever to its perceived credibility.

It’ll always be faster to generate false news or spread rumors than it is to verify them. What’s frustrating is that even mentioning them on legit news publishers with the purpose of correcting them can backfire.

When Obama was trying to pass the Affordable Care Act in 2009, there was a political rumor that the ACA contained provisions to withhold care from certain citizens. The elderly would have to appear before a panel who would discuss and determine their end-of-life health care options, such as euthanasia.

Within weeks, the 24/7 media coverage of “death panels” increased the rumor’s fluency. In breathlessly repeating the rumor without effective correction, both the liberal and conservative media had given it a new lease on life. If a lie gets repeated often enough, it’s likely to be misremembered as truth, and this falsehood was so sticky that even Obama’s direct refutal of it at a town hall meeting did nothing to change the minds of those who were predisposed to believing it.

There is, however, hope for harnessing the power of partisanship for good. In 2015, MIT professor Adam Berinsky researched the curious effect of repeating a lie until it is believed to be true, also known as the illusory truth effect, which made the death panel rumor so hard to shake. “When I paired the death panel story with a quote debunking the rumor from a Republican who helped draft the end-of-life provisions, respondents — Republicans and Democrats alike — were far more likely to reject the euthanasia rumor. In the real world, these types of corrections from unexpected partisan sources exist, but they are admittedly rare.”

There is perhaps no more serious design flaw in human cognition than our tendency toward motivated reasoning. While there’s no erasing that tendency, there is hope in presenting the world as less red or blue and more purple. What are the parts of a policy that see bipartisan support? Are there politicians from across the aisle who can correct false beliefs?

If Motivated Cognition Is Our Design Flaw, Then Curiosity Is Our Strength

A persistent belief among Western philosophers through the centuries was that man is, above all, a rational being. It was on this assumption that modern society, democracy, and Twitter were invented, and soon the theory revealed itself to be flawed. From the mid-20th century onward, the totally rational and totally imaginary individual died from a thousand paper cuts with cognitive research that showed humans in a less flattering light.

This is not to say that humans aren’t capable of remarkable critical thinking, the kind that helped our species discover electricity, eradicate diseases that threatened us, and fly ourselves straight out of our home planet.

But rational, individual thinkers we are not.

In a 2002 experiment, a group of Yale students were asked to self-evaluate their knowledge of how certain objects work. The objects in the experiment are familiar to the majority of us in the developed and developing worlds: a zipper, a piano key, a flushing toilet, a cylinder lock, a speedometer, and so on. The students reported that they believed they had excellent understanding of how these objects work. As you hit the accelerator, the needle moves up on the speedometer; as you hit the brakes, the needle moves down.

Then the students were asked to describe in great detail the mechanics of those objects. They failed to do so. As it turns out, there is a lot more to how a speedometer works than just that; it’s only because we see it daily from the corner of our eye that we assume we know all there is to it. Researchers Leo Rozenblitz and Frank Keil called this “the illusion of explanatory depth.” They also noted an interesting pattern in the experience of participants: genuine surprise and new humility at the limitations of their own knowledge.

People are experts at borrowing from the collective human knowledge, treating it as their own, and adding to it where they can. From the times of the African savannah to the silicon jungle, the sum of human knowledge has increased exponentially over time, but the knowledge of the individual has remained superficial.

Neither I nor you ever think alone. The problem arises, argue Steven Sloman and Philip Fernbach, cognitive scientists and authors of The Knowledge Illusion, in political matters like foreign policy, climate change, and health care. While our opinions on these topics may be strong, our knowledge of the mechanics of these concepts is remarkably limited.

Sloman and Fernbach took the 2002 Yale experiment one step further and asked people to rate their knowledge of political policies they support or oppose, like single-payer health care or foreign policy with Iran. Then they asked people to explain the mechanics of these policies and how, not why, they would work. And they replicated the Yale study’s results: “People often believe they understand what is meant by well-worn political terms like the ‘flat tax,’ ‘sanctions on Iran,’ or ‘cap and trade’ — even when they don’t.”

That’s not a shocker, but the really interesting thing is that after the subjects were asked to explain the policies — explain, not justify — they realized the gap in their knowledge and became more moderate in their views.

More recently, Yale researcher Dan Kahan and his colleagues set out to map the levels of plain old scientific curiosity, not knowledge, among the population, using an extensive survey. There are people on both sides of the political aisle who engage open-mindedly with information just for the awe and surprise that comes from learning something new.

But Kahan also included some survey questions that he expected would lead to politically polarizing answers, such as, “How much risk do you think global warming poses to human health, safety, or prosperity?” Instead of finding yet more evidence that people rely on motivated reasoning in these situations, something curious happened.

For the more curious-minded liberals and conservatives, responses appeared to converge — global warming posed high risk. Conservatives still believed global warming posed lower risk than the liberals believed, but even so, their answers were converging in a time of extreme divergence of views. Kahan then showed respondents sets of politically charged news headlines—some pro-human-caused global warming, some against. Rather than choosing whatever corresponded with their existing beliefs, curious liberals and conservatives just picked the one that surprised them most.

There was in intriguing thread in both Sloman and Fernbach’s and Kahan and colleagues’ research studies.

Even though these studies were experimental, they showed a crack in humans’ agonizing tendency toward politically motivated reasoning. With less pontification on policies and more explanation of how they may be executed, people moderated their views. With less scientific evidence and more fostering of scientific curiosity, people converged in their views.

Here there’s hope for how the media can present news to its audience. More curiosity, less pontification. It seems almost naive to believe that this could help readers find trust in each other’s news. The trust issues that hurt the news industry today have some basis in human psyche. Maybe the solutions do, too.

The problems that the news industry faces today are urgent and daunting. It must not be forgotten, however, that the darker tendencies in human behavior that have led to these problems are not new or exclusive to this time. They are historic and instinctive and must be understood.

Only then will the news industry have any hope of rebuilding reader trust by thoughtfully experimenting with the ways it covers and distributes news.

This series explores how product thinking can help the news industry address the concerns of today’s readers. The first part covered the tricky problem of dwindling trust and hyper-partisanship. The next part will discuss the presentation and delivery of relevant news to underserved communities.