As a society, we are getting worse at distinguishing facts from lies, especially where it comes to politics. And cognitive science research shows us that partisan bias—the standard villain in this story—isn’t really why.
Take for example President Donald Trump’s tweet accusing Rep. Adam Schiff (D-Calif.) of fraudulently making up a statement and “reading it to Congress.” The president was referencing Schiff’s paraphrasing of Trump’s July call with Ukrainian president Volodymyr Zelensky.
Trump fans on Twitter showed off their jujitsu skills in support of the president, suggesting it was Schiff, not Trump, who attempted to extort damaging information from the Ukrainian government. Others wondered why Schiff isn’t himself being investigated and ultimately removed from office, even though no evidence exists that Schiff was fabricating anything.
Here are a few choice examples—none of these are from small, random accounts; these Twitter users have tens of thousands of followers.
These were all in response to Trump’s original tweet. And the split over issues of basic fact only got worse after that. Mere days after the president’s tweet, only 40 percent of surveyed Republicans thought Trump even mentioned Joe Biden in the call despite Trump himself acknowledging that he did so.
How is it that reality can seemingly shapeshift depending on who is tweeting? How is it that reality no longer seems to be able to furnish us with a shared factual starting point?
The common answer is to blame partisanship. Under that explanation, our existing beliefs shape how we view facts.
There’s certainly some truth to that. It’s called motivated reasoning, and it suggests we prefer information that confirms our existing beliefs. If we believe going in that the Democrats are orchestrating an elaborate hoax, we are likely to regard Schiff’s behavior with extreme suspicion. If instead we believe Schiff, then we’re likely to conclude Trump is the one dealing in false information.
But while partisanship goes some distance toward explaining how we got to a society teeming with alternative facts, the explanation is missing something crucial.
There’s another explanation for why we believe what we believe—a cognitive explanation. According to that explanation, our ability to recognize fake news isn’t affected so much by partisanship as it is by our ability and desire to engage in critical thinking.
A recent study in the journal Cognition argues that cognitive laziness is an instructive explanation for why we embrace fake news and accept false information as factual. The researchers, Gordon Pennycook and David G. Rand, studied the results of 800 participants who read and responded to 30 newspaper headlines. Half of those headlines were factual and half were bunk. An equal number of headlines were consistent with Democratic priors, an equal number with Republican priors, and an equal number were party-neutral. Participants were asked to evaluate the accuracy of the headlines. They were also given two tests that measured their ability and willingness to engage in cognitive reflection.
The results of the study are startling. Partisanship was a only small factor determining a person’s ability to distinguish fake news from factual news. What mattered more was the person’s analytical ability.
In other words, if someone has strong critical thinking skills, they are far more likely to reject fake news, regardless of their political affiliation or existing beliefs. What this means is people who believe fake news do so mainly because they aren’t thinking, not because they’re thinking in a partisan way. It turns out that cognitive laziness, rather than motivated reasoning, better explains our susceptibility to fake news.
As part of this study, researchers used something called the Cognitive Reflection Test (CRT), which measures the potential for reasoning as a means of solving problems. Individuals who have high CRT scores generally demonstrate high mental abilities and are less likely to be “cognitive misers”—individuals who prefer quick answers and mental shortcuts as opposed to critical analysis.
We all rely on mental heuristics much of the time, and for good reason. It makes sense to take some shortcuts in thinking, at least in certain realms. Imagine having to think fully consciously about every decision you make each time you drive to work — from how much pressure to put on the gas pedal to the exact angle you need to turn your steering wheel.
But people who believe fake news stay in a kind of cognitive autopilot. They aren’t interpreting using motivated reasoning when presented with facts and lies so much as failing to reflect in the first place. It’s not that they’re thinking but doing it in a motivated and thus distorting way; it’s that they’re not thinking at all.
Cognitive misers aren’t necessarily dumb or lacking in information. They simply prefer not to think deeply, especially about certain subjects like politics.
While we should be careful about generalizing beyond Pennycook and Rand’s study, if their work captures something real about what’s going on when we adopt fake news, then the study also gives us interesting ideas about how to combat this problem.
Think about the situation this way. The first explanation—the one blaming motivated reasoning—tells us our brains are busily working to justify our beliefs based on partisanship. A crucial component of that explanation is that our brains are actually doing work. But it just so happens to be the wrong work, so that this cognitive labor results in our believing false information.
The second explanation, the cognitive laziness model, instead suggests we are listening to news and scanning headlines with the same level of attention we give to background noise.
What makes someone a cognitive miser is an open question, but research does seem to suggest that we are generally aware that we take mental shortcuts. Maybe we fail to think because we don’t have the time, or we are too distracted, or we feel overwhelmed by the informational surplus now at our fingertips. Or maybe it’s lack of practice. Sometimes we simply aren’t motivated enough to expend precious cognitive energy, so we allow our brains to rest.
Amid all the analysis of how to counter disinformation in an age of lies and propaganda, we may be forced to first admit that the reason it works so well is because, cognitively, we’re just plain lazy. Maggie Selner recently wrote in these pages that we’re the antidote to fake news. She has a point: If we want fake news to have less sway, it’s up to us to start thinking.