Eleven takeaways from Knight-supported research on restoring trust in news

As part of its effort to explore the root causes of the current crisis in trust in the media, the Knight Foundation is commissioning a continuing series of white papers from academics and experts. Here’s what we’ve learned so far. Editor’s note: We are updating this piece as we publish new white papers. Last edited July 12, 2018.

Institutional trust is down across the board in American society (with a few notable exceptions, such as the military). But trust in the media is particularly troubling, plummeting from 72 percent in 1976 to 32 percent in 2017. There are many reasons for this decline in trust, writes Yuval Levin, but one of the problems is that the rise of social media has pushed journalists to focus on developing personal brands:

“This makes it difficult to distinguish the work of individuals from the work of institutions, and increasingly turns journalistic institutions into platforms for the personal brands of individual reporters. When the journalists’ work appears indistinguishable from grabbing a megaphone — they become harder to trust. They aren’t really asking for trust.”

[S]ocial media in particular have turned many journalists from participants in the work of institutions to managers of personal brands who carefully tend to their own public presence and presentation.

— from “American journalism as an institution,” by Yuval Levin.

Humans are biologically wired to respond positively to information that supports their own beliefs and negatively to information that contradicts them, writes Peter Wehner. He also points out that beliefs are often tied up with personal identity, and that changing beliefs may put a people at risk of rejection from their communities.

In a sense, people see what they want to see, in order to believe what they want to believe. In addition, everyone likes to be proven right, and changing their views is an admission that they were wrong, or at least had an incomplete understanding of an issue.

— from “Why people are wired to believe what they want to believe,” by Peter Wehner

As people increasingly rely on social media platforms to get information, they are at the mercy of opaque algorithms they don’t control, write Samantha Bradshaw and Philip Howard. These algorithms are optimized to maximize advertising dollars for social media platforms. Since people tend to share information that provokes strong emotions and confirm what they already believe, “The speed and scale at which content “goes viral” grows exponentially, regardless of whether or not the information it contains is true.”

[T]he filtering of information that takes place on social media is not the product of the conscious choices of human users. Rather, what we see on our social media feeds and in our Google search results is the product of calculations made by powerful algorithms and machine learning models.

from “Three reasons junk news spreads so quickly across social media,” by Samantha Bradshaw and Philip Howard.

The conventional wisdom these days is that we’re all trapped in filter bubbles or echo chambers, listening only to people like ourselves. But, write Andrew Guess, Benjamin Lyons, Brendan Nyhan and Jason Reifler, the reality is more nuanced. While people tend to self-report a filtered media diet, other data show that many people do not engage in political information much at all, instead choosing entertainment over news. But, that doesn’t mean there is no problem. “[P]olarized media consumption is much more common among an important segment of the public — the most politically active, knowledgeable, and engaged. This group is disproportionately visible online and in public life.”

A deep dive into the academic literature tells us that the “echo chambers” narrative captures, at most, the experience of a minority of the public. Indeed, this claim itself has ironically been amplified and distorted in a kind of echo chamber effect.

— from “Avoiding the echo chamber about echo chambers,” Andrew Guess, Benjamin Lyons, Brendan Nyhan and Jason Reifler

People may be predisposed to hold on to beliefs that are agreeable to them, but, they also are more likely to believe a correction if it comes from a source they think would promote an opposing opinion. However, offering a simple correction alone rarely works. Finally, even when people accept corrections, other studies show a taint persists — called a “belief echo” — by which the false belief continues to affect attitudes.

[P]eople are more likely to believe a correction if it comes from a source for whom it runs counter to personal and political interests.

— from “Why inaccurate political information spreads,” by Jonathan Ladd and Alex Podkul.

There’s no one-size-fits-all way to communicate complicated information, write Erika Franklin Fowler and Natalie Jomini Stroud, but science can help. Different goals require different types of information. If we know people don’t have the time or motivation to pay attention to in-depth information on all issues, then we might encourage the use of endorsements or other cues from trusted sources. If we seek to increase participation, it is helpful to encourage citizens to join groups and to consume like-minded information, but if we want to encourage empathy or deliberation, we need more balanced information that compassionately represents others. Sometimes people learn best through experiences, or getting issue-oriented information from organizations they trust.”

Finding strategies for artfully conveying complex information in ways that break down attention and trust-based barriers represents the most important challenge in our politically tumultuous time. But it’s one we can meet, and science can help.

from “Let’s face it, communicating facts can be difficult,” by Erika Franklin Fowler and Natalie Jomini Stroud.

Americans don’t know things because they can’t be bothered to know them, the conventional wisdom says. But, lack of motivation isn’t the whole story. News stories often cover breaking news without contextual information that supply basic facts about a particular issue, whether it’s the federal budget or climate change. Experiments show that people can be open to information about complex subjects if it’s provided within context of a news report.

Emily Thorson, a professor of political science, conducted an experiment where she provided two versions of a news story about the federal budget, one with contextual information provided in a box and another without. She found that people who read the version of the article with the contextual message, when questioned, reported more accurate information about the budget than those who did not.

[C]ontextual fact-checks can be remarkably successful in correcting misperceptions. In addition, compared to fact-checks of politicians and candidates, they run a smaller risk of creating a partisan backlash.

8. So does a three-dimensional filter map!

We need a way to think about information that goes beyond “agreeable” or “disagreeable.” “I object more fundamentally to the notion that all mass affirmation is always bad, and its corollary, that unwanted or unplanned encounters are always good. In true academic fashion, I will argue that it depends: each can sometimes be good and sometimes bad,” writes Deen Freelon.

Freelon proposes a different way to think about the information that we consume: a three-dimensional “filter map”:

  • Agreeableness. This is the degree to which information fits into our preexisting opinions.
  • Truth value. This is simply whether a given message is true or false.
  • Legitimacy. While difficult to define, this “usually reduces to an opinion’s adherence to widely accepted ethical norms like freedom, equality, fairness and human rights. While there is considerable debate around what kinds of opinions comport with such principles, it can be safely said that crimes like racial discrimination, torture and arbitrary detention definitively violate them.”
Ideally, our media filters would optimize for truth and legitimacy, ensuring that both agreeable and disagreeable content and sources are included (the map’s four blue cells)….By the same token, false and illegitimate messages would be excluded, again regardless of agreeableness (the four white cells). The conceptual leap I make here is from considering disagreeableness as a virtue in itself, to distinguishing between more and less desirable types of disagreeable content. There are many claims and opinions we should rightly dismiss out of hand, but there are others we should entertain despite disagreeing with them.

from “Filter bubbles are only part of the problem,” by Deen Freelon.

“Misinformation isn’t new,” write Danielle Allen and Justin Pottle, “and our problem is not, fundamentally, one of intermingling of fact and fiction, not a confusion of propaganda with hard-bitten, fact-based policy. Instead, it’s what we now call polarization, but what founding father James Madison referred to as “faction.”

Madison wasn’t concerned about disagreement in and of itself. Rather, he thought about structural ways to bring people together despite those differences. He advocated for a large republic with a relatively small legislature in which each representative represented a wider variety of groups and individuals.

Thanks to societal challenges such as the disappearance of many local and regional newspapers, a growing concentration of people living in ideological groupings, the loss of credibility of many colleges and universities among conservatives have all contributed to undermine “the institutions whose job it is to broker the debate within the citizenry about what different people see as credible or incredible.”

Allen and Pottle suggest a number of strategies to bring Americans together in united experiences, such as instituting a national service requirement, establishing geographic lotteries for elite learning institutions, and reviving local journalism with philanthropy.

Our problem is the breakdown of institutions that facilitate valid social learning across diverse, disagreeing groups. Historically, the institutions that facilitate social learning, for example newspapers, schools, colleges and universities, have served also as anchors for shared norms of inquiry, including for the aforementioned commitment to honesty, for ideologically diverse populations.

from “Why James Madison would say our real problem is not misinformation,” by Danielle Allen and Justin Pottle.

The scholarly literature on trust “shows that there are institutional factors that create trust in society and government. This is fortunate, because it means there is room for optimism despite the widely known fact that trust in both society and government is declining in the United States,” writes Kevin Vallier.

Social trust and political trust are closely related, he explains, and can be part of a “positive feedback loop of trust, where political and social trust help contribute to institutions, and institutions contribute further to trust, even when diverse people disagree.”

What would help? Reducing ethnic and racial segregation, encouraging people to reach across the divide. Trust also increases when people feel like they are getting a fair deal. So does protecting their individual rights, reducing economic and ethnic segregation, enforcing the rule of law and reducing corruption, as well as using markets and social insurance to make sure the economy works for all.

Do you need to share strangers’ ideology in order to trust them? Fortunately, no.

from “Social and Political Trust: Concepts, Causes, and Consequences,” by Kevin Vallier.

One commonly held theory is that it’s the increase in diversity that’s ailing Western countries. This makes some intuitive sense. After all, diversity implies difference — different wants, different needs, and different interests.

But there’s another train of thought with a respected pedigree in political philosophy — it includes James Madison, John Stuart Mill and more — the idea that difference makes a liberal society stronger. When people are different, they argue their point of view. A liberal state provides a structured way to do this, as well as ways to experiment with new ideas and then debate them. However, the liberal ideal of a contest of ideas and values requires actual engagement between their proponents. That is, the ideal of productive contestation relies on the notion of a public square in which all comers can and do engage with others, challenging their ideas and being challenged in turn. It can’t work if representatives of competing ideas don’t show up.

So, what to do? We need not to worry about the fact of diversity, but rather how we encourage diverse people to interact. Some ideas: 1. Companies like Facebook can show people not just what they like, but also what people who are not like them like; 2. encourage integration in communities by altering the Home Mortgage Interest Deduction so that deduction varies depending on how diverse a neighborhood is; 3. end the use of local property taxes to pay for public schools, and build school districts in such a way that they cross boundaries of racial segregation.

We need spaces where different factions can engage with each other seriously–where the marketplace of ideas is allowed to operate.

from “Diversity isn’t what divides us. Division is what divides us,” by Ryan Muldoon.


We welcome you to dig into these white papers and comment. We collect comments and provide them to commissioners on the Knight Commission on Trust, Media and Democracy, to inform them as they develop a report and recommendations on how we can improve our democracy.

Like what you read? Give Nancy Watzman a round of applause.

From a quick cheer to a standing ovation, clap to show how much you enjoyed this story.