People don’t want help finding the truth? Not so fast.

“You can’t correct fake news!” “People don’t care!” Research findings about misinformation this week have made for somewhat depressing reading, especially if we’re too glib in our summaries.

But I would urge anyone who cares about news and truth not to fall into a pit of despair, just yet. Misinformation science is still very young. Many of the results we saw this week were first attempts to answer very large and vexing questions.

That means these results are far from conclusive. And even if we could be certain that these findings are correct, there are still many unanswered questions. Within these gaps may yet lie the path towards addressing our modern information conundrum.

What do these latest findings tell us?

A study by Gordon Pennycook and David Rand found that tags on Facebook showing stories as “disputed by 3rd party fact-checkers” — one of the platform’s most prominent anti-misinformation initiatives — only had a modest effect on readers’ evaluation of the stories’ accuracy. At the same time, the study revealed a serious unintended consequence: When some fake stories were tagged as disputed, participants were more likely to believe the un-tagged stories were true (compared with participants in a control condition, who saw no tagging whatsoever). This side effect was particularly strong among Trump supporters and 18- to 25-year-olds.

It seems that since flags are applied to some, but not all, stories, many people then infer that any story without a flag is true.

So far, so worrying. Still, as Politico points out, there are a number of reasons not to throw our hands up in defeat. The “disputed” tag is just one part of Facebook’s efforts, which also include adding fact-checks to “related articles” modules, shutting down fake accounts (not just Russian ones) and reducing financial incentives for fake news and other misinformation.

Then there’s news literacy. A Facebook spokesman said the platform’s efforts also include “helping people make more informed choices about the news they read, trust and share” — but Facebook has done little such work. (Back in April, they seeded people’s news PSAs whose tips could be a lot more helpful than they are.)

In fact, so far, very few actors have made any serious attempts to improve adult news literacy. This is frustrating, because the work is crucial. After all, if people so easily assume that stories without a disputed tag must be true, that demonstrates a major gap in understanding — not just in the very specific matter of what fact-checkers are capable of, but in how every member of our citizenry approaches every article they read. The default reading mode needs to be something like “open, but critical” — not “swallowing whole unless I’m told otherwise,” and also not “believing nothing.” (Disclosure: I’m the co-creator of Post Facto, a game designed to teach adults news literacy skills.)

You can grumble that it’s a lost cause, or you can see this as one of the biggest social challenges of our generation, and get to brainstorming solutions. I choose the latter.

Do people want news literacy training?

Oh, but what’s this? A Pew study, also out this week, uses survey data from 3,015 adults to outline a five-fold typology of the modern American, based primarily on a) their level of trust in information sources and b) their interest in learning, especially about digital skills. Pew estimates that about half of U.S. adults fall into the Doubtful and Wary typologies — and these people are relatively uninterested in building their news literacy skills.

What’s more, another 16% of Americans fall in the Confident group — and as their name implies, these highly educated folks don’t feel they need news literacy training. They could well be wrong, given the power of the Dunning-Krueger effect. And while I’m not aware of studies that specifically investigate the correlation between educational level and misperceptions, there’s evidence that high levels of numeracy and science literacy are no protection against misinformation about science.

To quote Eleanor Shellstrop, “Oh, fork.”

Credit: goodplacegifs.tumblr.com / NBC Universal

But wait! What does the data really say? How did the researchers ask about people’s appetite for news literacy improvement? Well, respondents were asked:

  • How much they think their decision-making would be helped by training on how to use online resources to find trustworthy information
  • How much they think their decision-making would be helped by training to help them be more confident in using computers, smartphones and the internet
  • Whether they “occasionally” or “frequently” need help finding the information they need online.

Yes, I certainly am disappointed that so many Americans answered “not much.” But I’d argue that this handful of questions cannot reveal the complexity of Americans’ relationship with their information needs and challenges. Not only are the questions few in number, but they’re also pretty abstract — it’s possible they just didn’t mean much to the respondents, or really make them think about their particular information challenges.

It’s possible, too, that the word “training” is pretty off-putting. How much would the answer change if we asked whether they’d like “help”? Or “resources”? There are probably people who can’t commit to a formal training program, but would welcome the occasional assistance. Understanding that will help news literacy efforts to meet these people where they are.

Here’s some more questions we could ask:

  • Do you always know whether the information you read is true?
  • How often would you say you confidently make that determination?
  • What information would help you to make that determination?
  • How much do you think you’d be helped by more information on news organizations’ practices? On what makes reporting good or bad? By easier access to databases of facts? By help evaluating whether a story is true?… and so on.

A news literacy research agenda

A research agenda for investigating these questions would probably combine quantitative and qualitative approaches. Focus groups could help us understand what questions resonate with people, and what assumptions we’ve made. Surveys and experiments will help to quantify the attitudes that people hold.

By noting the limitations of Pew’s question set, I mean no criticism of their methodology. The researchers were trying to understand the interaction of a number of psychological and sociological factors, without writing a survey so long that participants would tend to drop out.

What I am suggesting, though, is that this is just the beginning of what must be an intense investigation of how people view their own information literacy skills and gaps. In the past decade or so, we’ve started to build a picture of how people process misinformation and corrections. This work is vital, but if we are to craft solutions, we need to also understand people’s attitudes towards their own information literacy, or lack thereof.

That’s why, while I sometimes find myself catching my breath at the sheer scale of the problem before us, I also see a reason to roll up my sleeves and pour that concern into the serious work of research. There is no time to lose.

Cross-posted at tamarwilner.com

Edited intro Sept. 15, to make clear that those quotes aren’t good summaries of the findings.