For online media literacy that works, speed and ease matters

Hygiene checks of online info can be as simple & automatic as hand washing & seatbelt wearing

It is a profoundly erroneous truism, repeated by all copy-books and by eminent people when they are making speeches, that we should cultivate the habit of thinking of what we are doing. The precise opposite is the case. Civilization advances by extending the number of important operations which we can perform without thinking about them. Operations of thought are like cavalry charges in a battle — they are strictly limited in number, they require fresh horses, and must only be made at decisive moments. — Alfred North Whitehead, An Introduction to Mathematics (1911)

My younger daughter is learning to drive right now, and it’s a nerve-wracking process. You realize pretty quickly that what separates you from next week’s car wreck is not really anyone’s amazing driving skills. It’s habits. It’s scanning the crosswalk before making the turn. It’s the mirror and head check before changing lanes.

As Whitehead noted, habits are underrated because when they work they disappear. They reduce cognitive overload to manageable levels, and allow us to focus on more complex tasks. Over time, we identify those complex tasks as the activity we’re engaged in: most people who think they are good at driving cite their fluency or precision, not their automatic and unthinking use of turn signals. But take away the habits, the overload returns, and bad decisions ensue.

The debate around whether media literacy can work to reduce the impact of misinformation has often focused on whether teaching citizens to think more deeply about media can help, or if people are pursuing truth-neutral strategies for reasons having little to do with knowledge. At the Digital Polarization Initiative we’re interested in something more concrete: how do we reduce the cognitive overload associated with truth-seeking behavior? How can we develop methods that requiring less thinking when initially encountering information? And will students — once that overload is reduced — fall back less on simplifying frameworks of confirmation bias, tribalism, cynicism, and conspiratorial thinking?

We’ve worked relentlessly to streamline these techniques over the past two years. What follows are some of the techniques we teach our students as they learn to navigate the web.

Is this the right site?

Here’s an example: recent news revealed that Russian-connected entities were trying to spoof conservative sites for possible spear-phishing campaigns. So how do we know if the Hudson Institute site we are on is really the real site? Here’s our two second scan of the crosswalk:

The steps:

  • Go up to the “omnibar”
  • Strip off everything after the domain name, type wikipedia and press enter
  • This generates a Google search for that URL with the Wikipedia page at the top
  • Click that link, then check in the sidebar that the URL matches.
  • Forty-nine out of fifty times it will. The fiftieth time you may have some work to do.

In this case, the URL does match. What does this look like if the site is fake? Here’s an example. A while back a site at bloomberg.ma impersonated the Bloomberg News site. If you arrived at the site via a link and found an important story, you’d want to check the site before you shared it. Here’s what that would look like:

In this case you scroll down and click the link. Checking the associated URL in Wikipedia you find it is different. It’s not the real site. If you’re lazy (which I am) you might click that Wikipedia link to get to the real site.

What is the nature of this site?

Let’s stick with the Wikipedia technique for a moment, because it’s useful for a few other questions. As an example, let’s take one that got past both a Washington Post reporter and the WaPo fact-checkers a month or so ago. Question: Is this article really by the lead singer of Green Day?

Let’s check:

Again, same process. Now, does this mean that you are 100% sure that it’s not Billie Joe that wrote that article? No — there’s a slight slight chance that maybe somehow the lead singer of Green Day wrote a —

Nah, you know what? It’s not him. Or if it is, the chances are so infinitesimal it’s not worth spending any more time on it. Find another source.

How about this site, and its searing commentary on Antifa and journalists?

Maybe you agree with this article. I don’t, but maybe you do. And that’s okay. But do you want to share from this particular site to your friends and family and co-workers? Let’s take a look!

You can dig into this if you want, and look through the numerous links in that Wikipedia page that support this description. Maybe have a little mini-forum in your head about the differences between white nationalism and white supremacy.

Or maybe — here’s a thought — find a similar article from some other site that hasn’t been called a white supremacist organization by half a dozen mainstream groups. Because no matter what you think of the article, funneling friends and family to a site that has published such sentences as “When blacks are left entirely to their own devices, Western civilization — any kind of civilization — disappears” is not ethical — or likely to put you in the best light.

Is this breaking news correct?

Here’s some breaking news.

More people than you would think believe that the blue checkmark = trustworthy. But all the blue checkmark really does is say that the person is who they say they are, that they are the person of that name and not an imposter.

Your two-second “mirror and head-check” here is going to be to always, always hover, and see what they are verified for. In this case the verification means something: this person works for CNBC.com, a legitimate news site, and she covers a relevant beat here (the White House):

But maybe you don’t know CNBC, or maybe you see this news from someone not verified, or verified but not as a reporter. How will you know whether to share this? Because you know you’re DYING to share it and you can’t wait much longer

Use our “check for other coverage” technique:

When a story is truly breaking, this is what it looks like. Our technique here is simple.

  • Select some relevant text.
  • Right-click or Cmd-click to search Google
  • When you get to Google don’t stop, click the “News” tab to get a more curated feed
  • Read and scan. Investigate more as necessary.

Scan the stories. If you want to be hypervigilant, scan for sources you recognize, and consider sharing one of the stories featuring original reporting instead clickbait rehash or unsourced rumor.

Start with the head checks

Do these techniques fail sometimes? Of course they do! Do they sort all issues into neat true and false columns? Thankfully, no.

But as with Whitehead’s operations, they conserve student effort for the larger problems. They break the relentless cycle of reaction by getting you to read less and learn more before pushing you to make judgments. Given that belief in false news may be more about cognitive laziness than tribalism, reducing the amount of effort required to make such judgments has to be central to any digital literacy effort.


Like what you read? Give Mike Caulfield a round of applause.

From a quick cheer to a standing ovation, clap to show how much you enjoyed this story.