A group of people tries to solve a mystery in an escape room. Photo by David Hoffman via Flickr (CC BY 2.0)

‘You have to hide the vegetables’ — The need for new directions in misinformation programming

chris coward
Center for an Informed Public at UW
7 min readOct 16, 2020

--

This post is an edited version of a talk the Center for an Informed Public’s Chris Coward delivered at the Washington Library Association’s annual conference on October 9, 2020.

Good afternoon. I’m Chris Coward, senior principal research scientist at the University of Washington Information School, where I am also director of the Technology & Social Change Group (TASCHA), and a co-founder of the Center for an Informed Public (CIP).

One year ago we established the CIP as a university-wide center to study misinformation and translate our research into policy, technology design, curriculum development, and public engagement. We are one of five centers across the country that received start-up funding from the Knight Foundation, and the only one that has made public libraries a prominent focus. We believe that libraries are uniquely positioned to play a critical role in combating misinformation. But how? This question motivates much of my work.

My background is in access to information, digital equity, information and digital literacy, and public libraries. Having immersed myself in misinformation research for a couple of years, I’m now at a point where I can summarize my position on libraries in the following two statements.

First, that public libraries have largely been bringing the wrong tools to the fight against misinformation. Second, that public libraries have unique assets to combat misinformation; they just need to be deployed differently.

Why this first statement? Harvard’s Connor Sullivan puts it well in his article, “Why librarians can’t fight fake news,” wherein he argues that in the effort to counteract misinformation there has been a tendency for libraries to double down on two pillars of what he calls the library faith — access to quality information and information literacy skills. An example of the first is the 2017 ALA declaration that reads: “Access to accurate information, not censorship, is the best way to counter disinformation and media manipulation.” The latter is exemplified by his quoting from Nicole Cooke’s book, Fake News and Alternative Facts: Information Literacy in a Post-Truth Era: “The bulk of disinformation on the Internet could be combated with basic evaluation skills.”

I think we can all agree that pointing people to quality information is not a viable strategy in and of itself. The second statement is probably more contentious and where I’d like to focus for a couple of minutes. The claim is that information literacy (IL) is not up to the task. Let’s unpack this. Here I summarize four observations of how information literacy-based approaches have not accounted for the ways in which misinformation works.

The first observation concerns directionality. IL largely assumes that the individual is the one seeking information. It’s the individual who learns to locate (search), evaluate and use information. Misinformation, on the other hand, often works in the opposite direction. It’s the information that seeks out the individual. It does this through your social media feeds, and algorithms that prioritize information that draws you in.

Two. IL is largely presumed to be a rational process. That is, if you learn how to search and engage with information critically, you will be a savvy information consumer. Misinformation, on the other hand, is so effective because it triggers psychological and emotional triggers, like anger, confusion, and fear. I have not come across IL-based curricula that address the emotional side of information.

Three. IL treats each engagement with information as a distinct, single, instance. What I mean by this is, IL presumes we can effectively evaluate a piece of information at a single point of time. The challenge with misinformation is that the people who want you to change your minds about something know that they need to lead you there gradually, over weeks or months. It’s a cumulative approach where it is typically extremely challenging to detect misinformation in the first piece of information you encounter. Take the issue of vaccinations as an example. The first YouTube video you watch will lay out the evidence of vaccine safety, but also introduce the other side, since good journalism is supposed to do this, right. Since you’ve clicked on this video, you’re marked, and later on your feed will highlight another video on the subject, this one that starts to raise more doubts about the safety of vaccines. And so on until several months later you’ve become an anti-vaxxer. It’s the going down the rabbit hole effect.

Number four concerns a contextual assumption. IL is focused on an individual’s engagement with information, as though one does this in isolation. What we need to do is understand the social context of information. Let’s say it’s a classmate who shares the original vaccination YouTube video, because it’s for a class project and everyone is sharing resources they find. Now everyone has viewed it, creating new vectors for a disinformation campaign to spread.

I’ve only offered a cursory overview of what are deeply complex issues that require much more time to discuss. My hope through this first part of my talk has been to illustrate why doubling down on conventional IL approaches is insufficient for addressing misinformation.

Moving from the conceptual to the practical, my research team at TASCHA wanted to learn more about the professional experiences of librarians when dealing with misinformation. Earlier this year we interviewed 12 librarians across Washington State and held a 22-person workshop (before COVID-19), hosted by the King County Library System. This is what we found.

Among this group, most of them had provided ad hoc support to patrons when they had questions about problematic misinformation, but few of them offered formalized programs on misinformation per se. Among those who did offer programs, we heard that those marketed with titles that included terms such as “fake news,” “misinformation” or “post-truth” tended to only reach those who were already familiar with and concerned about the problem. The preaching to the choir effect.

With regard to their abilities, most librarians said that they felt unprepared to effectively help their patrons. This was driven by a general unfamiliarity with the technologies and tactics of misinformation. How does misinformation spread? How do social media bots work? What are deepfakes? And so on. In this regard, we surmised that these librarians recognized that IL-based approaches alone would not get the job done.

In part, their lack of confidence in their own abilities could be attributed to their nuanced understanding of the problem. They recognized misinformation as a complex issue that traditional programming does not sufficiently address. One librarian made this statement: “[People are] not interested necessarily in hearing anything that challenges their viewpoint; they really want something that supports it.” This illustrates the challenges librarians face when trying to help their patrons use credible information.

We also asked librarians to offer suggestions. First, we heard the need for programs that are respectful, especially as misinformation has become a political and polarizing topic. No-one wants to be told they lack the ability to make judgements for themselves and librarians risk alienating patrons if they come across as preachy or challenging their beliefs.

Second, they called for misinformation programming that is more obviously relevant to people’s lives. That is, rather than generic programs on, say, detecting fake news, librarians told us it would be more effective to build programs on topics that are of interest in their communities. One suggestion for achieving this was to weave misinformation into existing programs.

Third, we heard the need for programs that address misinformation indirectly, perhaps not even using the word in the title or description. Just the mention of misinformation (or fake news or post truth) can have the effect of deterring people from joining a program. This was summed up nicely by one librarian who said, “you have to hide the vegetables.”

These and other suggestions are tremendously helpful, since they are based on the interactions of librarians with patrons in their communities.

In my final few minutes I’d like to describe a project we are working on that attempts to incorporate misinformation research (and the limitations of IL-based approaches mentioned earlier) and the insights and suggestions I just mentioned. For my research team, this was a design challenge.

We arrived at a misinformation escape room. What better way to hide the vegetables than inside puzzles in an escape room? Escape rooms, as many people probably know, are live interactive adventure games in which a team of players (e.g., 4–8) work cooperatively to solve puzzles in a set amount of time (e.g., 1 hour). A worldwide phenomenon, they are also popular in public libraries where they have been deployed to offer STEM learning, curriculum support in partnership with schools, and information literacy instruction, in addition to entertainment value.

Together with colleagues at the Information School, students, and a leading designer of escape rooms we started work in the spring. We interviewed librarians who have hosted escape rooms, and tested different ways of embedding learning goals into the puzzles and escape room narrative. Due to COVID-19, we decided to first build an online escape room, but it is played live with a librarian serving as the host. Players engage with each other and the librarian over Zoom, and interact with the puzzles through a website. The escape room is designed around a fictional theme, and does not have “misinformation” in the title or description. But it does immerse players in a world of manipulated images and video, doctored figures and charts, and social media influencers. We’ve set the game at 45 minutes, with a 15 minute debrief. The debrief time is extremely important. Game literature shows that an opportunity to reflect on an experience and put it into a non-play context is where most learning happens.

The alpha version is now complete, and we will start internal testing this month, with plans to deploy it to a limited number of libraries in the winter to gather patron data on the experience. We are particularly interested to learn whether the escape room achieves any of the emotional learning objectives, such as their feelings about the consequences of spreading misinformation. If all goes according to plan, the escape room will be available for broad use some time next year.

To sum up, public libraries are a critical institution for addressing misinformation, and librarians, as information professionals, are ideally positioned to step up to this challenge. I don’t believe we can rely on conventional information literacy based approaches; but collectively we have the capacity to develop new programs that can make a real difference.

--

--

chris coward
Center for an Informed Public at UW

Senior Principal Research Scientist, UW Information School. Information access, digital inclusion, civic engagement, public libraries, misinformation