Stop Calling It Fake News.

Information disorder is complex, but fixing it starts with calling it what it actually is.

Harvard Kennedy School
Harvard Kennedy School PolicyCast
25 min readJan 31, 2018

--

Mis- and disinformation graphic from Information Disorder.

In 2016 and 2017, false information spread like wildfire across social networks, particularly in countries facing elections. While disinformation campaigns are nothing new, the scale of this phenomenon was unprecedented. Soon, it earned a nickname, Fake News. It’s a term that has since been repeated far and wide — most notably by politicians and autocrats seeking to discredit legitimate journalists — but also by the news media itself as they attempted to grapple with the problem.

But if there’s one thing our guests this week beg of you, it’s that you stop saying Fake N***.

On this episode of PolicyCast, we hear from Claire Wardle and Hossein Derakhshan, who recently co-authored a report that breaks down the complex issue of Information Disorder in order to help find productive ways to address it.

Wardle is a research fellow who leads the First Draft initiative out of the Kennedy School’s Shorenstein Center. Derakhshan is a Spring 2018 joint Entrepreneurship Fellow at the MIT Media Lab and Shorenstein Center.

Each week on PolicyCast, Host Matt Cadwallader (@mattcad) explores the ways individuals make democracy work by speaking with the world’s leading experts in public policy, media, and international affairs about their experiences confronting our most pressing public problems.

Transcript

Note: This transcript has only been lightly edited and may contain errors.

Matt: You spend a lot of time in this report establishing and defining terms for the various types of information disorder. Why was that important from the outset?

Claire: Partly because I am an academic by training, and so academics love definitions, but really, very early at the beginning of 2017, I saw the term, and I don’t even use it, F, asterisk, asterisk, asterisk, news being thrown around by politicians but also journalists and academics as a shorthand … I thought it was a shorthand or as a weapon, and it just felt like the conversations that we were having subsequently were actually pretty shallow and actually pretty useless, because we were talking over each other because everybody meant different things by that term.

Claire: In February I went away and kind of came up with seven different types of mis- and disinformation. It really was very popular as a framework, and I thought, there’s something to this. For this report, Hossein and I have spent a long time thinking about the complexity of the whole ecosystem and saying, we can only really start talking about interventions if we understand what we’re talking about.

Claire: On something like this, when some people are throwing propaganda, some people throw in satire, opposite ends of the spectrum, some of their stuff is genuine but it’s just old and used out of context. We’ve got click bait headlines is a problem. We’ve had just simple mistakes. There’s so many elements to this and so I think breaking it down and really explaining what we mean by it means that we can have conversations where at least we understand that we’re on the same terms, we’re on the same ground when we’re having those conversations.

…the bigger reason we shouldn’t be using [the term ‘fake news’] is because it’s being used by politicians around the world as a weapon against a free press, against free speech, as a lazy shorthand for anything they don’t like, but it’s being aimed at the mainstream media and it’s working. Research is now showing that more members of the audience believe that mainstream media are pedaling fabricated information.
— Claire Wardle

Matt: Of course, beneath those asterisks is the words that should go unmentioned, fake news. I will apologize for saying it.

Claire: It’s quite all right.

Matt: This is a term that you hear everywhere. It’s used everywhere from articles, journalists use it, to even in academic literature. What is wrong, in your estimation, with the term fake news?

Claire: There’s two very key reason why we shouldn’t be using it. Firstly is that it’s unhelpful as I just described. It actually doesn’t get to the heart of this. If we’re talking about fabricated websites created by Macedonians teenagers during the US election, I would struggle to call that news. It’s fake, but it’s not news. Similarly, if we look at a breaking news event, for example, yesterday when we had the explosion in New York, we saw old imagery circulating. That was genuine imagery. It wasn’t fake, so actually the term is unhelpful.

Claire: We know, the bigger reason we shouldn’t be using it is it’s because it’s being used by politicians around the world as a weapon against a free press, against free speech, as a lazy shorthand for anything they don’t like, but it’s being aimed at the mainstream media and it’s working. Research is now showing that more members of the audience believe that mainstream media are pedaling fabricated information.

Claire: Trust in the mainstream media is at an all time low as it is, so when journalists say to me, “Can I talk to about this topic,” and they include that term in the headlines, and I push back and say, “Can we stop using it?” They’ll say, “Yeah, but search engine optimization means that we need the traffic,” or academics where I say, “Please don’t use the term.” “Yeah, but it’s going to help me get funding because it’s the sexy term right now.

Claire: If we really care about this, we should understand that language matters. Language is powerful, and at the moment, lazy shortcuts, I think, are really damaging, and I don’t think we recognize how seriously damaging this term is, and I just wish 2018, I really hope it will stop being the word of the year, by every country, by every dictionary, and we could start saying, “What do we really mean when we talk about this??

Matt: That’s, in all honesty, something I’m going to struggle with when we publish this. People will know what fake news is, or at least, if they don’t know what it is, the information about it, they understand the idea whereas saying information disorder, it’s a little bit less … It’s not as hard [crosstalk 00:05:35].

Claire: It’s true, but I say to people, we understand what rumors are, so when I’m writing I talk about rumors, fabricated and manipulated content, propaganda, we do have terms for this. I think it might mean that we need to use more words but I think, as a shortcut, it’s lazy and it’s unhelpful and it’s being weaponized. Yes, it would make our lives easier if we had a shortcut term. We don’t, and so right now I think we just have to be much more clear about what we’re talking about.

Matt: As an alternative, you’ve proposed in your report, these three different types of information. I think you mentioned them before. Misinformation, disinformation, and mal-information. Can you break those down for us?

Claire: Yeah, and this is partly because I would hear lots of people talk about misinformation, and they didn’t really mean misinformation. Misinformation is false information that’s not intended to cause harm, so it’s a mistake. It might be my mom retweeting a picture of a shark during a hurricane, and she doesn’t know that it’s not from this hurricane, but she knows it’s kind of fun and she shares it. That’s misleading and so therefore it’s misinformation.

Claire: Disinformation is also false information but the person who is creating or sharing it knows that it’s false and is sharing it with the intent to cause harm. We talk about that in a legal sense. Harm is a difficult aspect to define, but in this sense, it’s somebody knows they’re trying to cause harm.

Claire: The third idea is mal-information, and that’s when it’s genuine information, but it’s revealed to cause harm. For example, revenge porn or hate speech, on some level, somebody’s identity is used as a form of hate speech.

Claire: For us, we think about different types of information and trying to understand the difference between when it’s genuine and false and what the intention in terms of harm is a way for breaking out those three different types, and we have a nice Venn diagram that described it.

Matt: Another way that you break this down is you talked about the actors, the people who are involved with this. It’s not just about one individual, one organization, involved with all the stages of this. It’s really a process.

Claire: Absolutely, and this, again, came from a frustration that I spent most of 2017 sitting on panels at conferences where everyone thinks that in 45 minutes you can solve this problem. This was to say, “Okay, we need to break out the agents. They are distinct to the messages and they are distinct to how they are interpreted.” Even if we just focus on the agents for a second, are these individuals? Are these loose connections of individuals? Are they state actors? Are they unofficial actors? What are their motivations?

Claire: We talk about four different types: financial, political, social and psychological. When people say, “What’s the solution to this problem?” If we’re trying to stop those people who are motivated by financial motivations, there are a certain set of changes that we could do. Political disinformation is another set of motivations. If it’s social, it’s another set of motivations again.

Claire: What we say is, for each of these aspects, agents, messages and interpreters, we need to ask questions. For every particular piece of content, who is the agent, what do we know about them? How do we make sense of them, and I think, in the moment, we’re seeing this as one, which goes back to the terminology, as this one big problem that we can solve with a tweak of an algorithm, and actually, we’re not going to get to that if we don’t understand all the different types and all the different elements and all the different types of agents that are involved in this.

Matt: It’s interesting you mention an algorithm. Of course, social media seems to have a played a huge role in this, or at least it’s part of the larger narrative about it. Propaganda and false information have been used throughout human history. Is this a particularly new problem or is it just a new format?

Claire: It’s certainly not a new problem, but what technology has done has meant that it’s much, much easier and much, much more cheap to manipulate content or to create fabricated content, so an eight year old with iMovie can create a pretty sophisticated manipulated video today. It’s very easy to create the manipulated content, and then the forms in which it spreads, we’ve never had technology that could spread information like wildfire, remembering that it’s traveling between trusted peers.

Claire: Propaganda, we’ve always had propaganda, but it meant that you were watching a broadcast or listening to a radio broadcast that was coming from somebody else. Now, that atom, that weaponized atom of information comes from your best friend. That’s what’s changed, is the technology has changed. What has always been the case, as humans, we’ve always shared rumors.

Claire: We’ve always shared information that we didn’t always know was a hundred percent true. We’ve always had that social connection through information, but what’s changed is the fact that this is being kind of super charged in a way that nobody saw coming and certainly Mark Zuckerberg and the others in Silicon Valley, this is an unintended consequence. They didn’t see their platforms being used in this way.

When we saw those Russian ads that were up on those big cards during the judiciary committee hearings, I think most people were kind of shocked at how unsophisticated they looked, but that’s why they were so sophisticated, because they understand that, actually, as humans, we have these very emotional responses, particularly around strong cultural and identity issues. When people saw those ads, they weren’t stopping to do a reverse image search on those images or to question the messaging.
— Claire Wardle

Matt: One thing that you’ve pointed out is how social media has become a performative act. I think that’s interesting, because when we think about fake news or, sorry, information disorder. Oh, my God. I got to catch myself. When we think about this, we often hear about what the Russians have done to affect the 2016 campaign, for instance, but I get the sense that is far less about the agents who have political motives like, perhaps, the Russians, and far more to do with the way that we, as just humans, use social media. We see it … It’s a platform where we have to perform for our friends and family.

Matt: I’m curious, how much of this is about those external agents, I guess, and how much of it is about us, is about how we consume and actively consume?

Claire: It’s a great point, and it’s, again, in our report, we talk about … I talked about three elements, before, the agents, messages and interpreters. We also talk about the three phases, which is the creation, production and dissemination, because Russia could create a whole host of amazing memes. If none of us share them, we haven’t got a problem. When we think about who creates this kind of content and produces it, sometimes it’s hard to really understand the motivations and why they’ve been created, but what happens is it gets shared by people because the people who created it understand emotional response.

Claire: When we saw those Russian ads that were up on those big cards during the judiciary committee hearings, I think most people were kind of shocked at how unsophisticated they looked, but that’s why they were so sophisticated, because they understand that, actually, as humans, we have these very emotional responses, particularly around strong cultural and identity issues. When people saw those ads, they weren’t stopping to do a reverse image search on those images or to question the messaging.

Claire: It absolutely appealed to their existing world views and they shared them like crazy. That’s why, when we think about this, it doesn’t always look sophisticated, but the people who are doing this, those people in St. Petersburg with whiteboards who mapped out those campaigns, they understood psychology in a way that I think, in America, we’ve been kind of shocked at it. We were kind of looking at pretty strategic political campaigns in a traditional sense, and they’re not at all.

Claire: They’re pretty unsophisticated, pretty ugly weapons, but they’re weapons disguised as what look … Kind of pretty silly memes.

Matt: It gets back to this idea that we see communication as broadcast. I’m telling you a truth and you now have that truth, but instead, what we really should be thinking of is communication as ritual.

Claire: Yeah.

Matt: How it’s all about tribalism and, as you said, reinforcing our world view. How do we handle that … I’d love for you to talk a little bit about that, about the ritualistic communication, but also, if the issue here is our built-in tribalism, the things that we use to separate ourselves from others, how can that be fixed?

Claire: I’m nodding because you’ve got to the heart of everything. In our report, we reference James Kerry who was this incredible communications scholar who, back in the 1990s, wrote a book and he absolutely talked about two forms of communication, communication as transmission which is, from A to B, there’s a clear message, and then ritual forms of communication.

Claire: In our work, it’s very much about understanding that aspect.

Matt: Mm-hmm (affirmative). Okay, so, I don’t want to cut you off but it seems Hossein is here, so we’ll hold off and welcome him. Hossein, how are you? Pleasure to meet you. Thanks so much for coming in.

Hossein: Hey.

Claire: Hossein has a lot to say about this ritual aspect of communication, but to your point about where this leads us, is partly, the response to this is many people have simply said, “We just need more fact checking. We just need more quality information in the ecosystem,” and when we have those conversations, we completely miss this aspect which is, even those of us who are most educated, and actually, Dan [inaudible 00:14:53] has done some great work from Yale talking about, even those of us who are more educated and more literate and more enumerate are actually more likely to do mathematical somersaults to make sense of data that it reinforces our world view.

Claire: When we’re trying to think about solutions, this is the most difficult aspect, because we have to stop thinking like rational academics and to understand the power of emotion and to say, “How do we recognize that this is how people find and consume information, and this is what they share, because they’re performing.” That will be the biggest challenge over the next few years is having to get out of our sense that quality information’s the answer.

before we start thinking we need to feel something and it’s only after that that we know what to think. Before having any emotions, we don’t know what to think. We are disoriented in many ways. That’s why he says that metaphors determine, to a large degree, the way we relate to the world. His famous work on metaphors that we live by come to existence.
— Hossein Derakhshan

Matt: We’ve been talking about disinformation — specifically about communication as ritual and how we talk to each other and how we use information to enforce our own beliefs about our own tribe, essentially. I’m curious for your take on this.

Hossein: I think that’s the core of the problem now, because everything that everybody is thinking is within the discourse of communication as a transmission between two groups of people, a transmission of message, but the history shows us that this is not the only way that communication works. Increasingly, we face the reality that it’s much more complicated. The way people subscribe or to relate to the factual world is not always factual, because we are very complex creatures and we have minds and hearts, as well as minds.

Hossein: We have emotions and that’s how we relate to things. Actually, it’s very interesting because one of the people who’s looked at this very carefully and very clearly is George Lakoff, the linguist. He says that before we start thinking we need to feel something and it’s only after that that we know what to think. Before having any emotions, we don’t know what to think. We are disoriented in many ways. That’s why he says that metaphors determine, to a large degree, the way we relate to the world. His famous work on metaphors that we live by come to existence.

Hossein: Actually, it’s not a very new work but it’s increasing relevance within the American political system. Then, when it comes to communication studies, it’s very important to look at James Kerry’s work. He also talks about looking at communication as something beyond transmission of messages, and then he explains how the emergence of telegraph made this possible because before telegraph communication and transportation were the same. You couldn’t communicate, basically with anyone or with any part of the world or your country, your city, without transferring a message using trains or any other transportation system.

Hossein: He says the telegraph separated these two things for the first time and, somehow, I think it’s become detrimental to the study of communications, this distinction, because we, especially from the American school of communication studies, everybody has, usually, or maybe even always, looked at communication as this transmission.

Hossein: Whereas, the ritual aspect, which comes from the real meaning of commune and community and all those things, first of all it has a very strong religious connotations. It’s about relating to a world view rather than, obviously, transmission of the message, and so basically, what he says, it’s an amazing quote. I’m not sure if Claire mentioned that already. He says, “When you read a newspaper …” Something along this, that, “When you read a newspaper, it’s as if you’re participating in a ritual or you are going to church or you join a club. This is what is happening now.”

Hossein: I think that explains a lot why we are suddenly worried about the post truth era, the spread of disinformation, and basically information disorder.

Matt: It’s easy to see how social media can make this work. As Claire spoke about before, the ease of production is, it’s much easier to produce anything. It’s much easier to share. It’s much easier. You’re seeing what your friends and family are posting, you’re performing for them. You’re both consuming identity stuff and putting forth what your identity is.

Matt: Given all of that, is there something we can do? Short of shutting down social media and making it illegal to share on Facebook, is there something that can be done to help put guard rails on this?

Hossein: Not really, to be honest, because it’s a very deep problem. It’s related to the socio-political situation. It’s related to what I call, actually, the post enlightenment era that we’ve entered, where reason and facts and text are replaced with feelings and emotions and images. It’s really hard to expect people to be rational thinkers when they don’t have time to basically read anything more than one or two paragraphs.

Matt: Right. I’m glad you mentioned the images in particular, because that seemed to be a big theme, was, the difference here is images. It’s not articles, full articles of false information, but an image with maybe a few words over top. How does that change things?

Claire: It changes things in the sense, you don’t have to click through to an image. On social media, which is increasingly designed to be visual, this stuff is very powerful. You don’t have to expect somebody to link through to it. Our brains are a lot less likely to be critical of visuals. We are trained to trust visuals, and so we don’t need to use the same brain power, or we don’t think we do, to read an image. It goes back to the terminology again, talking about, this is news. Do we talk about those Russian ads as news?

Claire: I think, because of that, a lot of this discussion has been focused on news and text, and so what that means is the platforms themselves haven’t been focused enough on how do you monitor and make sense of visuals? From a technology point of view, we’ve not thought about memes, and actually they have become the most powerful vehicles of disinformation.

Matt: Mm-hmm (affirmative). I wanted to talk a little bit more about the Russian interference in the election via both ads and just shared memes. There was recently an article in the Columbian Journalism Review that questioned the importance of all of this discussion of the power of these memes. One thing that it noted was that in sheer numerical terms, the information to which votes were exposed during the election campaign was overwhelmingly produced not by fake news sites, or even by alt right media, but by household names like the New York Times and the Washing Post.

Matt: They referenced one study that said an average American probably saw one of these fake ads or memes over the course of the entire election. Given the relative [inaudible 00:23:13] of misinformation out there or disinformation, are we making a bit too much out of just something that … A problem, to be sure, but something that only affected things around the edges?

Hossein: Yes. I think this is definitely exaggerated, the threat that everybody’s talking about of all aspects of the information and its effect on elections, especially. I think it’s exaggerated. Media cannot create anything out of the blue. They cannot make someone to be seen as ordinary or as … They can’t create popularity, for example, for politicians. There must be some popularity, and they can only enforce it or maybe harm it a little bit. This whole debate about the effect of disinformation, I think, is a bit exaggerated when it comes to elections, but then there are things that we can do to prevent this minimum impact, if it’s about interference, foreign interference, or if it’s about things that platforms or others could do to stop the spread of disinformation.

Hossein: Disinformation has always been produced, but what is new hear now is that it’s become very cheap and easy for bots, for committed activists, or for organized groups of people, paid agents, to disseminate them and to amplify. That’s why we talk a lot about the manufactured amplifications that are easy, actually, to spot and detect by the social media platforms, and they can crack down on this manufactured amplification. This is what we can do in terms of …

Hossein: Going back to your previous questions about the solutions, one of the other things that I think platforms should look into is acknowledging the fact that the realm of news is not emotions. We cannot react to news based on our emotions. We cannot show that we like or dislike a piece of news. We can only say we agree or disagree with an opinion, if it’s about an opinion, or we can say, for instance, that we trust or suspect a news story.

Hossein: Liking a news story is like agreeing with a piece of music. It’s as irrelevant. I think they should change, when it comes to formal news, I think, they should change the way people can engage with them and change those buttons from likes to something else, to agree, disagree, suspect, trust, to try to change people’s relations to that piece of information, just as an example, because this emotionality is really detrimental.

Matt: I should also note that the report comes with a lot of recommendations for all sorts of organizations, news organizations, social media companies, governments, et cetera, but that’s just a side note. I know, Claire, you mentioned that you’d read that article before. I’m curious for your thoughts, as well.

Claire: Yeah. I was not very happy with the way that they were juxtaposed, which is this idea of it wasn’t the fabricated news stories that were a problem. It was the mainstream media, when actually the only news outlet that was measured was the New York Times and actually I don’t think they’re an either/or, so I wasn’t crazy about that.

Claire: It’s important that we do more research on this and I think one of our biggest challenges is how we measure impact of this type of content. That study that was mentioned which came out very quickly, I think it was even last December, saying, “Don’t worry, everybody. There’s no impact.” Actually, Facebook’s whole model is on the effectiveness of their ads. It’s very difficult … Media studies is notoriously difficult to find media effects, so what we really need to understand is how this was used, even offline, who read some of those stories about the Pope endorsing Trump and came downstairs to dinner and said to everybody, “You’re not going to believe this.”

Claire: How we measure this stuff is really, really difficult to do and I think, actually, on an election like we had when the numbers were so small, actually voter suppression concerns me more than anything. A lot of the ads, a lot of the fabricated content, was pro-Trump or against Hillary. If you already hated Trump or loved Hillary or vice versa, that probably wasn’t going to change your mind. What this might have done was suppress the vote. One of the famous visuals that was circulating was saying, and it was micro-targeted to minority communities in the states.

Claire: It targeted districts, saying, “You don’t have to go out and vote for Hillary. You can stay at home and vote via text message.” That’s the kind of disinformation that I really worry about. I’m less worried about reinforcing whether Pope endorsed Trump, but again, going back to world view, if you are Catholic, that’s an incredibly powerful message.

Claire: I think, in the abstract, or we, as a total, think about these ads and say they weren’t effective or they were effective, we don’t yet know, and I think it’s dangerous to make bold claims. I certainly agree with the authors in that report that the New York Times did a pretty horrible job of talking about policy in the lead up to the election.

Claire: October was entirely about Hillary’s emails and there was no very substantive conversation about how the candidates stacked up against each other. As a newspaper of record, I think it really was unforgivable. That’s an important part of that research, but I wasn’t very happy with the fact that it was, like, “Look at this and look at that.” I would say that to the author’s faces.

Matt: Hossein?

Hossein: There’s something very important in terms of impact and the audience understanding or reception of these messages. We’ve tried to show that with our model which basically has three phases, three elements, and three types of information. In the three elements, we have the element of interpreter, and that’s why … there is a reason we use the word interpreter and not the recipient, because people are not simple. People, they have brains, obviously, and when they receive a message, they interpret it first and then they act upon it or it affects their world view or whatever. That’s why, for example, going back to Stuart Hall’s distinctions of three types of reading, that anybody can have with a text, which are basically hegemonical, oppositional, and what was the third one? I forgot the third one, but everybody can …

Matt: It’s the non-important one.

Hossein: Yeah, exactly. The oppositional reading is hugely important, and this is actually … Ironically, I come from Iran. This is where I was born and raised. Interestingly enough, in some authoritarian systems, people are much more inclined to read things oppositionally, and negotiated reading. That was the third term, negotiated reading. Then, in more liberal, free kind of media societies, because people are always suspecting the source and the validity and the truth to some of the information that they get and it’s mostly news that comes from official channels. They kind of … They are trained with these amazing media literacy skills of always suspecting, always questioning, always interrogating and negotiating with the text.

Hossein: Sometimes, you can easily see what the reality is by looking at the propaganda in an oppositional way. That’s amazing, because, another personal story is that I spent some time in prison in Iran and we didn’t have free access to all kinds of newspapers there. There were only one or two newspapers, very close to the government propaganda line, and I was reading them, especially one of them which is called [inaudible 00:31:24]. It’s famous to be the source of oppositional reading.

Hossein: Whatever you read there, you could actually see what was happening in reality outside this realm of the world view of this small newspaper. I thought people in more liberal societies don’t have that skill, which is very ironic.

It’s a very difficult line that we’re treading here which is, how do we inform people but make them understand that quality information is the key to any democratic society. If people have quality information to make decisions around, but we’re at a time where we have polluted information, we need to be more skeptical but we shouldn’t lose trust in everything.
— Claire Wardle

Matt: Is that something that can be learned? Also, how does that feed into, if I think of earlier on, Claire, you mentioned how the term fake news is, part of it is problematic because it has led people to distrust mainstream news organizations. How do we both build up that ability to look skeptically upon the sources of information that we’re reading or looking at, while at the same time build back that trust in mainstream media institutions?

Claire: I think, part of this is, news organizations have been terrible at actually explaining their processes and procedures and policies to the audience, so the last week, we’ve seen the case that ABC and CNN, where there were serious editorial mistakes made, and actually now, we’re seeing discussions about what were those processes? What do you have to do at ABC in order to get something on air? [inaudible 00:32:50] that most people have no idea about how all of those pretty strict policies that news organizations have.

Claire: I also see news organizations wanting to make themselves distinct from these other publishers. I use publishers in a liberal sense, and so they’re thinking about ways that they can talk about those corrections, policies, and make it clear that they are distinct. I think news organizations have to be better, hold themselves up when they make mistakes, make fewer mistakes. That’s the other thing. Unfortunately, newsrooms have been gutted. Local news in this country is horrifically … Has been taken away in a sense that we have awful news deserts.

Hossein: The funding for serious journalism …

Claire: Yeah, and in that era, we’re now expecting journalists to do more, but what we’re seeing is more mistakes, and so then when that happens that allows politicians to point the finger. The other thing I’d say, unfortunately, is that in that race for traffic, we’re seeing a lot of journalism where they’re reporting on disinformation because any headline with the word bot, Russia, cybersecurity, hacking, F news, gets traffic. What that means is we’re sort of writing stories that kind of say to people, “The system’s broken. You can’t trust anybody. There are outside forces trying to bring down democracy.”

Claire: In many ways, when I look at the ballot box of the midterm elections, I think, “What are the unintended consequences of the reporting that we’re currently doing on disinformation that’s going to cause people to just give up and say, ‘I don’t know who to trust.’” They’ll term away from information sources and they probably won’t turn out to vote.

Claire: It’s a very difficult line that we’re treading here which is, how do we inform people but make them understand that quality information is the key to any democratic society. If people have quality information to make decisions around, but we’re at a time where we have polluted information, we need to be more skeptical but we shouldn’t lose trust in everything.

Hossein: Coming from the Middle East, I think we should also acknowledge that part of this problem has been associated to the democratic leaning, more liberal press, as well. Nobody should forget that the whole disinformation campaign of the WMD in Iraq was fed into the most liberal newspaper in the US, and it took maybe for or five years until they realized and they apologized, the New York Times, how they helped Pentagon’s line to attack Iraq and to occupy Iraq which has been the source of so many problems in the region.

Hossein: Obviously, everybody knows that. I think, instead of … There is a degree of truth to this mistrust in mainstream media, to a degree, but then, at the end of the day, it all comes down to people’s rational thinking skills, critical thinking skills. Instead of giving people fishes we have to teach them fishing. The fishing here is media literacy, which is connected to public education. That’s why you see that in Northern Europe, these campaigns are much less effective compared to North America and especially to the US because the public …

Claire: You mean much more effective.

Hossein: Much more effective, because in the US, as far as I’m hearing, I haven’t spent that much time here, but the public education system is not good enough when it comes to giving ordinary people these critical thinking skills. Obviously, it’s much easier to manipulate, even though they are very sophisticated, everybody’s very sophisticated, but it’s much easier to trick them or to manipulate them in so many ways. Ultimately, it comes down to larger socioeconomic structures and I don’t think these problems would go away before those things are solved, the public education system is robustly reviewed and reconstructed again, restructured again.

Hossein: The welfare state also is related to this because the main difference between those Northern European countries comes down to welfare states and the education system is connected to that.

Matt: The name of the report is Information Disorder: Toward an Interdisciplinary Framework for Research and Policymaking. Hossein [inaudible 00:37:03], Claire Wardle, thank you so much for joining us on Policy Cast. We really appreciate it.

Hossein: Thank you. We’re both on Twitter if anybody wanted to read us.

Matt: What are your handles?

Hossein: Mine is, it’s difficult because I have to use numbers. H0D3R, that’s mine.

Claire: Mine’s cwardle but the L is a number one.

Matt: Okay.

Hossein: It would be probably easier to search our names.

Matt: We’ll certainly have links in the show notes. Thank you again, really appreciate it.

Hossein: Thanks for having us.

--

--