The Breakdown: Claire Wardle on disinformation and today’s journalistic conventions

Unpacking journalists’ role in debunking (and inadvertently spreading) disinformation

Berkman Klein Center
Berkman Klein Center Collection
10 min readJun 4, 2020


Oumou Ly (left) interviews Claire Wardle (right) for the latest episode of The Breakdown. Photo: Lydia Rosenberg

In this episode of The Breakdown, Oumou Ly, staff fellow for Berkman Klein’s Assembly: Disinformation program, is joined by Claire Wardle, co-founder and director of First Draft, and a member of the Assembly Forum. Wardle shares insight into how disinformation and conspiracy theories move from online spheres to gain coverage in mainstream media, advice for journalists on engaging in debunking and fact-checking work, and the challenge of reporting on disinformation.

Watch an abbreviated version of the interview from the Berkman Klein Center.

This conversation has been edited and condensed for clarity and brevity.

Oumou Ly (OL): Our conversation today centers on the interplay between disinformation and the professional media ecosystem. Certainly over the past couple of months, as the disinformation related to COVID has turned into an infodemic of sorts, we’ve seen various points at which it appears that there is a pipeline between the online information ecosystem which feeds disinformation into the mainstream media.

My first question is about the pathway that false content may follow to gain mainstream coverage. It often starts in really obscure corners of the internet, maybe to fringe news sites, and then it eventually makes its way more mainstream. Can you talk about that process?

Claire Wardle (CW): I think that for those of us who study and think about mis- and disinformation, it’s very tempting to study what’s in front of us. So there’s a disproportionate focus on Twitter. It’s the easiest to study because it has an open API. Similarly, that’s where journalists look for content, sources and stories. So we end up really just thinking about one platform as the problem when actually we need to think about the full ecosystem. And it’s not always the case, but there [are] certainly examples of some of these conspiracy theories or trending campaigns or inauthentic activity being coordinated in spaces, for example, like a discord, a Guild, or it might be 4Chan. It might be some of these pretty small spaces and it would be normally easy to dismiss them and those conversations because you can see people trying to coordinate, but you don’t necessarily think that’s going to go anywhere.

In a lot of cases, it doesn’t, but sometimes [we see this basically “trading up the chain,] which is this term that’s been around for a long time. We talk about this as the trumpet of amplification, because you can see it then move into other spaces. That might be WhatsApp groups or Twitter DM groups, where the coordination gets a little bit more strategic. You then, maybe, see that move into communities — maybe on YouTube or even Reddit or Gab or places that are technically public — but these things [are] places often you’re not spending a lot of time in. They’ve grown up as particular fringe type communities that journalists are not spending time in. And from there, you see it jump into Instagram, YouTube— the YouTube that you and I spend time in — Instagram or Facebook. And it’s at that point, that journalists tend to find it.

And the problem is they don’t necessarily understand …that there’s a history to this, that there’s potentially being in any kind of coordination. At that level, unfortunately, we sometimes see politicians repeating the conspiracies or influencers repeating the falsehoods. And then at that point, you see the media make a decision that says, “Oh, we’ve now got to cover it because it’s been pushed by a particular influencer or a politician.” But that was part of the plan. That was the aim, which was to get the media to cover it. The other complication here is sometimes the media, even if a politician doesn’t talk about it, there’s a sense of, “Well, hang on. These rumors, conspiracies, falsehoods, fabricated media. It’s got to a point where actually we have to debunk it.” And unfortunately, if newsrooms debunk it in an irresponsible way, then that itself is the end goal of these bad actors. Even that’s a terrible phrase.

The role of the professional media in this whole ecosystem is [as] critical as the way we understand politicians and influencers because you can’t think about mis- and disinformation in 2020 without understanding their roles.

The fact that newsrooms are reporting on it, they have the megaphone that many of these fringe communities just don’t have. So the role of the professional media in this whole ecosystem is [as] critical as the way we understand politicians and influencers because you can’t think about mis- and disinformation in 2020 without understanding their roles.

OL: One difficulty that I hear reporters and journalists talk a lot about is this decision they have to decide whether or not to report it given that difficulty. Right? You want to prebunk, debunk, or just maybe even do simple fact-checking, but reporters often can’t do that without the risk of reamplification.

Given the fact that most major print broadcast audiences are networked, how do reporters go about doing that really critical fact-checking, debunking, or pre-bunking work without inadvertently reinforcing those very harms they’re trying to mitigate?

CW: So it’s a great question because the challenge that newsrooms now face is that they have different platforms that they need to consider. So when we do training with journalists, we talk to them a lot about the work of danah boyd around data voids. If mainstream media doesn’t do anything to debunk those conspiracies, if somebody hears about that on their family WhatsApp group, and they go to Google and they type in 5G Coronavirus, if there’s no debunking there, then all they get on Google is the conspiracies. So when it comes to information designed for Google and YouTube, newsrooms actually need to be creating this content and creating a headline that will get picked up by search. However, if you’re thinking about a tweet or a Facebook post that people are stumbling across, you have to be careful not to give oxygen to a rumor that they might not have heard about because unfortunately, our brains are really bad at making sense of truth and falsity.

We need to be more careful about ensuring that we don’t tell people rumors that they haven’t heard before

Even if somebody tells us it’s false, a week later when people go back to study it, they’re like, “Oh, somebody said something about Obama being a Muslim. I can’t remember now. Is he or not?” We’re really, really bad at making these distinctions. And so we need to be more careful about ensuring that we don’t tell people rumors that they haven’t heard before. It’s really difficult to make sense of all of these things. And there’s no hard and fast rules, but all of this is happening within an economy where newsrooms are increasingly struggling. We would be naive to not recognize that some of these headlines, some of these debunks, actually get a lot of traffic.

A story, I think it was 2012, there’s a now very famous YouTube video of an eagle stealing a baby in a park. And it’s pretty like, “Oh my goodness.” And then it transpired that it was actually a university in Canada. They’d been given an assignment to create a video that would fool journalists. And I was working with a newsroom at the time that ran the video. And I was like, “Oh my goodness. Are you mortified that you run this?” Without missing a beat, the answer was like, “Well, no, the debunk will get twice the traffic.” So there was a recognition that we have to be wary, be aware of that when we have these discussions.

OL: Right. Is there any particular set of best practices that you recommend — maybe in the course of your trainings — you recommend reporters and journalists take? Or is there an understanding within the industry that this is a problem and there is a wholesale shift of thinking that needs to happen?

CW: I would argue in the last two years, there [have] been many more discussions in newsrooms about the role that they are playing in the information ecosystem now that we have real challenges with information pollution, and some of the work by Whitney Phillips or Joan Donovan has really, I think, forced some internal conversations about this. And so what you see is newsrooms now saying, “Well, is it right that we use that particular keyword, because now we’ve learned that if we use the keyword that actually will then send people to conspiracy theories online. Maybe that’s not sensible.” So it is far from where we need it to be. But we’re asking for a pretty big paradigm shift.

Within the news industry, there has always been this idea that more sunlight is a disinfectant, that by holding people to account, by reporting on problems, that actually then that will help. The challenge here is these bad actors that we started talking about at the beginning of the interviews that potentially are sitting on 4Chan or Discord or in WhatsApp groups coordinating, that’s their whole end goal that they will get that coverage, but it’s very difficult.

And in trainings, when you show journalists like, “Listen. Look, this is a discussion that they are having about you as journalists and how they can manipulate you into covering them.” And it’s only then you have this light bulb moment from the journalists who say, “Well, that’s not why I went into journalism. I did not go into journalism to help these people get more coverage.” And I think when you have a discussion about every newsroom thinks very strategically about how to cover a press release, how to cover suicide, how to cover troop movements during a war. I think then there’s this sense of, “Oh, okay, this isn’t anything new. We just have to be aware of the unintended consequences of our coverage, which previously we didn’t have to think about when we were covering disinformation.”

OL: Yeah. That really seems to relate to what was going to be my next question anyway, about the structural issues that give way to this dynamic.

In what ways do you think that the really intrinsic link between the large and legacy professional media organizations and corporate advertising play into the way this plays out?

CW: Well, I mean, we’ve seen some great journalism over the last four years in particular really taking to task the platforms and thinking about the way that the business model of the platforms drives disinformation. But I don’t think we’ve had the same conversation about how that also plays out in the news industry. And I think, again, we’re being naive if we don’t recognize that the selection of certain stories, the framing of certain stories within the news business is designed for clicks and traffic and ad revenue. And because of that, I think there have been stories written that have unfortunately done more harm. And again, it’s very difficult, in the moment, to have these conversations. And I would say that over the last couple of years, I see much more reflection from newsrooms about these unintended consequences. But again, sometimes the people who are thinking about this aren’t necessarily the people who write the headlines or ultimately decide what to cover, so that’s the challenge.

OL: I want to shift gears for a second and talk about politics because political reporting is really ripe for disinformation in a way that’s unique to it. I mean, it’s one of those interesting sectors where the material harm of disinformation is pretty immediately recognizable because of the way so much of politics plays out in the public sphere. And I know that certainly over the last 10 years, there’s been a growing conversation about the relationship, or maybe not the relationship, but better yet the tension between balance and objectivity in political reporting. And the goal when you listen really is to really indemnify news organizations of any claims of bias by either political party.

Can you talk a little bit about that debate and how you see that playing out, maybe in the context of the COVID infodemic?

CW: I mean, the COVID situation is so interesting because we’ve all sat around and been like, “Wow, this is health misinformation.” This just goes to show the platforms could be doing much more than they’ve already been doing, but already we’re seeing how COVID is now becoming interlocked with political conversation. So for example, monitoring the excessive quarantine protests recently and seeing the online conversations and understanding that there were a lot of anti-vax groups pushing that, there were lots of second amendment gun rights people pushing that.

And so this idea that there’s health misinformation or there’s political misinformation or disinformation, these boundaries are actually very difficult, but the challenge of reporting on all of this is, as the Network Propaganda book showed us last year, this is asymmetrical. So because of that, it’s very difficult to sometimes tell these stories, make sense of this landscape for journalists who have been absolutely trained within an inch of their life to always take both sides.

And in the same way, there is the truth that most newsrooms do tend to have people who sit more on the left-wing. And so they’re already trying to counter what they perceive as potential bias. Well, they wouldn’t see it as bias, but everybody’s aware of how that might play out. So it’s a really problematic space for everybody because people are trying to use their training to do journalism but as I just said, the challenges that journalists now face, they weren’t taught about this in journalism school. They weren’t taught about how do you cover disinformation because journalism is about covering the truth, it’s not about covering the falsehoods. So now there’s this situation that journalists find themselves in that’s really hard. And of course, a trust question of how do you report on these spaces, knowing that it’s important that certain communities receive quality information yet knowing that those communities are much less likely to trust the professional media?

I think the fear I see is I see an increasingly polarized country politically, but also when we look at consumption of professional media, I see half the country going nowhere near the professional media and those people are actually more likely to be recipients of misinformation. It’s a real problem and I don’t see an easy way out of this.

This conversation was part of the Berkman Klein Center’s new series, The Breakdown. The first season of the series is produced in collaboration with the Berkman Klein Center’s Assembly program, which for the 2019–2020 year is focusing on cybersecurity approaches to tackling disinformation.



Berkman Klein Center
Berkman Klein Center Collection

The Berkman Klein Center for Internet & Society at Harvard University was founded to explore cyberspace, share in its study, and help pioneer its development.