Misinformation is Infectious
In May 2020, The Festival of Dangerous Ideas launched FODI Digital, a series of online conversations.
Misinformation is spreading worldwide. How do we distill truth in an algorithm-driven world?
Below is an abridged version of Claire Wardle and Ariel Bogle’s conversation from ‘Misinformation is Infectious’. You can watch the full discussion here.
Ariel — So I thought we’d better start by defining our terms. Can you walk us through the differences, the lines between misinformation and disinformation and, what I think you’ve called, mal-information, and what we’re seeing under those three categories during the pandemic online?
Claire — Language really matters. Particularly if we’re trying to think about interventions and what we could do to help slow these problems down. But disinformation is false information, when the people who are creating it or sharing it really do want to cause harm. And that is distinct from misinformation which is also false, but the people sharing it don’t realise it’s false and they’re not trying to cause any harm. So, that might be my mum sharing something on Facebook without realising. That is misinformation and it’s different from disinformation.
This third category, mal-information, is this idea of genuine information but the sharing of that might cause harm. So an example might be, revenge porn or it might be an older photo that’s being pushed. It’s a genuine photo but the people sharing it are trying to cause harm. So, we have to think a little bit about the complexity of this space if we’re going try and get a handle on it.
Ariel — Absolutely, I feel like since the 2016 US election and more acknowledgment of the role of Russian internet research agency during that time, there’s a lingering desire to always blame the spread of misinformation on darker forces. Sometimes that might be the case, but during this pandemic and the kinds of misinformation we’re seeing online, what kind of actors are at play? Are there Russians out there or is it just us?
Claire — So, it is a complex space and we have seen kind of phases around the coronavirus misinformation. So, this first two to three weeks, we were seeing kind of the top 10 tips of things to do. You probably saw it, gargle with salt water, if you can’t hold your breath for 10 seconds that probably means you got it. We all saw it whatever country we lived in.
We saw text messages saying your country’s going to go into lockdown, it didn’t matter if you were in Dublin or Mexico City. But then the last three weeks we’ve seen a lot more conspiracy theories which I think we’re going to talk a lot more around. But we are also seeing countries pushing certain narratives.
So at the moment there’s a tussle between the US and China about the origin story of the virus. So we are seeing different foreign entities using information. Some of that is, you know, old school disinformation. Some of this you could argue as propaganda and trying to push your own narrative. We are seeing everything right now. And we’re seeing everything at the same time, across the world.
We’ve never had a moment where we’ve seen the same kind of rumours and conspiracies travel as quickly as we’re seeing right now.
Ariel — So, it’s clear during this pandemic that a lot of people that are already committed to conspiratorial thinking, they’re really dedicated to conspiracy theories, say around 5G or they might be members of the strong anti-vaccination community. One thing they want to do is influence the media, get coverage in the media, but they also want to influence the influencers. I think it’s been interesting to see the role of celebrity in all this.
People, I think, watching will be aware of the 5G conspiracy theory. The baseless idea that 5G contributed or caused the coronavirus pandemic. And that really got a huge signal boost because of celebrities like Woody Harrelson. Could we talk about influencing influencer and how, you can’t control the celebrity Instagram account, I guess, can we?
Claire — No, I mean, what I would do to do verification classes with some of these influencers.
But we live now in a time where audiences for news are dropping.
I mean, there are still, there’s a lot of high levels of trust. Particularly in countries like the UK and Australia.
But, in the US, for example, you know, there are half the population who doesn’t go anywhere near the professional media. But they do get a lot of information from their social media feeds. And partly they’re getting that from friends and family, but they’re also getting it from celebrities that we would recognise, like Kim Kardashian or Woody Harrelson, but also YouTube creators or influencers, or TikTok influencers or creators.
And those people actually have huge audiences and they are trusted by people who watch them every day. They’ll actually have a relationship with them. And so, we often forget them, when we think about the media ecosystem, we talk about the mainstream media or we talk about friends and family. But these online influencers are a really important part of this ecosystem and, you know, I once was talking to an agent of a YouTube influencer who said, “Yes, I’m really worried that my, you know, my star is gonna say something that she shouldn’t do. I don’t know how we’re going to cope. I don’t know how we’d issue a correction.” And it was something that she really thought about and she said, she gets sent things all the time to retweet and she doesn’t do the verification, she’s not a journalist. But she has huge influence and therefore responsibility.
Ariel — At the ABC we’ve been inviting the public to send us examples of misinformation and it’s interesting, as you’ve said, a lot of it is occurring in private groups on WhatsApp or forwarded messages on Facebook Messenger. Even in neighbourhood Facebook groups or on Nextdoor, which is a social media network for communities tied to your postcode. Is this, how do you talk to somebody, your aunt, your uncle, your next door neighbour, in those groups, where they are sharing stuff in that way you said, “You know, I saw this, thought I’d share it, not sure if it’s true.”
How does, how do you talk to someone like that and also, as the media, how do you cover it?
Claire — So the question of how to talk to one another is really, really important. We’ve invested quite a lot of money in media literacy training but we haven’t really invested the money in teaching one another how to have these conversations. Which can be quite difficult. And the problem is, sometimes if you see somebody sharing something you might say, “Hey, Bob, you know, I saw you shared that on Facebook but it’s actually not true. And here’s the debunking article from the ABC fact check. I mean just to let you know, it’s best if you don’t share that.”
Now, you’ve immediately made Bob feel not great about himself and there’s a lot of psychological literature that shows that Bob is more likely to double down on his views. It’s actually very difficult for him, because he feels ashamed to therefore say, “Oh thanks so much for that. Let me read this. Oh, you’re right. I was wrong.”
It’s not just a very human thing for us to do. It’s better off to say, “Hey Bob, I saw you share that Plandemic documentary. I’ve seen it all day on my Facebook feed too but I’ve really just worried about this kind of misinformation on our communities. And I’m worrying about people sharing this information, it hasn’t been verified. I mean, I think it’s important for us to think about these things but I’d love to talk to you more about why this has been circulated. Why is it being circulated now? Should we believe this? I’d just love to have a conversation with you.”
And so it’s about using language like “we” and “us” and “community” as opposed to, I’m right, you’re wrong. We just have been doing it in a way that’s not very helpful and doesn’t really fit with the ways that our brains work. And we have to recognise that the information we share is very tied to our identities. And we perform those identities via the information we share. So when you tell somebody that they’re sharing something that’s wrong, you’re actually talking about the identity that they’re constructing and the worldview that they have.
So we have to be much more sensitive than we’re currently being. And to recognise that fact checking articles don’t always do everything we want them to do. Even though they should.
Ariel — So, of course, social media companies, Facebook, Twitter, YouTube, TikTok, they all operate under algorithms which are, you know, pretty hard to scrutinise. Or impossible to scrutinise from the outside. But yet trusted institutions, the World Health Organisation, or local hospitals here in Australia, or the Department of Health, that they have to compete in that ecosystem alongside Donald Trump, but also anti-vaccination groups, all kinds of various information. Are our institutions losing the information war? And how can they get better at competing in those quite confusing ecosystems?
Claire — So the huge challenge is, as humans we have emotional relationships to information. But those of us who work in the quality information space, whether that’s journalists, researchers, fact-checkers, unfortunately, we like to think that people have a rational relationship to information. So I often talk about, for example, anti-vax communities really understand the role of emotion. So they are excellent at creating very emotive, visual memes. Very sharable, very clickable. And they’re pushing this emotion ridden content to people and it’s pulling on peoples heart strings about this is what happened when my baby was vaccinated. But then on the other hand, you have the WHO putting out an 87 page PDF with an image of a dripping needle on the front cover. It’s an asymmetrical playing field.
And I would say that many health authorities are excellent at public health, not particularly good at communications. So we just aren’t very good at creating content for the age of social media. So in the US every day, Andrew Cuomo, from New York, does a press conference and he’s got a PowerPoint slides. And you see people like, “Oh, he’s got good slides.” How in 2020 did we think that a good PowerPoint slide is the best that can be done?
So I do think there’s a long way to go in terms of teaching health authorities how to create visually engaging, sharable content, in the same way as those people who are pushing kind of emotive… Which, and it’s not that, we have to, it feels very difficult to think about emotion if you’re a journalist. But we have to recognise that people on the other side are experts at using emotion.
Claire Wardle is a leading expert on user generated content, verification and misinformation. She is co-founder and director of First Draft, the world’s foremost nonprofit focused on research and practice to address mis- and disinformation.
Ariel Bogle is an award-winning technology reporter at the Australian Broadcasting Corporation (ABC). She writes, edits and makes radio about technology, science and culture.