Analysis | The human element

Fighting disinformation at home and abroad requires a holistic, human-centered strategy.

Alistair Somerville and Jonas Heering via Public Diplomacy Magazine

In this piece, originally published online in the University of Southern California’s Public Diplomacy Magazine, ISD publications editor Alistair Somerville and ISD research assistant Jonas Heering argue that efforts to counter disinformation should take a human-centered approach, building on the findings from ISD’s October 2020 working group report, “The New Weapon of Choice: Technology and Information Operations Today.”

Insurrectionists storming the U.S. Capitol on January 6, 2021
Insurrectionists storming the U.S. Capitol on January 6, 2021 (Image: Wikimedia Commons)

Bots. Troll farms. Deep fakes. All are now buzzwords in the world of online information operations. But these terms suggest that the problem of malign influence operations, specifically those conducted online, is primarily technological. However, as influence operations have evolved, especially since the 2016 U.S. election, it is increasingly clear that the spread of mis- and disinformation is a fundamentally human problem, exacerbated by technology. We are the target of these operations, and human beings’ susceptibility to believe comforting false narratives and conspiracy theories can have deadly consequences, as the January 6 insurrection in the United States so shockingly demonstrated. Moreover, the problem blurs the lines between domestic and foreign policy, and therefore requires a new domestic and diplomatic response that draws on both the national and international policy tool kits.

As an October 2020 report from the Institute for the Study of Diplomacy highlighted, the human element of this problem means that ordinary citizens — alongside their governments — must also be part of the solution. The Institute convened a group of experts in the field of information operations from government, think tanks, academia, and the private sector to provide insights and recommendations on a path forward. The final report noted that attempts to mitigate the effects of information operations must take a whole-of-society approach, and that educators, journalists, and the private sector should all play an important role, alongside governments.

Not everyone who spreads misinformation or disinformation is a Russian bot. Far from it.

Ordinary users of social media platforms drive the problem but can also help to fix it. Not everyone who spreads misinformation (false information circulated unwittingly) or disinformation (false information disseminated intentionally) is a Russian bot. Far from it. Human beings drive the spread of false narratives, and people engage in this behavior based on a complex array of economic, political, and social motivations, or fall into it unwittingly. While it is important to recognize the threat that fake social media accounts and developments in artificial intelligence pose to online political discourse through automation and fabrication, researchers should stay away from primarily technological explanations. It is all too easy to attribute disinformation to foreign actors without substantiation, and further compound the circulation of false narratives. False attribution can contribute “to a belief in pervasive inauthenticity,” if people believe that every misleading political post online comes from a fake account. In fact, many of those who spread disinformation in the United States today are real American individuals: members of anti-vaccine Facebook groups and the former president are just two examples. Our discourse around mitigation strategies needs to reflect that reality, while acknowledging that all societies, not just in the United States, experience this problem.

Take the 2016 case of two competing political Facebook groups: “Heart of Texas” and “United Muslims of America.” Heart of Texas trafficked in conspiracies about Muslims implementing Sharia law in Texas, while United Muslims of America claimed to be on a mission to save “Islamic knowledge.” Escalating tensions between the two groups led to a physical standoff in the city of Houston in late spring 2016. However, the seemingly spontaneous protest was far from an organic expression of the two sides’ First Amendment rights. Rather, American citizens found themselves as pawns in a Russian information operation. The now infamous Internet Research Agency, based in St. Petersburg, had created the rival Facebook groups and stoked the protests. Unwitting Americans did the rest, demonstrating one of the many ways in which foreign and domestic actors interact with each other in attempts to sow confusion and erode faith in democracy as a system of government. The scenes at the U.S. Capitol on January 6 were the destructive consequence of years of disinformation. Only this time, disinformation did not come from the Kremlin but from the White House itself. Starting months before the election, then President Trump and his allies spread the false narrative that the election was rigged — the so-called #StopTheSteal campaign. Republican leaders spread this disinformation so rapidly and so vociferously across all forms of media that thousands of Americans were convinced their violence was in defense of democracy.

In these cases, and in so many others, social media platforms provided a vehicle to inflame and exacerbate existing tensions. After-the-fact suspensions of accounts, as occurred after January’s insurrection, were a necessary but insufficient band-aid. As the results of the 2020 U.S. election have only confirmed, partisanship leaves domestic political tensions ripe for exploitation through government propaganda and disinformation by foreign governments and domestic players. The electoral appeal of populist leaders across democratic societies suggests that divisive politics is here to stay, in an era where governments have largely failed to address deeply rooted economic problems and feelings of dislocation from the consequences of globalization.

The global spread of the QAnon conspiracy reminds us that the transnational flow of ideas across free societies paradoxically also poses challenges for the very system in which the spreaders live freely.

The global spread of the QAnon conspiracy reminds us that the transnational flow of ideas across free societies paradoxically also poses challenges for the very system in which the spreaders live freely. Coordinated domestic and diplomatic responses are necessary. To borrow a phrase from the technology world: a fractured body politic, at least in the United States, seems to be a feature not a bug.

QAnon conspiracy supporter
QAnon conspiracy supporter (Image: Wikimedia Commons)

Easy as ABC?

As a human problem, spanning foreign and domestic, mitigating disinformation requires a human-focused solution. Technological progress may facilitate the quick removal of fake social media accounts and misleading posts as well as limit their virality, but content moderation alone is insufficient. Just as we will develop better technological tools to fight disinformation, so will those who disseminate it upgrade their toolkits. As the disinformation researcher Camille François has described it, attempts to tackle disinformation must focus on the “ABC”: actors, behaviors, and content. On both sides of the information operations divide, influential actors shape the behavior of others. In our attempts to mitigate the problem, we need “validators of truth” who can transcend partisan lines to help inform the public. In the past, news organizations and experts at trusted institutions fulfilled this function. Walter Cronkite, longtime anchor of the CBS Evening News, is the oft-hailed archetype. But the COVID-19 pandemic has once again exposed deep distrust of the media among Americans and non-Americans alike. Notably, between May 2020 and January 2021, trust in traditional media across 27 countries, including the United States, declined by an average of eight percent.

Identifying new validators of truth requires creative solutions. In Finland, the government worked with social media influencers to spread factual information about COVID-19 among parts of the Finnish population that did not consume traditional media. Flipboard, a popular news curation app, launched its Truth Seekers project just ahead of the U.S. election, which aims to highlight and elevate trusted, objective voices from across the American news media. While such steps alone cannot overcome polarization in the United States, they are a crucial first step. We need champions, influencers, and validators who can transcend at least some of the myriad political divisions facing the country. This is also an important message for diplomats and other public diplomacy practitioners, who must identify and collaborate with local influencers around the world to combat false narratives, now more than ever.

Another approach, in the United States and elsewhere, could enlist public libraries and librarians as independent sources of trusted information. Nearly 80 percent of Americans, across most age groups and ethnicities, trust public libraries to help them find reliable information. Libraries offer the advantage of the existing public infrastructure to fight disinformation, rather than having to create new programs from scratch. More importantly, as trusted pillars in their local communities, libraries offer an excellent avenue for a highly-localized disinformation mitigation effort through better media literacy training. This approach can help overcome potential shortfalls in national fact-checking initiatives, which are often ineffective in countering the localized spread of disinformation, especially around election season. Increased funding for public libraries, as well as public diplomacy initiatives overseas that harness local library capacity, can equip large parts of the population with the skills to identify mis- and disinformation.

While a human-centered approach toward mitigating disinformation should be front and center, its implementation must come in tandem with additional measures by technology companies.

While a human-centered approach toward mitigating disinformation should be front and center, its implementation must come in tandem with additional measures by technology companies. Our research at the Institute for the Study of Diplomacy charted the steps forward that social media companies have taken so far — especially over the course of the U.S. presidential election and through attempts to tackle the spread of false information about the coronavirus. Twitter decided to flag misleading information from President Trump and his associates about both the pandemic and election results, and acted rapidly to prevent users from sharing links to a misleading October 14 New York Post story about a laptop supposedly belonging to Joe Biden’s son, Hunter. The story may have been an unwitting vector for a Russian information operation. Content-focused actions by platforms can be effective, particularly in the short term — research by Zignal Labs showed that in the week after Twitter and other social media platforms removed accounts by Trump and his key allies in January, election fraud misinformation dropped by nearly 75 percent.

Picture of journalist photographing cardboard cutouts of Facebook CEO Mark Zuckerberg outside the U.S. Capitol, before Zuckerberg testifies before the Senate
Journalist photographs cardboard cutouts of Facebook CEO Mark Zuckerberg outside the U.S. Capitol, before Zuckerberg testifies before the Senate (Image: Avaaz/Flickr)

However, these steps alone are not sufficient. Returning to Camille François’ “ABC” moniker, the platforms’ algorithms mean that user behavior is just as significant as content moderation in shaping the spread of information online. As the rapid growth of the proTrump “Stop the Steal” Facebook group — by as many as 1000 new members every ten seconds — showed, Facebook has unleashed an algorithm that it cannot itself control. This has created a transnational army of witting and unwitting “disinformers” on the platform, particularly within Facebook groups, as the researcher Nina Jankowicz has described them. This suggests that a long-term approach, to include media literacy education campaigns delivered by trusted local actors, is necessary. As Facebook’s head of cybersecurity policy, Nathaniel Gleicher, noted in an October New York Times interview: “One of the most effective countermeasures in all of this is an informed public. So we have to do the best — all of us, not just Facebook — to amplify authentic information.”

A path forward

Linking up local, national, and international efforts to combat this problem will be a crucial challenge. Government officials around the world, from the local to the national level, will need to foster transnational cooperation to share best practices and learn from media literacy and election security success stories in countries like Taiwan and Finland. On the diplomatic level, the United States and other democracies should work together to pursue the development of international standards and norms to govern social media companies. Several countries have started to press ahead in designing new rules of platform governance, as exemplified by the European Union’s Digital Services Act. But information flows on social media transcend national and regional boundaries. Thus, to be truly effective, countries will have to cooperate to set new rules and standards.

At the societal level, local leaders and trusted influencers will be the key, especially in the United States. The role of trusted local players has implications for both U.S. foreign and domestic policy. Public diplomacy initiatives will require Washington to adopt the latest digital communications tools, while giving diplomats in the field enough freedom to shape narratives based on U.S. values that harness the power of local influential players. Conversely, stories of how U.S. embassies successfully counter disinformation abroad can hold valuable insights for how to address this issue domestically. For example, in Latvia, the U.S. embassy developed a program to provide media literacy skills training for school teachers. In the United States, government initiatives to train teachers on how to conduct media literacy education could be modeled on these efforts.

Information operations and the spread of disinformation will never cease altogether, no matter what tech companies and governments do, and regardless of new technologies to detect false narratives. Some citizens will no doubt continue to believe whatever information fits their worldview, no matter where the information comes from or its veracity, but public awareness remains a critical element. To build long-term resilience, democracies will need stronger civil society, independent media, fact checkers, and governments that model transparency. Harnessing both the domestic and diplomatic toolkits will be crucial.

Alistair Somerville is the publications editor at the Institute for the Study of Diplomacy and editor of The Diplomatic Pouch. Follow him on Twitter @apsomerville.

Jonas Heering is a research assistant at ISD. He is also the Bunker graduate fellow in diplomacy, and a master’s student in the School of Foreign Service.

Read the original piece on page 50 of the magazine:

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store