Coordinated Authentic Behavior: a framework for mitigating online disinformation campaigns

--

Coordinated Authentic Behavior is a project created during the 2020 Assembly Fellowship at the Berkman Klein Center at Harvard University. One of three tracks in the Assembly: Disinformation Program, the Assembly Fellowship convenes professionals from across disciplines and sectors to tackle the spread and consumption of disinformation. Each fellow participated as an individual, and not as a representative of their organization. Assembly Fellows conducted their work independently with light advisory guidance from program advisors and staff.

The Coordinated Authentic Behavior project and this post were authored by Maggie Engler, Olya Gurevich, Michelle Linch, and Dylan Moses; the project team has backgrounds in policymaking, cybersecurity, and linguistics.

Online disinformation campaigns are insidious, and stopping them can feel like playing whack-a-mole. From efforts to undermine elections to spreading erroneous health information, adversaries are able to promote disinformation through a scaled, network effect; relying on those most susceptible to misinformation to continue its spread.

Combined with the freedom, anonymity, and ubiquitous nature of the Internet, disinformation tactics create information challenges unsolvable by any one entity. In a world where disinformation is seemingly everywhere, how can individuals and institutions respond? We sought to understand how and when institutions and individuals have successfully pushed back against disinformation — to understand what is already working in regions around the world and see where there were strategies that could be abstracted and presented for non-experts.

In our analysis, we saw two broad categories of responses. First, overarching, institutional change by governments and platform companies in an effort to address disinformation. Second, individuals and resource-strapped organizations empowered to respond with the tools and capabilities at their disposal.

The vast majority of people that are affected by false information fall into the second category. They are the small nonprofits on-the-ground in crisis areas, the candidates in local offices hoping to make a difference in their town, and the grassroots policy advisors looking to pass new legislation. We aimed to highlight successful responses by individuals and small organizations, as well as by large institutions where tactics or responses might be usefully applied by smaller organizations.

Because existing models for mitigating disinformation mainly serve large institutions, we developed the Coordinated Authentic Behavior (CAB) Framework as a way to distill and draw out tactics for everyday individuals and communities. A riff on Facebook’s “coordinated inauthentic behavior,” CAB aims to centralize tactics, case studies, and data geared towards connecting people on-the-ground with resources to successfully mitigate online disinformation campaigns. By distilling the unique blend of capabilities among institutions into an actionable framework, our hope is to give individuals and communities paths to positively shape the information ecosystem for themselves.

Coordinated Authentic Behavior: A Framework

Below, we outlined a course of action that people or groups can think about when mitigating disinformation online or otherwise. These tactics can be taken individually or be read sequentially for users who are looking to take action over varying periods of time.

Coordinated Authentic Behavior framework

Remove Harmful Content

Naturally, when one sees misinformation, the first instinct will be, “how do we get rid of this content?” This tactic can be performed independently or by working with journalists, as we saw in our Irish Abortion Referendum case study. Here, the Transparent Referendum Initiative (TRI), a volunteer-led civic initiative, formed an early partnership with the news agency Storyful and began training media professionals on open-source intelligence techniques. TRI and Storyful began to track disinformation narratives and publish their investigation methods and results. These stories were picked up by larger news outlets, especially Buzzfeed, eventually encouraging both Facebook and Google to make adjustments to their advertising policy.

Limit The Scope of Disinformation

During the 2014–2016 Ebola outbreak NGOs like UNICEF and the WHO worked together to train village leaders in West Africa on facts related to Ebola. These leaders then used their social capital to circulate WHO-verified information to their villages. However, one does not have to be a large NGO in order to utilize this tactic; others can replicate the strategy used in this case study by recruiting credible influencers, whoever that may be in their community, to disseminate correct information. Influencers don’t have to be social media stars; they could be local community leaders, trusted religious institutions, or other individuals who represent the voice of their region. Their messages then limit the scope of false information by amplifying the correct information.

Expose the Disinformation

Sources of disinformation can hide in the depths of the Internet, often making attribution difficult. In 2018, the New York Times realized this as well. Since platform companies’ misinformation policies were in their nascent stages, and most inauthentic behavior was enforced under spam, exposing the disinformation for what it was remained a challenge. Recognizing the chasm between platform policy and the real-world, journalists from the New York Times and daily readers partnered to “name and shame” sources of disinformation across the web. This resulted in a live, on-going series of reporting on online mis/disinformation activities leading up to the 2018 elections.

Handle Bad Actors

Given the anonymity of the Internet, how can one hold disinformers accountable? Deplatforming — the process by which a platform company permanently bans a user from their service — is one tool institutions use to accomplish this goal. Though users are not typically consulted before a company will deplatform a bad actor, they can contribute to the body of evidence these companies use to make their decisions. In some cases, the government will also take action to exact punishment on propagators of disinformation, usually through the judicial system. Alex Jones, a right-wing conspiracy theorist, is known for promoting a number of harmful conspiracies online, chief among them that the tragedy at Sandy Hook Elementary was a hoax. The parents of Sandy Hook victims were able to appeal to the legal system: Jones was sued for defamation and ordered by the court to pay the plaintiff $100,000. While legal action is not an option for many victims of disinformation, depending on local laws, there can be legal recourse. Understanding if there are many victims of the same disinformation campaigns can also help people band together to fight it.

Build Public Awareness and Resilience

Ultimately, people or organizations fighting disinformation want to inoculate the public by making them aware of the harms of a disinformation campaign. To accomplish this, we suggest pairing with other organizations that have the same goal. During the Ebola outbreak, UNICEF worked with the WHO to perform on-the-ground interviews to understand which erroneous beliefs were most important to address. With community buy-in, they were able to equip people to correct false information on various platforms and media as it appeared over time. This same technique can be used regardless of the size or location of a region under attack by false information.

Connect with Us to Use the Framework to Counteract Disinformation

If one thing from our fellowship is clear, it is that online disinformation campaigns are not going away anytime soon. And while we believe the CAB framework is an excellent first step for individuals and communities, certainly more is needed. We hope to successfully demonstrate the applicability of these tactics across cultures and geographies. To that end, we hope to partner with nonprofits — groups closest to the actors this framework serves — to carry on our work and continue to establish positive use cases through local, on-the-ground initiatives. Our aim is that others will see the value in this collection and continue to add examples of techniques used to mitigate disinformation, creating a living document with a comprehensive coverage of effective responses.

For more information on Assembly: Disinformation, visit www.bkmla.org

--

--

Assembly at the Berkman Klein Center

Assembly @BKCHarvard brings together students, technology professionals, and experts drawn to explore disinformation in the digital public sphere.