What students and fellows learned and created during Assembly: Disinformation

By Zenzele Best

Over the past year, Assembly: Disinformation has brought together students, professionals, and experts deeply committed to understanding and making progress on problems related to disinformation and the spread of inauthentic content. This year’s program had three tracks: the Assembly Forum, Assembly Fellowship, and Assembly Student Fellowship. Guided by problem definitions developed during meetings of the Assembly Forum — an expert discussion group comprised of leaders in academia, civil society, government, and the private sector — the Assembly Fellows and Assembly Student Fellows developed team-based public interest project prototypes, sketches, and provocations scoped to specific aspects of the spread and consumption of disinformation. Student and fellows conducted their work independently with light advisory guidance from program advisors and staff.

Below, we share more about the two fellowship tracks, the cohorts’ learning, and the student and fellow contributions.

Assembly Fellowship

The 2020 Assembly Fellowship launched just as the number of COVID-19 cases across the United States was beginning to spike, prompting — among countless other changes across the globe — a significant shift in the scope and design of the program. This year’s cohort gathered 17 competitively-selected fellows, coming primarily from the private sector, with additional representation from government and nonprofits. Each fellow participated as an individual, and not as a representative of their organization.

2020 Assembly Fellows

The group convened in Cambridge for two weeks at the end of February to connect as a cohort, learn about key problem spaces around disinformation from leading researchers, and form project teams. Although the fellows were unable to return to Cambridge as planned, they nonetheless spent the three-month duration of the program leveraging their considerable expertise across policymaking, design, development, and journalism to learn together and design five projects — a combination of draft white papers, conceptual frameworks, and prototypes of technical tools to combat the spread of disinformation.

Below is 2020 Assembly Fellows’ final showcase, as well as an overview of each of the five projects. The development of some projects is still ongoing.

Assembly Fellowship Project Showcase

Disinfodex, developed by fellows with backgrounds in journalism, policy, and cybersecurity, is a prototype of a searchable database that indexes “public disclosures” of disinformation campaigns issued by online platforms. Upon discovering coordinated disinformation activity, platforms are increasingly disclosing their findings with the public through blog posts and reports. Currently, the project captures disclosures from Facebook, Instagram, Google, YouTube, Twitter, and Reddit. By aggregating the resources in one searchable database, Disinfodex provides a way to analyze and interpret publicly available information about disinformation. An accompanying white paper details the process of building Disinfodex as an independent aggregator, explores the benefits of advancing a shared infrastructure in the field, and outlines plans to build upon the database.

Visit the team’s website, watch their presentation, and read their blog post.

Into the Voids analyzes data voids and offers a framework for conceptualizing the harms they pose. The risks posed by “data voids” — the absence of high-quality, authoritative sources in search results — have only recently been explored in the context of disinformation. Developed by fellows with expertise in design, policymaking, and human rights, the team created a harms framework to evaluate both existing and emerging data voids and developed a data-driven methodology for mapping the lifecycle of data voids across Google search trends, Wikipedia page edits, and Mediacloud journalistic coverage. The methodology produces lifecycle maps that make it easy to clearly identify the “spike” and “long tail” phases of data voids, and help highlight the role of mainstream media coverage and organic search in amplifying and countervailing disinformation narratives. The team hopes its work will add further clarity, structure, and empirical data to the conversation around data voids and their implications.

Visit the team’s website, watch their presentation, read their blog post.

Semaphore is a beta-version browser extension that enables users to share data about instances in which they flag inauthentic content through the “report an issue” feature on platforms; the tool is currently available for use on Twitter. Disinformation researchers are limited by their lack of access to data under the control of platforms, including records of takedowns and copies of removed content (indeed, platforms themselves frequently don’t collect the latter). Developed by technologists and a human rights expert, Semaphore springs into action when a user with the extension installed flags a piece of content, capturing both the flagged content and its related metadata. That information can be archived in a database for future study and analysis, effectively creating an alternative means of data access for researchers. The team is interested in building upon Semaphore to allow users to share their data with interested parties like research communities. The team hopes that by enhancing the data accessible to experts by way of a crowdsourced donation and sharing model, they can enable more people to take part in the fight against disinformation.

Visit the team’s website, watch their presentation, and read their blog post.

Misinfo Motives puts forward an initial framework to try to answer a complex question: “why do some people create and share mis/disinformation?” Most interventions against mis- and disinformation attempt to treat its symptoms, but don’t address the underlying incentives that encourage its creation and dissemination — in part because those incentives are often not well-understood. The team, made up of professionals from the non-profit sector, government, and tech industry, worked to better understand the complex interplay between actors, motivations, and goals to bring more clarity to efforts aimed at countering mis- and disinformation. Over the course of the fellowship, the team developed a draft white paper that lays out a taxonomy of disinformation actors, identifies four major motivations behind the efforts of those actors, and maps existing interventions to those taxonomies.

Visit the team’s website, watch their presentation, and read their blog post.

Coordinated Authentic Behavior collects cases of successfully mitigated disinformation and draws lessons for whole-of-society responses for future campaigns. Despite the documentation and journalistic coverage disinformation campaigns have received in recent years, there has been relatively little focus on successful responses to the problem that might serve as models. The team, consisting of fellows with expertise in policy making, linguistics, and data science, worked to contribute to case study analysis scrutinizing the work of journalists, governments, social media platforms, and civil society organizations in effectively strategizing and executing responses to disinformation. Over the course of the fellowship, the team developed a prototype of a searchable database that collects examples of successful mitigation efforts, as well as a framework outlining potential strategies for countering disinformation campaigns.

Visit the team’s website, watch their presentation, and read their blog post.

Assembly Student Fellowship

The 2019–2020 Assembly Student Fellowship was made up of an interdisciplinary group of 16 competitively selected students from 9 schools across Harvard University, coming from domains including data science, mythology, and law. During the fall semester, students attended a biweekly seminar series led by experts at the forefront of the disinformation field. Students spent the final weeks of the semester scoping project ideas and splitting into project teams before returning to campus in January, consulting with faculty advisors before jumping into independent project work. The four projects include:

WTF is CDA is an interactive tool about Section 230 of the Communications Decency Act (CDA 230), one of the most influential laws that shaped the Internet as we know it today. Developed by students at Harvard College, the Graduate School of Design, the Kennedy School, and the Law School, the explainer provides a history of the law’s genesis and judicial interpretations, and gives users the opportunity to engage with important issues related to platform moderation efforts, particularly focusing on mis- and disinformation, by posing a series of questions around hypothetical situations involving controversial content on social media. The answers reveal how different approaches to content moderation in each instance reflect different policy values. The project also directs readers to more authoritative, in-depth resources examining the history, interpretation, and challenges of 230. This independent student project is in beta, and the team is actively gathering feedback to improve the tool.

Visit the team’s website and read their blog post.

Team Politics focused on the experiences of underrepresented minority politicians and candidates online. Developed by students studying at the Business School, the Law School, the Graduate School of Arts and Sciences, and the School of Engineering and Applied Sciences, the group explored the intersection of disinformation and harassment in political campaigns and its disproportionate impact on minority candidates. Disinformation that targets these candidates often mobilizes racial stereotypes and gendered language, alongside false information. Over the course of its research, the group learned that while social media is critical to running a successful grassroots campaign, platform content policies often fail to adequately protect minority candidates, many of whom feel unprepared to campaign online and are uncertain about what protections are afforded to them.

Read the team’s blog post.

A Taxonomy of COVID-19 Disinformation, developed by three students at the Medical School, the College, and the Graduate School of Arts and Sciences, maps COVID-19 disinformation and mitigation efforts. The team conducted preliminary data collection and analysis, and developed two infographics: one that explores the targets of COVID-19 related disinformation and the motivations behind its spread, and one that explores common interventions by governments, academics, platforms, and journalists to mitigate its impact. The group hopes its efforts will contribute to the growing body of work around better understanding and pushing back against this infodemic.

Visit the team’s website and read their blog post.

The Youth and Disinformation Literacy project conducted research for an infographic to improve media literacy among first time voters in the 2020 US elections. The project, which was developed by students at the College, the Kennedy School, and the School of Engineering and Applied Sciences, focused on four major topics: why disinformation is created, how disinformation is spread, real-world examples of inauthentic online content, and how to identify false information online. The team hopes their infographic can help increase students’ awareness of false content on the internet and encourage further conversation.

Read the team’s blog post.

These nine projects created by students and fellows during Assembly 2019–2020 reflect the diversity of backgrounds and disciplines present in both cohorts. The work of this year’s project teams speaks not only to their considerable talent and expertise, but also to their enthusiasm, tenacity, and commitment to the public interest.

​First launched in 2017, the Berkman Klein Center’s Assembly Program convenes professionals across levels of industry, government, academica, and civil society to explore and better understand seemingly intractable technology policy issues. The program has covered privacy and security, the governance and ethics of artificial intelligence, and disinformation. For more information, visit bkmla.org.

--

--

Assembly at the Berkman Klein Center
Berkman Klein Center Collection

Assembly @BKCHarvard brings together students, technology professionals, and experts drawn to explore disinformation in the digital public sphere.