Today @DFRLab announced that we are partnering with Facebook to expand our #ElectionWatch program to identify, expose, and explain disinformation during elections around the world. The effort is part of a broader initiative to provide independent and credible research about the role of social media in elections, as well as democracy more generally.
Democracy depends on debate, but productive debate depends on facts. In government for and by the people, facts are a foundation. Too often in recent years, we have witnessed attacks on this foundation: the deliberate spreading of false information, hostile state actors promoting divisive content, and attacks on fact-based reporting and evidence-based research.
One fact is humans are more connected than at any time in the history of our species.
Our connectivity has been an overarching force for good and is a trend not likely to be reversed. By reducing the time and space between families, communities, and business, we are closer than we’ve ever been. However, connectivity has also deepened the fault lines that run along the seams of societies. Bad actors or aggressive minorities can abuse connections for power, profit, or propaganda.
The world is on the move, and in movement we see contradictions. It’s these paradoxes that we must solve and shape.
In recent testimony in front of the United States Congress, Mark Zuckerberg explained:
It’s not enough to just connect people, we have to make sure those connections are positive. It’s not enough to just give people a voice, we have to make sure people aren’t using it to hurt people or spread misinformation. It’s not enough to give people control of their information, we have to make sure developers they’ve given it to are protecting it too.
We agree.
@DFRLab is committed to the mission of making sure that tools designed to bring us closer together aren’t used to instead drive us further apart. That mission starts with facts.
Our societies increasingly get information from tools designed to connect, rather than inform; platforms built for engagement, not analysis. This marks a tectonic shift in which we have an expanding encyclopedia of information literally at our fingertips. We’re firm believers that the only limiting factors of human capacity are our own creativity and will. But as we continue to design and create, we must not only answer can we do it, but also should we do it. Will it, whatever it is in the near and far future, make society more free and more fair?
@DFRLab’s mission is to expose and explain falsehood online, and to identify its sources and amplifiers. Our team is looking at everything from the conflict in Syria, to protests in Russia, to politically motivated automation and bots in Malaysia. Using open source means that we do not ask our readers to take our credibility for granted: we present our findings in a way that anyone with access to the internet can verify them.
We want to lift of a range of information that we can either prove or disprove and let our audience draw their own conclusions. We also emphasize breaking down examples of disinformation into component pieces and explaining the overall challenge because terms like “fake news” and “botnets” feel pervasive and more daunting than it has to be.
We’re building a community of #DigitalSherlocks in journalism and civil society, who incorporate the same methods into their own work.
Disinformation isn’t a new challenge. People have been propagating false narratives to achieve ideological aims since before Gutenberg invented the printing press. But, today, technology allows information to leapfrog a traditional marketplace of ideas. Disinformation can spread on industrial levels and evolve with the tools enabling it.
Elections are a prime target, because they are a process in which people come together to debate, then decide their future.
What we’ve seen across the democratic process is a toxic mix between deliberate disinformation, unintentional misinformation, existing social rifts, and increasing polarization. This is true across cases @DFRLab has analyzed over the past two years. Russia mounted a sophisticated operation to influence elections in the United States. Certain parties used false images and campaign material to stir emotive reactions from voters in Germany. The specter of #MacronLeaks hung over elections in France. Politicians automated populism with opt-in bots during Italian elections.
Not all of those tactics are outside of the realm of “acceptable” political discourse. The unifying characteristic in each case was, again, that disinformation was designed to drive citizens further apart rather than closer together.
Our partnership with Facebook is forward looking and will allow @DFRLab to focus more closely on the challenge of defending democratic debate in elections around the world. It will not change the way @DFRLab works: we will continue to cast and independent and critical eye on all platforms, including Facebook itself. We will not be monitoring elections for Facebook: it’s a company that will have a dedicated team expected to reach 20,000 people this year, aimed at doing that. Our mission will be to monitor the whole information space, from social media giants to emerging and locally relevant platforms to traditional media and to the engagement spaces in between.
The challenge is understanding the scope and trends of disinformation so that we can move from being reactive to proactive. Earlier this month, our team saw automated narratives ahead of local elections in Malaysia on May 9. Two hashtags directed at opposition parties reached more than 300,000 users and generated more than half a million impressions. While the accounts spreading those messages were mostly fake, the citizens reached were likely not.
We are partnering with the world’s largest online community because we face a critical moment in and among democracies worldwide — regionally, nationally, and at the local level. From the United States 2018 midterms to European Parliament in 2019 to major national elections this year in Brazil and next year in India just to name a few, voters will head to the polls to cast their vote on the future. Social media has given us more ways to express ourselves and keep leaders more accountable, but we need to account for the vulnerabilities that accompany those benefits.
Forging digital resilience will not be immediate. It will take a lot of work that cannot be done exclusively by social media companies or a research community. We need to close the current information gap between governments, tech companies, and media in order to solve for challenges like disinformation. Collective challenges, require collective solutions. If one acts in a silo, it’s not enough.
Some within our community may view this partnership with skepticism. We would expect nothing less and encourage more. In fact, a healthy skepticism is a vital part of digital resilience. Our partnership with Facebook is built on @DFRLab’s core values of openness, and open-source research; of intellectual independence and verifiable information. We will continue to base our research on those values as we look to defend the integrity of democratic debate.
We hope you’ll continue to join us in the effort.
Graham Brookie is the Director and Managing Editor of the Atlantic Council’s Digital Forensic Research Lab (@DFRLab).
Follow along for more in-depth analysis from our #DigitalSherlocks.