#TrollTracker: Facebook Uncovers Active Influence Operation
Preliminary analysis on accounts suspended by Facebook for inauthentic behavior surrounding politics in the United States
The accounts suspended by Facebook for being inauthentic on July 31 sought to exacerbate divisions and set Americans against one another.
It is too early to attribute them, but an initial scan shows behavioral patterns and use of language reminiscent of the troll operations run from Russia in 2014 through 2017.
Facebook shared eight pages with @DFRLab 24 hours before the takedown. This post gives an initial description of their open-source, publicly-visible features, focused on those which suggested a lack of authenticity or a resemblance to earlier troll operations.
Background: Inauthentic And Coordinated
False information and online polarization are on the rise in the United States. While countless factors are driving both organically among Americans, covert influence campaigns, some steered from abroad, are using disinformation to drive Americans further apart, and weaken the trust in the institutions on which democracy stands.
On July 31, Facebook announced the removal of around 30 pages and accounts on its platform for “coordinated and inauthentic behavior.” In an announcement depicting actions taken on their platform, Facebook representatives explained:
We’re still in the very early stages of our investigation and don’t have all the facts — including who may be behind this. But we are sharing what we know today given the connection between these bad actors and protests that are planned in Washington next week. We will update this post with more details when we have them, or if the facts we have change.
It’s clear that whoever set up these accounts went to much greater lengths to obscure their true identities than the Russian-based Internet Research Agency (IRA) has in the past.
Facebook concluded, based on internal data, that the accounts were “inauthentic,” and did not represent the individuals and groups they claimed to. @DFRLab has not had access to that data; however, our team has been poring over the specific pages Facebook took action against over the last 24 hours.
@DFRLab’s mission is to identify, expose, and explain disinformation. Our goal is to create more digital resilience against influence operations like those Russia mounted against the United States during 2016 elections. As such, we intend to make every aspect of our research broadly available, just as soon as we have had a chance to analyze the accounts. The effort is part of our #ElectionWatch work and a broader initiative to provide independent and credible research about the role of social media in elections, as well as democracy more generally.
This post provides an initial overview of the pages. Subsequent posts will focus on thematic trends and the behavior patterns exhibited in more detail.
The pattern of behavior by the accounts and on the pages in question make one thing abundantly clear: they sought to promote divisions and set Americans against one another.
Their approach, tactics, language, and content were, in some instances, very similar to accounts run by the Russian “troll farm” or Internet Research Agency between 2014 and 2017.
Similarities included language patterns that indicate non-native English and consistent mistranslation, as well as an overwhelming focus on polarizing issues at the top of any given news cycle with content that remained emotive rather than fact-based.
The set of accounts appeared, however, to use much stronger operational security. They maintained a focus on building an online audience then translating it to produce events — such as protests — in the real world. Further, this specific set of accounts was focused exclusively at engaging and influencing the left end of the American political spectrum.
Of note, the events coordinated by — or with help from — inauthentic accounts did have a very real, organic, and engaged online community; however, the intent of the inauthentic activity appeared to be designed to catalyze the most incendiary impulses of political sentiment.
Again, these are initial findings. We will provide a more in-depth assessment as soon as possible.
The first account identified by Facebook was called “Resisters” (username @resisterz), which described itself as:
Online and offline feminist activism against fascism. Widespread public education and direct action, amplifying the voices from targeted communities.
Organization on the page was effectively anonymous. It did not name any page managers or moderators, and the only contact detail provided was a Facebook messenger address.
@resisterz was created on March 21, 2017.
It took a liberal or left-wing stance on issues around gender, race, immigration, and human rights. The most engaged post on the page over time was about an issue of rape in South Africa.
It outperformed the engagement rate of other posts by hundreds of thousands.
However, it repeatedly made linguistic errors which are uncharacteristic of American colloquial language yet characteristic of native Russian-speakers, especially an inability to use grammatical articles, “the” and “a/an,” and difficulties with singular and plural verb forms. As @DFRLab has reported, this was one of the most telling identifiers of the troll accounts which targeted the U.S. from Russia in 2014-2017.
The page played on America’s internal divisions in the same way that Russian-run accounts targeted the U.S. in 2014–17. Its main focus, in keeping with its self-description and name (Resisters with a stress on “sisters”), was on gender issues.
The page also created an event in protest of the Trump administration’s policy of separating the children of undocumented migrants from their families.
Most imminently, it was the primary organizer of a counter-march against the “Unite The Right 2” rally, a far-right demonstration scheduled for August 10 in Washington, D.C.
The original “Unite The Right” rally, in August 2017, saw American neo-Nazis holding a torchlit procession through Charlottesville, Virginia. Fighting broke out between the far right and their opponents, and one woman was killed when a far-right supporter rammed his car into the crowd.
An inauthentic account promoting a counter-rally to “Unite The Right 2” suggested a desire to provoke further confrontations and violence.
Other events that it promoted — but did not necessarily host or co-host — included protests against U.S. Immigration and Customs Enforcement (ICE), U.S. President Donald Trump’s tax plan, protests against Trump’s Muslim ban, and a “March against rapist cops.”
These events tapped into the anger of the anti-Trump “resistance,” the Black Lives Matter movement, and supporters of America’s Muslim communities.
Promoting such events is not, in itself, indicative of malicious activity; the identification of this account as inauthentic rests primarily on Facebook’s own assessment, based on the full range of data, which only Facebook had access to.
What an initial open-source scan revealed was the account did feature language errors characteristic of earlier troll operations run from Russia, and that it amplified divisive content and events targeting the left of the political spectrum.
This page, created on March 12, 2018, focused on the African-American community, and emphasized the history of Africa in general, and Ancient Egypt in particular, as a source of African pride.
Many of its posts were memes without comment from the page, limiting the amount of textual data available for analysis. These were largely left-leaning.
Black pride, and the beauty of African women in particular, were repetitive themes.
Its only lengthy post was written in unexceptionable English, with none of the grammatical errors observed in @resisterz, although with some other typos.
The Facebook post did not credit the website for the article, even though it reproduced it wholesale (including spelling errors such as “conducive” for “conductive”). This is a small detail in itself, but recalls the plagiarism practiced by Russian intelligence account “Alice Donovan.” It is also, arguably, a case of inauthentic behavior.
More broadly, the emphasis on the African-American community and its alleged Egyptian roots was the same as that shown by known Russian troll accounts such as “Black4Black,” “Pan-African Roots Move” and “Nefertiti Community.” This extended, not just to similar sentiments, but to the invocation of identical models.
Most of the account’s posts were not overtly political. It appeared aimed at growing an audience among African-Americans; by July 31, 2018, it had only achieved seven followers.
Its overall behavior closely resembled that of known Russian accounts which targeted the African-American community in the 2014–2017 operation. However, the nature of its posts left too little information for a confirmed identification or attribution from open sources.
This page was created on March 20, 2017 — one day before the @resisterz page. Its theme was an idealized portrayal of the Central American background of modern Latin Americans, very similar in mentality to the @theancestralwisdom page.
Its “About” page was laconic, giving minimal information and no contact details other than a Facebook messenger address. Its motto was “We empower our gente,” using the Spanish for “people.”
The page promoted divisive content, emphasizing the suffering of Native Americans and Hispanics at the hands of the white community in general, and the Trump administration in particular.
Some of its posts invoked the possibility of armed resistance to oppression.
As with the @theancestralwisdom page, many of this page’s posts were reposts of content written by others, thus reducing the amount of linguistic evidence available for analysis. Those which appeared original were written in largely idiomatic English, without the errors typical of @resisterz.
Also as with @theancestralwisdom, the account’s memes, tone and posts strongly resembled those of known Russian troll account “Brown Power.”
This is insufficient for a conclusive identification from open sources. However, in the light of Facebook’s identification of the account as inauthentic, it suggests that the purpose of the account was likely to promote divisions between the Latin American community and the white community.
This account was created on July 11, 2017 and took an explicitly anti-Trump stance.
It combined attacks on Trump with praise for former President Barack Obama.
Like @resisterz, its use of language was non-English and contained grammatical errors characteristic of earlier Russian troll operations.
Like @warriorsofaztlan, it also posted anti-white memes focusing on the plight of Native Americans and modern immigrants.
It also posted about race issues, especially the “Take A Knee” controversy.
Again, few of its posts were of sufficient length to allow for a detailed linguistic analysis. One of the few lengthy ones attacked the Republican Party’s education cuts; it was copied word for word, including the imagery, from a 2015 article on website politicsusa.com.
Another lengthy (and literary) piece posted in November 2017 quoted a passage from Albert Camus to denounce America’s use of the death penalty. It was identical in image and wording to an Instagram post of May 20, 2017.
The account appears to have targeted the anger of a broad swathe of progressive, liberal, and minority groups. However, it scored no successes; by March 12, 2018, when it made its last post, it had scored zero likes and zero followers.
This was not a high-impact account, but it did resemble earlier influence operations, combining non-native language use, plagiarism and a range of divisive content. The plagiarism was probably an attempt to avoid the pitfalls of writing in a foreign language, which was one of the betraying features of earlier Russian campaigns, in particular.
As before, there is insufficient open source evidence for a firm attribution. Facebook’s action marked it as inauthentic, and the nature of the posts appeared to support that view. More precise identification is, as yet, not possible.
The newest account was @mindfulbeingz, created on May 31, 2018. Its content primarily focused on health, wellbeing, and spiritual issues.
Most of its posts were apolitical, often featuring memes with few or no supplementary comments.
A few came closer to being political, focusing on the mainstream media, social media, the pharmaceuticals industry, or all of them in the same post.
Most of the posts were meme-only. Only two were longer; both reproduced content posted by other sources, one attributed and recent, the other unattributed and older.
Unlike the Progressive Nation page, this account achieved at least some impact, with around 900 followers and likes by July 31, 2018.
It is unclear from these public posts why the account was singled out as inauthentic. Its behavior would be consistent with an attempt to build an audience in this particular community before targeting it with more overtly political content; this was the practice of earlier Russian trolls, as @DFRLab has chronicled. However, it would be equally consistent with a genuine account.
Given Facebook’s conclusion that these accounts were inauthentic, they appear to have constituted an attempt by an external actor — possibly, though not certainly, in the Russian-speaking world — to infiltrate left-wing American communities. The @resisterz account, in particular, attempted to mobilize its audience for a confrontation with the far right.
Such online activity poses a danger of both disinformation, which we define as deliberate spread of false information, and misinformation, which we define as the unintentional spread of false information. The Russian operation in 2014 through 2017 showed how easily disinformation actors could seed their falsehoods into genuine American communities on the right and the left; Americans thus became the unwitting amplifiers of Russian information operations.
The accounts which Facebook suspended appear to have been primed to take that approach, and, more explosively, to trigger standoffs between genuine Americans, bringing the risk of real-life violence from false stories.
Their behavior differed in significant ways from the original Russian operation. Most left fewer clues to their identities behind, and appear to have taken pains not to post too much authored content. Their impact was, in general, lower, compared with the 300,000 followers amassed by Russian troll account “Black Matters.”
Information operations, like other asymmetric threats, are adaptive. These inauthentic accounts, whoever ran them, appear to have learned the lessons of 2016 and 2017, and to have taken more steps to cover their traces. This was not enough to stop Facebook finding them, but it does reveal the challenge facing open source researchers and everyday users.
Their exposure underscores the ongoing threat which faces American social-media users on either side of the political spectrum. It would be dangerous to fall into the disinformation trap, but ruinous to believe or claim that every user who holds opposing views is part of a Russian information operation.
Above all, the exposure of these accounts reinforces the need for evidence-based analysis, clear open-source criteria for identifying influence accounts, and heightened awareness as the mid-term elections approach.
Ben Nimmo is Senior Fellow for Information Defense at the Atlantic Council’s Digital Forensic Research Lab (@DFRLab).
Graham Brookie is Director and Managing Editor at @DFRLab.
@DFRLab team members Nika Aleksejeva, Lukas Andriukaitis, Christina Apelseth, Michael Sheldon, and Aric Toler made this report possible with their research.
@DFRLab is a non-partisan team dedicated to exposing disinformation in all its forms. Follow along for more from the #DigitalSherlocks.
DISCLOSURE: @DFRLab announced that we are partnering with Facebook to expand our #ElectionWatch program to identify, expose, and explain disinformation during elections around the world. The effort is part of a broader initiative to provide independent and credible research about the role of social media in elections, as well as democracy more generally.
For more information click here.