#TrollTracker: Twitter Troll Farm Archives

Part Four — Expanding on key conclusions from the Russian and Iranian troll farms

(Source: @DFRLab)

On October 17, Twitter released an archive of over ten million tweets posted by accounts from 2013 through 2018. Of the total, over nine million tweets were attributable to 3,800 accounts affiliated with the Internet Research Agency, also known as Russia’s infamous St. Petersburg troll factory. Over one million tweets were attributable to 770 accounts, originating from Iran.

Each set is included in the same archive; however, because the actors and activity were separate, our analysis was conducted accordingly.

In an effort to promote shared understanding of the vulnerabilities exploited by various types on online influence operations, as well as social media’s role in democracy, @DFRlab had a brief advance opportunity to analyze the nearly complete archive.

What sets this archive apart is Twitter’s consolidation and release of all accounts the platform maintains high confidence are associated with the Russian Internet Research Agency and separate Iranian accounts.

In Part Four of our series, @DFRLab expands on assessments of content from the entirety of the archive — including Russian and Iranian troll farms.


1. All Content Points Home

Both troll operations put their governments’ needs first. Russia’s troll operation primarily targeted Russian speakers, while Iran’s focused on pushing regime messaging abroad by promoting aligned websites. The Russian operation’s subsequent use of English-language posting showed how a capability designed for domestic influence could be turned abroad.

2. Multiple Goals

The Russian operation had multiple and evolving goals. One main purpose was to interfere in the U.S. presidential election and prevent Hillary Clinton’s victory, but it was also aimed at dividing polarized online communities in the U.S., unifying support for Russia’s international interests, and breaking down trust in U.S. institutions.

3. Community Targeting

Both operations targeted highly engaged, highly polarized online communities, especially in the United States. The Russian operation attempted to infiltrate and polarize them, while the Iranian operation tried to message them. Any attempts to increase domestic resilience should prioritize working with such communities.

4. Equal-Opportunity Troll Farms

The Russian trolls were non-partisan: they tried to inflame everybody, regardless of race, creed, politics, or sexual orientation. On many occasions, they pushed both sides of divisive issues. It is vital to recognize this factor to end the partisan perception that Russian influence operations focused on one side of the political spectrum. Focus shifted over time or at specific moments based target audience.

5. Opportunism

The Russian trolls often chose targets of opportunity, especially elections and terrorist attacks, in their attempts to interfere in local politics. This included promoting anti-Islam hashtags after the Brussels terror attacks, a pro-Leave hashtag on the day of Britain’s Brexit referendum, and leaks targeting French President Emmanuel Macron before his election. These opportunistic attacks had little to no impact on the target populations.

6. Evolution

Both troll operations evolved, apparently through a process of trial and error in content and messaging. Their activities in 2014 were different from their activities in 2018. Countermeasures will have to take further evolution into account.

7. Low Impact

Other than in the United States, the troll operations do not appear to have had significant influence on public debate. There is no evidence to suggest that they triggered large-scale changes in political behavior, purely on the basis of their social media posts.

The full analysis is below.


1. All Content Points Home

Both the Russian and the Iranian campaigns put their governments’ needs first. It should always be remembered that the Russian operation began by targeting the Russian opposition, and was exposed by Russian journalists.

The Russian troll operation, run by the “Internet Research Agency” in St. Petersburg, posted far more in Russian than in English in its early years.

Timeline of posts in Russian, English and undeclared languages by the Russian troll farm, 2010–18. (Source: Twitter)

Activity spiked in late 2014 and early 2015. In context, at the time Russia was fighting an undeclared war in Ukraine and facing the “Manezhka” demonstrations, which were inspired by anti-corruption activist Aleksei Navalny, at home. The content sought to maintain support for the military incursion and quell domestic dissent.

The traces of that activity are readily visible in the Twitter archive. On one day alone, January 15, 2015, the troll farm boosted the hashtag “#АнтиМанежка2” (anti-Manezhka2) 1,968 times. Of those, 178 were invitations to other users to join the hashtag storm, posted by dozens of different accounts.

Image from the Twitter archive of tweets from January 15, 2015, with the text “КТО ХОЧЕТ В КОМАНДУ ВЫВОДИТЬ ТЭГИ? ПОЛУЧАТЬ БУДЕТЕ ФОЛЛОВЕРЫ ЗАВТРА ВЫВОДИМ ТЭГ В 16:00 МСК ПИШИТЕ РЕТВИТ #АнтиМанежка2” (translated from Russian: “Who wants to post tags in the team? You’ll get followers post the tag tomorrow at 16:00 MSK write retweet #AntiManezhka2”). (Source: Twitter)

The Russian troll farm referred to Navalny in Russian 14,728 times throughout the operation, including launching hashtags such as #ТьфуНаТебяАлексейНавальный (approximate translation: “Up yours, Aleksei Navalny”), #НавальныйВор (“Navalny thief”), #ПровальныйНавальный (“Failed Navalny”), and #РусскаяВесна (“Russian Spring”), the last of which was a sarcastic comparison of the domestic protests with those of the Arab Spring.

The Russian troll farm mentioned President Putin166,482 times. This included hashtags such as #ОкейПутин (“Okay, Putin” 1,346 times in on December 18, 2104, referring to his end-of-year press conference), #ПрямаяЛиния (“Direct Line,” 3,493 times in 2014–16, referring to his annual marathon public appearance), and even #PutinPeacemaker (377 times in a few hours on September 28, 2015, as Putin addressed the UN General Assembly).

Other pro-government hashtags included #СДнёмРожденияЛавров, wishing a happy birthday to Foreign Minister Sergei Lavrov, and #СДнёмРожденияШойгу, wishing the same to Defense Minister Sergei Shoigu. These hashtags had some penetration among Russian users not linked to the troll farm.

“Clever, right, ours. #HappyBirthdayLavrov.” Tweet by @shadddaur on March 21, 2017, using the troll farm’s hashtag. (Source: Twitter / @shadddaur)

The Russian troll farm accounts stepped to Russia’s defense when needed, such as after Malaysian Airlines flight MH17 was downed over Ukraine on July 17, 2014. The day before the shooting down, the trolls posted 18,949 times. On the day of the downing, their output dropped to 13,027 tweets on a range of topics.

The following day, they posted 57,646 times, clustering around three hashtags: #КиевСбилБоинг (“Kiev shot down the Boeing”), #ПровокацияКиева (“Provocation by Kiev”), and #КиевСкажиПравду (“Kiev, tell the truth”).

Scan of the hashtag #ПровокацияКиева (translated from Russian: “provocation by Kiev”), posted on July 18, 2014, showing some of the 22,240 results. (Source: Twitter)

The operation was not static. By mid-2016, the trolls were posting in English more than in Russian. At the same time, their center of gravity shifted towards U.S. political issues. However, the English-language trolls also defended Russia when needed, although not on the same scale.

For example, they mentioned special prosecutor Robert Mueller and his Russia investigation 6,712 times, often with the words “witch hunt” added. This was a substantial figure, but only half the traffic on Navalny. Some of the comments were in favor of Mueller, in keeping with the accounts’ cover identities.

Much of their posting consisted of automated shares of American websites, judging by the marker of “IFTTT” (short for “if this then this,” an automation program) following their posts.

Screenshot from a scan of one of the posts terming Mueller’s probe a “witch hunt,” shared 78 times by different accounts; note the IFTTT references on the right. The headline phrase came from conspiracy site truthfeed.com. (Source: Twitter)

They also defended Russia’s allies. For example, they posted #SyriaHoax 201 times after Syrian President Bashar al-Assad’s forces used sarin gas on civilians in April 2017, triggering U.S. strikes.

Most came from a single account, @covfefenationUS, which posted the same string of hashtags and emojis dozens of times at a rate of roughly twice a minute.

Screenshot from a scan of the posts mentioning #Syriahoax, showing the total number of results and repetitive posts by a single account. The timestamp is on the right. (Source: Twitter)

The same account led calls to fire President Trump’s son-in-law, Jared Kushner, seen as the architect of the Syria strikes. It posted #FireKushner hundreds of times, again with emojis and other hashtags, on April 8–9, 2017.

Screenshot from a scan of the Russian troll farm’s posts on #FireKushner, April 2017. The ones with red series of emojis are from @covfefenationUS. (Source: Twitter)

It is important to keep the troll farm’s original purpose and targets in mind. This was a domestic operation designed to harass the Russian opposition, which then evolved into a tool of foreign influence. Russians were its first, and most important, victims, and the Russian government was its main beneficiary.

The same can be said of the Iranian operation, which was smaller in scale, and much less subtle in approach. Over its lifetime, the operation posted 1,122,937 tweets, roughly one tenth of the Russian output. It, too, peaked in late 2014, and accelerated through 2017.

Timeline of posts by the Iranian troll operation, 2011–18. (Source: Twitter)

The surge in 2014 was driven by hyperactive tweeting by a small number of accounts, rather than a larger-scale operation. For example, Iranian account @marialuis91, a French-language account claiming to be a journalist, posted the same article to different users 422 times on December 1, 2014. The same day, another Iranian account, @BenhadiiHedia, posted a different French-language article 223 times.

Screenshot of a scan showing shares of the same headline, unrelated to Iran’s interests, by @BenhadiiHedia, on December 1, 2014. (Source: Twitter)

On November 11, 2014, during the rapid surge in volume, all the Iranian accounts together posted 1,143 times. Of those, 681 were shares by @marialuis91 of an article on anti-Muslim racism in France. The same account shared an article on French imperialism in Syria 103 times, and an article about French ex-president Nicolas Sarkozy 264 times. In total, 1,048 of the posts that day — almost 92 percent — came from the same account.

All those shares were of articles by the French-language version of awdnews.com, one of many websites which took Iranian pro-regime messaging and stripped it of its attribution, in an apparent attempt to launder it to new audiences. As our separate analysis shows, this was the Iranian operation’s main goal: driving users towards pro-Iranian websites.

Over the life of the operation, the Iranian accounts, especially @marialuis91 and @BenhadiiHedia, shared links to awdnews.com 329,730 times, more than a quarter of all traffic. They shared links to iuvmpress.com, the core of the laundering operation, another 15,000 times.

Screenshot of a scan showing shares of links to awdnews.com. Note the slew of shares by @marialuis91 and @BenhadiiHedia. (Source: Twitter)

Such massive amplification of pro-regime (and possible regime-controlled) websites indicates that the troll operation’s main purpose was to promote pro-regime messaging to online audiences. This was not about dividing Iran’s enemies, but getting them to read Iran’s messages.

2. Multiple Goals

The Iranian troll operation focused on promoting regime messaging, but the Russian troll operation had multiple goals which evolved as the campaign progressed. It interfered in the U.S. presidential election in 2016, attempted to undermine trust in U.S. institutions, and sought to polarize American online communities. (We discuss polarization in point four, below.)

The prime target of the election interference was Hillary Clinton. Russian attacks on her began even before she declared her candidacy, on April 12, 2015. At 07:53 that morning, troll account @politweecs posted, “#Hillary would be an ‘excellent president’ — #Obama. Tastes differ.”

Two days later, the Russian troll accounts clustered around the hashtag #HillaryNoThnx, using it over 250 times in a day.

Screenshot of posts by the Russian troll farm on #HillaryNoThnx. Note the highlighted claim of her being an “old Russophobe.” (Source: Twitter)

Throughout the campaign, they posted anti-Clinton content. Hashtags pushed by the Russian troll accounts included #DemsWontPass (304 times), #Hillary4Prison (200 times), #LockHerUp (1,459 times), #NeverHillary (3,593 times), and #CrookedHillary (1,993 times).

Screenshot of a scan of posts by Russian troll accounts on #CrookedHillary. (Source: Twitter)

After Wikileaks began publishing emails hacked from Clinton campaign manager John Podesta by Russian military intelligence, the trolls posted on #PodestaEmails 858 times.

Simultaneously, the Russian troll farm began to spread content in support of Senator Bernie Sanders, in an apparent attempt to weaken Clinton’s support within her own party. This included posting using a hashtag popular among Sanders supporters — #FeelTheBern — 1,840 times, mostly in late 2015 and early 2016. Typical posts included these:

“It’s so hard to choose when you’re a liberal #FeelTheBern #HillaryForPrison” (@TheFoundingSon, February 17, 2016)
“#Bernie is the only candidate that is on the board. #FeelTheBern” (@TrayneshaCole, April 2, 2016)
“I would rather see the first honest president than the first female president. Bigger milestone.#FeelTheBern” (@MissouriNewsUS, June 24, 2016)

The approach to Republican candidates was much less consistent. Some early posts backed Donald Trump in August 2015, under the hashtag #TrumpBecause, posted 1,201 times, with text such as “#TrumpBecause Trump is the only legitimate candidate consistently speaking the truth, and leading in polls” (@LazyKStafford, August 13, 2015).

Others mocked him with #TrumpCampaignSlogans, such as “A chicken in every pot, and a bankruptcy in every casino” (retweeted by @cassieweltch) and “CAN’T TYPE. NEED TO OVERCOMB” (@AronHolden8, July 25, 2015). The Russian troll farm engaged on this hashtag 525 times.

The approach to Jeb Bush was similar. On February 2, 2016, Russian troll farm accounts coalesced around the hashtag #JebWeCan. Some posted in favor of Bush, with messages such as, “Bush rly loves our country! #JebWeCan #IloveObama” (@Ivey_Mexx) and “Jeb it’s time to serve your country! #JebWeCan #IloveObama” (@RachDurant).

Others attacked him, with lines such as, “I guess you are smart enough not to vote for Jeb! #JebWeCan #ILoveHillary #IloveObama” (@brightandglory) and “Jeb Bush doesn’t even have any clear position. What is he going to do during his presidency? #JebWeCan #ILoveHillary” (@owenywade).

Frequently, and well into 2016, the Russian troll farm posted in favor of Ted Cruz. They appear to have divided their attention between Cruz and Trump until the latter secured the nomination. For example, during the Republican primary debate in South Carolina on January 14, 2016, posts by Russian troll accounts included these:

“Our country has future only with Trump #GOPDebateSC” (@patriotraphael)
“#Trump will beat em all! #GOPDebateSC” (@judelambertusa)
“We have to fight for a future where everyone has a chance at the American Dream and choose Cruz as our president #GOPDebateSC” (@priceforpierce)
“Christian country needs a Christian candidate! Go #Cruz #GOPDebateSC” (@_nickluna_)

In late February, Russian troll farm accounts joined the hashtag #CrookedCruz to a string of pro-Trump hashtags such as #TrumpWillWin.

Screenshot from a scan of posts by the Russian troll farm on #CrookedCruz and other hashtags. (Source: Twitter)

However, as late as May 28, 2016, during the Nevada caucus, Russian troll farm accounts attacked Trump and backed Cruz.

“#scPRIMARY #NevadaCAUCUS voters it’s time 2 place the #Trumpster in the dumpster. Send Trump” (@jacquelinisbest)
“Elect a REAL #conservative 2 reverse the damage President Obama has done to our county. #scPRIMARY #NevadaCAUCUS @TedCruz is the man #PJNET” (@jeannemccarthy0)
“Reagan: It’s your BIRTHRIGHT to dream great dreams! CRUZ can revive the DREAM! #CruzCrew #PJNET #TCOT #CCOT” (@westernwindwes)

Once Trump was confirmed, many Russian troll farm accounts threw their weight behind him. By October 2016, in the buildup to the election, their posts were routinely pro-Trump and anti-Clinton and amplified the Wikileaks dumps. On election day, they were particularly one-sided.

“Judgement Day is here. Please vote #TrumpPence16 to save our great nation from destruction! #draintheswamp #TrumpForPresident” (@DevineDevinBr)
“I know vote is pvt, unless 1 chooses to reveal. I’m behind (not voting obviously) #HillaryForPrison2016 Your colours? #TrumpForPresident” (@J0hnLarsen)
“I don’t want a criminal in office! I’d vote for Monica before I vote for Killary! #Trump #MakeAmericaGreatAgain #TrumpForPresident” (@AmelieBaldwin)

This was a large-scale influence campaign aimed at suppressing support for Clinton and galvanizing support for Trump.

Simultaneously, the Russian troll accounts tried to paint a picture of America as undemocratic and the election as rigged. This narrative was built up by Kremlin outlets RT and Sputnik over the year; the troll accounts provided incidental amplification. Five Russian-language accounts retweeted an RT article alleging that Google was rigging its search results to favor Clinton. Strikingly, Wikileaks tweeted a Sputnik article making the same claim.

Screenshot of five Russian troll farm accounts which retweeted RT on Google’s autocomplete results. (Source: Twitter)

On election day itself, the trolls pushed the hashtag #riggedelection almost 100 times. Over the campaign as a whole, they posted on rigging almost 1,500 times, although not always in the context of the election, and #voterfraud 290 times.

Screenshot from a scan of Russian troll farm posts on the term “voterfraud.” (Source: Twitter)

On election day, troll @TEN_GOP posted two claims of voting machines rejecting votes for Trump. They scored a combined 31,847 retweets and 15,857 likes, a significant figure on such a crucial day.

These posts seemed to have two goals: to undermine faith in the legitimacy of the vote and to provoke anger against an expected Clinton win. Only with Trump’s victory did the claims of fraud die away.

3. Community Targeting

The Russian operation targeted highly engaged and polarized online communities in the United States from its earliest days. In most cases, the trolls targeted communities on both sides of the divide, attempting to infiltrate them, and then radicalize them with more polarizing messaging.

This is a vital factor in considering future resilience. Not all American communities were equally targeted and not all targeted communities were equally infiltrated; in general, the more engaged and polarized the community, the better the lookalike trolls performed. Attempts to build online resilience should take a community-based approach, and engage with verified leaders of the community to raise awareness of the danger of foreign influence.

Some of the Russian operation’s first English-language accounts, set up in the second half of 2013, masqueraded as members of the Black Lives Matter movement. Others posed as defenders of the police, under the hashtag #BlueLivesMatter.

Screenshot from a scan of the hashtag BlackLivesMatter, showing the biographies of some of the accounts. Note the creation dates in July-August 2013, highlighted in red, and the bios, highlighted in blue. (Source: Twitter)
Screenshot from a scan of the hashtag #BlueLivesMatter, showing the biographies of some of the accounts. (Source: Twitter)

Various Russian accounts posted on a range of racial issues, including the rights of African Americans, Native Americans, and Hispanic groups.

Tweet by Russian troll account @Crystal1Johnson. (Source: Twitter / @Crystal1Johnson, via tweetsave.com)

Some Russian troll farm accounts posed as supporters of gun rights, using the hashtag #2A (short for “second amendment”) in their bios. They often claimed to support Donald Trump, using the hashtag #MAGA (“Make America Great Again”), while others opposed him, using the hashtags #Resist and #Impeach45 (a reference to Trump as the 45th president).

Screenshot of the profile page for Russian troll account @TEN_GOP, one of its most successful accounts. Note the #MAGA hashtag. (Source: Twitter / @TEN_GOP, via Reddit /r/the_donald)
Tweet by Russian troll account @KaniJJackson, cached by Google. The text in bold shows the account’s bio, including both #Impeach45 and #Resist. (Source: Twitter / @KaniJJackson, via Google cache)

The trolls targeted many communities. Some of their accounts posed as LGBT groups, others as Christian conservatives.

List of Russian troll accounts, showing the three LGBT handles. (Source: House Democrats)
Screenshot from a scan of posts from the Russian troll farm on the far sides of the political spectrum on #GayWeddingCake, dated April 2, 2015. (Source: Twitter)

Some of these personality accounts were hugely successful, scoring tens of thousands of followers and retweets. One in particular, @Jenn_Abrams, which posted on everything from race relations to Kim Kardashian, was regularly featured in international collections of funny posts.

Their approach was to post positive content about the target community, and then progressively introduce negative and polarizing content as its audience grew. @TEN_GOP, masquerading as the unofficial account of the Republican party in Tennessee, exemplified this strategy; @DFRLab analyzed it here.

Iran’s operation was much less community-specific in its origins, but after Trump’s election, and especially in 2018, it began creating accounts which played on American political divides.

Screenshot from a scan of Iranian troll farm accounts using “resistance” hashtags in their bios. Note the creation dates, all of which came after Trump’s victory. (Source: Twitter)

Later Iranian accounts also moved away from claiming identities as journalists or news outlets, and began to pose as everyday Twitter users, apparently realizing that, on social media, personality-based accounts drive engagement.

Such accounts included @InariAlesia, bio “a mother and a cook,” created on August 1, 2017; @NinaHanderson, claiming to be a Texas-based lawyer and created on September 1, 2018; @AllynBeake, again apparently in Texas, created on December 18, 2016, whose bio proclaimed, “Alone. Alone. Social Activist. Writer. Analyst. #Resistance #ImpeachTrump #Resist #Progressivism;” and @zarasmi75452578, created on June 26, 2018, bio, “live in US,Ohio,cleveland. Feminist and proud, cleveland State University.”

While they claimed personalities, they did not show them. Almost all their output was retweets of other Iranian troll accounts, mostly on Iranian foreign policy issues.

Screenshot from a scan of posts by Iranian-run Ohio feminist account @zarasmi75452578. Note the high number of retweets it posted, and the focus on Iranian foreign policy issues. (Source: Twitter)
Screenshot from a scan of posts by Iranian-run “mother and cook” @InariAlesia. Note that almost all are retweets of Iranian-run account @libertyfrontpr, leading to its website. (Source: Twitter)

In the most remarkable instance of an attempt to gain more personality, Iranian account @libertyfrontpr changed its name to @berniecratss, apparently trying to woo supporters of Senator Bernie Sanders, another highly engaged online group.

Before and after: screenshots of the profile page of the Iranian account, which changed its name in July 2018. (Source: Twitter / @libertyfrontpr / @berniecratss, via Google cache)

The lack of personalized content undermined this creation of personalized accounts; few Iranian tweets achieved significant levels of engagement.

4. Equal Opportunity Troll Farms

The Iranian accounts consistently promoted regime narratives, but the Russian troll farm accounts were more complex. Simultaneously hyper-political and apolitical, they regularly took extreme positions on both sides of America’s most toxic debates.

It is vital to recognize this factor to end the partisan perception that Russian influence operations focused on one side of the political spectrum. Focus shifted over time or at specific moments based target audience.

We have already noted their focus on opposing groups, such as Black Lives Matter and Blue Lives Matter or #MAGA and #Resistance users. This also took in individual issues.

For example, during the “take a knee” controversy, Russian troll farm accounts posted on both sides on the same day, using the unidiomatic hashtag #TakeTheKnee. The use of the odd hashtag may indicate a desire to have a unique hashtag for measuring purposes, or a lack of English skills. Posts included:

“Rosa Parks took a seat. We must #TakeTheKnee.” (@JemiSHaaaZzz, September 24, 2017)
“White America be like: ‘Why can’t they protest peacefully?’ Also White America: ‘But not like this. Or like that.’ #TakeTheKnee #TakeAKnee” (@wokeluisa, September 24, 2017)
“#IStandWithVillanueva True patriot. Thank you for your service. Thank you for STANDING UP. #TakeTheKnee” (@amconvoice, September 24, 2017)
“The NFL Should #TakeTheKnee for Player Violence Against Women” (@brristasi, September 24, 2017.

The text of the last tweet listed above reproduced a headline from conspiracy site truthfeednews.com, and was posted using the automation tool IFTTT.

Repeatedly, the Russian troll farm accounts clustered around hashtags which referred to breaking news, suggesting that they had been ordered to coordinate in this way.

After the mass shooting in San Bernardino, California, in December 2015, for example, they gathered around #Prayers4California. The great majority posted in favor of gun rights, but a handful used the same handle to call for gun control.

“Mass shooting occurs even in #GunFreeZones so people is the problem not guns #Prayers4California” (@micparrish, December 3, 2015)
“I’m tired of the whole anti gun thing. Saying that Guns cause murders is like saying Steering Wheels cause car wrecks #Prayers4California” (@JudeLambertUSA, December 3, 2015)
“mass shooting wont stop until there are #GunFreeZones #Prayers4California” (@LazyKStafford, December 3, 2015)
“Another shooting. Nothing new in America. We need to fix this issue and not be living it weekly.We need gun legislation! #Prayers4California” (@PatriotRaphael, December 3, 2015)

Interestingly, the identical tweet, including the typo “weekly.We”, was posted 33 minutes later by another account, @riogithief, suggesting that it had been copied and pasted within the troll farm from a list of potential tweets.

Screenshot from a scan of two tweets and a retweet using the identical text, including the typo. (Source: Twitter)

After the Charlottesville neo-Nazi demonstration and violence, the vast bulk of Russian troll farm accounts posted tweets accusing “Antifa” of starting the fighting. However, a few with an apparently liberal persona took the opposing side.

“Milo Perfectly Explains Why the LEFT is Responsible for Charlottesville” (@elinsstr, August 13, 2017)
“WATCH: Father talks about daughter killed at #Charlottesville protest Heartbreaking #WhiteSupremacists” (@ellastrs, August 15, 2017, via IFTTT automation)
“Rufus Wainwright’s ‘Hallelujah’ dedicated to Charlottesville victim Heather Heyer was performed tonight at People’s State of the Union. Brings tears to my eyes. She will never be forgotten. #PeoplesSOTU” (@KaniJJackson, January 30, 2018)

The main exception to this bipartisan hate speech was when the issues concerned the Obama administration, or the United States’ financial or business elite.

For example, in late 2015 and early 2016, after the administration decided to let in thousands of Syrian refugees, the Russian troll farm accounts used #Syrianrefugees to attack the decision.

“Why do liberals invite #SyrianRefugees when the Real Unemployment rate is 18%? Blindness? #BeingPatriotic #PJNET” (@patriotous, February 10, 2016)
“#StopIslam #IslamKills Imagine how hard it would be to successfully vet #SyrianRefugees who have all “lost” their documents.” (@ChesPlaysChess, 22 March, 2016)

During the controversy over the Dakota Access Pipeline, the trolls routinely attacked the American business and political establishment. In a scan of all 491 posts to use the phrase “Standing Rock,” the name of the Native American reservation involved, the great majority were negative, either criticizing the pipeline or accusing the authorities of violence against “water protectors.”

“Drone footage shows water cannons being used against #WaterProtectors #NoDAPL #StandingRock” (@BleepThePolice, November 25, 2016)
“A Historic Victory 21st century score: Indians 1, Army 0. #DAPL #StandingRock” (@Crystal1Johnson, December 5, 2016)

These posts appeared aimed at exacerbating hostility between American communities, and against the government and business elite.

5. Opportunism

The Russian troll farm accounts regularly latched onto breaking events with hashtag campaigns. Some resembled short-lived attempts to interfere in political debates in other countries, notably Britain’s Brexit debate and France’s presidential election.

We noted the reaction to the San Bernardino shooting. After the Brussels terrorist attacks on March 22, 2016, the Russian troll farm accounts pushed the inflammatory hashtag #IslamKills almost 3,800 times.

Screenshot from a scan of Russian troll farm account posts on #IslamKills. (Source: Twitter)

Some of these moments in time appear to have been planned, and dropped promptly on time, such as the bizarre series of posts around #My911Story the trolls posted on September 11, 2015, with each account posting several conflicting stories.

For example, Russian troll account @micparrish posted:

“In 7th grade home room. TV was on. I think we were all in such a shock that we didn’t feel the fear quite yet #My911Story”
“it was a common day for me, I knew nothing about the terrorists attack till the evening #My911Story”
“I saw it on the news, I was surprised by how easy it was for terrorist to take over the planes #My911Story”

@DorothieBell posted:

“I knew something horrible had just happened, and I began to feel sicker #My911Story”
“I was at the post office when I heard that a plane flew into the first tower… #My911Story”
“I was trying to sleep when my wife woke me up after the first plane crashed #My911Story.”

The last post is particularly interesting, since the “Dorothie” account posed as a “Conservative wife, mother,” and is unlikely to have lived in a single-sex marriage.

At other times, their hashtags came out a day or two after the event in question, which suggested that the editorial decisions were slow to respond to breaking events.

For example, after Hillary Clinton declared her candidacy on April 12, 2015, it took the troll farm’s accounts until April 14 (Russian time) to coalesce around the hashtag #HillaryNoThnx, attacking her. Again, the hashtag was noteworthy for its curious construction.

Screenshot from a scan of Russian troll farm posts on #HillaryNoThnx. (Source: Twitter)

Some of these hashtags of opportunity appeared to be attempts to interfere in foreign political processes. On June 23, 2016, as Britain held its Brexit referendum, the troll farm’s accounts posted #ReasonsToLeaveEU 1,102 times, a mixture of authored tweets and retweets. They were apparently spearheaded by @WorldOfHashtags, which posted, “Everybody is obsessed with #EUref today. So let’s play #ReasonsToLeaveEU.”

Screenshot from a scan of Russian troll farm posts on #ReasonsToLeaveEU on Brexit day. (Source: Twitter)

This appears to have been an attempt, on voting day, to make a pro-Leave hashtag trend. However, it should not be taken as a larger Russian attempt to interfere in Brexit. The Russian troll farm only posted on #VoteLeave 35 times in its career and “Brexit” 4,437 times and mostly after the vote, suggesting that there was no concerted campaign around the issue.

Similarly, on May 6, 2017, just before France held the second round of its presidential election, troll farm accounts posted the #MacronLeaks hashtag 77 times. The leaks were reportedly perpetrated by Russian hackers. They were pushed on Twitter by American far right activist Jack Posobiec and Kremlin validator Wikileaks, as @DFRLab reported at the time.

Screenshot from a scan of Russian troll farm posts on #MacronLeaks, showing the small number of posts, and the names of the accounts. (Source: Twitter)

Some of the troll farm’s highest-profile accounts, including @SouthLoneStar (346 retweets), @Pamela_Moore13 (1,526 retweets) and @Jenn_Abrams (19 retweets), amplified the hashtag. A few also posted on hashtags boosting Macron’s opponent Marine Le Pen, around #Marine2017 (172 times) and #LaFranceVoteMarine (four times).

While these resemble short-term influence attempts, they cannot be classified as serious influence campaigns, such as the troll farm ran against the United States. They were targets of opportunity, not systematic operations.

6. Evolution

Both troll farm campaigns evolved. We observed how the Iranian operation adopted more personal profiles, although not more personal content, in 2018; this may have been inspired by the Russian effort. We also observed how the Russian campaign adapted its messaging towards Republicans, shifting between Trump and Cruz before settling on the former.

Early Russian troll farm accounts attempted to spread entirely fake stories about manmade disasters. The most elaborate was the claim, on September 11, 2014, that a chemicals plant in Louisiana had exploded; clustered around the hashtag #ColumbianChemicals, this included tweets, SMS messages, YouTube videos and even a Wikipedia page.

Another, on March 10, 2015, reported a phosphorus leak near American Falls, Idaho. This used the hashtag #Phosphorusdisaster and even included a post to CNN’s crowdsourced i-report site.

Screenshot of the CNN i-report story, posted by a user called Maxim Shilo, via Wayback machine. (Source: CNN, via Wayback machine)

The Russian troll farm pushed the story massively and tweeted about it over 2,500 times.

Screenshot from a scan of Russian troll farm traffic on #PhosphorusDisaster. Note the sheer number of results. (Source: Twitter)

Stories such as these forced the authorities to respond and deny the incidents, but the stories did not gain significant traction. By late 2015, the Russian troll farm accounts appear to have abandoned the attempt to create fake incidents and instead focused on real-life ones.

They appear to have taken a similar approach to their “personality” accounts. Early examples, such as the “African American” cluster created in late 2013, largely retweeted with little original content. These did not perform well and gathered only a few hundred followers in five years of activity.

Later accounts built up individual personalities, which changed as they received more audience response. @TEN_GOP, for example, often shared headlines straight from websites in its early life (November-December 2015) and added minimal comment.

“Severed pig’s head thrown at Philadelphia mosque door https://t.co/Xswstjh1my #WakeUpAmerica #ISIS” (Posted on December 8, 2015, zero likes, two retweets. The headline was shared from the Washington Post.)
“Fire her! Anti-american school principal bans Pledge of Allegiance and Christmas!” (Posted on December 13, 2015, one like, 10 retweets. The article came from Breitbart.)
“UK Anti-Terror Cops Reveal Muslims Offer No Help in Combatting Extremism. What about US?” (Posted on December 27, 2015, three likes, nine retweets. The article came from Breitbart.)

By early 2016, it was more outspoken and achieved greater impact.

“They should remove the muslim community if the American flag is a threat to them.” (Posted on January 30, 2016, 128 likes, 189 retweets. The article came from Conservative Post.)

Posts which engaged directly with other users tended to perform well, especially when combined with polarizing messages.

“Since all MSM lies.. let’s do our OWN POLL! RT for #Trump2016 LIKE for #Hillary2016 We the People and we decide!” (Posted on June 3, 2016, 15 likes, 197 retweets.)

This progressive radicalization of the account epitomized the feedback loop of online discourse: the Russian troll reacted to audience sentiment and became increasingly aggressive, while the audience responded with increasing appreciation. By election day, 2016, its aggressive posting style and hyper-partisan disinformation were bringing it thousands of amplifiers.

Screenshot of an election day post by @TEN_GOP alleging fraud. The post was retweeted over 25,000 times. (Source: Twitter / TEN_GOP, via archive.org)

After Twitter suspended the troll farm’s main accounts in the fall of 2017, the trolls appear to have retrenched. They launched a second wave of accounts, targeting the same communities (see our analysis here), but they seem to have tried to mask the more obvious signs of their foreign nature.

One striking feature of this second wave was the number of times it used IFTTT automation, which allows users to automate posts across accounts using a number of associated apps. The Russian troll farm accounts used IFTTT 291,275 times, the majority during the second wave, which linked to hyper-partisan websites.

Screenshot from a scan of Russian troll farm posts using IFTTT software; note the number of results, and the dates in late 2017.

It is not clear what caused this evolution; it could indicate a lack of manpower (or trollpower), or a desire to reduce the scope for linguistic errors. In either case, it appears to have reduced the trolls’ impact, as their content was less personal, and therefore less engaging.

7. Impact

The Russian Twitter operations in Russian and English had mixed results, and their impact should not be exaggerated. The troll farms certainly scored individual successes. In Russian, they flooded politically important hashtags with pro-government comment; in English, they spread false claims of election fraud and posted high-impact comments invoking polarization against an assortment of communities.

Against these successes should be set the large number of accounts which achieved negligible results, and the many hashtags the troll farms launched which failed to take off — partly because their use of English was so poor. A handful of accounts, such as @TEN_GOP, achieved substantial followings, but many hundreds more did not.

The Russian experiments in the UK and France, and the whole of the Iranian experience, point to their limitations. Russian troll farm accounts did not run a campaign in favor of Brexit, or against Macron: their interventions were short-lived and unlikely to have had any real impact. The Iranian accounts seldom engaged with users, and seldom received engagement in return.

The troll farm Twitter accounts achieved some remarkable moments; but overall, they had a limited effect, especially in other countries.

Conclusions

The Russian and Iranian troll farm operations were fundamentally different in goal and outlook, but used similar tools, in the shape of Twitter accounts masquerading as members of the target communities. Russia’s attempt was far larger, and far more effective.

Many conclusions will be drawn as the research community analyzes these posts. @DFRLab’s immediate concern is with building resilience against future operations, particularly during elections.

The two operations show that American society was deeply vulnerable, not to all troll farm operations, but to troll accounts of a particular type. The type hid behind carefully crafted personalities, produced original and engaging content, infiltrated activist and engaged communities, and posted in hyper-partisan, polarizing terms.

Content spread from the troll farm accounts was designed to capitalize on, and corrupt, genuine political activism. The trolls encapsulated the twin challenges of online anonymity — since they were able to operate under false personas — and online “filter bubbles,” using positive feedback loops to make their audiences ever more radical.

The positive conclusion of this is that the trolls were less effective than may have been feared. Many achieved little or no impact, and their operations were washed away in the firehose of Twitter. Few scored significant followings. The Iranian operation shows how easy it is to ignore troll farm accounts, if they are not skillfully operated.

The negative conclusion is that the most effective Russian trolls used exactly the techniques which drive genuine online activism and engagement. That made it much harder to separate them out from genuine users. It will continue to do so. Identifying future foreign influence operations, and reducing their impact, will demand awareness and resilience from the activist communities targeted, not just the platforms and the open source community.

The influence operations on social media were an attack on groups within society. Any effective response will need to engage those groups, more than hostile actors can.


Ben Nimmo is Senior Fellow for Information Defense at the Atlantic Council’s Digital Forensic Research Lab (@DFRLab).

Graham Brookie is Director and Managing Editor at @DFRLab.

Kanishk Karan is a Digital Forensic Research Assistant at the Atlantic Council’s Digital Forensic Research Lab (@DFRLab)

Follow along for more in-depth analysis from our #DigitalSherlocks.