#TrollTracker: Facebook Uncovers Iranian Influence Operation

Iranian narratives buried in divisive content target United States and United Kingdom

On October 26, Facebook announced that it removed 82 pages, accounts and groups for “coordinated inauthentic behavior that originated in Iran and targeted people in the US and UK.”

According to Facebook, while these pages and accounts originated in Iran, there are currently no ties to the Iranian government. It was the second time in two months that Facebook shuttered an Iranian network, after taking down over 600 accounts in August. Last week, Twitter published over 1 million tweets from that same network; @DFRLab analyzed them here.

Facebook shared ten pages and 14 Instagram accounts with @DFRLab, 12 hours before the takedown. These accounts masqueraded primarily as American liberals, posting only small amounts of anti-Saudi and anti-Israeli content interspersed within large volumes of divisive political content such as race relations, police brutality, and U.S. President Donald Trump. This evolution of tactics from previous more blatant pro-Iranian messaging suggests the operation had learned from earlier takedowns.

These assets were designed to engage in, rather than around, the political dialogue in the United States. Their behavior showed how much they had adapted from earlier operations, focusing more on social media than third party websites and becoming much more engaging.

This post sets out the most important findings, giving an initial description of their open-source, publicly-visible features, focused on those which suggested a lack of authenticity or a resemblance to earlier troll operations. It is vital to understand the evolution of this threat, to ensure that responses to it also evolve.

@DFRLab intends to make every aspect of our research widely available. The effort is part of our #ElectionWatch work and a broader initiative to provide independent and credible research about the role of social media in elections, as well as democracy more generally.

@DFRLab will analyze the accounts in further detail in the coming days.

1. Election targeting

Some of the pages posted directly about American political processes, either discussing their own votes or calling on others to vote.

Post by @nornowar. (Source: Facebook / @nornowar)
Post by @nornowar. (Source: Facebook / @nornowar)
Post by @nornowar. (Source: Facebook / @nornowar)
Post by @nornowar. (Source: Facebook / @nornowar)

This is the most immediate and output-driven component of the accounts, given how close the U.S. midterm elections are.

2. Big Hitters

Some of the Facebook pages had very large followings, and impressive numbers of shares. One, called I Need Justice Now (@INeedJusticeNow), had over 13 million video views; another, No Racism No War (@nornowar), had over 412,000 likes and almost half a million followers.

Analysis of the top performing Facebook pages, showing the total number of interactions, likes and video views. (Source: Facebook / CrowdTangle)
Screenshot of the “community” page from the account @nornowar, showing the number of likes and follows.

3. Recent Creations

The accounts and pages were recently created, unlike the Twitter and Facebook assets and the websites which were taken down in August, many of which dated back years. This is significant, as the dates imply either a separate Iranian effort, or an evolution of the earlier campaign, focusing more on divisive content.

Two of the most-followed pages, @VoiceofChangee (with 113,155 followers) and @INeedJusticeNow (with 61,507 followers), were created this year, on February 3 and April 1 respectively.

Screenshots of the info pages for @VoiceofChangee and @INeedJusticeNow, showing creation dates. (Source: Facebook / @VoiceofChangee / @INeedJusticeNow)

The earliest, @nornowar, dated back to January 2016.

Screenshots of the info page for @nornowar, showing creation date. (Source: Facebook / @nornowar)

4. More Engaging, More Engagement

The accounts posted a much more engaging range of content than the earlier operation, which focused on using social media to drive users toward websites laundering pro-Iranian regime messaging. The latest batch of accounts sought to drive engagement on the platforms, rather than off them, with a mixture of memes, videos, and authored comments.

The approach appears to have worked, with posts on both Instagram and Facebook receiving large numbers of shares and replies.

Post by @VoiceOfChangee on Facebook. Note the positive tone and the hundreds of shares. (Source: Facebook / @VoiceOfChangee)
Post by @VoiceOfChangee on Facebook. Note the positive tone and the hundreds of shares. (Source: Facebook / @VoiceOfChangee)
Post by @know_the_realities on Instagram, identified by Facebook as part of the Iranian network. We have anonymized the comments to protect innocent users. (Source: Instagram / @know_the_realities)
Post by @INeedJusticeNow on Facebook. Note the number of reactions, shares and views. (Source: Facebook / @INeedJusticeNow)

5. Divisive Content

The great majority of posts by these consisted of divisive and polarizing content, especially attacking President Trump and the Republican Party. This is similar to the approach practiced by Russia’s troll operation, but the Russian operation targeted both sides in America’s most painful debates.

One Instagram account — @RepublicansUnited2 — masqueraded as a conservative Christian user. Its messaging was not explicitly divisive, and it ceased posting in September 2017; nevertheless, its identification as part of the network may indicate an initial intention to target both sides.

Post by @republicansunited2. Note the one dissenting voice among the comments, suggesting that even this post was, to some degree, divisive. (Source: Instagram / @RepublicansUnited2)

Most of the others posed as left-wing accounts, attacking Trump and the Republicans, and praising the Democrats.

Post by @shut_racism on Instagram; note the number of likes and the comments. We have anonymized all comments to protect the identities of apparently innocent users. (Source: Instagram / @shut_racism)
Post by @VoiceOfChangee. (Source: Facebook / @VoiceOfChangee)
Post by @nornowar. (Source: Facebook / @nornowar)

Many of the attacks were personalized against President Trump.

Post by @INeedJusticeNow. (Source: Facebook / @INeedJusticeNow)
Post by @VoiceOfChangee. (Source: Facebook / @VoiceOfChangee)
Post by @shut_racism. (Source: Instagram / @shut_racism)

Others focused on other divisive political issues, notably race relations, and especially on police violence against the African American community.

Post by @shut_racism. (Source: Instagram / @shut_racism)
Post by @INeedJusticeNow. (Source: Facebook / @INeedJusticeNow)
Post by @nornowar. (Source: Facebook / @nornowar)

Some of them hinted at the need for violence, or called for it.

Post by @INeedJusticeNow on Facebook. (Source: Facebook / @INeedJusticeNow)
Post by @VoiceOfChangee. (Source: Facebook / @VoiceOfChangee)

This divisive posting constituted the majority of content, suggesting that one main aim of the Iranian group of accounts was to inflame America’s partisan divides. The tone of the comments added to the posts suggests that this had some success.

6. Anti-Israel, Anti-Saudi, Pro-Yemen

In between these posts, the accounts repeatedly attacked Iran’s regional rivals, Israel and Saudi Arabia. This is the strongest external indication that they were part of an organized Iranian network designed to amplify the regime’s chosen narratives, in the way that earlier networks did.

Post by @shut_racism, linking Ivanka Trump to the plight of the Palestinians. (Source: Instagram / @shut_racism)
Post by @wupame. (Source: Instagram / @wupame)
Post by @wupame. (Source: Instagram / @wupame)
Post by @know_the_realities. (Source: Instagram / @know_the_realities)
Post by @know_the_relaities. (Source: Instagram / @know_the_realities)

Some opposed the United States’ policy in the Middle East more generally.

Post by @victoria.freeromwarrior. (Source: Instagram / @victoria.freeromwarrior)

7. Recycled Content

Some of the content posted by these accounts appeared original, but much more appeared to have been taken from authentic websites. This appears to have been an attempt to blend in with the authentic communities, and also, perhaps, to attract the attention and endorsement of genuine users.

On 10 September 2018, for example, the page @TMag shared a video of a “kinetic door” which had been posted on YouTube the month before.

Left, post on Facebook by @TMag-English. Right, the same video on YouTube. (Source: Facebook / @TMag-English / YouTube / Ali Mumtaz)

On July 8, 2018, the same page shared a cartoon attacking France for hosting a meeting of the controversial MEK group, regarded by Iran as a terrorist organization, and listed as such by the United States until 2012. The original cartoon was dated to 2015 and posted on a blogspot page called Latuff Cartoons.

Left, post by @TMag-English. Right, Latuff Cartoons post. (Source: Facebook / @TMag-English / Blogspot / Latuff Cartoons)

A post by @INeedJusticeNow on August 28, 2018, copied a Twitter appeal by former cricket star Kevin Pietersen to find two lion hunters, posted two days before.

Left, post by (Source: Facebook / @INeedJusticeNow / Twitter / @KP24)

8. Many memes

All these accounts were heavy on meme content, light on text. This may have been a way of driving engagement, but may also have contributed to reducing the need for original-language posts, and thus reducing the chance for language errors which would have betrayed them.

Meme by @know_the_realities, on Palestine. (Source: Instagram / @know_the_realities)

Some of the memes were remarkable for the errors they made — errors which appear unlikely to have been made by Americans.

One post contrasted the deaths of American soldiers in WWII with modern neo-Nazi marches in America, but the image it used was of Soviet soldiers, not U.S. GIs.

Left, post by @VoiceOfChangee. Right, eBay offer of the image of Red Army snipers, as depicted in the URL. (Source: Facebook / @VoiceOfChangee / eBay)

Another was structured as a meme with two parallel images, but only made sense if read from right to left, as if translated from Arabic.

(Source: Facebook / @VoiceOfChangee)

9. Mostly Negative

While some of the pages made some positive comments, listed above, the great majority were negative. This is a very similar approach to that adopted by the Russian “troll farm” in its attacks on America from 2014 to 2018, and may indicate that the Iranian account managers were drawing on Russia’s experiences.

(Source: Facebook / @VoiceOfChangee)
(Source: Facebook / TMag-English)
(Source: Instagram / @shut_racism)

10. Artificial Amplification

Some of the accounts’ amplification statistics, especially their shares, were so disproportionate to their followings that it suggested artificial amplification.

Post by @INeedJusticeNow. The account had 52,000 followers, but managed over 790,000 shares of this meme…
The same post shared by @VoiceOfChangee on a different occasion; note that this time, it only had 3,500 shares. (Source: Facebook / @VoiceOfChangee)
The most shared post by @VoiceOfChangee. Compare the likes (1,100) with the shares (41,600). (Source: Facebook / @VoiceOfChangee)
The most-shared post by @AlQudsDays; note the 8,000 reactions and over 40,000 shares, and over 2 million views. (Source: Facebook / @AlQudsDays)

Conclusions

The shuttered assets seemed to focus on promoting divisive content in America (one account also focused on boosting left-leaning posts in the UK). In between that content, they amplified posts attacking Israel for its behavior towards Palestinians, and Saudi Arabia for its treatment of Yemen, and its own citizens.

The foreign-policy messaging was in keeping with earlier Iranian networks, which primarily amplified Iranian government narratives on the Middle East. The focus on divisive content is much closer to the behavior of the Russian information operation, although the Iranian operation does not appear to have targeted conservatives as much as liberals.

Taken with Facebook’s own attribution, this suggests that the accounts were indeed part of an information operation supporting the Iranian regime, but that they adapted and evolved in light of earlier Iranian and Russian operations. This confirms earlier assessments that such troll operations have moved on since 2016, but are still active in evolving permutations, underscoring the importance of ongoing analysis to keep up with them.


Ben Nimmo is Senior Fellow for Information Defense at the Atlantic Council’s Digital Forensic Research Lab (@DFRLab).

Graham Brookie is Director and Managing Editor at @DFRLab.

@DFRLab team members Donara Barojan, Lukas Andriukaitis, Kanishk Karan, Aric Toler, Michael Sheldon, and Nick Yap made this report possible with their research.

@DFRLab is a non-partisan team dedicated to exposing disinformation in all its forms. Follow along for more from the #DigitalSherlocks.


DISCLOSURE: @DFRLab announced that we are partnering with Facebook to expand our #ElectionWatch program to identify, expose, and explain disinformation during elections around the world. The effort is part of a broader initiative to provide independent and credible research about the role of social media in elections, as well as democracy more generally.

For more information click here.