On April 12, Facebook removed 234 Indonesian pages, accounts, and groups for spreading polarizing political messaging just a week before the country’s elections.
Indonesia’s 192 million voters go to the polls on Wednesday, April 17, for presidential, parliamentary, and local elections. The presidential race pits incumbent Joko Widodo, or “Jokowi,” against former general Prabowo Subianto, who has been accused of human rights abuses and seeks to bolster the military establishment dramatically. Indonesia reportedly has the fourth highest number of Facebook users in the world, making the platform a prime target for manipulation in the country.
Facebook shared the names of some of the pages and groups with the DFRLab shortly before the takedown. These assets mixed attacks on the incumbent president and praise for his challenger with apolitical content, possibly as an audience-building technique.
In an update to an earlier blog post, Facebook stated that it had removed these assets:
“…for engaging in coordinated inauthentic behavior as part of a domestic network in Indonesia. This activity originated in Indonesia and the people behind it misled others about who they were and what they were doing. They used fake accounts and frequently posted about local and political news including topics like upcoming elections, alleged election fraud, candidate views, and alleged misconduct of political figures. As always, we took action based on the behavior of these actors, not the content they posted.”
Their impact was low, as measured by follower numbers and engagements, but their potential reach was higher because of the number of groups they managed.
A Known Fake Name
Three pages were using the identity of a known false persona, Annisa Madaniyah, which had previously spread a hoax about the North Maluku gold mine being given to the Chinese and had also posted comment insulting Jokowi’s mother in 2017.
The pages often published the same posts within a tight time-frame, suggesting coordination across the network.
Linked by Groups
The DFRLab identified four pages that managed over ten groups each. Most of the managed groups were the same across the three pages.
The pages Satu Dua Tiga, Sekilas Info, and Berita Sekitar Indonesia posted the same links to Gelora.co, Nusanews.id, and Gelora.news at nearly the same time. This behavior is typical of coordinated networks with shared managers or a shared content strategy whose intent is to amplify a specific outlet, while making the amplification look organic.
One page, RBNS News, posed as a news outlet. In February and March, it shared posts in English, Chinese, and Russian, linking to three Islam-focused Blogspot pages (respectively called islam-news-online, islam-xinwen, and islam-novosti-russia). The posts in each language concerned the same subjects, usually Islamic prayer and scripture, and used the same images.
Prior to these posts, the page had been silent since late November. Earlier, it had primarily shared posts from a website called superbola.win. The page is no longer operational but shared a name with the Facebook page Superbola.com, which also featured in this Facebook takedown.
The page was ostensibly devoted to sports, but its recent posts were more political or religious in nature and linked to a website called sew0rd.com that described itself as a “portal for Islamic news.”
This report presents a snapshot of the pages that Facebook took down for “coordinated inauthentic behavior” ahead of the election.
The pages visibly formed a dense and self-reinforcing network, with clusters of pages sharing links to the same articles and websites at almost the same time. Some of the identities they claimed had already been exposed as false ones.
The pages mingled religious and human-interest content with political content, predominantly attacking incumbent President Joko “Jokowi” Widodo and supporting his main rival, Prabowo Subianto.
Their overall direct follower count was low, at around 17,000 (with probable redundancies between the follower counts for different pages). Typical posts only achieved a handful of engagements, if any. Nevertheless, these pages managed a large number of groups, well over 100, which gave them the chance to achieve significantly more reach.
This does not appear to have been an especially effective operation. Given the proximity of the elections, however, its potential for spreading false or divisive messaging remained significant.
Kanishk Karan is a Digital Forensic Research Associate at the DFRLab.
Ben Nimmo is Senior Fellow for Information Defense at the DFRLab.
Register for the DFRLab’s upcoming 360/OS summit, to be held in London on June 20–21. Join us for two days of interactive sessions and join a growing network of #DigitalSherlocks fighting for facts worldwide!