Kenya’s paid Twitter propaganda machine

ANCIR iLAB
ANCIR
Published in
10 min readJan 13, 2022

How a network of paid agents uses online “astroturfing” campaigns to set political agendas.

By Code for Africa

Executive summary

Social media platforms such as Twitter and Facebook have become major avenues for ordinary citizens to discuss politics, disseminate news, and organise collective action.

Coordinated messaging is inherent to any information campaign, which consists of a group of people who want to convey specific information to an audience. This is also true for genuine grassroots movements, which we define here as a movement initiated by one or several regular users, expanding organically by convincing other users of the movement’s merit. Its participants also send out similar messages, but they are intrinsically motivated, and usually organised in a more decentralised fashion than is the case for agents of an astroturfing campaign.

“Astroturfing” is defined on Wikipedia as: “The practice of masking the sponsors of a message or organization to make it appear as though it originates from and is supported by grassroots participants. “

We have identified one such astroturfing campaign in Kenya. On 2 August 2021 Kenya’s Deputy President William Ruto was blocked at Nairobi’s Wilson airport from travelling to Uganda. The revelation that a Turkish man was part of Ruto’s planned entourage to Uganda caused debate in Kenya, especially after it emerged that he bore a similar name to an individual charged with terror in Germany two decades ago.

Our interest in the events that followed was piqued by the hashtag #RutoPlanningViolence. The first tweet under this hashtag was posted on 4 August 2021, at 06:30 am, and took aim at Ruto’s motives. Our analysis suggests that this tweet opened up a Pandora’s box: it led to a myriad of posts containing allegations of a Ruto-orchestrated plot to grab power through violence.

The network used current and historical events and controversies involving Ruto to propagate core narratives during the campaign. It spoke of an alleged “master plan” to use violence to win the upcoming election in August, 2022, claimed Ruto had links to terrorists, and highlighted his role in Kenya’s 2007/08 post-election violence. The network also employed tactics such as using graphically manipulated images and automated accounts to amplify the campaign’s messaging.

The campaign

Online social media has revolutionised how people access news and information — and how we form opinions. By enabling exchanges that are unhindered by geographical barriers, and by lowering the cost of information production and consumption, social media platforms have enormously broadened participation in civil and political discourse.

Although this could — and does — sometimes strengthen democratic processes, there is increasing evidence of malicious actors polluting the information ecosystem with disinformation and manipulation campaigns

Our analysis is based on 7,311 tweets collected using Meltwater, a social media monitoring and analysis tool, between 3 August 2021 and 8 August 2021. We focused on posts that used the hashtag #RutoPlanningViolence. Our analysis revealed that on 4 August 2021 at 6:30am, a post on Twitter from what we refer to as a ‘seeder’ account, started the hashtag campaign before being amplified by a network of eight Twitter accounts.

The hashtag, as is clear from this graph, picked up momentum and peaked within an hour before starting to decline.

Timeline showing posting frequency between 04 Aug — 5th Aug (Meltwater / CfA)

Further analysis revealed that 65% of the posts under the hashtag were retweets; only 25% reflecting the number of original tweets. The full breakdown is in the animated chart below.

Chart showing Tweets by type (Meltwater / CfA)

Popular topics

A word cloud generated from the Twitter dataset revealed that Power and violence were the most common terms used by the network, indicating that the conversation was centered on the planning of violence. As expected, Ruto was the most prominent entity/persona mentioned.

Word Clouds showing top keywords (left) and top entities mentioned (right), (Meltwater / CfA )

Sentiment analysis

The use of extremely sensationalised language and visual media was a key element in this campaign. A word cloud comprising tweets containing #RutoPlanningViolence over the three-day period revealed that the bulk of the conversation revolved around specific key terms that were used to evoke emotive reactions from the public.

A word cloud showing key negative words used in the tweets (Meltwater / CfA)

In particular, a video posted by the Twitter account “@Lynne_Soi” claimed to explain a fully formulated plan to use violence to win the upcoming elections, highlighting alleged key players who would be part of the master plan.

A screenshot of the explainer video (Meltwater /CfA)

Top locations

The conversation’s reach was primarily national: that is, within the Kenyan social media space, with a smaller number of posts originating from countries such as the United States of America and the United Kingdom, where the majority of Kenyan diaspora communities are located. As such, mapping the geo locations for the post shared on twitter confirms this as it mainly centers within these regions.

Map (Top) showing location of post. Below , a chart showing volume of tweets by top countries(Meltwater / CfA)

The coordinated network

The coordination becomes even clearer when one filters the network graph to identify accounts with high interaction rates; i.e. accounts that have frequently retweeted multiple posts from other accounts in the same network. A large retweet network shows clear evidence of a Twitter campaign. Our analysis revealed a highly coordinated network of accounts under this hashtag that retweeted posts from a small number of influential accounts within a short time span.

Screen grabs of accounts with digital marketer on bio ( Sources: @ItsPOS, @smileycherry2,@Jr_LincolnKE, @CharleeOddie1 /CfA)

A network graph comprising tweets containing #RutoPlanningViolence revealed that the key amplifiers of this hashtag, and several related ones, described themselves as digital marketers in their Twitter bios. This possibly indicates the use of paid influencers and sock puppet accounts to amplify the hashtag.

A network analysis graph showing accounts with high interconnectedness. ( CfA / Gephi)

We noted that some of the individuals in the network were included in a Twitter list called ‘Keyboard Warriors’. They are described as a “premium non-negotiable set of character assassins”, indicating that they can be paid to target or manipulate conversations on Twitter.

A screengrab of the paid influencer Twitter list ( Twitter CfA)

Top accounts based on volume of posts

The campaign thrived on a few individuals posting specific messages, seemingly to achieve the campaign’s goals. From the tweets collected, the table below shows activity from a set of accounts within the 1st hour of the campaign (6:30 AM — 7:30 AM EAT)

A table showing the top 10 accounts that influenced the campaign within the first hour ( Twitter/CfA)

Top 5 Twitter profiles with the highest interaction rates

Our analysis revealed that retweets contributed 65% of the campaign’s activities. The table belows shows accounts that were retweeted the most.

A table showing the top 5 accounts with the highest number of retweets in the network (Twitter/ CfA)

The posts with the highest number of retweets addressed key political issues in Kenya, such as land grabbing, Ruto’s plan for the upcoming elections in 2022 and election-related violence.

Screen grabs of the most retweeted tweets ( Sources: @marcymontezz,@Jr_LincolnKE /CfA)

Cross-platform operation

Our analysis revealed that the campaign was primarily run on Twitter, with little interaction on Facebook. Using CrowdTangle, we identified only seven Facebook posts from three individuals using this hashtag.

Screenshots (Above) post shared on POV ( Source: posts by Ivy Luther, Mark Kamande /CfA)

Key narratives

The Twitter campaign used several narratives, developed by the network and derived from both current and past controversies involving Ruto. Particularly, the narratives were curated to show that Ruto has personally been involved in political violence or is linked to individuals who are implicated in political violence or terrorism.

The “master plan”

One of the most retweeted posts was from user @marcymontezz, who claimed that a neatly orchestrated plan was in motion to enable Ruto to win Kenya’s next presidential elections.

The post claims to highlight key individuals, sources of funding, and the sequence of events to accomplish Ruto’s victory.

Labelling people or organisations as agents of propaganda brings the risk of reputational harm, even if the allegations later prove to be false. Several popular tweets highlighted this tweet and shared the images of people allegedly delegated to execute the “master plan”.

Screen grabs of tweets posted about alleged key players in Ruto’s “master plan” ( CfA / Twitter)

Alleged links to terrorism

A Turkish businessman, Harun Aydin, was allegedly one of the five people set to be part of Ruto’s entourage on a visit to Uganda. Ruto was blocked at Nairobi’s Wilson airport from travelling to Uganda however, Aydin travelled to Uganda and was detained and interrogated by Immigration officers and Anti-Terrorism Police Unit (ATPU) detectives upon his return Uganda. It was later established that Harun shares a name with a certain Turkish national who is serving a jail term in a German prison for terrorism. This narrative was also exploited by the network, with an influx of posts claiming that Ruto was closely aligned and funded by a terrorist.

Screenshots on Harun Aydin connection (Top Left)post1, (Bottom ) post2 and (Top right) post3 ( CfA / Twitter)

Alleged links to militia and post-election violence

Our analysis also revealed a narrative, accompanied with graphic images, that suggested Ruto will become violent, or support violence in his name, if he doesn’t win the upcoming elections.

The suggestion of post-election violence is especially unsettling given Kenya’s historical and recent context. The 2007/2008 general elections were closely contested and the process was marred by anomalies and irregularities. Within minutes of the announcement and swearing in of President Mwai Kibaki, fighting and mass protests broke out in different parts of the country,as delays and irregularities in the count had already sparked rumors of rigging, this resulted in the majority of Kenyans denouncing the results.

Much of the unrest was characterised by ethnic conflict between communities that voted for Raila Odinga, Orange Democratic Movement (ODM) candidate and those that voted for Kibaki, the now defunct Party of National Unity (PNU) candidate & incumbent. Ruto, who was then a Member of Parliament and a member of the Pentagon, ODM’s governing body, was blamed for the attacks because of his strong tribal rhetoric prior to the election.

Screen grabs of tweets showing claims of violence (Source: @keylah_ke (Left), @geraldngaoPk7 (Top right ) and @chepchumbaa (Bottom right) / CfA)
Screenshots highlighting a repeat of Kenya’s post election violence linked to Ruto (Source: am_dagi (top left), am_dagi (top right), @redflamekenya (bottom right) /CfA)

The network’s tactics

The campaign used several tactics to gain popularity and to pass on the narratives highlighted above:

  1. The use of sensationalised graphical and manipulated images
  2. Use of automated accounts

Use of sensationalised graphical and manipulated images

The campaign actors managed to promote the legitimacy of the hashtag by promoting sensational, controversial and engaging (but manipulated) content, which in turn drove engagement within the platform.

Some of the images posted (Top Left) devil’s employee, (Bottom left ) Junta and Adolf Hitler ( CfA / Twitter)
Images of a politician wielding a bloody machete were used within posts (Source @MudGuard,@Lily_nganga/ CfA)

Use of automated accounts

From the data collected we analysed the profiles using the Botometer API, a twitter bot detection and analysis tool, and extracted 281 accounts within this network that had greater than 70% bot indicators. We also noted that although some of these accounts had been flagged as bots, they are regular users with patterns that may be described as bot-like tendencies which is a case of false positives.

Some notable user accounts had been suspended as of the time of reporting e.g. @Akintola4Saheed with a total of 72 tweets within this network and @lincolnopiyo with a total of 43 retweets within this network.

Some of the key accounts on the network that had 80% bot indicators include @clint_ke and @ntomugania. Notably, @clint_ke used stock images on the profile picture and cover image and had been created just 5 months before the date of the campaign. The account had posted a total of 14.8k tweets within this period. The account only had 559 followers as at the date of reporting.

Screen grabs of accounts that surfaced as bots due to behaviour analysis ( Source : @Clinty_ke, @Ntomugania /Botometer / TruthNest / CfA)

Conclusion

A coordinated network of paid Twitter influencers was used to coordinate a Twitter campaign against Kenya’s deputy president, using the hashtag #RutoPlanningViolence. The network used current and historical events and controversial topics about Ruto to propagate core narratives during the campaign. This included an alleged “master plan” to use violence to win the upcoming elections; links to terrorists, and reference to Kenya’s previous post-election violence. The network also employed tactics such as the use of graphically manipulated images and automated accounts to amplify the messaging of the campaign.

Recommendations

We recommend that:

  • Media literacy programmes ought to seek to inoculate individuals against misinformation and disinformation by providing them with the means to build resistance to messaging and propaganda. This will reduce their susceptibility to misinformation and disinformation, and lead them to question the veracity of the information being presented to them, as well as the legitimacy of the source presenting the information.
  • Encouragement of media organisations and members of civil society to promote the need for healthy scepticism by their users when consuming online content. This includes providing media literacy resources to users and enhancing the transparency of content distributors.
  • It is important to note ‘Naming and shaming’ alone is clearly not enough if these campaigns remain active. Our recent investigations indicate that some mis- and dis-information campaigns have resources at their disposal therefore policies and legal actions may need to be reviewed to address this phenomenon.

By ANCIR iLAB investigative team.

Cite this report as: Allan Cheboi, Robin Kiplangat, Justin Arenstein, ‘Kenya’s paid Twitter propaganda machine’, January 2022, African Digital Democracy Observatory (ADDO), Code for Africa.

--

--

ANCIR iLAB
ANCIR
Editor for

The iLAB is ANCIR’s in-house digital forensic team of data scientists and investigative specialists.