Bad-faith actors use a variety of online manipulation practices to disrupt elections or damage the reputation of public figures. Some of these practices rely on traditional paid-for advertising. Yet, using the services of Google and Co. can be notoriously expensive. Services like AdSense or Facebook Pixel also leave an easily identifiable digital footprint. Instead, many manipulation campaigns rely on the seemingly organic spread of content — giving their message free exposure across social networks. How can bad-faith actors achieve this organic spread of content? One of the techniques is ‘astroturfing’.

What is astroturfing?

To understand astroturfing, let’s take a look the original meaning of the word. ‘AstroTurf’ is a brand of artificial turf that has become a generic trademark. Astroturf is any type of synthetic surface replicating the look and feel of natural grass. Over the past two decades, the term has increasingly been used as a verb — not in the context of sports surfaces, but politics. ‘Astroturfing’ is the attempt to replicate the look and feel not of natural grass, but ‘grassroots’ social movements. This way, bad-faith actors can create the illusion of popular support for any political agenda. This phenomenon is not exclusive to the internet. Yet, the logic of virality on most social media platforms have allowed bad-faith actors to radically expand their reach — at a low cost and low risk of detection. …


The political sphere has been abuzz over the last weeks with the announcement by Twitter founder and CEO Jack Dorsey that his platform would no longer accept political advertisements. The policy change will take full effect on 22 November, just in time for the UK General Election in December. It is not wonder that the decision was widely greeted with enthusiasm, both in the UK and internationally. As I have explained elsewhere, micro-targeted ads may have a significant potential to disrupt fair elections — particularly in winner-takes-it-all systems.

Twitter’s stance on political advertisement is in stark contrast to Facebook’s recent decision to exempt political campaigns from fact-checking, citing freedom of speech concerns. It is easy to see how this is a PR disaster for Zuckerberg and Co, as many users are left wondering why Facebook is still happy to cash in on the growing problem of political disinformation. It seems disingenuous to have entire teams dedicated to stopping bad actors from exploiting their platforms — while at the same time the ad department is happy to help the same actors manipulate users through micro-targeted and often misleading messages. Unfortunately, amidst all the praise (for Twitter) and finger-pointing (at Facebook), some nuance has been lost. …


Disinformation campaigns are being used everywhere to undermine democracy. Yet, most accounts of online disinformation swaying elections come out of the US and the UK. The media’s disproportionate coverage of these two countries is partially to blame for this — but are there similarities between the US and UK political systems that make them more vulnerable to manipulation attempts?

Photo by Element5 Digital on Unsplash

Both the US and the UK effectively have a two-party system in which the winner takes it all. This means whoever gets most votes in the constituency will represent the entire region — indifferent of how close the runner-up came to beating the eventual winner. In the UK, this led to around 68% of ‘wasted votes’ in 2017. These are votes that were not cast for the winning candidates in their respective constituencies. In the US, this is illustrated by the difference between the popular vote and the electoral vote.

About

Christian Schwieter

MSc student @ Oxford Internet Institute. All things digital politics, disinformation & online social movements. More at christianschwieter.com

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store