Pro-Russian internet trolls fueled claims that Scotland’s independence referendum in 2014 was rigged, and amplified demands for a revote.
The behavior of these accounts is pro-Kremlin, and consistent with the behavior of accounts know to be run by the so-called “troll factory” in St. Petersburg, Russia, during the United States 2016 presidential election. However, it is not possible to determine from open sources whether some or all of the accounts are independent actors, or linked to Russian information operations.
Given the concerns expressed in the United Kingdom over the support of Russian trolls for Brexit, and in the U.S. over Russian interference in the 2016 election, much more research is needed into the activity of pro-Kremlin trolls around Scottish independence, and much more investment is needed into building Britain’s resilience against online disinformation.
The referendum and the results
Scotland’s independence referendum was held on September 18, 2014. The referendum asked a single question:
Should Scotland be an independent country?
Scottish National Party leader Alex Salmond accepted the vote in the early hours of September 19. He said, “I call on all of Scotland to follow suit in accepting the democratic verdict of the people of Scotland.”
Salmond admits defeat in referendum
Scotland's First Minister Alex Salmond has thanked those who voted for independence but accepted that "Scotland has…
Claims of fraud
However, almost as Salmond was speaking, claims of fraud began to circulate online. The fraud claims included a number of videos, which purported to show vote-rigging. The most influential was a video posted by “Elite NWO agenda”, which was viewed over 800,000 times. NWO stands for the conspiratorial “New World Order”, a theory of a secret elite plot to dominate the world.
The post used video and still photos to claim the vote was rigged. The claims were debunked, but links to the video continued to circulate online.
Most of the accounts that shared it appear unexceptionably Scottish-focused; however, a significant minority, especially among the earliest accounts to post, look more like pro-Kremlin trolls. These accounts were among the most vocal amplifiers of the video — posting it repeatedly and tagging different users.
According to a machine scan of tweets sharing the YouTube clip, @w_nicht posted it six times in 90 minutes, each time tagging other users in an apparent attempt to spread it more widely.
@w_nicht is a strongly pro-Kremlin account. It regularly takes positions consistent with known Russian propaganda narratives. Its use of English is sometimes erratic in ways which are consistent with the errors made by native Russian speakers.
Its posts accuse the West of hypocrisy or falsification in the shooting-down of Malaysian Airlines flight MH17 over Ukraine, which criminal investigators concluded was done with an anti-aircraft missile sent into Ukraine from Russia. The account’s shares also include opinion pieces by Kremlin propaganda outlet RT, which recently registered as a foreign agent in the U.S.
Other posts lambast Turkey for shooting down a Russian plane on November 24, 2015; attack UK Prime Minister Theresa May and her Conservative party for warning against Russian interference in UK domestic politics; defend Russia against claims of systematic Olympic doping; and denigrate murdered Russian opposition leader Boris Nemtsov.
All these are characteristic of pro-Kremlin trolls, in general, and the “troll factory” in St. Petersburg, in particular.
@w_nicht was by no means the only pro-Kremlin account to spread the claim of vote-rigging. Another particularly outstanding case was @ArianeDaladier. This account appears to have been inactive since September 2015, but, when active, shared pro-Kremlin content in English…
… and in Russian.
This appears to be an almost exclusively Ukraine-focused and pro-Kremlin, troll. Nevertheless, on September 19, it retweeted a post which shared the Scottish referendum video, in the midst of its usual Ukraine-centric posts.
This behavior suggests it was a pro-Kremlin troll, possibly run from the troll factory, whose main focus was Ukraine, but which was diverted to amplify the claim of vote-rigging in Scotland.
According to the machine scan, 671 users in total shared the video alleging voter fraud. A manual scan of the earliest 100 showed over a dozen which appear to be pro-Kremlin trolls — sharing, for example, Kremlin narratives on Ukraine, Crimea, MH17, Turkey, and the Syrian conflict. These were among the most active accounts, pinging the link multiple times to different users, just as @w_nicht did.
Its posts included shares of another video making the same claims and using the same footage; this will be referred to as the “Boom!” video.
Again, this account echoes many top Kremlin narratives, both on its Twitter feed and on the associated blog. It portrays Ukraine as headed by a far-right, U.S.-led coup; attacks Turkey and anti-corruption protesters in Russia; and claims that a sarin attack in Syria was a Turkish false-flag operation.
OCCRP wrote that the video was “widely shared by a Russian troll network” with which the hoaxer was close, and “reposted widely by accounts logged under Russian names, including via a Facebook page called NovoRossiya (a reference to breakaway regions of East Ukraine) and several others where dozens of comments can be found in the Russian language”.
Facebook and Twitter searches for these reposts returned only a handful of hits, suggesting that many have since been deleted. However, a Russian-language exposé of the fake recorded some of the Russian-language posts which had accompanied it, supporting the OCCRP’s report.
Pro-Kremlin and Russian accounts such as these played a role in helping to promote the “vote-rigging” video. They were a minority, but an active minority, sharing the claims of fraud repeatedly, and tagging other users and news outlets to amplify the message.
The same applies to the “Boom!” video shared by @glopol_analysis. Many of the accounts which posted it appear to come from Scottish users, but some — especially among the earliest movers — resemble Kremlin trolls.
The very first account to share the “Boom!” video was called @skull322. This is a highly-active account, having posted almost 200,000 times since it was created in 2008; its profile picture is taken from Fox television series “Fringe”.
At 06:20 local time on September 19, @skull322 shared the video, using the same headline as the video itself (suggesting that this was either directly shared from YouTube, or an automated post).
@Skull322 is a sedulous promoter of pro-Kremlin narratives. Many of its tweets copy word for the word the headlines which they share, suggesting that this account may be automated (or merely unoriginal). For example, it repeatedly shares anti-Ukrainian content, including from propaganda sites Sputnik and Russia Insider.
It also shares conspiratorial, outlandish, and anti-Western posts on MH17 and the conflict in Syria.
It also posts on non-Russia-related issues, including terrorism, space exploration, and UFOs. All the below tweets were posted on September 19, 2014.
The account thus shares large quantities of pro-Kremlin messaging, but it is not a solely pro-Kremlin account, and its other posts bespeak a preference for conspiratorial content.
Another early amplifier, @just1fix2004, exhibited similar behavior. It also shared the video in the early hours of September 19.
This account, again, shares many posts which amplify Kremlin narratives or outlets, including on Ukraine, MH17, and Russian President Vladimir Putin. Many are YouTube shares, again suggesting that this account may be partly automated. Its favorite sources include RT, Iranian outlet Press TV, and far-right commentator Paul Joseph Watson.
There are nuances in the behavior of the accounts named above, and others like them which also shared videos alleging vote-rigging in Scotland. Some appear to be run by non-native-speakers, whose allegiance is primarily to Russia; others appear less focused and may be automated. All, however, post considerable quantities of pro-Kremlin messaging, and have little or no apparent connection to Scotland.
Open source methods cannot confirm whether these were simply pro-Kremlin accounts, or troll-factory ones. More research is needed to determine the full extent of such pro-Kremlin accounts’ activity, and their reach and impact.
One claim which can be traced directly to Russia emerged even earlier, as the first results were coming out. This was a claim from a Russian election observer that the vote was not in accordance with international standards. The claim was important for the way in which it fed calls for a revote.
Within hours of the comment, a Facebook page, “Rally for a Revote”, had been created to demand a rerun of the referendum; as pointed out by the OCCRP, its first post was a Guardian article picking up on the Russian claim.
The “Rally for a Revote” page accompanied a petition on website Change.org calling for a full revote (not merely a recount), “counted by two individuals, one of whom should be an international impartial party without a stake in the vote,” in a possible acknowledgement of the Russian observer’s complaint.
Other petitions were also launched. One was a unilateral declaration of independence; a second called for a “public judicial review” of the referendum process; a third was hosted on pro-independence website Yes2014.net, again demanding a revote.
A fourth petition was submitted to the official UK Parliament petitions page, calling for a recount, instead of a revote.
The petitions appear to have been launched by Scottish users. The declaration of independence was launched by a user called Martin Keatings; the petition linked to the Facebook page was launched by a woman named Kirstie Keatings. Both gave locations not far from Edinburgh.
Kirstie Keatings subsequently complained about harassment on Facebook and changed the name of the author to “Rally for a Revote”. The addressee was also changed from Salmond — who resigned after the vote — to his successor, Nicola Sturgeon.
There is no reason to suspect that these petitions were launched by anyone other than Scots dissatisfied with the outcome. In particular, Kirstie Keatings was quoted by the Scotsman (her surname spelt as “Keating”) on September 28, 2014, suggesting both a local presence and local verification.
The petitions received very different levels of support. The declaration of independence gathered 3,888 signatures. The call for a review gathered 25,905. The Yes2014 version had 18,821 signatures by September 21. The petition to the UK Parliament had 23,697 signatures by March 2015.
Far more than any of these, however, the “Rally for a Revote” petition gathered 100,261 signatures by the time it closed.
This is a remarkably high figure, given that total turnout in the referendum was 3.6 million, and given that the formal petition to the UK Parliament achieved less than a quarter of the impact. It raises the question of whether an attempt was made to artificially amplify the signatures.
Change.org only requires an email address and name to sign petitions. @DFRLab asked Change.org twice how it verifies signatures on its petitions; Change.org had not replied by the time of publication. By contrast, the UK Parliament petitions page is limited to UK residents and citizens, and requires a postcode for verification.
According to an archive of the Change.org petition, which showed the ten most recent signatures, they included submissions from Germany, Spain, and England, confirming the petition was not limited to a geographical area.
Signatures on the Yes2014 petition included “Dr Evadne Hinge Hinge” and “Dame Hilda Bracket” (references to comedy stage duo “Hinge and Bracket”) and “Cliff Richard’s Vaginal Deodorant Yewtree”, indicating a lack of credible verification methods.
The signatures of the “Rally for a Revote” petition appear to have come rapidly. A Twitter account called @Hypermobile2011 tracked them during the day, and recorded 3,000 signatures in half an hour, 7,000 signatures in two hours and 50,000 signatures in eight hours.
Little social-media evidence remains of how the petitions spread. The “Rally for a Revote” Facebook page has been deleted, so the data on the spread are no longer available. A machine scan of tweets sharing the change.org petition returned only 2,287 posts, suggesting either that many tweets were deleted in the interim, or that Twitter traffic was limited.
Again, however, links to the petitions were amplified by conspiracy-minded, pro-Kremlin social-media users and websites in their later stages, and at least one small network of apparently automated “bot” accounts.
One post, for example, was made on September 22 by a website called Kickass Cookies (kickass-cookies.co.uk). This was not an early mover, rather amplifying the petition when it had already gained significant traction.
The website calls itself an “independent, viewer-supported news platform that helps alternative and non-mainstream viewpoints reach a wider audience.” It is an aggregator rather than a primary source and lists as its “friends” far-right, pro-Kremlin. and conspiratorial websites such as Infowars, Prison Planet, 21st Century Wire, and Iran’s Press TV.
Its own posts on Twitter regularly amplify pro-Kremlin messaging on issues such as Ukraine, MH17, the murder of Nemtsov, and the crisis in Russian relations with Turkey. These include shares of the Kremlin’s own propaganda outlets, RT and Sputnik.
According to a WhoIs search, the person behind the website has not yet been validated and chose to hide their address.
This site demonstrably shares pro-Kremlin messaging, but also shares far-right and conspiracy sites. Its affiliation and identity cannot be established at this stage; however, it is clearly not a user focused on Scotland. Other than a handful of tweets on September 19, 2014, most of its uses of the word “Scotland” were attempts to hijack the hashtag while promoting other pro-Kremlin content.
The post by @Kickass_Cookies on the referendum petition was retweeted fifteen times. Of those, nine were posted at exactly the same second, 22:23:41 UTC on September 22 , 2014— a classic indication of automated bot activity.
These accounts have bot-like behavior patterns, and repeatedly share anti-Ukrainian, anti-NATO, and anti-Obama tweets, together with conspiracy theories on MH17. All these are characteristic of Kremlin trolls, but also of far-right users, making attribution difficult.
All, for example, shared a post on MH17, tweeting it at the same minute (6.52 pm on August 7, 2014). The post was taken from the Centre for Research in Globalization, an anti-Western site notorious for false reporting, and argued that “Obama definitely caused the Malaysian airliner to be downed.”
Another simultaneous tweet was anti-NATO, and hijacked other hashtags to amplify its reach.
Yet another simultaneous post focused on Chinese and Russian military advances and their threat to the United States. The headline was taken from fringe site 21stcenturywire.com; that article was based on an RT original. The founder of 21stcenturywire.com is listed as a contributor to RT, the Centre for Research on Globalization, and far-right American site Infowars, showing how far-right, conspiratorial, and Kremlin propaganda intersect.
Other late-coming amplifiers of the petition shared the pro-Kremlin stance. @MarquisLeDain, for example, retweeted a post about the petition; it repeatedly shares RT propaganda on Ukraine, shares content in Russian, and also amplified Russian accusations of Turkey’s oil trade with the Islamic State “Daesh” terrorist group.
Its profile proclaims it to be “Anglo Norman Scottish” and based in London. Despite this, it appears unable to spell the name of the English city “Southampton”.
Accounts such as these were in the minority, and in some cases, their pro-Kremlin messaging is matched by far-right messaging, obscuring their overall affiliation. However, they did play a role in amplifying the various calls for a revote, especially the poorly-verified ones on Change.org.
In the aftermath of Scotland’s referendum, a range of accounts which post pro-Kremlin content amplified claims of election fraud and calls for a revote. Russia’s election observer did the same. The trolls did not create the claims or the calls; instead, they boosted the existing signal, in a manner consistent with known Kremlin operations, especially in the United States.
Open source research cannot establish definitively which of these accounts were run from the “troll factory” or associated with the Kremlin’s known information operations, and which share a looser ideological alignment; as we know from the experience of the United States, some Russian troll accounts managed to masquerade as Americans for many months, making open source attribution extremely difficult. Nor can it establish how much impact these troll posts had, compared with posts from genuinely Scottish accounts making the same claims.
However, overall, the impact of claims of fraud (not least the Russian observer’s allegation) was considerable. According to a report by the UK’s Electoral Commission, a third of all respondents to a survey conducted by the Commission “thought that fraud took place at the referendum, more than in any previous post- election survey.”
“Asked why they thought so, the main response was ‘I heard or saw stories in the media’.”
Moreover, “Respondents who identified themselves as ‘Yes’ voters (42%) were considerably more likely to think fraud took place compared with No voters (21%).”
The allegations of fraud demonstrably had an impact; pro-Kremlin accounts demonstrably boosted those allegations. The anger and disappointment felt by many Yes voters were entirely sincere, and are not the subject of this analysis; however, those sentiments were fanned by pro-Kremlin trolls, in a manner characteristic of Russian influence operations.
A number of responses are needed. First, more research is required to establish, as far as is possible, the scale of pro-Kremlin troll activity on Facebook and Twitter during the referendum campaign; this article only considers the day after, but the months before were more important politically. Second, the platforms themselves should analyze their own data for signs of Russian activity, as they have done for the United States election and the Brexit campaign.
Third, research is needed into the online petition platforms, especially Change.org, and the way in which their petitions are conducted. These seem like a sitting target for malignant actors who seek to “game” online polls in order to create a political effect.
Finally, as in other countries, attention should be paid to the question of digital resilience. Troll accounts are not impossible to identify, although attribution remains challenging; bots can be identified in many ways (for a sample, see here). Online attempts at manipulating political processes in democratic countries are only likely to grow in the coming years. With Britain facing the political upheaval of Brexit, and the possibility of further referenda, online vulnerability remains a glaring problem which needs to be addressed.