Defining Political Influence after the Russian Invasion of Ukraine

Tactical Tech
14 min readMar 21, 2022

--

by Amber Macintyre, Project Lead, The Influence Industry Project

The Russian invasion of Ukraine has prompted many of us to ask ourselves: what can we do right now, how might our future change, and what could we have done? For those of us working on digital influence, whether on political communications, misinformation campaigns, or data-driven marketing, this last question is particularly pertinent given that information on Russian influence campaigns is not new. The Oxford Internet Institute reported that since at least 2014 there has been evidence of Russian state actors conducting influence campaigns in Ukraine, describing it as “the most globally advanced case of computational propaganda.”1 Researchers and practitioners of digital influence alike must consider with hindsight: have we been acting on our knowledge as well as we could have? Have our recommendations and responses been appropriate, and was there evidence we should have taken more seriously?

In this article, we re-examine our worldwide research on political influence to understand what has framed our work so far, and how our approach might change. We identify the shifting features of political influence, in practice and ethics, in light of the severity of the impact of Russia’s persistent disruptive influence campaign. We conclude with a consideration of what this means for the role and responsibilities of civil society. Our analysis reveals how our work will continue to reinforce the need for secure data practices in political campaigns, question the role of the industry working on political influence, and adapt our resources to include civil society’s emerging responses to disruptive influence techniques.

Data used for influence can change hands

Political influence worldwide is conducted on the basis of personal data that is collected, analysed and hosted in databases held by political groups, private companies and platforms.2 This data can change hands, and when it does, its purpose and meaning can change too. In Nigeria, data held by phone companies have been used by the government in the name of national security, and in turn for political election campaigns.3 In the US, public administrative data is used by political parties to profile their audience and target communications.4 In Ukraine, data collected during campaigning in the run up to an election became government data when the political party won in 2019.5

While it is clear that data can change hands and purposes quickly, what we don’t know yet is how data might change hands, and meaning, in this war. Political groups in Ukraine and private firms hold datasets representing the level of support for political parties during the last election in 2019. Datasets held by political parties who conducted campaigning may also contain developed profiles of individuals. For example, in the run up to the election, the Volodymyr Zelensky campaign had “segmented its audience into 32 categories according to age, gender, professional affiliation, or political interest.”6 These profiles are usually connected to the proposed political leaning of the person based on the data available about them and may identify people by key sensitive features of their identity such as LGBTQ+, religious group, or economic status including whether a migrant. Furthermore, data representing the followers of political parties’ social media profiles is not only owned by the platform, but also often names and profile photos are visible to anyone in the public.

These datasets are politically sensitive and the protection of the data can be weak. Privacy policies of most organisations are usually vague and ambiguous with an open interpretation of how data could be used. Sometimes there is no privacy policy altogether: in Ukraine, only two of the parties in the last election had privacy policies on their websites.7 Further, there are very few privacy policies on any platform that have established procedures for how data will be processed in the case of a conflict-based crisis. Even when there is a privacy policy, sometimes the software isn’t adept at keeping data secure: in one of the campaign apps in Uganda a little technical knowledge opened up access to users’ uploaded photos and related metadata such as location or model of their phone.8

The risk of data changing purposes is one that motivates the work of digital security experts and privacy campaigners. The sudden change of political environment in Ukraine, and the much more open threat to the current leadership, is a reminder that we must also have consideration for how data is stored, archived, and repurposed. If the parties host the data, they could delete it quickly to eliminate inappropriate use but they could also use it in their current communications framed by conflict. If another group ends up in control of the data, they could use it to track or persecute people who have been identified as unsupportive of their goals. Influence practices that are ‘routine’ or ‘safe’ can create data that may be repurposed in dangerous ways. Anyone working with data in digital campaigns for politics or civil society should take care of what data they choose to collect, how they choose to store it, and what they will do with it in the future.

Technologists change roles

Just as datasets are repurposed, so are the skillsets of the technologists who manage the data. Technology professionals regularly adapt their skills in new contexts between companies and causes. Furthermore, there has been a notable crossover of knowledge and personnel between political communications and for-profit marketing.9 In our research, we found that there are many people working at Facebook who had previously worked in either government relations, policy or security. For example, the previous Deputy Prime Minister in the UK, Nick Clegg, is now the President of Global Affairs at Meta, formerly Facebook.10 The exchange of personnel between private firms working within political influence and the political causes or candidates themselves leads to an exchange of knowledge, values and skills, which may be of great advantage to a new context but may also clash with the needs of their new roles.

The role of a technologist can change within a single context. In Ukraine, Mykhailo Fedorov was the chief digital strategist for Zelensky’s election campaign in 2019. Once Zelensky was elected, Mykhailo Fedorov was appointed Minister of Digital Transformation and Vice Prime Minister. This role has adapted again to the context of war as Fedorov runs Ukraine’s “formidable war machine”10 with a team of skilled technologists. The team are working on a public campaign targeted at technology firms to boycott Russia in whichever way they can, receiving donations in cryptocurrency, and using civic participation to submit images and videos of the Russian military’s movements.

The nature of crisis can change what is asked of technologists, and their role within politics: for example, they may change affiliation (either between groups or from a ‘neutral’ company to a partisan one) or come under new forms of pressure and consequently share, transfer or leak data. We have to ask: how can the experience of a technologist in one context, such as a marketing firm or a political election campaign, benefit or hinder influence within a war? Are there conflicts of interest when a technologist changes role or company such as focusing on profit? When technologists change roles, do they take their data with them, which could contribute to the intelligence they can use towards their new goals? And finally, what technical training or political education do those of us working with and on digital influence need to account for this flexible environment between politics, security, and marketing?

Tools and tactics of influence

The practice and theory of what tools are used for political influence has always been framed by often dichotomous judgements on current events: Barack Obama’s presidential election campaign’s data team and tactics were considered a prototype for many political campaigns;11 tech-driven influence campaigns in the run up to elections in Nigeria and Kenya lead to an uptake of the same tactics in Gambia and Senegal;12 Cambridge Analytica and the polarising tactics of the Brexit campaign in the UK focused scrutiny on the influence industry; the Arab Spring prompted many organisations to examine the role of social media in protests and uprisings. Now, Russia’s invasion of Ukraine prompts us to examine the research on digital influence that we have conducted so far to iterate our understanding once again.

Many of the routine tactics of election and political campaigns remain relevant in a crisis, including segmenting, profiling and targeting individuals with personalised communications on Facebook or WhatsApp.13 However, the tactics in crisis, conflict, and disruptive campaigns also differ from routine political election campaigns. In the run up to Russia’s military invasion of Ukraine, and during, some of the tactics are deemed as unacceptable, outside the norm, and anti-democratic approaches to political influence, such as bots, trolls and misinformation. Regarding Russia’s influence in Ukraine, there are substantial demonstrations of these different techniques. For instance, more than 2,000 Facebook profiles in Ukraine were connected to a Russian ‘profile farm’, other Facebook pages were also found spreading misinformation, and one investigation found offers from a bot to create hundreds of fake accounts on Facebook and leaving tens of thousands of ‘comments’ in support or against a particular candidate.14

Some tools are seen as neither purely acceptable or unacceptable. For example, negative campaigning is considered acceptable in election campaigns (when pointing out flaws of the opposition or their previous policies), but become contentious when used to encourage distrust in an institution and disrupt political stability. Targeted advertising that recommends products or music may be seen as more acceptable than sensitive and risky political profiling that uses personality types or sentiment analysis. If our views change based on the context of who and when these tools are used, is it worth using these tools at all and if so, what safety measures or regulations do we need to take when using them?

International Relations and Influence

One of the substantial features of the current context of influence is that the political groups looking to gain advantage through influence are nation states. State-coordinated information campaigns on their own citizens, or citizens of another nation, shifts the discussion of influence beyond elections and social movements into international relations. The influence tactics also change on this basis too. For example, Russian-linked hackers were connected to attempts to disrupt state electronic infrastructure in Ukraine in 2019.15 As Carole Cadwalladr has shown, Russia has exerted influence financially, politically and through social media around the world.16 Those of us working with campaigns and influence have to take the term as multi-layered: from social media influencers to promoting a candidate in an election to disrupting national stability.

Other notable features of the nation-state’s influence tactics is the use of established and traditional media outlets. In Russia, only state-approved media coverage is accessible. China is also only showing Russian-controlled media on the war. On the other hand, many countries across Europe have banned Russia Today. Furthermore, states can control access to social media platforms. Russia has banned Facebook and Instagram within the country. Access to banned sites can be accessed only through the risk of using a VPN, which is illegal in Russia. It is not uncommon in some countries for governing groups to carry out internet or media blackouts around elections or conflicts. The scale and control of information in this war has highlighted how communication strategies must not only deal with the over-saturated media environment of the internet but also with limited, controlled, and censored media environments.

The Influence Industry: The money in influence

Many influence techniques are not instigated by political parties alone, but with assistance from companies who are part of the influence industry. This business is lucrative. During the last Ukrainian election in 2019, it was estimated that parties spent over 1,800,000 US dollars on Facebook ads.17 A bot farm that could leave 40,000 comments for or against a candidate could cost up to 20,000 Euros.18 In the UK, up to 500,000 GBP was spent on one digital influence consultant in one campaign.

The public-facing and established side of this industry has been well documented by Tactical Tech.19 However, this industry also has a private, sordid face, with smaller and harder to track networks and freelancers. The ubiquitous but underground techniques are less visible and far harder to find. In our research so far, we have only seen hints of fake news pedlars paid in Sub-saharan Africa20, one or two companies advertising the sale of misinformation techniques21, or companies offering objectionable services to undercover journalists posing as political parties. This shady space is much harder to monitor.

One approach to monitor the role of the industry would be to track their financial spending. However, while 60% of countries must report their financial spending, this is not consistently carried out or easily accessible to monitor.22 In these cases other methods might be used. For example, using Facebook’s ad library it was possible to find that in Ukraine’s last election “only three parties of those participating in the 2019 elections declared spending compatible with the amounts indicated in Facebook’s library of political advertisements.”23 These quantitative data techniques to investigation are useful when there is data, but interviews or embedded journalists within campaigns can help when there is little public data. Transparency of political influence worsens substantially in conflict and crisis as political influence becomes covered by national security protections.

Regulation of the transparency measures from the parties and companies themselves could give us an accurate idea of who is spending what services with which companies. In the meantime, investigative methodologies need to be used to understand how many more companies are offering services, what exactly these services entail and their impact on politics, so the industry can be monitored, regulated and held accountable.

Responses from Regulators and Technology Companies

Government regulation and technology company policies can be used to some extent to manage the consequences of digital influence. However, it is well recognised that policy and regulation are far behind the needs of the online information environment whether managing omnipresent social media platforms or identifying fake news. In many cases, the issues have been raised but have yet to be acted on. Either because policy takes time, such as the now long-awaited digital services and digital market acts, or, as was the case in Ukraine, because the situations are not “deemed not extensive enough to affect the voting process or the electoral outcome”24 to limit the content of disruptive influence campaigns.

Yet, since Russia began its invasion of Ukraine there has been a substantial sanction-style response from the tech industry: Facebook stopped monetizing or selling ads to Russian state media, Twitter paused all Russian ads in Ukraine, and tech companies have stopped access to or selling their services in Russia.25 There are many considerations to examine why the companies did not act to create policies to limit the influence in the first place, including the Euro-centrism of the response to this war. There is also an important question as to whether their response is appropriate and effective now — and this can only become clear after some time has passed. If so (or not), what can we learn from it for international regulation of digital communication spaces as we go forward?

Responses from Civil Society engaged in Influence

Many campaigns and civil society groups are involved in their own counter campaigns to the influence techniques of bots, trolls, and misinformation. There are tools and strategies for debunking fake news stories.26 Existing resources on digital security are being shared.27 One Telegram group called the “IT Army of Ukraine” assigned tasks to hackers to disrupt access to Russian-state websites. People have responded to censored contexts by flooding information where possible: pro-Ukrainian posts have flooded VK, a Russian-language social media platform and Instagram before it was banned.28 This content ranges from hard-hitting evidence from the ground to lighted hearted memes.29 Building capacity for civil society in influence has always balanced training how to use data-driven tools with the ethical consequences, and now the skills for civil society increasingly need to be able to respond to the disruptive influence techniques.

meme4Ukraine is a Twitter account posting well-known memes adapted for a pro-Ukraine message https://twitter.com/Meme4Ukraine/status/1501441217556529152/photo/1

Digital influence is a shifting concept, changing both with the available technologies and the way we’ve seen them used. The Russian invasion of Ukraine has reinforced a broader understanding of digital influence that includes disruptive techniques as well as showing the potential severity of situations that influence is used in. The industry, from individual consultants to established technology companies, shape the use of influence technologies and is often working unregulated, inconsistently, and with little transparency. At Tactical Tech, we will continue to build local capacity for investigation and monitoring of digital influence, conduct partnership-based research for an international perspective of digital influence, and develop digital literacy resources on the practices and impacts of influence campaigns. For others working on political influence, our initial analysis leads us to make three recommendations:

1) Digital influence practitioners including campaigners, social media influencers, and marketers should continue to review their data policies and ensure that the collection, retention, and archiving of personal data for political influence mitigates for a politically turbulent environment in which the security of personal data can be impacted by crisis and conflict.

2) Investigators looking at the topic of digital influence should examine the impact of the values and practices of the international influence industry including partisan companies, for-profit tactics used in political contexts, political tactics used in national security contexts, and which technologies have a shared use between social movements, election groups, and national security communications.

3) Digital literacy projects should not only continue to build the capacity of the public and civil society to engage with communication technologies and social media platforms in an effective and privacy-conscious way but also include tactics for how to combat trolls, bots, and misinformation. These tactics are evolving, and for now include how to: find different media sources, verify information, and promote the verified information.

If you want to know more you can check out our work here, sign up to our newsletter, or get in touch with the team.

Many thanks for comments and feedback from Marek Tuszynski, Christy Lange, Björk Roi, and Glyn Thomas.

1 Samuel C. Woolley & Philip N. Howard, “Computational Propaganda Worldwide: Executive Summary.” Samuel Woolley and Philip N. Howard, Eds. Working Paper 2017.11. Oxford, UK: Project on Computational Propaganda. comprop.oii.ox.ac.uk. 14 pp.

2 Tactical Technology. “Personal Data: Political Persuasion (How It Works),” March 2019. https://cdn.ttc.io/s/tacticaltech.org/methods_guidebook_A4_spread_web_Ed2.pdf.

3 Amber Macintyre. “The Imports and Exports of Sub-Saharan Africa’s Influence Industry” Sept 2020. https://medium.com/@Info_Activism/the-imports-and-exports-of-sub-saharan-africas-influence-industry-d189a7bb9edf.

4 Hersh, Eitan D. Hacking the Electorate: How Campaigns Perceive Voters. New York, NY: Cambridge University Press, 2015.

5 Tetyana Bohdanova, “Personal Voter Data Use in the 2019 Ukrainian Parliamentary Elections: A Report on Digital Influence Outside the Scope of Disinformation”, July 2020, https://cdn.ttc.io/s/ourdataourselves.tacticaltech.org/The-Use-of-Personal-Voter-Data-During-2019-Elections-in-Ukraine_EN.pdf.

6 Bohdanova (2020)

7 Bohdanova (2020)

8 Tactical Technology, “The National Resistance Movement App and Digital Politics in Uganda”, April 2021, https://medium.com/@Info_Activism/the-national-resistance-movement-app-and-digital-politics-in-uganda-558df35b7b48.

9 Kreiss, Daniel, and McGregor, S., “Technology Firms Shape Political Communication: The Work of Microsoft, Facebook, Twitter, and Google With Campaigns During the 2016 U.S. Presidential Cycle.” Political Communication 35, no. 2 (April 3, 2018): 155–77. https://doi.org/10.1080/10584609.2017.1364814.

10 Paul, Kari., “Nick Clegg promoted to top Facebook role”, The Guardian, February 2022, https://www.theguardian.com/technology/2022/feb/16/nick-clegg-facebook-meta-president-global-affairs.

10 Tom Simonite Gian M. Volpicelli, “Ukraine’s Digital Ministry Is a Formidable War Machine”, WIRED, March 2022, https://www.theguardian.com/technology/2022/feb/16/nick-clegg-facebook-meta-president-global-affairs.

11 Kreiss, Daniel. Prototype Politics: Technology-Intensive Campaigning and the Data of Democracy. Oxford Studies in Digital Politics. Oxford, New York: Oxford University Press, 2016.

12 Amber Macintyre. “The Imports and Exports of Sub-Saharan Africa’s Influence Industry” Sept 2020. https://medium.com/@Info_Activism/the-imports-and-exports-of-sub-saharan-africas-influence-industry-d189a7bb9edf.

13 Tactical Technology. “Personal Data: Political Persuasion (How It Works),” March 2019. https://cdn.ttc.io/s/tacticaltech.org/methods_guidebook_A4_spread_web_Ed2.pdf.

14 Bohdanova (2020)

15 Bohdanova (2020)

16 https://twitter.com/carolecadwalla/status/1502430347832745985?t=L-aRHlsploVIj0gWvoe5xQ&s=09

17 Bohdanova (2020)

18 Bohdanova (2020)

19 Amber Macintyre. “The Influence Industry Long List: The Business of Your Data and Your Vote”, Tactical Tech,

20 Amber Macintyre. “The Imports and Exports of Sub-Saharan Africa’s Influence Industry” Sept 2020. April 2021. https://medium.com/@Info_Activism/the-imports-and-exports-of-sub-saharan-africas-influence-industry-d189a7bb9edf.

21 Max Fisher, “Disinformation for Hire, a Shadow Industry, Is Quietly Booming”, The New York Times, July 2021, https://www.nytimes.com/2021/07/25/world/europe/disinformation-social-media.html.

22 International IDEA. (n.d.) Political Finance: Design Tool, https://www.idea.int/political-finance-design-tool.

23 Bohdanova (2020)

24 Bohdanova (2020)

25 https://twitter.com/annargrs/status/1497670943434366976

26 Bohdanova (2020)

27 https://www.accessnow.org/cms/assets/uploads/2022/03/Ukraine_-Safety-tips-for-civil-society_2022-UA.pdf

28 https://www.politico.eu/article/ukraine-russia-disinformation-propaganda/

29 https://twitter.com/uamemesforces/status/1502265977488261133?t=w7GOONmY4Iey8yd5XmzR4g&s=09

--

--

Tactical Tech

Tactical Tech is an international NGO that engages with citizens and civil-society organisations to explore and mitigate the impacts of technology on society.