Between interference and disinformation — The Future of our Democracies

--

By Léa C. Glasmeyer, Core Writers’ Group

It is nothing new that election interference and disinformation represent a threat to democracy. The 2022 Report of the Special Committee on foreign interference in all democratic processes in the European Union, investigated, inter alia, how foreign powers manipulate information and discredit European democratic processes, the report also focused on how malign actors influence elections, conduct cyber attacks, hire former senior politicians, and increase the polarization and influence of public debate — often without fear of repercussions. This resulting risk of subversion of democratic processes questions the current regulations’ volatile nature. Election interference has become a foreign policy tool for several non-state and state actors, notably Russia, which has used such means to advance its own national interests. With the recent shift to digitalisation, it has become increasingly difficult to trace the origins of such interferences. It is therefore crucial for democracies to maintain citizens’ confidence in the legitimacy of their institutions by strengthening their democratic resilience and digital capabilities to withstand such obstacles.

From elections campaigns to the war in Ukraine

On February 15, 2023, the Forbidden Stories consortium published an investigation into the Israeli so-called “Team Jorge” and its disinformation campaigns. Team Jorge is an agency created by former Mossad and Israeli army officers and is specialized in spreading disinformation, cyberespionage, and the propagation of fabricated stories. The agency whose clients have ranged from intelligence agencies, and corporate clients, to political campaigns, has claimed to have manipulated more than 30 elections, notably through the use of a software program called Aims (Advanced Impact Media Solution). It offers various services to its clients, from the creation of fake accounts on social networks, to allowing infiltration through bot networks, document falsification, and image embellishment.

These comprehensive sabotage actions reveal the existence of a global private disinformation market encompassing a large network of companies that use and control thousands of social media accounts. Avatars and fake profiles are easily created, existing accounts infiltrated, and real persons manipulated. Nonetheless, despite the ubiquitous character and repeated use of disinformation and foreign interference, there is a general lack of awareness about the implications and dangers of information manipulation. Within the EU, this is exacerbated by legislative gaps and a lack of coordination between Member States. Both Russia and China have been especially active in this domain. The war in Ukraine makes decisive and concrete steps against the spread of disinformation and interference more necessary than ever: Indeed, in times of war, disinformation becomes a weapon when it is used to mislead and manipulate the general public opinion and European citizens.

In Russia, coordinated networks of volunteers have become very active, notably on Telegram, which work on translating Russian articles about the war into various other languages, with the purported goal of “reestablishing the truth”. This community of volunteers, called “Info Defense”, disseminates content on major social media networks to international audiences, acting as a grassroots, decentralized propaganda network. The message: “Ukraine and its allies are responsible for the war, Russia is only liberating oppressed populations”. Inviting everyone to join, this low-cost model could turn out to be compelling for both state and non-state actors.

Election interference

Technological developments have been playing a key role in this regard, both offering growing opportunities while constituting major challenges, as they provide political actors with a wide reach at low costs to facilitate the collection of personal data for micro-targeting. Micro-targeting generates social echo chambers that reinforce and perpetuate distorted perceptions of the public discourse and predominant ideas by artificially amplifying specific opinions and false narratives. These echo chambers, in turn, hamper citizens’ ability to make political decisions based on pluralistic information.

According to a 2020 Oxford Internet Institute report, 81 countries were the target of campaigns to manipulate public opinion. While disinformation campaigns have long been a staple of international relations, it is only with the ubiquity of “fake news” and the Russian interference during the 2016 U.S. elections that their relevance for the democratic functioning of our societies became obvious to a broader public. Citizens then began to worry about the noxious effects of social media and its effectiveness in controlling societies through lies. Russia had already been interfering in elections in neighboring countries since 1991 to support pro-Russian parties and to polarize society. While such interferences were initially only targeting post-Soviet countries, after 2014, the scope of target countries began to broaden to include Western democracies as well.

Ukraine especially has been the target of repeated interferences, beginning with Russia’s support for Viktor Yanukovych during the Ukrainian presidential elections in 2004, which eventually led to the Orange Revolution. Ten years later, the anti-Russian Euromaidan protests were equally targeted by Russian interference, soon followed by the illegal annexation of Crimea in 2014. Simultaneously, Russia began including cyberattacks and online disinformation in its repertoire. Since then, hack-and-leak operations have become very popular, as they are cost-effective and serve the purpose of disrupting state institutions and elections well. Altogether, the Kremlin has been waging a long-term information war against Ukraine, seeking to generate an ideological subversion to its own benefit.

China, on the other hand, refrains from massively interfering in Western elections, but uses tools of disinformation within its own sphere of influence, mostly in Taiwan and Hong Kong, but also towards its own population. Iranian troll farms have also become increasingly common, with disinformation operations predominantly directed at specific political events. Furthermore, independent right-wing entities are also progressively employing disinformation tactics to promote their own political agenda, often by denigrating political adversaries and questioning the legitimacy of institutions and elected individuals.

Democracy under threat

Disinformation campaigns represent a major threat to democracy, as they undermine the plurality of opinions and the contest of ideas, thus hindering unbiased opinion-forming. Systematic misinformation affects independent journalism and, most importantly, independent and free elections. So-called “Infodemic” phenomena — which include continuous waves of rapidly spreading disinformation — have become prominent during the Covid-19 pandemic. Consequently, “the majority of people will see more false information than true information in 2022”. Disinformation has indeed become an important business, increasingly shifting crime to the digital. While the influence of disinformation campaigns seems to have been relatively weak so far, these campaigns remain influential as they continue attracting more and more clients. Their means will also continue to increase, leading to more consequential results in the long term. Altogether, the industry remains opaque and is likely to move towards a blackmail business with an improvement in the hacking methods available to these organizations.

Cyber security represents another notable concern, especially for counterintelligence services. Cyber weapons are used to capture sensitive data or to conduct sabotage operations and can be employed to destabilize and weaken operations, damage the reputation of a structure or of a personality, or even conduct terrorist attacks. Cyber security experts estimate the number of companies having specialized in disinformation and manipulation to be constantly growing. Addressing cyber threats relies on factors that need to be further developed by democratic societies in order to better prepare themselves to face malign influence and a biased information world. Coordination between services is particularly crucial, both at the national and international levels. While detecting malign video content has become much more complex with the rise of audio and video materials, both mediums are increasingly used for manipulation and require greater attention. Finally, the hybridity between criminal and state operating methods also poses a major challenge when emanating from states that do not cooperate with democratic justice systems.

Democratic and digital resilience

Altogether, election influence is done by measures of manipulation, cyber occurrences, or political grooming. The rapid and extensive dissemination of information is increasingly done through social media, which offers various opportunities for abuse and renders the gatekeeper model of public discourse obsolete. Algorithms contribute to reinforcing exposure to biased content, focusing on creating echo chambers and content that generate abundant reactions.

In international law, interference is prohibited when it amounts to “coercion” that hinders independent opinion forming which itself distorts the democratic process. Because manipulative conduct has similar consequences, it amounts to coercion and should be considered a violation of state sovereignty and the right to self-determination according to Art. 1 of the International Covenant on Civil and Political Rights (ICCPR) and Art. 1 of the International Covenant on Economic, Social and Cultural Rights (ICESCR). Accordingly, states should adapt international law to the novel threats posed to democracy and electoral processes by going beyond the mere enactment of new legislative measures tackling disinformation.

Within the EU, Parliament negotiators have aimed to reach agreements on the rules among Member States in time for the 2024 European elections. In view of revelations of alleged foreign interference and investigations into corruption, the “Special Committee on Foreign Interference in all Democratic Processes in the European Union, including Disinformation II”, (ING2), was entrusted with further areas of responsibility and capabilities. The Committee has the task to identify legislative shortcomings in transparency, integrity, accountability, and anti-corruption to deliver reform recommendations. The European Parliament has also adopted the Digital Services Act and the Digital Markets Act in 2022 to build a safer and fairer digital sphere that protects users’ fundamental rights and establishes a level playing field for businesses. Understanding how adversaries manipulate and weaponize information and what their short and medium-term strategic goals are is crucial for domestic defense.

Therefore, a more comprehensive understanding of influence campaigns is needed with more reliance on open-source intelligence (OSINT) resources. More transparency, better protection against misinformation and foreign interference as well as a single market for political advertising are necessary. Social networks constitute a major challenge, and need to be regulated. While the veracity of their content should also be guaranteed, social networks simultaneously function as identifiers and detectors of fake accounts. Only through coordinated efforts among institutions and intelligence services can disinformation be both debunked and countered. Altogether, states need to improve both cyber defense capabilities and media literacy to enable citizens to better navigate the vast amount of information while responding to concerns about free speech and civil liberties.

Léa C. Glasmeyer is part of the European Horizons Core Writer’s team. She holds a Master of Public Policy from the Hertie School in Berlin and the Munk School of Global Affairs and Public Policy from the University of Toronto, as well as a Franco-German BA from Sciences Po Aix and the University of Freiburg. She is part of Netzwerk F, an intersectional network for the promotion of a feminist foreign policy, and a member of the Diverse Young Leaders initiative, where she aims at bringing young people with migration biographies closer to politics. Passionate about theatre and literature, Léa is also a fervent European citizen and particularly interested in democracy and the rule of law.

--

--

The European Horizons Editorial Board
Transatlantic Perspectives.

European Horizons empowers youth to foster a stronger transatlantic bond and a more united Europe.