Listen to this story
The series “Guidance Systems” discusses technologies that seem to improve our lives by offering us new choices, while in fact shaping or removing our ability to decide things for ourselves.
Back in August a Twitter user, “Conspirador Norteño,” began documenting an interesting pattern among certain folks posting alt-right messages. Here’s one user that fit the pattern.
Conspirador Norteño picked one user, DavidJo52951945, from this group — one of more than 60,000 in a network of similar users.
DavidJo52951945 went on a daily tear, writing over the years about Brexit, about Ukraine, about Hillary Clinton—whatever was going on in the world that was politically divisive in the West, and in which the Kremlin has a stake. He was remarkably on top of it.
Conspirador Norteño writes that he believes this account is a human, not a bot. For one thing, DavidJo52951945 keeps a schedule. A very telling schedule.
In the end, DavidJo52951945 was about as influential as a user can be. A perfect Twitter user. Disciplined, engaged, active all day long.
This troll would seem random and anecdotal, but it is part of a much larger pattern, one identified in a research paper forwarded me by a colleague a few months ago. It’s a 2014 analysis by Jolanta Darczewska, member of a Polish think tank, the Centre for Eastern Studies. Her subject is specifically the propaganda war in the Ukraine ahead of Russia’s invasion of the Crimea—the deliberate manipulation of public opinion there by Russian agents and propagandists to soften things up for invasion. But more broadly Darczewska has written a discourse on Russia’s larger strategy for destabilizing its enemies the best way it knows how: through information.
What make this report so astounding and informative is that it compiles not guesswork and talking-head analysis, but a detailed account of exactly what Russia has said it is up to, in the public speeches of information officials and military folks whose job it is to protect Russia’s interests against the United States and the EU.
First of all, it’s amazing to read just how clearly and directly some of the top strategists of information warfare against the West articulate the distinction between our societies, and their ambitions for Russia. Aleksandr Dugin, a professor at the Moscow Lomonosov University, a highly visible political analyst, and a founder of the Eurasia Party, is quoted at length. (Dugin has various ties to the Kremlin, and clear views on Russia’s president, best articulated by him in a 2007 public appearance. “Putin is everywhere, Putin is everything, Putin is absolute, and Putin is indispensable.”) Here, he describes Russia’s future, what the paper describes as a “neo-conservative post-liberal ideological superstate.” Dugin writes:
Russia is not and will not be a pre-liberal power. It is a post-liberal revolutionary force struggling for a just multipolar world, for genuine dignity and freedom. In its war on liberalism Russia will defend tradition, conservative values and true liberty.
Meanwhile, Igor Panarin formerly of the KGB, now a professor at the Diplomatic Academy of the Ministry of Foreign Affairs of the Russian Federation, is described as having “laid the foundations for the Information Security Doctrine of the Russian Federation.” Here, they’re laid out for us.
In practice these are operations of influence, such as: social control, i.e. influencing society; social manoeuvring, i.e. intentional control of the public aimed at gaining certain benefits; information manipulation, i.e. using authentic information in a way that gives rise to false implications; disinformation, i.e. spreading manipulated or fabricated information or a combination thereof; the fabrication of information, i.e. creating false information, and lobbying, blackmail and extortion of desired information.
Panarin, in his book “Information World War II,” described the need to organize a central “information KGB.” In presenting his book “Information Warfare and Communications” (eleven of his fifteen books use the phrase “information warfare” in their titles) he described the need to create a “national information warfare system…based on the best Soviet experiences. It must be enriched by the US and Chinese experiences.” But here’s where I begin to notice a pattern. The report describes one Panarin contribution to national efforts this way.
Panarin distinguishes the following stages in the process of information operation management:
(1) forecasting and planning
(2) organisation and stimulation
(4) operation adjustment
(5) performance control
(I often think, as I’m waiting to order at some pristine cafeteria-style eatery, that our most advanced form of capitalism, in which we choose from among expertly prepared but largely interchangeable variations of, say, a shirt, or an organic wrap, closely resembles textbook communism. Long lines, few choices, happy citizens. But that’s neither here nor there.)
What grabs me reading the playbook of propaganda and information warfare laid out in this report is that it echoes—or at least complements—the playbook of social media companies in the United States today.
The paper describes the principles of propaganda that modern Russia has adapted from Soviet times. Darczewska writes that in recent information battles it’s clear that these include “the clarity principle,” by which “the message is simplified, uses black-and-white terms, and is full of loaded keywords.” Or consider the “principle of desired information,” in which the planners of the information campaign try to exploit messages they know the audience is receptive to. Most haunting to me is the “principle of emotional agitation,” in which the campaign brings people “to a condition in which they will act without much thought, even irrationally.”
I read all of this as good propaganda, sure. But it also seems like an echo of the strategies I hear executives describing onstage and in private meetings about the best way to capture and retain users. And if a platform is built to make these principles easy to enact —filtering out unwanted information, analyzing keywords to deliver the most emotionally impactful content, rewarding the shortest, clearest, most powerful headlines—then it suddenly dawns on me that we’ve built the perfect machine for another country to capture and retain users.
Written as it was to understand how Russia used information warfare to prepare the Crimea for an invasion, the paper takes pains to avoid suggesting that Russia could do the same to the United States. But it points out that there are still dangers for this country in what Russia is pursuing:
Russians also play on the various motivations of various social groups in the West (using pacifists’ fear of war, politicians’ fear of unpredictability and entrepreneurs’ fear of losses)… Furthermore, public opinion is not aware of the fact that they are the object of a planned and coordinated information struggle.
When I consider that there is a geopolitical player that has been working for most of its modern history to perfect its means of disrupting and destabilizing the United States, that its principles of doing so revolve around taking advantage of emotion, and that Facebook and Twitter have made an addictive product out of our emotions…well, it’s clear we’re in the middle of a very tight knot.
Conspirador Norteño pointed out that when DavidJo52951945, now going by DavidJoBrexit, went silent for just three days this summer, American Twitter users who’d been following him began to ask if he was all right, and when he’d be back.
Facebook, Twitter and the rest have helped to create the digital communities by which we find comfort in grief, support as parents, hope in our personal and professional lives. But they’ve also created a platform by which agents of another country can influence American public opinion, simply by being expert, active users.
About this Collection