Hack. Leak. Amplify — A Fast Primer on Russia’s Information Warfare Program in the United States
A great deal has been written concerning Russian information warfare operations in the United States, particularly those connected to the 2016 presidential election. I have written and presented extensively on the subject both in the United States and abroad, and there is a wealth of other information available from reliable journalistic and intelligence sources.
This post provides a primer on the broad objectives and strategies of the Russian campaign, which can be understood as Hack, Leak and Amplify. Better put, Hack-Leak constitutes one track, driven primarily by the cyber warfare arms of Russia’s GRU (military intelligence), and FSB (the successor agency to the KGB). These are known as APT (Advanced Persistent Threat) 28 and 29. Amplify is the second track.
HACK-LEAK. APT 28 and 29 routinely engage in broad phishing campaigns aimed at government, business, academic and military targets. (Cybersecurity firm Fire Eye has an excellent report on these activities here.) These phishing campaigns commonly involve sending large quantities of spoof emails designed to imitate legitimate sources (such as Google), warning the recipient to take an action (such as resetting their password) through a spoof website. The phishing emails contain various elements designed to make it appear, on quick glance, to appear legitimate — including what appears to be the sender’s name and email address, logos and graphics, vocabulary and phraseology. Though a technique known as typosquatting, the attackers may host their fake website at a domain that appears to be legitimate at quick glance, such as g00gle[dot]com, with the “o” in Google actually being zero. The purpose here is to have the web address of the website to which the recipient is directed appear legitimate.
Beyond these mechanical elements, the language is often intended to cause alarm or concern in the hopes that the recipient acts reflexively as intended, and before taking a closer look at the email.
Once the target takes the intended steps, the effect is to transmit the target’s credentials to the hackers, who in turn use them to gain access to the legitimate account and exhilarate the information. This is what happened to Hillary Clinton’s campaign chairman, John Podesta.
As raw data is gathered, the intelligence services must curate the information to determine what is useful. This is an important step, since one cannot simply steal terabytes of data and have its strategic utility appear obvious. Someone with strategic knowledge, rather than merely IT skill, needs to go through it. In Podesta’s case, as with the DNC and DCCC, whose networks were penetrated by APT 28 and 29 for months, the curation process provided the Russian intelligence services with evidence the Clinton campaign worked behind the scenes with the DNC to provide Clinton with an advantage that could be seen by Sanders supporters as unfair.
Hacking, and the theft of data through it, is followed by curation and analysis to determine what is potentially useful for the agency sponsoring the penetration — in this case, the Russian government.
Leaking information is not a single act, but in itself is a campaign. Russian intelligence used Wikileaks and its own website, DCLeaks, as the primary portals through which to distribute the information it selected for release on a specific timeline to maximize its effect. In one dramatic example, the information that the DNC had worked with the Clinton campaign to tilt the nomination process was released just before the start of the Democratic National Convention, forcing the resignation of then-DNC Chairwoman Debbie Wasserman Schultz.
While the impact on the result of the 2016 can never be fully known (it can neither be measured, nor repeated), certainly events such as forcing the resignation of the leader of the Democratic Party on the eve of the party’s convention impacted the flow of the national campaign.
When the DNC hacking was reported quickly attributed to Russian intelligence services, those services engaged in a “denial and deception” campaign of misdirection. “Guccifer 2.0,” an online persona purporting to be a Romanian hacker, claimed responsibility for the theft of DNC data and subsequent leak.
In fact, Guccifer 2.0 was not a real person, but rather what is known in intelligence circles as a cutout, or sock puppet. Denial and deception campaigns are often multi-faceted, as was the case with the DNC hack and leak. While “Guccifer 2.0” was claiming responsibility, Russian propaganda outlets such as RT and Sputnik pushed a conspiracy theory that former DNC staffer Seth Rich, who was murdered in what authorities concluded was a botched robbery, was an insider who stole and leaked the information. There is no evidence behind this claim.
Denial and deception efforts do not need to “prove” a false narrative to be successful — the objective is to create confusion about those truly responsible. CrowdStrike, the respected cybersecurity firm retained by the DNC to investigate the hacking, concluded DNC’s network was compromised and sending information to Russia for months before it was discovered.
Hack-Leak is a strategy employed by Russian intelligence services in numerous other high profile cases, including those involving the World Anti-Doping Agency, the German and French governments, among others.
AMPLIFY is the second track of the information warfare campaign. It operates largely independently of the hack-leak track (which was driven by APT 28 and 29), and is waged primarily by Russian troll farms such as the now infamous Internet Research Agency based in St. Petersburg.
The New York Times Magazine in 2015 published an excellent piece by Adrian Chen on the Internet Research Agency. Read The Agency.
This track is designed to amplify social tensions within the target population. A wealth of reporting describes the strategies and tactics used by the troll farms to achieve this objective through thousands of fake Facebook, Twitter and other social media accounts, propaganda outlets such as RT and Sputnik, and independent websites carrying Russian-generated content.
The content of the posts in this campaign focused exclusively on emotionally charged subjects, almost always in one of four areas: race, religion, ethnicity, and separatism. Content was aimed at both the left and right, and designed to push each to the extreme. Content never focused on more benign and unemotional topics such as taxes, trade, regulation, and education. You can review examples of the Russian posts in this Washington Post story by Dan Keating, Kevin Schaul and Leslie Shapiro.
Paid ads funded by the troll farms promoted various accounts which appeared at first to be genuine and innocuous, and of interest to target audiences. TENGOP was a Twitter account aimed at appealing to Republicans while Blacktivist was a persona aimed at appealing to African-American Democrats. Many of the posts in both accounts were innocent in appearance and typical — if every post was extreme, it would turn off people in the target audiences. Over time, increasingly radical posts were included in each account. In between innocuous and normal posts, TENGOP promoted the accusation Barack Obama wants to convert all children to Islam. In an effort to suppress African-American voter turnout, Blacktivist posts later in the campaign urged followers to vote for Green Party candidate Jill Stein, or not to vote at all.
The strategy: appeal to millions of Americans on the left and right through accounts imitating real people, then gradually push them to the ideological extremes with the objective of driving polarization by normalizing out of the mainstream views.
On May 26, 2016, Russian accounts promoted both a pro-Islam, and anti-Islam rallies in Houston at the same time and place with the obvious intent to promote violence. Natasha Bertrand describes the operation in greater detail in this Business Insider piece.
Russian information warfare campaigns are ideologically promiscuous — worldwide, the Russians are just as apt to promote far right-anti NATO parties in Europe just as easily as they might promote far left anti-NATO parties, for example.
The first step in combating information warfare campaigns by foreign governments is to raise awareness, and encourage Americans to be more skeptical of online content. One study found that 85% of posts shared by Facebook users were not read by the user first — they simply shared based on the headline or the photo.
The Russian troll farms and intelligence services did not shut down on Election Day in 2016 — they continue operating with targets around the world. By understanding how they operate — their objectives and methods — we can reduce their effectiveness and protect our country from those who want to weaken us from within.
Ron Nehring served as the presidential campaign spokesman for Texas Senator Ted Cruz in 2016. The views expressed are his own.