Facebook on Russian Election Hacking: A Risk Analysis

Var City UW
Var City UW
Published in
11 min readFeb 16, 2019
Courtesy of Time Magazine

This article is written by Abby Huang who is an Informatics Junior at the University of Washington

New Technology will come with new techniques of political advertising, and with it comes unpredictable responsibilities and ramifications that demand the cooperation of many stakeholders. In 2017, the United States of America formally accused Russia of interference in the 2016 presidential elections; a landmark discovery that immediately resulted in several rounds of sanctions against Russia. However, when it came to addressing how exactly the Russians were able to infiltrate American politics on this new front of social media, not the lawmakers, tech billionaires, or the general public were prepared. The series of events and decisions that unfolded brought to light an important case study of how an increasingly polarized political environment no longer moderated by television, introduces new risk to consider.

The most recently filed indictment in July from special counsel Robert S. Mueller III’s investigation of collusion between Trump, his associates, and Russian interests, made Russia’s intervention during the 2016 U.S. presidential elections undeniable. Detailed reports cited Facebook, the largest social network of over two billion members, as the major and critical platform to the Russian operation (41 times compared to Twitter’s 9 mentions in the 37-page indictment). Following Facebook’s disclosure of foreign ad purchases and the findings of hundreds fake accounts, it became clear that Facebook’s unparalleled powerhouse of information distribution was used by at least 13 Russians and 3 Russian companies in a scheme to subvert the democratic process. Facebook had unwittingly become the biggest driver of Russian messages, ensnaring millions of participants in its social experiment. The list of offenses grew, including cyber attacks on 21 state voting systems, hacking of the Democratic National Committee (DNC) servers and subsequent leaks of Hillary Clinton’s emails before the November elections. This article will focus on analyzing the elements of risk involving Facebook’s tools and it’s platform that allowed Russian infiltration of U.S. politics as it did, bringing into question some of the glaring insecurities and susceptibility of our destabilized electoral systems.

In 2016, the once majorly anticipated presidential run of Hillary Clinton became undermined by a series of discrediting campaign of disinformation, email hacks, and pro-Trump messaging. In parallel was a seemingly inexplicable rise of an unlikely prospect, shaking up traditional politics in the two years leading up to the elections. Donald J. Trump burst into public eye as the first presidential candidate since the collapse of the U.S.S.R to controversially admire the deeply unpopular and undemocratic agenda of the vindictive Russian leader Vladimir Putin. His desire to do business with Russia despite international criticism of how Russia was handling human right affairs, led to the start of a very mutual understanding between the two (Frenkel, et al). What ensued was a series of contact with Trump’s associates during the campaign involving a range of Russian intermediaries, including oligarchs, diplomats, former military officers, and suspected agents. Experts describe the engagements as “loosely coordinated effort[s] by Russian intelligence both to get insight into the campaign and to influence it.” Donald Trump, now President of the United States, has repeatedly dismissed any possibility of collusion throughout the campaign and investigation, proclaiming it an unfair, “witch hunt.”

It is important to understand the motivation behind Russia’s interest in the election in the first place. Mr. Putin was a former K.G.B. officer with a known resentment for American “superiority” and nostalgia for Russia’s lost power. He was irritated by The United States’ support and funding of democratic movements and anti-Russian forces throughout the 2000s. When Russia started invading Ukraine in 2014, American’s condemnation of his practices led to the Kremlin’s particularly open hostility towards U.S. relations. One of the biggest proponents of democracy in Russia happened to be Hillary Clinton, then secretary of state: “[Russians] deserve the right to have their voices heard and their votes counted, and that means they deserve free, fair, transparent elections and leaders who are accountable to them” (Shane, et al). Since then, Putin has openly expressed his exasperation with the polarizing American “democracy” and a domineering Hillary Clinton. In the background, Russia had begun practicing a sort of cyber warfare in hacking and influenced operations throughout Eastern Europe, before launching their attack (Shane, et al).

While Facebook had always been present in the dissemination of political information, the growing interference from Russian forced Zuckerberg to consider more dangers of such an embedded platform. In 2014, Putin-mandated a St. Petersburg company called the IRA (Internet Research Agency), which began manipulating the Facebook platform to create fake accounts based on stolen identities from social media accounts. The group then used those to populate and promote Facebook pages rallying angsty users around social and economic issues. One such page called “Woke Blacks,” cited Clinton’s supposed hostility to African-Americans as a reason to abstain from voting. It was an operation composed of 80–100 competitively compensated, young Russian employees who worked round-the-clock shifts to copy and create propaganda that emulated disturbed Americans with remarkable efficiency. (Shane, et al) They operated with explicit instructions to tarnish and criticize Clinton and other candidates while bolstering the support of divisive candidates like Bernie Sanders and Donald Trump. A pattern of angry, corrupt or crazed Clinton memes emerged while Trump was portrayed as resolute, honest, and capable candidate. The Russian operation also boosted other divisive candidates such as Jill Stein, from the Green Party to deter even more votes away from Clinton (Shane, et al).

Facebook’s first acknowledgment of the potential misuse of their tools wasn’t until 2017. The initial $100,000 they found spent on ads appeared negligible compared to the tens of millions committed by the Trump and Clinton campaigns. But the Russians’ impact came from the viral spread of triggering propaganda, which was free. Resulting investigations discovered their reach amassed to 2,700 fake Facebook accounts, 80,000 posts, in which many of them shared an impeccable resonance with the well-defined target audience. The estimated number of reach by Facebook grew to a shocking 126 million Americans, considering that 137 million real people ended up voting in the 2016 presidential election (Shane, et al).

Starting in 2015, the Russians’ IRA “specialists” used a combination of stolen and rerouted PayPal accounts to pay for Facebook ads to promote viral posts, which totaled up to hundreds of thousands of dollars on outreach efforts. The operation became increasingly well-staffed and prioritized, seeing at one point over $1.25 million a month was being spent under the direction of a CEO who met frequently with Putin (WIRED). In the summer of 2016 the IRA evolved its efforts into organizing and coordinating political rallies within the US, pretending to be activists themselves. From several time zones away, they orchestrated large attendances by promoting their made-up events through a network of fake Facebook accounts and partnering with admins of other well established groups or communities. Rallies were held in multiple cities like DC with specific agendas such as convincing crowds that Hillary Clinton would turn the country over to Sharia law. At such events, there were actors who were paid to make specific demonstration or hold controversial signs of misattributed quotes.

In an official September blog post, Facebook disclosed 3,000 advertisements posted on the social network between June 2015 and May 2017 that were linked to Russia. The Washington Post confirmed that the subsequent ads came from a Russian company called the Internet Research Agency. CNN later reported that a portion of those ads were geographically targeted to reach residents of Michigan and Wisconsin, in which less than 1% of the closest defeats were key to securing Trump’s victory among the Electoral College. Trump defeated Clinton by a narrow margin of 10,700 and 22,700 votes out of over 4.8 million total cast, respective in both battleground states (CNN).

A Buzzfeed analysis discovered that in the three months following the elections, the top fake election news stories on Facebook had more engagement than the top stories reported by long-time established news sources such as The New York Times, The Washington Post, The Huffington Post. The IRA’s efforts had undermined the very foundation in which consumers get their information. By September 2016, even widely followed and respected public figures like the Pope were sharing

discrete forms of disinformation created by the IRA (Madrigal). Facebook’s platform had additionally enabled the business of fake news even further by allowing them to profit from the viral sharing. Not only was the campaign’s engagement extensive throughout the U.S., its opportunity via Facebook had begun to inspire other manipulations. In the days following up to the critical election, the same reporter had traced at least 100 pro–Donald Trump sites to a town of 45,000 people in Macedonia where a group of teenagers were making money off posting about the election. These children were unknowingly furthering the Russian’s original initiatives that helped Trump overwhelm Clinton’s hold on winning (Madrigal). A separate study from the Center for Digital Journalism at Columbia University, found that a verified six of over 470 publicly known Russia-linked Facebook pages had been shared 340 million times. Given these figures, simple multiplication puts the potential shares of this propaganda across billions of pages (Madrigal).

To this day, it is not proved nor disproved that there was collusion between Russia and Trump’s presidential win. Heated disagreement from expert pundits to analysts to voters are unsure of what the actual impact of the meddling resulted in. Whether or not it caused voter suppression, changed individual’s votes, or played a role in the difference of less than 100,000 votes in three swing states, what is known is that the percent of Republicans who view Vladimir Putin favorably has doubled from 11 percent to 25 percent, according to a poll by the Pew Research Center. Meanwhile, another October 2017 poll showed that 63 percent of Democrats and just 38 percent of Republicans said they saw “Russia’s power and influence” as a significant threat to the United States” (Shane, et al). What’s important now is to learn what can be done from the situation and apply it to the future decision-making and strategy at, not only Facebook, but across the sector.

This portion of the paper will look at the specific risk exposure Facebook should consider over this period of time.

Operational Risk: The risk of direct or indirect inadequate or failed internal processes by people and/or systems (can be external). In terms of people, with multiple counts of employees and leaked companies memos discussing the company’s disregard of safety and responsibility as Facebook pursued exponential growth. The management of people at Facebook has become a risk in question as Facebook is experiencing a change in retention and attracting the top talent to avoid such scandals. More and more people are considering leaving the company that was once widely considered the place in Silicon Valley where nobody wanted to leave. In terms of people, before official hearings where the result of evidence could properly be addressed, Facebook’s chairman and C.E.O., Mark Zuckerberg, had over and over again publicly maintained that the amount of Russian content that had been disseminated on social media was too small to matter. But subsequent investigations from other organizations and on their own part, discovered evidence that he had gravely, again, misunderstood the gravity of the situation — with estimations of up to 150 million Americans being exposed to the Russian propaganda on their platforms. In terms of addressing inadequate systems or break down of automated systems, Facebook has been under fire for more stringent monitoring of abusive content while also balancing the line between not intruding on political freedom of speech.

Strategic Risk: Facebook faces different risks from the rapid speed of disruptive innovation enabled by new emerging personalization technology that may outpace their means of managing them without making significant changes to their business model. It’s powerful business model based on its ability to gather personal data and effectively target ads to remarkably specific demographics has made Facebook very successful. Facebook made $40 billion in revenue in 2017, of which $39.9 billion was from their digital advertisements product. However, with the onslaught of scrutiny from multiple fronts about whether or not their algorithm for bringing custom, desirable, albeit filtered and easily manipulated, news to users may require reconsideration of the strategy that made it so profitable.

Program Risk: Facebook’s most successful programs like News Feed and ad sale products to even “I Voted” buttons have to be assessed for their range of impact, good and bad alike. Current algorithms are dependent on shares over comments and likes, and prioritization of emotionally responsive content, like videos and photos, caused virality and unprecedented growth of engagement. Facebook has the power to bring new traffic to sites by 70%, owning media distribution (Madrigal). For example in 2015, Breitbart went from a Facebook page of 100,000 to 1.5 million likes over the course of a year, eventually outperforming the New York Times interactions by millions in July. The key metric for Facebook’s powerful engagement tools is by aggregating interactions to predict and push homogenous content onto users under the premise of personalization. This, however, eventually perpetuates a “Filter Bubble,” as described by Eli Pariser since back in 2011 (Madrigal). There is substantial risk in how a curated set of facts can deter public debate and how this formula for engagement can be manipulated. This is the opposite of what users want when they look to Facebook to openly connect with the world, and it may no longer be the acceptable standard for it’s predictive models. Facebook’s platform had hyper-engaged what was originally a weak following into a powerful political force. Since 2012 Facebook has demonstrated its ability to impact electoral politics when a team composed of researchers from University of California, San Diego and Facebook published a study arguing that Facebook’s “I Voted” button design had led to a substantial increase in youth voter engagement and participation of the 2012 general election. Facebook has to be more transparent with how they will responsibly use as well as restrain this power in the future (Madrigal).

Activity risk: The action of posting fake news which Facebook then automatically promotes has created a complex struggle with what activity it will show, allow, and enable. Cases of unmanaged activity risk ranges from selling unlicensed guns to broadcasting live killings, to giving social media fame to anyone. The company’s business depends on people being highly engaged with what is posted on its sites, which needs to be assessed further, especially to reduce foreign manipulation (Frenkel, et al). Facebook has since hired 3000 people worldwide to help monitor content and review reports of harm or harassment. It has also changed its advertising policy so that any ad that mentions a candidate’s name goes through a more stringent vetting process.

Reputational: Facebook’s brand seems to be dealing with numerous scandals surrounding exposure of personal user data, abuse of their tools for political corruption, distrust of internal employee with many executive level people who are leaving due to disagreement on how to handle the problems that Facebook keeps failing to mitigate. Others include: loss of trust over its ability to handle user content and the spread of hate speech that led to real genocide in Myanmar. The resulting bad publicity has damaged Facebook’s reputation, leading to the #DeleteFacebook campaign and depreciating faith in the CEO. Zuckerberg and company officials were demanded to testify before Congress in the fall of 2018, under public pressure.

Asset: Facebook has seen its stock drop nearly 40 percent from July of 2018 (CNBC).

Regulatory: Facebook’s constant run in with Congress proved to have substantial risk when it sold user data in the Cambridge Analytica scandal who fined them and set the stage for stricter regulatory changes and scrutiny. This will affect the way Facebook manages content, user data, and also how they produce and deliver upon new products.

Macroeconomic Risk: Uncertainty in volatility and political leadership in national and international markets holds Facebook subject to constant, large-scale response and mitigation in order to prevent misuse of their platform, as well as anticipation of new requirements. This extra cost and surveillance can limit their growth original opportunities.

Conclusion

In many ways, Facebook in it’s short history as an enterprise has made significant efforts to address the different types of risks it has faced across many risk categories. However as always, with new technology, processes, and culture shifts comes a need to renew risk assessments of existing protocols to best mitigate and also take advantage of the risk presented. Anticipating risk isn’t always a bad thing — and in the case of Facebook, taking the time make sure their execution is aligning with their business goals will help them get back on track while they are under acute speculation from not only their customers, the government, and the world. The analysis and recommendations here are meant to prepare and complement the fixes that Facebook are facing, in their journey through this patch of uncertainty.

--

--

Var City UW
Var City UW

Empowering the University of Washington’s Computer Science, Design and Technology community.