Illustration by Dave Chenell

How Two Facebook Engineers Could Decide the Presidential Election

Berit Anderson
Scout: Science Fiction + Journalism
7 min readJul 22, 2016

--

Thanks to Facebook’s renowned hacker culture, just two developers working together could push an update to the site’s news feed algorithms that changes the outcome of the 2016 presidential election. According to former Facebook developers Scout has spoken with, it takes one engineer to write the code and one to approve it and push it to production.

Done smartly, these kinds of manipulations would be very targeted and extremely difficult to detect. Imagine two developers pushing Facebook’s ‘I Voted’ banner to the top of news feeds among liberal voters in swing states to drive more of them to the polls. Imagine showing politically moderate users more favorable posts about Hillary — or about Trump — in their feed in the weeks leading up to the election.

These kinds of moves could swing the election.

If they did, there’s virtually no chance any of us would even realize what had happened.

We took these ideas to their logical end points in three speculative fiction scenarios, which ask “Will Silicon Valley Stop Trump?”

Facebook has already performed non-consensual research on its users to intentionally affect their emotions and behavior through incredibly subtle changes to the news feed. In 2012, the company successfully manipulated the moods of nearly 700,000 users by including more positive and more negative posts in individual users’ feeds. In 2010, it used a simple banner ad at the top of the feed to convince an additional 60,000 users to vote.

Facebook’s public stance on this has been firm: News feed is its most important asset and it has a range of safeguards in place to maintain compliance and control of its code. However, Scout has spoken with multiple past Facebook developers who say a pair of developers on or close to the news feed team could get code past that review process so long as it didn’t contain any bugs, harm engagement or business metrics, or affect user privacy.

Scout reached out to Facebook to confirm this on July 12th, but as of July 21st had not received a response.

***

This spring, Facebook was confronted with one of its worst controversies in recent memory. A former employee went on the record alleging suppression of conservative news in its Trending Topics section.

Though the jury was split on whether or not conservative news really was suppressed, everyone involved missed the bigger story. Focusing on Facebook’s corporate politics or Mark Zuckerberg’s personal political leanings misses the point — changes to the news feed wouldn’t need to be a company-led effort.

In fact, according to our sources, they could feasibly be made without any knowledge by company leadership. The following is the process they described to us.

Once a pair of engineers agrees that code is ready, it moves into production, the process of gradually rolling code out to the entire Facebook user base. Every Tuesday, teams across Facebook meet to approve new code from the last week. From there, the approved code moves into a series of tests monitored by the operations team — as a January Slate article described it, “first in an offline simulation, then among a tiny group of Facebook employees, then on a small fraction of all Facebook users — before it goes live.”

Here’s the important part. According to our sources, these safeguards — Tuesday meetings, operations and compliance monitoring, segmented release — depend mostly on automated review to catch three things: Bugs, changes to data privacy, and material changes to Facebook’s business and engagement metrics. As long as new code doesn’t raise any red flags in any of these areas, it would theoretically move easily through the review process.

There is no process that we are aware of within Facebook for understanding the political, ideological and societal effects news feed updates might have on users.

That means that, assuming their code is error-free and doesn’t negatively impact privacy, engagement, or business metrics, two employees could collude to manipulate the beliefs and behavior of Facebook’s 1.65 billion users without anyone — users, fellow employees, executive leadership, Mark himself — even noticing.

One reported Facebook employee implied in a 2011 post that Mark Zuckerberg reviews all changes to Newsfeed algorithms. Facebook has yet to confirm or deny that fact to Scout. However, we have it on good account that, though the review has at least existed in the past, it was superficial at best.

***

Scout’s editorial team actually predicted that bias against conservative news would become a problem for Facebook way back in May of 2015.

At the time, we were preparing to launch a Kickstarter campaign and brainstorming new types of content that could illustrate our larger focus — the social implications of technology. We wanted to illustrate how strategic foresight could be used to identify possible future risks for society. So we decided to create a press release from the near future.

We already knew that the world’s largest social platform, which plays communicator and confidante to 1.65 billion users, allowed just two developers to make changes or updates to the site.

At the time, the 2016 primary was just revving up. Donald Trump had yet to enter, but Hillary Clinton and Ted Cruz were already taking potshots at each other. It was clear that this was going to be an exceptional year for national political contempt.

And Facebook was its most powerful magnifying glass.

So we created a fake press release from the future (below left) to articulate one way that could possibly play out.

Draft design by Pope Wainwright Wykes.

Remember, we put this together last May — before Trump announced he was running.

Our goal with this was not to demonize Facebook or Clinton or Cruz. We believe that Facebook employees and leadership are smart people with generally good intentions.

Instead, we wanted to explore what it meant that a company with a demonstrated willingness to psychographically manipulate its users also has a hacker culture that allows small groups of individuals to ship code that affects the minds of billions.

Ultimately though, we abandoned the draft. We didn’t want to undermine either candidate’s campaign or contribute to the global overabundance of chain email conspiracy theories.

That is until this spring, when reality began to mimic the release. In May, Michael Nunez at Gizmodo reported on the suppression of conservative news in Facebook’s Trending Topics section.

The mania that arose clearly put Facebook on the defensive. The U.S. Senate Commerce Committee launched a Letter of Inquiry into the Trending Topics selection process, Facebook staged a meeting between Mark Zuckerberg and conservative leaders, Glenn Beck penned a pointed vindication of Facebook, and Facebook itself launched an internal investigation.

Not long after, Facebook announced that it would be changing its Trending Topics selection process to eliminate the possibility of anti-conservative bias.

The real story though, which the media had already begun to explore, was the outsized influence of the news feed and its role in shaping public thought.

Unlike Trending Topics, the news feed is central and stuffed with political opinion, its scroll never-ending. And it’s run by algorithms, which, unlike humans, don’t speak to the press about how it works.

Those algorithms have a lot of power. They influence which of your posts will go viral and which will reach just a small dribble of friends and followers. They suppress posts by organizations and individuals that you ‘Follow’ and elevate posts from friends and family.

As Facebook’s mood manipulation experiment taught us, they can literally change your mood just by subtly changing the types of posts you see.

***

Let me be clear: We’re not exploring this issue because we think Facebook is some kind of Machiavellian mastermind or because we believe that Mark Zuckerberg and Sheryl Sandberg are out for world domination or that Facebook engineers are maniacal hackers. It’s not that we think that this will definitely happen or even that it’s already going on.

However, based on publicly-available information and conversations with former employees, we believe it could happen.

Until we as a public are confident that there are formal safeguards in place to protect against massive electoral and ideological manipulation, this possibility will continue to be a problem for Facebook and a source of fear and distrust among the public.

Facebook has shown a willingness to address these types of weaknesses and bias in its system. We invite Facebook and CEO Mark Zuckerberg to begin a conversation with the public about what, if any, safeguards currently exist to protect against manipulation of public opinion.

Up to now, we as a society given Facebook a free pass on its control of the world’s most powerful tool to affect political and ideological perceptions.

It’s time to get transparent about how it works, why it works, and what stake we as users have in its outcome.

--

--