Facebook Data And The Suppression Of Votes

Facebook Developer: two taps and you too can extract Facebook user data

The Trump campaign ran a Facebook campaign late in the election to dissuade Clinton voters from casting ballots. This strategy used the powerful tools that Facebook provides to advertisers to alter voters’ behavior.

The Facebook campaign was run by Jared Kushner, Trump’s son-in-law, and his advocacy likely was critical for acceptance by Trump the technophobe. Kushner was likely convinced of the power of Facebook by Peter Thiel. Thiel was an early, major investor in Facebook and is a current Facebook board member, and was associated with Kushner and Trump at the time. Thiel owns Palantir Technologies, the largest tech company you have never heard of. Palantir sells software to government intelligence agencies which allows them to monitor and conduct counter-insurgency operations against terrorists.

Kushner is said to have surprised tech people with his technical savvy and ability to run the Facebook campaign like a startup. It is likely that he received input from Thiel, and from Robert Mercer, a brilliant, mathematician, computer scientist and quant hedge fund manager who donated millions to the Trump campaign.

Kushner ran a lean campaign with a intent to try many different approaches while cutting ineffective approaches quickly. Kushner recruited Brad Parscale, the CEO of a little-known San Antonio web design company to run the campaign. Parscale’s firm had produced websites for Trump businesses previously, but Parscale had no particular technical expertise with respect to Facebook. However, Parscale had the essential (for Trump) qualities of being familiar, hard-working, and inexpensive. Parscale first practiced online recruitment and Facebook advertising strategies to sell Trump campaign merchandise and to raise small campaign donations, and to generate lists of Trump supporters. The proceeds from sales and donations were used to fund the Facebook campaign. Other groups provided key databases and technical expertise.

Links to download complete voter lists from the Cuyahoga county board of elections web site. CSV files listing all registered voters are behind the links.

Republican National Committee (RNC) chairman Reince Priebus had a voter database which his team had started to assemble in 2012. The RNC group had a deep understanding of the particulars of US presidential elections. The RNC database held names, addresses, party affiliation and voting patterns of registered voters in the US, data gathered from US boards of elections. Typically, complete lists of voter records are available for download on county board of election web sites. Voter records include, in addition to names, addresses and party affiliation, information on behavior in previous elections — for example, whether voted at all, voted early or voted by mail.

Cambridge Analytica, a company which ran a Facebook campaign for the Brexit Leave group, provided additional Facebook expertise and data. Cambridge Analytica had collected psychological profiles of American voters linked to Facebook user data.

Facebook and Behavioral Contagion

Cambridge Analytica’s parent company Strategic Communication Laboratories (SCL) first became interested in Facebook as a tool for voter persuasion in 2008 when an SCL employee encountered research conducted by David Stillwell, a student at Cambridge University. Ten years ago, a prevalent concern was users misrepresenting their true selves on social media. Stillwell’s research project (the myPersonality Project) studied how individuals’ actual personalities relate to the way they represent themselves on Facebook. Stillwell was approached by SCL to see if he would sell his data. Stillwell refused, so SCL acquired their own personality test results linked to Facebook. Cambridge Analytica still offers a test on personality and political leanings.

Stillwell’s data continues to be used for Facebook research through myPersonality.org

Since Stillwell’s project, Facebook research has moved to new problems, including behavioral contagion. How behaviors and attitudes can be infectious on Facebook is of interest to academics, advertisers and political campaigns. Behavioral contagion at its evolutionary root is a way for social animals to protect themselves from predators. A school of fish moves in concert away from a shark. The fish distant from the shark sense not the shark but signals from their neighbors. Perception of a threat can spread in a similar manner through groups of people.

The conditions which maximize behavioral contagion include a shared mood and a large number of people in close proximity. These conditions are present in large crowds, and we acknowledge this as “mob behavior”. Not surprisingly, if influential individuals send the signals, the behavior is more contagious. Likewise, if inhibitions are reduced with alcohol, contagion is increased as well.

Before technologies made communication at a distance possible, behavior was contagious because of things seen, heard or felt from our fellows near us. With new means of communicating, it became evident that behavioral contagion could occur without direct perception of other human beings and with much milder stimuli than the threat of being eaten. With Facebook, the means of communication can be reduced to ‘likes’ and perception of likes. Other properties of the real world of behavioral contagion also have Facebook analogues. For example, the number of connections in a group of Facebook users substitutes for number in a crowd. And while alcohol also lowers inhibitions on Facebook, so does a shared disdain for political correctness. Academics made Facebook their laboratory because of the financial support of advertisers, and because Facebook makes it possible to answer questions which cannot be addressed in any other way

A key paper on how to identify susceptible and influential people and how to use behavioral contagion to cost-effectively affect attitudes, published in a leading scientific journal.

The Science of Behavioral Contagion on Facebook

Three features of Facebook make it the social medium of choice for research into behavioral contagion. Facebook has a very large, diverse and active user base; the connections between people are precisely defined; and Facebook provides powerful tools to conduct experiments.

Facebook has nearly 200 million active users in the United States and nearly 2 billion worldwide. While the user base is somewhat younger and has more females than the world at large, it is reasonably representative and access to millions of users gives great statistical power.

Within Facebook, connections between users are explicit and meaningful, encoded in friends lists. We speak of degrees of separation between people, but it often can be challenging to define the links. This difficulty is illustrated by the fact that six degrees of Kevin Bacon is a challenging game. In contrast, the connections between people are easily accessible in Facebook data.

The tools that Facebook provides to advertisers are incredibly powerful. In Facebook research experiments, an ad is presented to users and their friends. If a user ‘likes’ the ad, notifications are sent to some of the user’s friends. If the notified friends are more likely to ‘like’ the ad than the un-notified friends, then there was behavioral contagion. With this simple set of tools, many different questions can be addressed. Ads can be tested against each other to see which have greater ‘infectivity’. One can test if repeat notifications are more effective (they are). One can test if a diversity of sources increases infectivity (it does). One can test if there are individuals who are influential at, or susceptible to, infection (there are).

Perhaps it isn’t surprising that people differ in how susceptible and how influential they are. Cambridge Analytica uses their personality test to identify people who are susceptible or influential. However, it turns out that personality tests are not necessary. Work published in 2012 showed that susceptible and influential Facebook users could be identified from age, sex and relationship status alone. Perhaps it isn’t a revelation that older men may have more influence on average, nor that young women and those in relationship flux are more susceptible.

Your Facebook Data, Free For the Asking

If you ask the internet if Facebook sells your data, a Facebook web page succinctly and clearly states that Facebook does not sell your data. And indeed, Facebook does not sell your data — Facebook gives it away.

Facebook gives advertisers enormous power to target their ads. The power to target users lies in the user data that Facebook gives away.

Facebook gives your data away every time you use a Facebook login button to sign into a web site. Facebook gives your data away every time you do a personality test on Facebook, every time you play a game on Facebook and every time you sign into an app on your phone with your Facebook id.

The list of data types that Facebook sends to a third party when a user uses Facebook Login

All of these actions activate the same snippet of code. Facebook calls the code ‘Facebook Login’, which is confusing, because it runs in response to more than just logging in (personality quizzes and games).

Facebook Login sends a unique id number for you, your first name, your last name, email address, photo, sex, age range, locale, time zone, and a complete list of your friends. More sensitive data can be sent if Facebook approves it (and if you consent). That more sensitive data can include your religion, politics, relationship details, work history, even the content of your Facebook private messages.

The number of times that user data is extracted from Facebook by third parties is enormous. According to Facebook, the web site login buttons which activate Facebook Login are used tens of millions of times per day alone. Facebook does not provide numbers for how often CONNECT is activated when users play games, take personality tests or log in on phone apps.

The data that is extracted from Facebook doesn’t exist in isolation, either from the data of other users, or from third party data on a user. For example, Breitbart News has a Facebook login button which you can use to comment on their site. The Breitbart implementation of Facebook Login collects only the default Facebook user data. Breitbart data analysts can surmise that, since you are Breitbart commenter, your political views probably align with others on their site. But in addition, your comments will tell them exactly where you stand on the political spectrum. If they track your statements through time, or they track your comments in response to particular stories, they can determine how effectively they are engaging you with each story. And because they have your list of friends on Facebook and the friends lists of other Breitbart commenters they can map Facebook network of Breitbart users. And Breitbart can submit the names of Breitbart users to have Facebook target ads to people who have similar characteristics. Former Breitbart executive chair Steve Bannon said that Facebook was key to the growth of their user base.

Cost-Effective Advertising Through Behavioral Contagion

With a database of many users, the Facebook network can be reconstructed, at least in part. Knowledge of how people are connected — the structure of the network — holds the key to influencing people cost-effectively as we will see. If a network map is made of a small number of Facebook users, the pattern often resembles spokes radiating from a hub, a central individual with a large number of connections to others on the periphery. But this pattern of a hub with spokes disappears as more individuals are added. In larger maps, a pattern emerges of many groups, each loosely connected with each other, with dense connections within a group. Within a group the connections form something of a mesh, or netlike pattern, rather than a hub with spokes. Not surprisingly, groups consist of people with shared interests, the echo-chambers of Facebook.

The pattern of a Facebook network of friends consists of groups of individuals with many connections between them, loosely connected to other groups. Individuals within a tightly connected group have shared interests — the echo-chambers of Facebook.

Researchers have asked if influential and susceptible people reside in different parts of the network, the echo-chambers or the connections between them. On average, influential people reside in echo-chambers, while susceptible people reside in the looser connections between them.

Can advertising be made more cost-effective with behavioral contagion? Is it more cost-effective to target the many loosely connected susceptibles who are poor influencers, or to target the not-very-susceptible influencers who reside in echo-chambers? While it might seem obvious to target people who can be converted to a cause, the easily convinced are less likely to be able to ‘infect’ any of their friends because they are unlikely to be influential. When researchers tested both strategies, it was found to be more cost-effective to target the echo chambers. Despite fewer connections to susceptible people outside the echo-chambers it is still more cost-effective. Of course, given enough money, both echo-chambers of influencers and dispersed susceptibles could be targeted.

There is little in the published literature on whether positive or negative Facebook messages are more infective. Academic research on Facebook would not pass ethical review if propagation of fear or hatred were proposed. Accordingly, most published studies use researcher-generated stimuli which are positive or neutral: likes or empathy, for example. Despite the lack of evidence, it seems highly likely that negative stimuli, for example fear, are more infective. The ability to sense and respond to fear should promote survival, and confer a selective advantage in evolution. The more potent the threat the more infective the message, and this may be why extremist groups — for example white supremacists — have been tolerated on Breitbart although they may be outside the range of beliefs admitted to by the site.

Breitbart’s Facebook login for comments collects default information, including name, unique id, age range, sex, locale, and list of friends.

Preparing the Soil

The Trump Facebook campaign was the finishing battle in a long-term campaign started after the previous presidential election by a trio of organizations: Cambridge Analytica, The Government Accountability Institute and Breitbart News. These three organizations had substantial financial support from Robert L Mercer and had Steve Bannon in a leadership role. Bannon co-founded the Government Accountability Institute in 2012. Bannon became a board member of Cambridge Analytica when it began in 2012. Bannon became the executive chair of Breitbart News after the death of Andrew Breitbart in 2012.

Many now know Trump advisor Stephen K Bannon, but Robert Mercer has remained largely out of the public eye. Mercer has been one of the largest donors to conservative political causes in the US. Mercer is co-COO of the most profitable hedge fund in the world, Renaissance Technologies. From 1994 to mid 2014 Renaissance Technologies had returns averaging over 70% per year. Renaissance Technologies limits the size of its holdings, and restricts its investors to owners of the company. Robert Mercer has used his riches to fund PACs supporting Republican candidates. Mercer donated over $22 million to conservative PACs in the 2016 campaign cycle alone.

The way that Mercer’s Renaissance Technologies earns its money has some parallels to the Facebook strategy. Both involve active, repetitive probing of complex networks, large data sets and powerful computers. Renaissance Technologies is a quant or black-box hedge fund. Popular perception of quant hedge funds is that they employ mathematicians who reveal trends in the stock market. While this is partly true, it is incomplete because it implies an approach that is passive. Hedge funds like Mercer’s have assets ($65 billion) that they use to actively affect the market. Markets responds to stimuli: for example, an attempt to purchase a billion dollars of a stock would cause the price of that stock to increase. While this response is detrimental to the purchaser’s goal, responses that are beneficial are also possible, although less direct and less obvious. In fact, there is no need to understand why particular responses happen — thus, the black-box — one only needs responses that are profitable. Repeat the action at high speed over and over again and large gains can accrue from many small movements. The profits from this endeavor gave Mercer the wherewithal to make investments in the long-term media campaign between elections.

The common goal of the three Bannon/Mercer groups was to undermine Republican and Democrat presidential candidates to the left of Bannon and Mercer with negative information. The Government Accountability Institute wrote and seeded investigative journalism in books and respected newspapers. These books and articles focused on crony capitalism by politicians who were likely Republican and Democratic presidential candidates. Breitbart News and its army of alt-right trolls weaponized those legitimate news stories against the candidates. The negative themes were exploited again by Cambridge Analytica in the Facebook campaign of the final two months.

The code for Facebook Login from the Breitbart comments section, retrieving user information from Facebook. ‘public_profile returns id, first and last name, age range, gender, locale, profile picture and timezone.

The long ground game by The Government Accountability Institute and Breitbart News to make Clinton a vulnerable candidate with negative messaging were aided by many other events, including the congressional Benghazi investigations, the hacking of the DNC, the WikiLeaks release of the Podesta emails and Comey’s announcement of the FBI investigation.

The Final Facebook Assault

If we think of it at all, we tend to thing that advertisers can target particular demographic groups on Facebook, for example white suburban housewives. But in addition to advertising to groups, they can target individuals by name. An advertiser (or campaign) can submit a list of names of people in a voting precinct, or the only the names of registered Democrats in a precinct, or only the names of young women Democrats in a precinct. If the list of names is known to be incomplete, Facebook can target individuals who have similar characteristics to those on the list.

Facebook allows advertisers to directly advertise to people by name (Custom Audiences) or to individuals who are similar to those on a list of names (Lookalike Audiences)

The Trump campaign targeted very specific Facebook users. Because of the winner take-all nature of the electoral college, they targeted registered voters in swing states. Because their strategy was to dissuade opposition voters from voting, this was refined to probable Clinton supporters. This set was further refined to target voters who were susceptible to persuasion, the young and women. And they targeted people by race, which could be imputed from Facebook data. Lastly, the level of voter idealism and liberality could be deduced from demography and the Cambridge Analytica personality test results.

The Trump campaign said they targeted three distinct categories of voters in the swing states, young women, idealistic white liberals, and African Americans. On one day in August of 2016, the campaign tested around 100,000 different variants of ads to determine which worked best in each target group. Like academic researchers, they monitored the infectivity of ‘likes’ to the friends of the targeted users. Subsequently they continued to test large numbers of ad variants, as many as 30–40,000 per day. The ads were negative and tailored to the group. For example, African Americans were targeted with ads that said “Hillary Thinks African Americans are Super Predators.”

Once early voting was underway, the effect of the Facebook campaign on actual voting could be measured directly. Up-to-date lists of who has voted early or by mail are provided by the boards of elections. By monitoring these lists and comparing past with current behavior they could determine how effective they were at suppressing Clinton votes. These data, combined with early exit polls, showed them that the Facebook campaign was effective.

How Well Did It Work and What Does It Mean For the Future?

The contest between Trump and Clinton was unprecedented in many ways, and because there were so many anomalies it is not clear and likely will not be clear how much the Facebook strategy contributed to the Trump victory. The individuals who led the Facebook strategy believe that it was effective, and the sudden come-from-behind victories of both Brexit and Trump campaign are consistent with that belief. It is likely that future campaigns by others will try a similar strategy. More importantly, it is likely that the Trump team will continue to employ this strategy.

Most members of the team that led the campaign are now members of Trump’s administration. Bannon, Parscale, Priebus, Kushner, Thiel and Mercer’s daughter Rebekah all have roles in the transition or in the administration itself.

Thiel’s Palantir Technologies sells data and network analysis tools to the intelligence community

What uses will this approach be put to in the future? It is likely that elected GOP members of the congress have been threatened with Facebook primary campaigns if they do not cooperate with President Trump.

It seems that the Kushner-led group would like to extend the approach they have used to issues beyond elections. Given that negative messaging preceded the Facebook battle to prepare the soil, what are the current negative themes Breitbart? Not surprisingly, neither Hillary nor crony capitalism are themes on Breitbart any more. The current themes are the lies and bias of the mainstream media, the violence of anti-Trump protestors, Radical Islamic terrorism and immigration.

What the election has changed are the tools available for executing the strategy. The White House administration has worked to control the information coming from it and they have demonized the press. The administration has control of the national intelligence agencies. When the administration controls the information coming from the government, suppresses information from other sources, and can increase the intensity of surveillance of its own people, the methods of behavioral contagion could be employed to darker ends than to win an election.

References

Facebook Tools in the Social Sciences

https://www.ncbi.nlm.nih.gov/pubmed/26348336

http://mypersonality.org/wiki/doku.php

Facebook Tools

https://developers.facebook.com/docs/facebook-login

https://www.facebook.com/business/products/ads/ad-targeting

https://www.facebook.com/business/help/381385302004628

The Science of Behavioral Contagion

https://www.ncbi.nlm.nih.gov/pubmed/22722253

https://www.ncbi.nlm.nih.gov/pubmed/24621792

https://www.ncbi.nlm.nih.gov/pubmed/24889601

http://www.pnas.org/content/109/16/5962.long

https://academic.oup.com/jcr/article-lookup/doi/10.1086/518527

http://www.nature.com/articles/srep37825

News Stories on the Trump Facebook Campaign

https://www.nytimes.com/2016/11/20/opinion/the-secret-agenda-of-a-facebook-quiz.html?_r=0

http://adage.com/article/campaign-trail/cambridge-analytica-toast/305439/

https://www.bloomberg.com/news/articles/2016-11-21/how-renaissance-s-medallion-fund-became-finance-s-blackest-box

https://www.bloomberg.com/news/articles/2016-10-27/inside-the-trump-bunker-with-12-days-to-go

http://www.bbc.com/news/technology-34922029

http://www.forbes.com/sites/stevenbertoni/2016/11/22/exclusive-interview-how-jared-kushner-won-trump-the-white-house/#4c7c9d112f50

https://medium.com/startup-grind/how-the-trump-campaign-built-an-identity-database-and-used-facebook-ads-to-win-the-election-4ff7d24269ac#.4fjcjxr1i

Like what you read? Give Murcan Idealist a round of applause.

From a quick cheer to a standing ovation, clap to show how much you enjoyed this story.