Facebook Ad Targeting still enables Voter Suppression — “probably”

Angela M
The Startup
Published in
9 min readNov 11, 2019
Photo by Annie Spratt on Unsplash

In a hearing before the U.S. House of Representatives Financial Services Committee on 23 October 2019, Congresswoman Alexandria Ocasio-Cortez questioned Mark Zuckerberg about Facebook’s policy not to fact-check advertisements paid for by political figures. “Could I pay to target predominantly black zip codes and advertise them the incorrect election date?” she asked. Zuckerberg said no, explaining that Facebook policies do not allow lies about how and when to vote, even if that lie comes from a political figure. Ocasio-Cortez followed up: “Would I be able to run advertisements on Facebook targeting Republicans in primaries saying that they voted for the Green New Deal?” After some evasion, Zuckerberg managed to commit enough to answer, “I think probably.”

Based on Facebook policies, both answers do seem to be true — at least, probably. Facebook has, since the 2016 U.S. presidential election, pledged to block voter suppression that takes the form of misleading or incorrect information about the date of the election, or where, when, or how to vote. However, AOC’s second question addresses voter suppression in a broader sense, by targeting supporters of one’s political opponents to discourage election participation. This is a tactic that has already been demonstrated by the Trump campaign, has been conducted by the election and data manipulation company Cambridge Analytica (connected to both the Trump Campaign and to Facebook) and is only encouraged by Facebook’s policies regarding fact-checking political figures, which, as Zuckerberg noted, do (probably) allow exactly this sort of campaigning.

An example from Cambridge Analytica illustrates the risks — and potential — in discouraging voters by encouraging apathy. Facebook’s Cambridge Analytica scandal is well known at this point, and is the subject of Netflix documentary The Great Hack. The film explains the involvement of Cambridge Analytica in both the Brexit Referendum and the Trump Campaign. It also provides background on Cambridge Analytica and its parent company, SCL Group, which has been involved in election manipulation across the world. An example detailed in the film illustrates exactly why this approach to voter suppression can be so dangerous, and so influential. According to a Cambridge Analytica sales presentation included in the documentary, SCL worked on the 2010 election in Trinidad and Tobago on behalf of the majority-Indian United National Congress (UNC) party, campaigning against the People’s National Movement, made up primarily of Afro-Trinbagonian voters. The company helped a candidate by painting graffiti slogans around the city, attributing them to young UNC supporters. SCL also claimed credit for a previously-assumed grassroots “Do So” campaign, encouraging young people to refuse to vote as a sign of resistance. SCL predicted that Indian voters would still turn out to vote, while they expected young Afro-Trinbagonians to maintain their boycott. The UNC won the election. They deny involvement with SCL Group (although one affiliated party has admitted “discussions and some engagement”), but an investigation is ongoing. To what extent the election victory can be attributed to this campaign is also debatable. SCL, in its incarnation as Cambridge Analytica, worked alongside Facebook staff on the Trump campaign, which employed this same strategy of discouraging voting.

While it can always be debated to what extent campaign advertising has influenced an election, Facebook has extensively explored its ability to affect voting by varying what users are shown on their platform. During the 2010 U.S. Congressional elections, Facebook conducted a “randomized controlled trial of political mobilization messages delivered to 61 million Facebook users.” Those experimented on received different versions of a message reminding them to vote. Researchers found that those delivered pictures of their friends who had voted were themselves more likely to vote. Facebook claimed credit for directly increasing turnout by 60,000 voters and indirectly by another 280,000 voters. In the study, they remind us that the 2000 U.S. presidential election was decided by 537 votes in Florida, as if their newfound ability did not already strike readers as concerning enough.

During the 2012 election, Facebook again decided to see what it could do. From 25 August to 9 November 2012, Facebook researchers altered the algorithm in the News Feed for two million individuals, such that if any of their friends shared a hard news story, that post would be placed at the top. According to a follow-up survey, voter turnout (as reported by those experimented on) increased from 64% to 67% among those who were exposed to more news. Facebook requested that video of the talk be removed, a copy is available here. Facebook continues to feature reminders to vote, or to register to vote, and it has more than enough data about everyone on its platform to be able to offer those just to certain groups, offering just its word that this is not what happens. Doubtless this influence is a selling point it uses to encourage spending of campaign dollars on its platform.

Given Facebook’s ability to target at such a focused level, sending a particular message to a particular group of individuals with the intention of discouraging them from voting can be very effective and should also be considered a form of voter suppression. While political campaigns have always been able to direct ads to their opponent’s supporters as well as their own, the focused targeting of advertisements enabled by social media platforms take this practice to a more personal level. The difference, too, lies in the message in the ad. A campaign could target opposing voters and try to persuade them to vote for their candidate, rather than the opposition, or the campaign could target voters just to try to keep them from voting. The former just tries to change someone’s mind while encouraging voting; the latter discourages political participation. Those targeted are certainly still able to vote, so it could be argued that this is not voter suppression. The Trump campaign did see this as voter suppression, and they hoped to use it to their advantage.

In fact, the 2016 election has provided examples of both types of ads mentioned by AOC. Facebook delivered ads encouraging mainly African American voters to “boycott the election”. In addition, at least four Twitter ads, branded to look as if it came from the Hillary Clinton campaign, instructed individuals on how to vote by text message. The ads used imagery directly from Clinton’s messaging. One was entirely in Spanish. Each included the line “paid for by Hillary for President”. Twitter initially claimed the ads did not violate its terms of service, though later claimed the issue had been fixed. Regardless of whether or not these came from the official Trump campaign, Russia, or just other motivated individuals, this form of voter suppression is now covered by Facebook policy — probably. Since 2016, Facebook has forbidden offers to buy or sell votes, misinformation about dates, times, locations, or qualifications for voting, misinformation about how to vote or which votes will or will not be counted. A more recent update to this emphasizes that they do not allow advertisements discouraging people from voting. As Zuckerberg told AOC, this should also apply to ads taken out by political figures, despite their exemption from fact checking. But, if AOC can (probably) target advertisements to Republicans saying that their candidates voted for the Green New Deal, Facebook policy will not prevent advertisements intended to discourage election participation by turning citizens away from their party candidates. Voters who dislike their party’s candidate will probably not vote for the opposite party, but they might stay home on Election Day.

The Trump campaign counted on this very possibility. Days before the 2016 election, a senior campaign official commented, “We have three major voter suppression operations underway.” These were not conducted solely over social media but were integrated with radio, debate arguments, and general campaign talking points. In debates and on Twitter, they focused on Clinton’s support for the Trans-Pacific Partnership to try to diminish interest from Sanders supporters. They tried to discourage young women by emphasizing the women claiming sexual assault by Bill Clinton or harassment or threats from Hillary. They targeted African Americans by replaying Hillary’s 1996 line calling young African American men “super predators”, taking out radio spots in certain stations and zip codes. This final operation, at least, relied heavily on capabilities offered by Facebook. An animation of Clinton delivering the line was delivered through Facebook dark posts, so that, as the campaign put it, “only the people we want to see it will see it.” As the senior official claimed, “We know because we’ve modeled this. It will dramatically affect her ability to turn these people out.” To its credit, Facebook has eliminated dark posts. All ads currently being run by a page are viewable on that page, and all ads related to political or social issues are archived in its ad library for seven years. (It does not share information on who each ad targets). However, it has also decided not to fact-check ads run by political figures (although it has proven choosy about when to apply this practice), and its microtargeting continues to offer the potential to deliver ads specifically to one’s political opponents.

Despite Mark Zuckerberg’s almost passing comment in his speech this October that “of course we don’t allow voter suppression”, there is no reason to trust Facebook’s position. Its policies on voter suppression remain insufficient. The Trump campaign in the past displayed their willingness to use Facebook to suppress votes by encouraging voter apathy, and the door is now open wider for it to happen again. Moreover, Facebook has proven willing to experiment on citizens during elections. It should be noted, however, that this is not a partisan issue. Articles warning about Facebook’s potential influence on voters written in the early years of the Trump administration raise concerns that the company might use this influence to encourage just Democrats to vote. Facebook’s microtargeting allows campaigns to target either their supporters or their opponents with advertisements, meaning they could either try to motivate or discourage voting. The Trump campaign defined their ad campaigns as voter suppression in 2016. Facebook’s policies have done nothing to prevent this, only enabling it while raising the stakes. Rather than supporting political engagement, Facebook allows, or even supports, its tools being used to discourage election participation. As long as Facebook enables the targeting of advertising, political or otherwise, at such a personal level, this remains among the risks associated with the platform. But maybe there is nothing to worry about. After all, as Zuckerberg told AOC, “Well, congresswoman, I think lying is bad.”

PS: In July 2019, Zuckerberg commented on Elizabeth Warren’s calls to break up Big Tech at an internal Facebook meeting:

“I mean, if she gets elected president, then I would bet that we will have a legal challenge, and I would bet that we would win the legal challenge. And does that still suck for us? Yeah. I mean, I don’t want to have a major lawsuit against our own government… It’s like, we care about our country and want to work with our government and do good things. But look, at the end of the day, if someone’s going to try to threaten something that existential, you go to the mat and you fight.”

It is unclear if Zuckerberg is speaking about going to the mat to fight during the campaign or if he will wait until this anticipated lawsuit materializes. Regardless, it sounds like he wants to work with the government and do good things — until what the government thinks is good differs from what Facebook thinks is good. After all, as he has admitted, sometimes what is good for the world is not good for Facebook. It seems he chooses Facebook in these cases. A string of recent revelations suggest he intends to fight preemptively: Facebook’s decisions to allow misinformation in ads from political figures — but only those figures it deems legitimate; to name the Daily Caller, founded by Tucker Carlson and with links to white supremacists, as one of its six fact-checking organizations; and to include Breitbart among the trusted publishers for its upcoming News tab. Zuckerberg’s recently reported meetings and dinners with conservative journalists, commentators, and political figures, along with the description of its D.C. office, by a former Facebook employee, “Everyone in power is a Republican” (including Vice President for Global Public Policy Joel Kaplan, who sat behind Brett Kavanaugh during his Senate hearings and threw him a party after his confirmation to the Supreme Court), only reinforce concerns that Facebook might be choosing a side in this election. Facebook is trying to avoid regulation by courting Trump’s favor, especially after being faced with charges of liberal bias. While it should come as no surprise that Facebook acts according to its own self-interest in seeking to fight regulation, Facebook is uniquely positioned due to its potential for influence over voters. It does not need to rely on political figures to place advertisements to discourage voting. Regardless of which method it chooses, Facebook is fully capable of voter suppression. What does it mean for the News Feeds of Warren supporters when Zuckerberg decides to go to the mat and fight?

--

--

Angela M
The Startup

Writing on the internet, personalization, politics, conspiracy theories and the so-called ‘alt-right’. possibly also an octopus here and there