In the aftermath of the 2016 Brexit referendum and US presidential election, much has been written about how personal data was used to target voters with advertisements and other messages over social media. We’ve since learned that actors both foreign and domestic employed information operations, computational propaganda, and cyberattacks weaponizing our commercial media infrastructure. The question at hand is whether our democratic process can endure a hyper-personalized data-driven media and propaganda environment that our founders could never have imagined.
Financed by the elusive hedge funder, mega-donor, and computer scientist Robert Mercer, Cambridge Analytica (part of the British firm SCL Group Ltd.) constructed a significant data set based on US voter rolls. These were legally acquired as a function of working on the Cruz and Trump campaigns. Informed in part by information derived from Facebook and then enriched with commercial data through brokers, our voter dossiers were sold to the Trump campaign. Controversy surrounds the donation of SCL’s “election management” services for Brexit.
Some respected observers argue the firm’s psychographic profiling methodologies may not be nearly as potent as its sales pitch suggests; although evidence is emerging that its geo-targeted voter profiles with ideological modelling may have helped the Leave.EU campaign find voters with “authoritarian” leanings. Regardless, the firm’s psychographic practices remain as unreviewable as the campaign’s voter advertising. Journalists and regulators can never examine hyper-targeted dark posts deployed in 2016 because Facebook did not preserve or release them for public review.
With the help of researchers in Europe, we learned Cambridge Analytica was subject to laws that have no parallel in the US. After submitting a request for personal data, a Cambridge Analytica voter profile was delivered and publicly exposed for the first time. Does this prove that our voter data is stored and processed in the UK? Couldn’t our request have been denied if personal data had not left US territory? Americans don’t have a basic right to request and view our own voter data profiles and ideology models, a right that citizens enjoy in the UK and other European nations. We are concerned as to whether SCL/CA have complied with the UK Data Protection Act of 1998 and have instructed solicitors in the UK to write to them, with a view to court action.
Does this prove that US voter data is stored and processed in the UK? Couldn’t our request have been denied if personal data had not leaked from US territory?
Pledge to support our CrowdJustice campaign to pursue legal action against SCL (Cambridge Analytica) under UK Data Protection Act.
What we fear is a future in which potent personal data is combined with increasingly sophisticated technology to produce and deliver unaccountable personalized media and messages at a national scale. Combined with data-driven emerging media technologies, it is clear that the use of behavioral data to nudge voters with propaganda-as-a-service is set to explode. Imagine being able to synthesize a politician saying anything you type and then upload the highly realistic video to Facebook with a fake CNN chyron banner. Expect the early versions of these tools available before 2020.
At the core of this is data privacy, or as they more meaningfully describe it in Europe, data protection. Unfortunately, the United States is headed in a dangerous direction on this issue. President Trump’s FCC and the Republican party radically deregulated our ISP’s ability to sell data monetization on paying customer data. Anticipate this administration further eroding privacy protections, as it confuses the public interest for the interests of business, despite being the only issue that about 95% of voters agree on, across every partisan and demographic segment according to HuffPo/YouGov. We propose three ideas to address these issues, which are crucial to preserving American democracy.
First, citizens need to put pressure on Google, Facebook and other technology platforms to behave in the interests of the democracies that enabled their unrivaled success. Public pressure is surprisingly effective at holding these companies accountable. When people drew attention to the epidemic of viral “fake news” hoaxers or the fact that jihadi and Nazi YouTube videos are being monetized by the world’s largest brands, funding extremists, these companies answered to their customers. By zeroing in on how the duopoly has a duty to democracy itself, we can question whether these firms are on our side, or not.
Second, those in the US concerned about these issues should seek alliances across political lines to advance reforms. Industry lobbyists are there to confuse lawmakers that these rules will inhibit innovation. The only innovation reasonable privacy bright lines will inhibit is degradations to the informed voter that undergirds our ability to hold politicians accountable in the polling booths. Illinois, the only state with a facial recognition privacy law on the books, is leading the way. Many other states are considering protections such as California, Connecticut and New Mexico. Every voter should have the right to request personal voter data dossiers and learn how politicians might be using predictive ideological models to manage campaigns in their district.
Finally, the most ambitious solution may be abolishing our Electoral College. The advent of geo-targeted predictive microtargeting and the redefinition of the public sphere into filter bubbles has become a new cybersecurity threat model. Because a tiny fraction of votes count more than the vast majority, and because we may have proven that voter data is being processed internationally, we must take drastic actions to equalize the value of one vote. Deploying hyper-targeted voter media that constructs narrow or outright fabricated versions of the truth to influence small subsets of voters in strategically important geographies is a scenario our founding fathers never imagined.
We may never know the true scale of the Cambridge Analytica voter data and hyper-targeted media operation in the 2016 election. But what is clear is that in the future these methods will only become more powerful, matched with new, machine-driven methods to produce artificial reality media and even more powerful social platforms to deliver it. Unless we direct our collective outrage at tech companies, state legislatures and Congress for diminishing our data privacy, we risk ceding democracy to plutocrats with dark databases and vast resources to surreptitiously exert their will.
Justin Hendrix is Executive Director of NYC Media Lab.
David Carroll is Associate Professor of Media Design at Parsons School of Design.
List of hypertext links in order of appearance, for further reading.