Why data privacy is critical to US elections—in 2020 and beyond

Benjamin Brook
Transcend
Published in
7 min readNov 2, 2020

Cambridge Analytica was a low-tech privacy breach, but far more damaging invasions of personal data privacy could threaten free and fair elections for years to come.

As we approach the eve of the US general election — which only four years ago was marred by the meddling of Cambridge Analytica — it’s fair to be frightened about how our data trails are being used in unintended and malicious ways this time around. Yet, we don’t have to just sit back and watch. Any business leader with online consumer data can play a powerful role in fixing the broken system.

First, let’s acknowledge a sobering reality: not much has changed in the way of government regulation and oversight since 2016, save for California’s passage of minimal data rights with the California Consumer Privacy Act (CCPA). Advancements in Artificial Intelligence and Machine Learning have only increased, and, in turn, the impact of data misuse now has greater consequences for successful (and more advanced) behavior manipulation than it did four years ago.

But, it is incomplete to assume that the only potential solutions include consumers assuming personal responsibility and hyper-vigilance for their data or regulators compelling advertising technology companies, like Facebook or Google, to clean up their mess (although they definitely should).

The truth is that all companies hold responsibility for the impact on data privacy and behavior manipulation caused by their products, regardless of their proximity to an advertising-based business model. Even your Cheerios comes with the sale of user data.

The real solution is a “Yes, and.” Yes, there are actions that consumers can consider to manage their data footprints. Yes, ad-tech companies must also take action on better business practices, and yes, there is a systems-oriented way for all companies to better prioritize data privacy. By moving out of the realm of “compliance” and into a strategic mindset of protecting data privacy, businesses large and small can create a positive impact, reap the reputational benefits, and avoid costly missteps.

This is especially important now because this election once again reignites a data privacy debate on free and fair elections that’s been burning for years in the US. Back in 2016, political consultancy Cambridge Analytica built specific psychological profiles on 70-million Americans based on a Facebook data leak. The company then ran micro-targeted campaigns, particularly toward people who were profiled as “more prone to impulsive anger or conspiratorial thinking than average citizens,” with the intent to influence their vote. The Facebook data they used to identify their targets and craft their messages was obtained from an academic researcher who violated individual privacy and Facebook’s policies by giving Cambridge Analytica access.

We now know that Cambridge Analytica was, in many ways, an unsharpened tool. Election interference continues steadily and is already far more sophisticated with new layers of obfuscation, as evidenced by the charges filed against a Russian troll farm for 2018 US midterm meddling and new claims by Twitter and Facebook that 2020 election interference is becoming more difficult to spot.

Going further, a recent whistleblower post from former Facebook employee Sophie Zhang highlights an even greater amount of election manipulation and interference occurring on the global stage. In 2020, more data and more advanced algorithms mean more effective interference that is more attuned to our individual psychology and behavior.

These rapid advancements in data application mean that solutions also need to be as sophisticated and holistic. In particular, data today is spilling over the edges of businesses and co-mingling in all sorts of ways. It’s not just social media. Sure, Facebook has dominated this conversation, but that’s only part of the problem. There are thousands of examples of how data can be used to manipulate and change the behavior of a consumer. Any business engaging in the information-sharing economy can be used to help paint a clearer picture of who may be voting for whom and where to focus microtargeting efforts.

Take, for hypothetical example, a local Sam’s Club and an electrical company. They each sell a customer list which specifies names and purchases. These lists are purchased and cross-referenced by a troll farm that finds people who bought both hunting equipment from Sam’s Club and solar energy from the electrical company. The troll farm correlates this dual-purchase with swing voters, identifies 10,000 voters by name, and launches a campaign to micro-target and misinform them.

Unfortunately, this is a simplified example: with machine learning, thousands of datasets are used to identify even more subtle correlations. Additionally, this scenario doesn’t touch on the fact that data is also used to build deep psychological profiles of voters and predict which micro-targeting campaigns will push their buttons and nudge them into the desired vote.

So if all companies are a part of the system, what can we all do together to fix it? Based on my research of data privacy at over 200 companies, I believe there are five clear actions business leaders need to take:

1. We need better privacy from everyone, not just the obvious players. Instead of raking only the usual suspects over the coals, we need to help all companies meet a high data standard. Consent and transparency must exist with all actors. That requires creating the engineering tools all companies need to be better stewards of people’s data. The system is only as strong as its weakest part. Regulation plays a role in ubiquitous data privacy, but true change comes when senior executives see data privacy as an important business opportunity. The good news is that many of the data privacy challenges that businesses face are core engineering systems problems that are ripe for innovation. Companies need only seek out better and new solutions.

2. We need smart and accessible consumer education. Similar to evolutions in financial literacy, preventative health care and sustainability, we need to break down the often cryptic legalese of data privacy into easy-to-comprehend and contextual chunks that fit into consumer’s busy lives when most relevant to the decisions they are making. Company privacy policies that surpass 1,000 words should be the first on the chopping block. So-called “dark patterns” of hiding opt-outs or obscuring privacy choices on companies should also be removed. These practices seek to serve a short-term interest — such as marketing not wanting to change their web flow — but they get in the way of building a better top-line reputation.

3. We need better suppression logic for when users opt out. Data doesn’t flow from one company to another on a single track, but that’s how most companies treat the configuration of their opt-out flows. Instead of rolling a person’s preference across all of the data touch points, including their website, payment systems, and servers, companies are only able to turn off the most obvious of data hoses. Engineering can help make tools that have more surgical precision on hearing a user’s preferences and acting on that directive everywhere. In other words, when users opt-out, it should be respected, but there are infrastructure hurdles that need to be overcome.

4. We need to more clearly draw the boundaries. Protections need to follow the data wherever it goes and no matter who has it. For example, in California, data privacy has a “Do not sell” opt-out choice. By selecting this, a consumer is saying that a company can’t give the consumer’s data to anyone else for their own use. But consumers are opted-in by default, and therefore have to know who’s actually selling their information in order to tell them to stop. In the US, everyone can have their data sold to a broker who in turn sells it to a political group.

5. Finally, we need to redefine “data privacy” to “data respect.” A common misunderstanding of data privacy is that if you’re a person with nothing to hide, why should you care? Consumers no longer buy this trope. Our recent survey of 1,000 Americans shows that 93% of people would switch to a company that prioritizes the issue. So, instead of “privacy,” what we’re really talking about today is “respect.” Showing a consumer respect is simple. Use their data for its intended purposes, tell them where it goes, who it is being sold to, and efficiently return it to the sender when they ask. When the discussion of data privacy expands to data respect, a more accurate picture of the problem and solution appears.

Companies that want to stay far away from Facebook’s perceived missteps have seen the trends of data privacy and are smarting up in boardrooms and C-suite conversations. How can they get an edge on showing consumers they care about data privacy and are respectful of a person’s data consent? Companies that combine the above five principles with smart engineering applications can help to improve the integrity and reliability of their systems, enhance their security, and create compelling user experiences that improve their bottomline.

As elections show, the future of data privacy is about the intersectionality of mass data sharing and behavioral manipulation through hyper-personalized echo chambers, micro-targeting, and misinformation. No election in the modern era will be free from manipulation without the right data controls. We may not be able to directly change a tech-giant’s algorithm, but we can all start now to meet the demands of today’s consumer by treating them and their data with the respect they deserve.

Ben Brook is the CEO and co-founder of Transcend, the first-ever Data Privacy Infrastructure technology company that powers data privacy for brands including Patreon, Robinhood, and Indiegogo. Transcend is backed by Accel and Index Ventures.

--

--