Why we can’t just protect “smart” people from data tracking

Vi- Grail
6 min readApr 8, 2024

--

In 2018, Mark Zuckerberg appeared in front of the United States senate. Most of us remember the memes about the evil robot man drinking water anxiously. But not all of us paid attention to why he was there in the first place.

Facebook was giving access to their data to the company Cambridge Analytica, which was interested in influencing elections. This is dangerous. There are a million tiny ways that everyday factors influence our decision making. When you arrive at the voting booth and write down your preference, your decision is influenced by every part of your life leading up to that. What political ads did you see? How were you feeling when you saw them? Did you watch the news right after dinner when you were full, or beforehand when you were hungry? Did a candidate in the debate call their opponent a psychopath instead of a different word, and do you care about that kind of language? How fast do news websites reporting left wing views load in your browser? How are the traffic lights between your kids’ school and your house timed, and are you frustrated by them when you’re listening to the radio news?

These influences, big and small, planned and unplanned, add up to make your decision in the voting booth. A single small incident could influence your decision years later in massive ways we cannot understand.

But Cambridge Analytica wanted to understand it. And use it.

And Facebook had the data to make it happen.

If you can see all the data in a society, you can study how small changes influence people. If you want to appeal to a voter who prefers pepsi over coke, should you target the jews, or trans people? Should your political website load HTML content before the page script has initialised, or should the page be blank while it’s loading? Should ads defaming immigrants be displayed next to posts about trucks, or posts about hunting? Will young people vote less or more if you manufacture a scandal about Taylor Swift right before the election?

But it gets subtler than that. Those are the obvious questions. With big enough data, you can ask seemingly nonsensical questions and get useful answers.

Does increasing the price of spaghetti make people vote left or right? What effect does Facebook’s colour scheme have on the amount of news people read? How many commas per sentence should you average if you want readers to think trans people are pedophiles? How many pictures of a sun does a person see in an average day, and what relationship does it have with veganism? Does the amount of whitespace per post have a greater effect on people’s opinion of Israel, or on their Tucker Carlson viewership?

These are weird questions. But Facebook levels of data make answering them possible.

I really like the game Watch_Dogs. Nobody remembers the plot of that game, they just remember that you can hack everything and everyone is edgy. But the plot (at least, the political side of it) is smart. A city has an interconnected smart system controlling everything. Traffic lights, billboard ads, people’s phones, police radio, security cameras, smart car systems. The company that contracts this system to the government and to hundreds of private enterprises is working on a predictive system that can use all this data to control elections. Make someone’s commute 5 seconds slower, and you turn them into a republican. Sound familiar? The game is literally about what Facebook and CA were actually doing at the time. It’s a smart city instead of a social media site because fiction has to be big and flashy and the point of the game is the player can hack everything, but it’s there. This issue became science fiction and reality at basically the exact same time.

Cambridge Analytica worked for the 2016 Donald Trump campaign. Mark Zuckerberg’s lack of moral compass may have been a deciding factor in Trump’s presidency. Fascism is growing in America, democracy is in serious danger, and trans people are fearing for their lives in the US and basically everywhere. And we have, in part, maybe a big part, Cambridge Analytica to thank. This isn’t an abstract problem. This isn’t a hypothetical, it isn’t a warning. It’s already happening and it’s one of the biggest-scale problems in the world.

The GDPR was created as a direct response to these events and others like them. A lot of people use individualised language around the data privacy issue, including supporters of the GDPR. “We need to control how a company can use YOUR data”. No. This is wrong.

If you’re a trans person in the US, your data is important. It could be used to out you, to harass you, to doxx you. That’s bad, but that’s not what you should be scared of. That’s not a big deal. You should be scared that everyone else’s data is going to be used to influence an election, implement Project 2025, and send you to a concentration camp to be killed. That’s the danger. The stuff about “protecting your data”? It’s a distraction. You need to be scared of everyone else’s data.

Adblockers, data tracking blockers, do not track requests, secure messaging apps. This stuff is important, especially if you’re a vulnerable minority or you’re planning political action. But it’s not the big concern. The big concern is the 3.59 billion people who are currently using Meta services. That’s enough people to get the data needed to understand how to influence elections. That’s enough people to influence to take away your rights and destroy your democracy. And that’s not even the whole problem, because I didn’t mention the people who’ve sworn off Meta but still use Google. Or a website that implements Google AdSense. Or a Windows computer. Or Discord, or TikTok, or Tumblr, or Reddit, or Twitter. And you might have requested data privacy from all these companies. But has the average user?

You can’t just install a privacy extension in your browser and act like you’re safe from data mining. The danger is global. The danger is not you being manipulated, it’s your society and your democracy being manipulated.

So I see some people falling into the trap of “If I just teach all my friends to use safe software and privacy tools, then at least us, the smart people, will be safe”. No! You won’t! You’re not safe until everyone is safe. Whatever solution we come up with to data tracking, it has to protect the most illiterate, most incompetent, most ignorant people in our society. It has to protect Ethel from the retirement home, and it has to protect your racist neighbour Greg, and it has to protect 3 year old Kaiden whose babysitter is an iPad with YouTube on it, and it has to protect Kyle who spends all day shitposting on 4chan and looking at transgender incest pornography. If Ethel, Greg, Kaiden, and Kyle aren’t safe from having their data harvested, what makes you think you’re safe from having your rights taken away by God-Emperor Elon Musk?

The GDPR is a good start. The point of it is to make it as easy as possible for everyone to stay safe from data tracking. So far, the number of CA style scandals seems to have gone down. Is that because the GDPR actually works, or because it’s spurred companies to do a better job hiding their activities? I don’t know. I do know the GDPR isn’t being followed.

During the Reddit API scandal, many users deleted their comments en masse using automated tools, only to find comments mysteriously restored, or that they had comments they could no longer delete. Reddit, at least, is not complying with data deletion requests. Many websites have arcane, illegal cookie consent interfaces that are not being prosecuted.

We need an international, global push to protect democracy from data mining, not just from the EU. And we need more prosecution of the existing laws and newer, stricter laws. The GDPR should be the start.

And don’t ever act like it’s okay for companies to violate our trust because “the smart people” can just install the right tools. That’s not gonna help Kaiden and Ethel, and that means it’ll have limited help for you.

--

--

Vi- Grail

Nonbinary Goddess explores philosophy, politics, and pop culture to find lessons that can improve people and help improve the world. http://soulism.net