Solidarity Under Surveillance Capitalism

The Invisible Worker
The Digital Labourer
6 min readFeb 17, 2020

Words: Miriam Shestack

Images: Lena Yokoyama

“How do you ethically steer the thoughts and actions of two billion people’s minds every day?” That question guides the work of Tristan Harris, the self-styled conscience of Silicon Valley. Harris, a former Google Design Ethicist, founded the Time Well Spent Movement, which aims to fix the misalignment between a tech industry that relies on capturing attention, and the best interests of society. Our apps and devices are addictive by design, Harris says. Every platform needs to keep our eyes on them for as long as possible, needs us to engage as much as possible, to maximize the information they can collect, the ads we see, and the profit they make. As a result of this competition, devices and the apps or sites they host drive us to spend our time in ways that we’d rather not. This claim seems to have struck a chord. Mark Zuckerberg has embraced time well spent a design goal of Facebook. Harris is a prominent figure in a movement in Silicon Valley, brought on in part by the fake news and privacy scandals of the last couple of years, to think more carefully about the effects that technology can have on the world, and to consider what the Valley owes the people who generate value with their attention and data. It’s time we also start thinking about what data collectors owe us. We can’t rely on government regulation to tame the misuse of data or otherwise protect us from the influence of social media, nor can we rely on the goodwill of Silicon Valley to democratize the technology they built. We need something else — a democratic body accountable only to those whose data makes the tech world go ‘round. We need an organized entity that can withhold personal data to bargain for a cut of its value and for access the power that it holds. We need a union.

To date, online privacy has generally been conceived of a matter of individual preference and security. However, recent events have demonstrated that the aggregation of so much personal data by private companies has exposed us to a staggering collective vulnerability. Cambridge Analytica was allegedly able to sway the political tides of 2016 in favor of Brexit and Trump using information initially collected through a Facebook quiz. In 2017 Equifax, one of the biggest consumer credit reporting agencies in the United States, admitted that sensitive information — home addresses, phone numbers, credit card numbers — belonging to 148 million Americans had been compromised in a data breach. Amazon’s Rekognition facial recognition technology, used by social media companies to identify fake followers and budding influencers, has also been used by police departments in Florida and Oregon to analyze mug shots. Amazon even pitched the service to US Immigration and Customs Enforcement last summer to aid in their crackdown on undocumented immigrants. We are vulnerable to gentle suggestions built into the design of social media that encourage us keep scrolling and feel inadequate, we are vulnerable to identity theft, we are vulnerable to surveillance and discrimination.

And yet, in this collective vulnerability also lies a potentially huge source of economic leverage, if we can organize to harness it. After all, major tech companies would be nothing without our data. Pretty much everyone who uses the internet or otherwise participates in the modern world helps fill the coffers of corporations, tech and otherwise, with our data. We do it when we post pictures of our avocado toast, track our periods in an app, scroll past ads. We also do it when we buy shampoo with a store loyalty card, submit term papers to plagiarism scanners, and repay student loans. Credit rating agencies, after all, figured out how to monetize surveillance long before Facebook came along. Except in relatively rare cases, we don’t get paid for that data. But that doesn’t have to be the case. The first step is to reconfigure how with think about the data we hand over.

Surveillance capitalism, a term popularized by Shoshana Zuboff, refers to a capitalist logic that monetizes data gathered through surveillance. In this stage of capitalism, corporations need to acquire as large a volume and as wide a breadth of data as possible to make better predictions of behavior. The ultimate goal is to actually be able to use that predictive power to influence behavior. Like by putting Pokémon, just waiting to be caught, next to businesses that paid the makers of Pokémon Go for the foot traffic. Some would argue that if we don’t like our data being used this way, we can always opt out of the services. But we all know that that’s not a real option. We don’t have much choice but to give up our data in order to use the services that have become so integral to our social and professional lives. We should understand this extraction of data as not voluntary but coercive. Just as workers must keep working to live, we must keep giving up our data to participate in society. But at the same time, corporations that use our data couldn’t function without it.

It’s hard to estimate how much any given individual’s data is really worth. Some estimates put Facebook’s annual revenue per user globally at $12 in 2016. In general, the total comes to less than a dollar per individual per sale for companies that sell personal data to prospective advertisers. Individual choices to stop using a given platform, therefore, are unlikely to make a difference. But organized at scale, defection from, say, Facebook, could bring the company to its knees. The logistics of organizing are daunting, but the nature of social media also makes the odds more promising, making those platforms a potentially strong starting point for organizing. As the name implies social media companies are especially vulnerable to what economists call the network effect — once some users jump ship it becomes a little less appealing to everyone else. The ripple effects of organizing could go far. It might be hard to imagine people organizing for a fair share of a few cents profit made on their data, but remember: unions negotiate. Just as the boss needs labor, the platform needs data. The fact is that we’ve never tried to withhold data at a scale large enough to see how much money it’s really worth. And besides, a data union need not only demand compensation for data, it could demand behavior. Imagine if anyone with a face could prevent that face from being used to feed predictive policing algorithms. Imagine if we decided that no one’s browsing history would be available to Amazon in any form until its warehouse workers earn a living wage. In Europe, web users must authorize the use of cookies on their computers, thought sometimes “continued use of the site” is still taken as full authorization. With a union we could design a much more meaningful system of consent around data collection and sharing. Given the extent to which tech companies and governments have already been shown to collaborate on surveillance projects, only a totally separate entity can really curtail the practice.

It’s also hard to know exactly how powerful the behavioral influences of technology really are. There’s evidence that Uber has successfully used techniques inspired by video games to persuade drivers to keep working at the times that the company most needs drivers on the road without having any formal rules or scheduling for drivers. Who’s to say when the decision to keep scrolling through Twitter’s endless feed ceases to be our own? Who knows exactly how many votes were really swayed by fake news in 2016? We can’t know, the line is too fine, the experience too subjective. But we do know that some of the wealthiest and most powerful companies in the world have achieved their status in part based on their success in selling the idea that such prediction and control is indeed possible. We can’t say for sure whether Tristan Harris and his less conscientious peers in Silicon Valley really have the power to steer the thoughts and actions of two billion people every day. But what we can say for sure is that we should never trust anyone who would try, even if they claim to have our best interests at heart. We also should not assume that those who created this technology are the only ones who can control it. That so much of our most personal information is held by corporations and used to drive their profits, often against our interests, is not a technological problem but a political one. The first step in addressing it is recognizing that every day that we give up our privacy we also generate a massive economic asset, and there is power in that. An independent and democratic data union could harness the power of that asset to bring democracy into the age of big data.

--

--

The Invisible Worker
The Digital Labourer

A zine exploring work and the internet in contemporary capitalism