What Is Surveillance Capitalism?
You may have heard the words “surveillance capitalism” thrown around lately, but it also may not be obvious what that means. We’re going to briefly explain what surveillance capitalism is, and reflect back on what that means for our lives.
This theory is based off of a book of the same name by Shoshana Zuboff, professor at the Harvard Business school. She motivates this theory by talking a lot about Google, and how they became the first company to tap into this new form of profit-making.
Reinvesting for Customers
But first, it may be useful to do a quick refresher on how industrial capitalism works. We can take the example of a car factory… The factory and equipment are the “means of production.” Workers labor in the factory to make a product. Generally, industrial capitalists have an incentive to improve productivity, whether that’s through better workers or equipment. So the goal of capitalists is to make profit, that is, extra money beyond just the costs of labor, materials and upkeep, so they can have surplus to reinvest into their means of production, making the factory or products better. Capitalists want to improve their products so they can compete in a market and maintain dominance. There is also an incentive to keep worker wages low so profits are high, but we’ll leave that for another video. Examples of pioneers in industrial capitalism are the Ford Motor Company and General Motors — their innovation of the assembly line greatly improved productivity.
Zuboff argues that Google is the Ford Motor Company of surveillance capitalism — they took the old model of doing business and radically transformed it. But before they became surveillance capitalists, they operated very similarly to an industrial capitalist mindset. Take the Google search bar as an example. In the early days, they treated the bar with what Zuboff calls the “behavioral reinvestment cycle.” When users used the search bar, Google got some usage data. Through analytics, they could figure out how to improve the search bar for a better user experience. Then users would use the search bar more and there was a “virtuous” cycle of reinvestment similar to improving cars in industrial capitalism.
Reinvesting for Advertisers
If you’ve ever heard the saying that on social media “you’re the product,” Zuboff argues that’s wrong: the products are predictions about you, and you’re actually the resources.
The twist comes when Google discovered a whole trove of data that wasn’t being used at all. They called this “data exhaust” or “digital breadcrumbs.” It was thought, at first, to be useless. In the search bar, this type of data may be things like a user’s grammar, tendency to capitalize, if they use punctuation in their search, or typos. Eventually, data scientists at Google figured out that this data exhaust could actually be used to predict things about their users like their personality or interests. More importantly, though, they figured out they could predict very specific things about users, like how they’re feeling in this very moment, how they may be feeling in the future, or what their goals are.
These predictions ended up becoming a gold mine, as Google got into the business of allowing advertisers to very carefully craft and target ads whose likelihood to make you click them was now based in behavioral predictions. This was the turn to surveillance capitalism. Now Google’s main objective was to sell prediction products to the real customers: advertisers. Instead of you being a customer to improve search for, users were now the raw materials to mine in order to make better prediction products. If you’ve ever heard the saying that on social media “you’re the product,” Zuboff argues that’s wrong: the products are predictions about you, and you’re actually the resources. And in the interest of making better and better predictions for advertisers, a new incentive was born to generate and mine as much behavioral data as possible from users. Zuboff calls this the “extraction imperative.”
This entirely changed the game of what it meant to produce and reinvest as capitalists. The old behavioral reinvestment cycle now gained an entirely new layer of machinery. The new means of production in surveillance capitalism are machine intelligence algorithms. The products are predictions, sold in a market of what Zuboff calls “behavioral futures” — that is, selling predictions to those who are interested in your future behavior. And this led to enormous profits.
In addition to simply harvesting more data from you to make more accurate predictions about what you will do, she says there is another way to make predictions more accurate: change your behavior so you do what they predicted.
Instead of simply analyzing user data to make better tools for users, the real incentive now is to analyze user data to make better prediction products for advertisers. Changes to the actual tools a user interacts with now serve the purpose to encouraging you to share more data, or behave a certain way so predictions are fulfilled.
A great example of this is the transformation of the Facebook “like” button into a host of emoji reactions. At first, it may seem like Facebook is giving you more options to show how you feel about a post. That’s true, but it’s not the main purpose of the upgrade. If you assume Facebook is operating under the extraction imperative, that they want as much behavioral data from you as possible, then it becomes clear that the emoji reactions are aimed at getting more data about your emotions. You can only infer so much from a simple “like,” but reading a user’s sad, angry, or happy reactions could allow much better predictions about your current and future mood.
If this wasn’t freaky enough, then Zuboff’s conclusions about what surveillance capitalism also incentivizes will put you over the edge. In addition to simply harvesting more data from you to make more accurate predictions about what you will do, she says there is another way to make predictions more accurate: change your behavior so you do what they predicted. She argues that through subtle manipulations we would never notice, companies may be trying to make us feel certain ways, or do certain things. They can show you certain posts over others to make you sad, then give you the perfect ad to take advantage of your vulnerability. Google Maps may route you in a slightly different way to make you drive or walk past a restaurant that you got a few coupon ads for earlier in the day, so you’re more likely to buy their food. She calls these systems the “means of behavioral modification,” and argues that this is the logical end of making money off of surveillance and behavior prediction.
So what do we do?
There are probably a lot of thoughts rattling around in your head. So now would be a good time for some reflective questions:
- How does this information make you feel? One of the ways Zuboff argues these companies get away with surveillance is by making us used to feeling spied on. If we take time to reflect on how we feel, we’re already making progress.
- What types of things might someone be able to predict about you based on your data?
- Zuboff reminds us that this system was not an inevitable evolution of capitalism — but an intentional mutation. People made it happen and continue to allow it to happen. If you don’t like it, how do we stop it?
While no perfect answer to that final question is agreed on by people who study surveillance capitalism, there is one piece of an answer that is indisputable: the more people who know about these ideas, the more likely we will be to break free from manipulation. Once you know, you’ll start to see it. So, this is not just a plug, but sharing this post and others about Surveillance Capitalism with our friends and communities can be the first step in the journey towards being free of its effects. Also, supporting organizations like Fight for the Future — who run campaigns around data rights and pushing back on Big Tech’s encroachment — can be helpful as well.