Listen to this story
The product Facebook sells is you
Congress and tech companies are eager for an easy PR victory, but requiring advertiser disclosures won’t solve what’s wrong.
If you’ve ever felt like you’re not in control of your brain anymore, the reason is that you’re not. Your memory, executive functions, reward system and consciousness itself have been deliberately altered by a group of young men who live in California. You sleep less. You remember better (all those posts help commit moments to memory). You’re addicted. And you now have several layers of consciousness, some of which are entirely external to you, like the brain inside Netflix that knows you like political thrillers.
Facebook, and by extension Silicon Valley, is in damage control. “Tech Giants, Once Seen as Saviors, Are Now Viewed as Threats,” The New York Times reports. Among their transgressions: selling Facebook ads to Kremlin trolls, stifling criticism at a Google-funded think tank and allowing racist fringe groups to infiltrate and reframe mainstream political discourse.
As Max Read noted in New York magazine, Facebook has assigned itself the duty to monitor national elections in foreign governments. “We have been working to ensure the integrity of the German elections this weekend,” Mark Zuckerberg said ahead of last month’s vote.
Read, responding incredulously: “A private company, working unilaterally to ensure election integrity in a country it’s not even based in? The only two I could think of that might feel obligated to make the same assurances are Diebold, the widely hated former manufacturer of electronic-voting systems, and Academi, the private military contractor whose founder keeps begging for a chance to run Afghanistan. This is not good company.”
Smelling blood, Congress sees a way to score an easy political victory. “Who wouldn’t want to know if the ad that’s appearing next to your story was actually paid for by a foreign power?” said U.S. Sen. Mark Warner, announcing a bill to force companies like Facebook, Google and Twitter to disclose the source of funding for political ads. Facebook, in fact, has already begun to release a new policy along those lines, which would require ads on its network to link to the page that paid for them.
Does anyone here believe $100,000 in Russia-funded campaign ads on Facebook is the real problem? I’m not saying transparency isn’t a positive step. But it’s like treating a gunshot victim for his headache. Eating Advil is simpler than trauma surgery, for both the senators and for Facebook. By pushing nearly victimless election transparency rules, the social media companies protect their overall business models and the senators get a quick win (if it passes at all—nothing seems quick or easy in Congress anymore) without upsetting San Francisco’s lobbyists.
But the impact will be marginal because the problem isn’t the ads.
You may think Facebook is the product and you’re the client, but that’s not entirely true. There’s a reason tech companies call us users and not customers. It’s because we’re just people who come and use the interface. The product Facebook sells is you. The advertisers are the customers. That goes for all tech companies that make most of their money from ads.
Therefore, in order of priority, advertisers are first and users are second. Much of the time, the interests of those two groups do not compete. But if, say, Facebook has to decide whether to infect two billion people with addict-like compulsions or not to do that, it will refer to its priority list.
“We’ve never had a media device that literally a billion people are kind of being programmed in the same way, where so much influence is in the hands of a few technology designers,” said former Google employee Tristan Harris in the PBS interview embedded above.
The Guardian reported on Harris’ campaign recently.
A friend at Facebook told Harris that designers initially decided the notification icon, which alerts people to new activity such as “friend requests” or “likes,” should be blue. It fit Facebook’s style and, the thinking went, would appear “subtle and innocuous.” “But no one used it,” Harris says. “Then they switched it to red and of course everyone used it.”
That red icon is now everywhere. When smartphone users glance at their phones, dozens or hundreds of times a day, they are confronted with small red dots beside their apps, pleading to be tapped. “Red is a trigger color,” Harris says. “That’s why it is used as an alarm signal.”
Harris is promoting his new initiative, Time Well Spent, which argues the ad-based business model needs to change before the companies will stop using slot machine-style techniques to manipulate us.
But the addictive quality of these technologies is just one aspect. They’re changing how our brains work in other ways we don’t fully understand. A vast portion of our lives — perhaps half our waking hours or more — is now mediated through interfaces designed by a handful of companies whose technology has only been pervasive for 10–15 years. We have to consider whether it’s affecting our ability to empathize, to reason and to be persuaded.
I pointed out last week that including the phrase “will make you” in a headline increases the number of Facebook shares more than any other phrase. Will make you cry. Will make you thinner. Will make you outraged. Social media are not fora for deep thinking. The only ethos is compulsion. Don’t think. Click and feel.
Our phones are changing the way we engage with those around us in the physical world. Sometimes we don’t even look at people’s eyes when they’re talking to us anymore.
Beyond Facebook and Google, companies like Uber, TaskRabbit and UpWork affect how we work, whether we work and how much we earn—even if you don’t use those services. By mediating labor, our essence as workers is devalued and atomized. No one but a few programmers can see the whole marketplace; they are both the market’s proponents and its regulatory body.
Our brains. Our economics. Our politics. “We have been working to ensure the integrity of the German elections this weekend.” Who empowered these rulers? By what mechanism do they derive their legitimacy?
I’m not saying technological advances are bad; on the contrary, given the challenges in the world, we need innovation and discovery more than ever. But innovation for innovation’s sake is wrong. Innovation for shareholders’ sake is worse. Advances in science and technology belong to humanity, and, democratically, we should be able to ensure these result in common benefit rather than corporate benefit.
Do you feel better off?