It should be impossible to feel ambivalent about social media anymore. The horse is dead and beaten, its bones picked over: We know services like Facebook harvest our data, spur violence, and elect madmen. Using products like the social network implicates us in a system that has, on occasion, made the world worse in very vivid ways: Rodrigo Duterte, the authoritarian president of the Philippines, has “turned Facebook into a weapon,” and you can see the end result in the bodies lining Manila’s gutters.

Or maybe you won’t see the victims in Southeast Asia, and your pain comes instead over the airwaves as Tucker Carlson goes bonkers over some inane viral tweets about Rudolph the Red-Nosed Reindeer, for crying out loud. Maybe your kid watches some nightmarish Peppa Pig simulacrum on YouTube that an algorithm pieced together based on engagement metrics.

This is all pretty much awful, and there’s no room for ambivalence. Yet ambivalence prevails, and one four-letter word can help explain why: love.


I finally deactivated my Facebook account last week after years of problems: Cambridge Analytica, a recent data breach exposing the personal information of many millions of people, the genocide in Myanmar, the election of Duterte (and Donald Trump), the flattening of the media industry, and inexplicable choices about newsworthy content. What finally broke me was Facebook leadership’s refusal to accept responsibility for just about anything. It’s one thing to know that the product may be harmful on some level, but it’s something else entirely to recognize that no one making it really cares.

This isn’t ambiguous anymore: Some of the social products you enjoy as an individual are causing measurable harm around the planet.

Despite algorithm tweaks, the Facebook News Feed has mostly been the same for years: It still rewards sensational content that makes people like, share, and comment. And there’s no incentive to fundamentally change the product by dismantling the News Feed because the data it generates is absurdly profitable. In 2017 alone, Facebook made nearly $40 billion in ad revenue.

Though the social network’s blunders are well covered, it would be wrong to articulate any of these problems as uniquely Facebookian. We’ve arrived at a point where many of the apps shaping our digital lives have similar issues. They cultivate ambivalence in their users, which only makes everything worse.


So, let’s talk about the word “love.”

Deactivating Facebook meant sacrificing a feature I genuinely enjoy: “Memories,” which shows you posts from years ago on relevant days. (On New Year’s Eve, you might see how you celebrated eight years ago, for example.) I tend to fixate on the present, which is partially what makes the feature so powerful; I’m inclined to take and post pictures during events, and Memories helped situate me in a version of the past that felt genuine and appropriately “small.” It didn’t capture big gestures: The pictures showed my wife at dinner or a friend with a beer. Memories made me smile.

Every day, you might quietly, even subconsciously make hundreds of little decisions that teach these services exactly how you’d like your digital morphine drip.

It also kept me coming back to the product and using it, perpetuating a harmful cycle. Remember that the data you feed into Facebook helps train the product, as it does with many online services that use machine learning.

Take for example a revelation last week from Cambridge Analytica whistleblower Christopher Wylie, who asserted the Trump campaign was able to use fashion data to profile potential voters. As he explained it, people who liked certain clothing brands, like Wrangler and L.L. Bean, were less “open” and more likely to be influenced by pro-Trump messaging.

You could argue that there’s something of a butterfly effect here. Something as simple as your taste in denim makes a difference since your interests and behavior on a service like Facebook end up tangled in a complex web that may eventually have an immense impact on the world around you.

So, I deactivated my account. No more Memories. But Google Photos, a personalized photo gallery, offers something similar with its “Rediscover This Day” feature. The app isn’t a social network, but it has something in common with them: It uses forms of artificial intelligence to dig through the pictures and videos you upload and deliver them in a new context that encourages you to use the app more. “Rediscover This Day” is cute, but it’s also a reason to open Google Photos every day and keep the auto-upload feature on. And then there’s the search bar, which lets you dig through the photos you’ve uploaded using basic terms like “cat” or “food.”

Perhaps adrift without Facebook Memories, I opened Google Photos last week and searched for “love.” I saw hundreds of results uploaded from my personal devices: pictures of my wife when she was my girlfriend, some poor crops of my friend hugging his ex, my mom smiling on her birthday years ago.

That got me curious. Google Photos is interesting because it feels like a more transparent approach to the kind of programming that underpins products like Facebook: Understanding which words the product recognizes could teach us something about how they manipulate us.

After I swiped through row after row of “love,” I typed in “hate” and found Google Photos returned no results. “Happy” gave hundreds, but “sad” only gave six. There were nearly endless “smile” pictures but no “frowns.” While the app promises “automatic organization for all your memories,” the machine powering it appears to have a bias toward positivity.

Fundamentally, Google wants people to use the product, not get bummed out by it. Happy customers come back. The company didn’t respond to an email about this, but it can’t be an issue of the service not understanding what a “frown” is: The company reportedly trained Photos via Image Search — not exactly the happiest place on Earth.

Google Photos, like Facebook, Twitter, YouTube, or any other social platform, sucks up your personal data, sorts through it, and uses the information to hook you. None of them offer a realistic representation of your life: They’re constructing an idealized one.

Pay attention and you’ll see this happen in many of the online services you use today. Twitter promises to show you tweets you’ll “care about,” Instagram gives you more content like what you’ve tapped the “❤️” button on, and Snapchat does its best to show you publications you’d enjoy via its “Discover” tab. All of them are constantly gorging on your data — down to how long you glance at a post, even if you don’t touch it — to serve you more of what you like. Every day, you might quietly, even subconsciously make hundreds of little decisions that teach these services exactly how you’d like your digital morphine drip.

The machinery is primed to keep you happy — or at least engaged — as you’ve probably already recognized on some level. In a recent survey of 1,141 U.S. teenagers, 72 percent said they believe social media companies manipulate them into spending more time on their platforms, but they were also much more likely than not to say social media makes them feel better. Meanwhile, 68 percent said they believe these platforms have a “negative impact” on many of their peers.

You will have to look beyond the glossy idealized world they render on your screen to understand why.

And that’s really the whole thing: People may believe these companies have impure motives, that they can make things worse for other people, but they keep using the products because they have positive associations with them as individuals: “Instagram may suck for other people, but it feels good when I use it.” That may help explain a recent New York Times column in which philosopher S. Matthew Liao discussed, for several hundred words, the ethical problems of using Facebook only to conclude he’d stay on the platform.

Sure, not all of these products are world-ending. Google Photos may help a tech giant suck up your valuable data, but your “love” pictures aren’t about to undermine democracy around the world (unless you’ve got something weirder going on than most, but no judgment). The problem is that we’ve seen how some services — particularly those that combine algorithms with viral information and not just pictures — are legitimately destructive. This isn’t ambiguous anymore: Some of the social products you enjoy as an individual are causing measurable harm around the planet.

Continuing to use these services makes them more powerful, which is why maintaining a Facebook account becomes so ethically tricky. Most users wouldn’t have made a connection between their fashion sense and Trump’s election, but the data was relevant to Cambridge Analytica.

Ambivalence about your social media use thus has an impact: Perhaps no action you take in your News Feed is truly meaningless. You will have to look beyond the glossy idealized world they render on your screen to understand why. I lost my ambivalence toward Facebook’s News Feed not because it wasn’t enjoyable to use on some level but because its impact on the real world became too destructive to ignore. You might say the same about YouTube since its algorithms promote and distribute conspiracy videos.

Fundamentally, this tension cannot be resolved so long as these businesses make a one-to-one link between all of this data and their profits. Facebook, Snap Inc., and Alphabet (Google’s parent company) are publicly traded with an obvious goal to create more value for their shareholders. Right now, their incentive is to keep you coming back, engaging, and creating data.

This is a civics problem with no easy solution. Disconnecting my Facebook account means it’s harder for me to see what my friends and family are up to, but it’s a trade-off I was willing to make. And I made the iffy decision to keep my Instagram and Facebook Messenger accounts active for fear of removing myself from the grid too completely.

We may one day look back at these services with a question: How was any of this allowed? But we can’t be so ignorant now as to believe there are no choices. Ambivalence feeds a machine that millions of us have cogs in. Ask yourself now which of yours should keep turning.