The Social Dilemma fails to address the systemic issues fueling user behaviors

Jean Burke-Spraker
Plus Marketing
Published in
10 min readNov 19, 2020

A confession

Phone screen with Instagram, Facebook, and Twitter highlighted. Photo by dole777 on Unsplash

Hi, my name is Jean. I’m addicted to social media, and my notifications have become unmanageable. My drugs of choice are Facebook and Twitter. I have tried different approaches to abstinence including turning off notifications, timing my use, and scheduling my time. They don’t totally work for me. Of course, it does not help that part of my job requires me to post and monitor social accounts; nor does it help that with the pandemic all of us crave information, and I probably do that more than most. I spend my time reading. No Candy Crush or Minecraft or PUBG for me. Nope. I read about marketing, disinformation, and politics. But the same rules apply to my reading that apply to your game use. The dopamine hit is just too powerful; the apps that give me my fix are far too addictive.

The Social Dilemma

Broken phone with purple and pink veins. Photo by Agê Barros on Unsplash

But this blog post isn’t about my addiction, it’s about The Social Dilemma, the documentary on Netflix. The documentary frames the problem of social media largely as a user-level problem. We are addicted to our phones or more specifically the apps on our phones.

As we watch the fictional family in the series, we come face to face with the power of addiction. At one point, the mother hits on a brilliant idea: Put everyone’s phone in a timed container so that no one can use them during dinner. Next, we see the daughter (wearing safety goggles!) break open the container and accidentally break her brother’s screen in the process. In exchange for having his screen repaired, her brother then promises to go without his phone for a week.

Spoiler alert: He doesn’t last the whole week.

Instead, he falls into a rabbit hole of bad behavior, which ends when he is arrested at an “extreme center” rally. I rolled my eyes hard at that idea. We all know the center is not radicalizing people. The right is. The documentary even includes a graphic re-creation of Heather Heyer’s death at the Charlottesville Unite the Right rally that is equal measures triggering and irresponsible. It’s no surprise when the film fails to address the role white supremacy plays in social media development and use.

Experts in the field — the guys who invented Facebook and Twitter and the guys who created virtual reality and the Gmail inbox — then admit that they were trained to make these apps addictive. And make no mistake, these are guys with only one exception. And they are nearly all white.

It’s not their problem that we’re addicted to social media. It’s ours. Shame on us users for falling prey to their mind games.

Mind Games by John Lennon

Recommendation algorithms

Green code on a computer screen. Photo by Markus Spiske on Unsplash

Algorithms are sets of rules, processes, or instructions designed to perform a specific task. Social media companies often use algorithms to keep our attention. In the case of YouTube, their recommendation algorithm suggests videos to you to keep you on their site. During the documentary, the very guy who wrote it suggested turning off the Autoplay feature and your notifications.

You should do that now. Here’s how:

How to turn off your Autoplay on YouTube

You see, the YouTube Autoplay feature will keep playing videos for you unless you tell it not to. For example, for me, YouTube took me from Mind Games to Watching the Wheels to an ad for Naked Wines (which is a bit off, tbh) to another Lennon/Ono song and then to an ad for Google My Business. Finally, the algorithm recommended Give Me Love by George Harrison featuring images of Lord Krishna. Perhaps the algorithm has come to know me better than I know myself after all!

At that point, I turned the Autoplay feature off. The recommendation algorithm was trying to distract me, trying to force me to lose my focus on what really matters. Instead, the algorithm tried to focus my attention exclusively on what matters to YouTube: its monetization strategy.

Opinions embedded in code

Apple Airbuds, laptop with code on the screen, and a sneakered foot on table in front of a brick wall. Photo by Joshua Aragon on Unsplash

Why did I start writing this post? Oh yeah! The algorithms.

I am not an expert on algorithms. But in early 2019, I decided to start researching how algorithms work and how we can make them and social media platforms safer. I started following experts on Twitter who understood math, marketing, and misinformation. Primarily, I followed women scholars: Dr. Safiya Noble, Dr. Joan Donovan, and soon-to-be-doctor Kim Crayton. I also followed Dr. Siva Vaidhyanathan, an expert on Facebook who has written about it extensively in The Guardian.

A consensus started to build from my reading. Dr. Cathy O’Neil summarized it succinctly in the film:

“Algorithms are opinions embedded in code.”

That means that the problem is not so much with the algorithms as with their creators.

These creators and their perspectives are very white. Part of that is a function of who founded and built these companies. You can’t interview a person of color as an early team member of Instagram or the VP of engineering for Facebook if people of color did not have those jobs. When these creators talk about the underlying issues with social, however, no one mentions the elephant in the room: white supremacy.

At one point, we are told:

“We invented likes because we wanted to share love with the world. We didn’t think kids would be depressed because they aren’t getting enough of them.”

Yeah, well, if you weren’t a white tech bro, you might have thought about that. These creators just don’t think enough about the impact of their technology because they are not in the populations harmed by it.

The creators have goals, like monetization, that are not neutral, and they have biases. They embed both into their algorithms. Thus, if a group of young white guys builds an app, that app is going to reflect the lived experience of those young white guys and their business goals.

Imagine a group of young (some now middle-aged) white guys telling me that social media is harmful. Then, imagine they tell me that the solution is that I should turn off my notifications. Now, imagine the look on my face.

Young, white woman with the tongue sticking out. Photo by Maria Lysenko on Unsplash

The minute the whole world turns off their notifications, social media companies will find another way to stimulate that dopamine hit.

And that’s where The Social Dilemma fails as a project. It puts the onus on us to correct our behavior. Behavior that social media’s chief architects have created or modified for their benefit. A white guy telling you to delete social media fails to understand how marginalized communities use that media. For all the harm Facebook groups can do — and we have seen that harm in Kenosha and with Q Anon — they can also be lifelines for many to build communities around marginalized sexual identities or to overcome the physical isolation of disability or COVID-19. Moreover, political organizing now often starts online. I would be interested to learn how grass-roots organizers used social media to mobilize voters in Georgia.

The film does not push back enough on the social media creators and offer enough room to critics. The absence of longtime Facebook critic Dr. Vaidhyanathan is particularly glaring. Cynthia Wong (formerly with Human Rights Watch and now with Twitter) and Dr. Rashida Richardson (Visiting Scholar at Rutgers University Law School) both try to refocus our attention on the tech. But they just don’t get enough screen time to make an impact.

Dr. Richardson is the only Black woman interviewed for the film and appears only once. After 60 minutes of back-to-back white tech bros, she is a welcome relief, but her clip is too short. Here’s a longer video of her ideas:

Talk by Dr. Rashida Richardson about data and policing

Social media addiction

Hypodermic needle with fluid on top of razor blades. Photo by abyss on Unsplash

Ultimately, the discussion of addiction in The Social Dilemma lacks a strategic focus. No one really goes beyond “these apps are bad for you, turn off your notifications.”

Some would argue that these apps aren’t addictive, but recent research suggests otherwise. A 2018 blog from Harvard University pointed to reward prediction error (RPE) encoding, commonly called the slot machine effect, as similar to how social media apps get you to keep coming back for your dopamine hit.

Do you pull down the screen to get Facebook to refresh? That’s how the slot machine effect works. That feeling you get between the pull and the refresh mimics the slot machine showing you whether you won and elicits a specific response in your dopamine pathways.

Just like gambling, addiction treatment strategies can teach us how to help individual users. Since the foundation of Alcoholics Anonymous in the 1930s, our approach to addiction has been largely at the individual level. My admission at the beginning of this post is actually the first step in the Alcoholics Anonymous 12-step program: to admit that there’s a problem.

To paraphrase AA’s first step: I am powerless over social media, and my life has become unmanageable.

Just yesterday, members of the Senate Judiciary Committee talked about Facebook and Twitter as similar to cigarette manufacturers. During that hearing, Mark Zuckerberg asserted that their internal research about Facebook and addiction was inconclusive, while Jack Dorsey seemed to acknowledge it. At one time, we did not think that cigarettes were harmful either, until conclusive independent research revealed otherwise.

The cigarette analogy works well. They are legal, but they are regulated. If we understand algorithms as similar to the chemicals in cigarettes, then we start to ask whether they should be regulated too. Dorsey proposed an algorithmic marketplace where users could choose which algorithms they wanted to turn on or off. While I love the idea of giving users more choices about how they experience the apps, I am deeply concerned that users don’t have a clear enough of understanding of how these platforms work to make informed choices.

Especially our youngest users.

Government regulation of cigarettes includes not allowing anyone under 18 to buy them. Facebook and other platforms set their age limits at 13. Without exception, every parent who appeared in The Social Dilemma said they do not allow their kids to have access or severely limit that time. Are age limits a possible solution? Should only adults have access to social media?

Honestly, I don’t think that’s viable, but I think we need to think across industries for solutions.

Nearly everyone I chatted with about social media addiction is most worried about the youngest users, much like the ones we see in the film. I am too. I want these platforms to be safe for them. But terrifying their moms about too many hours online is not the solution either. The Social Dilemma had a lot of adults talking about kids, but no one actually asking the kids what they would do. You may find that “the kids” have solutions the adults had not considered. They are the digital natives, not us.

While it’s important for users to take personal responsibility for their screen time and usage, recent work on drug addiction has included broader harm reduction strategies like decriminalization and safe injection sites. Those strategies tackle the problem at a structural level and include government support and regulation. We cannot solve the problem of social media’s addictive nature just by focusing on individual failures. We must repair the structural weaknesses as well. We must look at underlying mental health issues, social pressures, and cultural trends that encourage that addiction.

Ultimately, the number of users who would probably qualify as addicted to social media under DSM-5 is small compared to the overall population. But, when we design for the most vulnerable, we make the products safer for everyone.

The impact of The Social Dilemma

Blue honeycomb design notebook cover with computer parts on top. Photo by Jonas Svidras on Unsplash

Since The Social Dilemma premiered on Netflix in August, the world has changed. A lot.

But have users?

I conducted an informal poll on Facebook and Twitter that indicates that nothing has really changed. Some people made small changes like turning off notifications or removing apps from their phones. But most have realized that deleting Facebook, Instagram, Messenger, and WhatsApp (all owned by Facebook) is just not feasible. Facebook’s biggest user base is in India. I have too many friends there to delete the app. I work in marketing. Deleting Facebook just isn’t possible, even if I use other apps to post to my managed pages. I can’t advertise on Facebook without being on Facebook.

Social media is very good at one thing: creating outrage. That’s what this movie did. People watched. They outraged. Some deleted the app. Some went on a break. But those breaks did not last. They never do. Now, it’s November. Do you still think about the documentary? Or have you already forgotten the lessons you learned just a few months ago?

The outrage-delete-reinstall-forget lifecycle is short, but its monetary rewards last long term. People did exactly what everyone making this film expected them to do. And nothing really changed. Yet.

For permanent change to take place, we must look more deeply than individual consumer behavior. The Social Dilemma is just the first step in fixing the problem. The next step is up to you. I will publish my possible solutions in a subsequent post. But, first, I want to hear from you.

What solutions would you propose to make social media safer for you and your loved ones?

--

--

Jean Burke-Spraker
Plus Marketing

Editor. Writer. Marketer. Blogs about books and writing at jeanspraker.com. Loves semicolons.