Social Media Harms

Illuminating the Darkness: Exploring Social Media Harms

#SocialMediaHarmsVictimRemembranceDay #OnlineISRealLife

Sharon Winkler
Social Media Harms


Image created by Author using

The Alexander Neville Foundation has announced the first-ever Social Media Harms (SMH) Victim Remembrance Day, which will take place on June 23, 2023. This day encourages everyone to consider the negative effects of social media use and emphasizes the importance of creating safer online spaces across all platforms. The inaugural remembrance day is dedicated to honoring the memories of two individuals, Alex Neville and Carson Bride.

Alex was only 14 years old when he unknowingly connected with an illegal drug dealer on Snapchat. Tragically, the dealer sold him drugs laced with fentanyl, resulting in Alex’s fatal overdose on June 23, 2020. Carson Bride, at the age of 16, faced relentless cyberbullying from his high school peers who utilized anonymous apps on Snapchat. The immense stress and shame pushed Carson to take his own life on the same date, June 23, 2020. These stories, among others, shed light on the harmful behaviors that go unchecked on numerous social media platforms.

SMH Victim Remembrance Day provides an opportunity for families and loved ones affected by such harm to share their experiences and raise their voices. Currently, these families and allies have little power to prevent or influence the negative actions of other users. Often, when these behaviors are reported, the platforms themselves take no action, leaving survivors, their families, and friends to shoulder the blame and shame for the consequences they face.

Families and friends of people who are targets of digital harm are sometimes unfamiliar with these destructive behaviors and do not recognize the stress and depression that result from these online communications. This article will provide a brief overview of the different types of harmful behaviors and consequences of online platform use in order to help assist recipients of digital mistreatment.

Hate Speech

Hate speech affects many online users. The Anti-Defamation League (ADL)’s “Online Hate and Harassment: The American Experience 2022” found a high rate of hate-based harassment, targeting users who belong to a marginalized or minoritized identity group. The study found that 58% of survey respondents belonging to such groups had experienced harassment in the past 12 months.

GLAAD concluded in its 2023 Social Media Safety Index (SMSI) that Anti-LGBTQ rhetoric and disinformation on social media:

  • translates to real-world offline harms
  • is a public health and safety issue
  • Platforms are largely failing to mitigate this dangerous hate and disinformation and inadequately enforce their own policies.

The Biden administration released the National Strategy to Counter AntiSemitism in May 2023. This plan aims to reduce hate speech against all minority groups including Black and Hispanic Americans; Asian Americans, Native Hawaiians, and Pacific Islanders; LGBTQI+ individuals; Muslim Americans; women and girls.

Goals specific to online activities are included in the plan. Strategic Goal 3.2, “Tackle Antisemitism Online”, encourages online platforms to establish terms of service and community standards that have a “zero-tolerance” for hate speech. Other recommendations include:

  • Establish “zero-tolerance” terms of service and community standards for hate speech.
  • Permanently ban repeat offenders, including personal accounts and extremist websites.
  • Allocate resources for effective enforcement of terms of service and community standards.
  • Enhance capabilities to prevent the recommendation and promotion of antisemitic and other hateful content.
  • Increase transparency of algorithmic recommendation systems and data, allowing public interest research to understand and improve content moderation tools.
  • Treat antisemitism as a distinct category in transparency reports, reporting on the volume of antisemitic content addressed on platforms.
  • Support and empower trained community moderators to address hate speech and bias, including antisemitism and related tropes.

Illegal Drug Sales

In 2021, the U.S. Drug Enforcement Agency (DEA) investigated 80 overdose death cases involving drug trafficking on internet apps. The DEA reports that drug traffickers advertise on various social media platforms like Facebook, Instagram, Snapchat, TikTok, Twitter and YouTube. They use disappearing, 24 hour stories to market and sell illegal drugs using widely known emojis and code words. Buyers looking for these drugs reply in the comment section to posts or send direct messages to the drug trafficker. Once contact is established, the actors often switch to an encrypted communications app like WhatsApp, Signal, and Telegram. Payment for drugs is often made using apps like Venmo, Zelle, Cash App, and Remitly.

Colorado’s state legislature passed a fentanyl law in 2022 (24–31–116, C.R.S.), which directed the Colorado Department of Law to perform a study to determine how internet and social media platforms are used for the sale and distribution of fentanyl and fentanyl-like products. The Colorado Attorney General released this report in March 2023 focusing on social media platforms. These platforms were chosen as they are not currently addressed by legislation or public policy interventions. The report made the following recommendations:

  • Social media platforms should adopt a uniform, robust set of best practices to prevent and respond to illicit drug activity.
  • More resources should be provided to support existing law enforcement efforts to combat drug distribution online.
  • There should be increased focus in expanding increasing internet and social media literacy for parents and caregivers.
  • A federal agency should be empowered to oversee social media platforms; and
  • Federal legislation requiring greater access to social media platforms’ data transparency should be enacted.

Harmful Online Challenges

Harmful Online challenges among youth are on the rise. Challenges covered in the media in recent years include /The Blue Whale Challenge, Tide Pod Challenge, Skull-Breaker Challenge, salt and ice challenge, Benadryl Challenge, and The Choking Game (TCG)/Blackout Challenge. The “Choking Game” (TCG)/Blackout Challenge is the most well known challenge. This is due in part to videos of people demonstrating this challenge being widely shared on internet platforms . TCG can result in death and permanent injuries such as chronic headaches, amnesia, convulsions and stroke. Erik’s Cause, a 501(c)(3) public charity dedicated to teaching students and parents about the harms of online challenges, documented 1,385 deaths world-wide since 1934 due to TCG. A white paper with more information on these destructive challenges can be found here.

Online Groups that Promote Eating Disorders, Suicide

A January 2020 study in the International Journal of Eating Disorders concluded that:

“A clear pattern of association was found between SM [Social Media] usage and DE [Disordered Eating] cognitions and behaviors with this exploratory study confirming that these relationships occur at younger-age than previously investigated.”

This study also found that the incidence of disordered eating increased in concert with the number of social media platforms used by both girls and boys.

Another study, published in February 2021 in the International Journal of Environmental Research and Public Health found that recent scientific literature identified an increasing number of Pro Anorexia (Pro Ana) and Pro Bulemia (Pro Mia) online content which encourage anorexic and bulimic behaviors, especially in female teens.

Media outlets including Vice and and the New York Times have documented the existence of online forums that encourage suicide and provide detailed information on suicide methods. A 2018 article in the Journal of Affective Disorders documented the existence of online pro-suicide discussion forums on both the public web and on dark web platforms.


Cyberbullying has been a known danger for young people for a long time. According to a study published in JAMA Open Network in 2022, which surveyed 10,414 American kids aged 10 to 13, online bullying is now the most common type of bullying. It significantly contributes to adolescents feeling suicidal, even more so than other forms of peer aggression and known risk factors. The study suggests that organizations should include an evaluation of cyberbullying experiences when assessing the risk of suicide in children and teenagers. A 2022 study of teens conducted by the Pew Research Center found that US teens aged 13–17 reported:

  • 46% experienced cyberbullying
  • 22% had false rumors spread about them
  • 17% received explicit images they didn’t ask for
  • 15% report being constantly asked where they are; what they are doing or who they are with by someone other than a parent
  • 10% reported receiving physical threats
  • 7% reported having explicit images of them shared without their consent
  • Older teen girls stand out for experiencing multiple types of cyberbullying behaviors

Sexortation/Catfishing/Revenge Pornography/Online Commercial Sexual Exploitation

A 2022 study published in JAMA Network Open that surveyed 2,639 adults (half identifying as either male or female, 1.8% other gender, and a mix of race and ethnicities) and asked them if they had experienced online sexual abuse when they were under 18 years old. The results included the following percentage of individuals who reported:

  • online child sexual abuse, 15.6%
  • Image-based sexual abuse, 11.0%
  • Self-produced child sexual abuse images, 7.2%
  • Non-consensual sexting, 7.2%
  • Online grooming by adults, 5.4%
  • Revenge pornography, 3.1%;
  • Sextortion, 3.5%
  • Online commercial sexual exploitation, 1.7%.

The authors noted that the most vulnerable age group was 13 to 17 years old. Contrary to widespread media reports, the aggressors in most types of harms were largely dating partners, friends, and acquaintances, not online strangers.

The Stanford Internet Observatory (SIO) released a report in June 2023 which evaluated the prevalence of self-generated child sexual abuse materials on several large online social media and internet communication platforms. SIO found that Instagram was the most influential platform that allowed large numbers of user accounts, presenting themselves as accounts run by minors, to market self-generated child sexual abuse material for sale.

Screen Addiction

A 2022 study “Digital Addiction” by researchers from Microsoft, New York University and Stanford reported they found studies that documented that residents of the United States (U.S.) check their smartphones 50–80 times per day. They concluded that when individuals are given the ability to set limits on their future screen time, they spend less time on screen, pointing to problems with self-control of screen use. They noted other studies indicate that people are not aware of the development of habits and do not realize when they are having self-control difficulties. The authors concluded that “self-control problems account for 31 percent of social media use.”

A December 2022 study by the Pew Research Center discovered that 93% of U.S. adults surveyed use the internet and 85% reported owning a smartphone. Another 2022 Pew Research study found that a majority of teens surveyed used TikTok and YouTube every day, and 54% say it would be hard to give up social media. A third survey found that 45% of teens surveyed are on their smartphones “almost constantly.”

The U.S. Surgeon General’s Advisory on Social Media and Youth Mental Health concluded that obsessive or unmanageable social media use has been linked to sleep problems, attention problems, and feelings of loneliness among teens.

Human Trafficking

The U.S. Department of Homeland Security launched its “Blue Campaign” to educate the U.S. public regarding the dangers of human traffickers. Traffickers use romance scams, false offers of good paying jobs and violence to entice people into situations where they can be exploited. These people are forced into sex work, forced labor and unpaid or under-paid domestic work such as nannies and maids. People who are trafficked work in cities, suburbs and rural areas — literally anywhere in the U.S.. In 2021, the National Human Trafficking Hotline validated 10,360 human trafficking situations. A 2018 survey of trafficking survivors found that 55% were recruited via text, website or app.

Extremist Recruiting

The Southern Poverty Law Center and American University’s Polarization and Extremism Research and Innovation Lab (PERIL) created a guide for parents and caregivers to help identify and combat online radicalization during the COVID-19 pandemic. This guide was developed due to concerns that children and teens would be spending more time on internet platforms during the pandemic, and that parents and other caregivers needed to be aware of the causes and methods of online radicalization. Methods that extremist groups use to recruit new members include:

  • Content “Rabbit Holes” or progressively violent content being presented by feed algorithms
  • “Filter Bubbles” or spending long periods of time with users who share a certain viewpoint
  • Peer Sharing — other kids may share propaganda. This content is often in the form of a dark meme, or videos displaying “edgy” jokes. These types of media can dehumanize entire groups of people and normalize violence against them, leading to further radicalization.
  • Direct contact with extremists online. Nearly all platforms, including online gaming and social media, have been infiltrated by extremists. Recruiters for these organizations communicate with youth through direct messages and other forms of personally directed communications.

Parents and other caregivers can take action to prevent radicalization.

  • Take it seriously. If your child/teen starts telling “edgy” jokes, talks about “dark memes” or starts to parrot hate-based speech, ask them about where they heard this.
  • Create a record. If you can, file a report with your school or with the school district. Create a written record and keep a copy for yourself. If you do not receive a response, send a written follow-up communication.
  • Discuss online safety and privacy practices.
  • Remind children that extremists are relatively few in number.
  • Get Help. Recommended resources include the Victim Connect Resource Center website or hotline, 1–855–4-VICTIM, and other organizations listed in the report.

Alex Peiser’s Story

Photo Credit: Author

Alexander Clifford Peiser was a sensitive, wickedly smart guy with a wry sense of humor. He loved acting, video games, the internet and the outdoors. On October 10, 2017, at age 17, Alex died by suicide. His online activities influenced the mental state that caused him to take his life.

Internet and social media industry guidelines and terms of service are clearly inadequate in making online experiences safe for all users. Federal legislation, such as the Kids Online Safety Act of 2023 and and the Children and Teens Online Privacy Protection Act (COPPA 2.0) are necessary to increase online safety for our children.

Acknowledgements: Many thanks to the members of Fairplay’s Screen Time Action Network Online Harms Prevention Work Group for sharing your research and lived experiences. You are brave, remarkable people.

Social Media Harms was created to help spread information to the public regarding the harms created by social media and other internet platform use. It contains lists of peer-reviewed studies, books and articles from media outlets that follow journalistic standards which document the negative effects of social media.

This work is licensed under the Creative Commons Attribution 4.0 International License. To view a copy of this license, visit or send a letter to Creative Commons, PO Box 1866, Mountain View, CA 94042, USA.



Sharon Winkler
Social Media Harms

Publisher/Editor Social Media Harms, Mother, Grandmother, Retired U. S. Naval Officer