Social Media Harms
“Blackout Challenge”-A Deadly Social Media Game
Parent Sues TikTok After December 2021 Death of 10 Year-Old Daughter
Social media challenges have been around since these platforms came into widespread use in the late 2000s. The “Blackout Challenge” dares users to choke themselves until they pass out. These challenges come with different labels such as the “ChokingGame” and “thepassoutchallenge.”
Tawainna Anderson is suing TikTok and its parent company ByteDance in the United States District Court for the Eastern District of Pennsylvania for the wrongful death of her 10 year-old daughter, Nyla. Anderson is alleging that TikTok’s service has a “defective design”, allowing the “For You Page” algorithm to recommend the “Blackout Challenge” as an appropriate video for her 10 year-old daughter. Nyla apparently lost consciousness after choking herself and died on December 12, 2021, after several days in the pediatric intensive care unit.
Twelve year-old Joshua Haileyesus died in April 2021 three weeks after an attempt to do the TikTok “Blackout Challenge.” Joshua’s death was well-documented in mainstream media outlets including the The Charlotte Observer, Yahoo and MSN.
Why these challenges are so dangerous
Jean Rogers, Director of the Screentime Action Network, stated in the podcast No App for Life that many pre-teens and teens do these challenges alone and film them in order to post to social media platforms to get “likes” and additional “friends” or followers.
Children at this age think these videos are funny. They cannot see the dangers inherent in doing these challenges. Compounding this misperception is the fact that there are no videos of any users dying while attempting these challenges. She concluded:
“Involved parents, but it [tragedy] still happens”
TikTok Denies Challenges are Occurring
In his testimony to the Senate on October 26, 2021, Michael Beckerman, Vice President and Head of Public Policy, Americas, TikTok, stated that:
“ With regards to reports of scheduled challenges on TikTok, as a prominent disinformation researcher who focuses on TikTok recently pointed out: “When I looked into this, I could not find a single TikTok actually endorsing this behavior. All evidence indicates this is a hoax turned into reality by local news and school districts reacting to completely unconfirmed rumors.”
Joshua Haileyesus’ death in April 2021 disproves Beckerman’s statement.
Better Social Media Moderation Results in Safer Online Experiences
Increased moderation by social media platforms is the best way to prevent these heartbreaking incidents. Good content moderation is both labor-intensive and emotionally distressing for moderators. Many social media companies try to mitigate these problems by using artificial intelligence (AI) algorithms to perform moderation with poor results. Internal reports released in the Facebook papers, reveal studies by Facebook engineers who estimate that only 5% of hate speech is removed by AI. However, Meta CEO Mark Zuckerberg testified before Congress that the company removed 94 percent of the hate speech it finds before users report violations.
Regulation is Key
Effective moderation is expensive and directly affects technology company profits. U.S. governmental regulation is the only effective incentive for social media companies to design and build products that are safer for all users.
U.S. Children and Teens Deserve a Safer Internet. U.S. governmental regulation is the only way this will happen.
Contact your U.S. federal elected officials:
and ask them to endorse the
Social Media Harms provides a listing of peer-reviewed studies, scholarly books, and articles from authoritative sources that document the negative effects of social media use. The site also lists links to organizations dedicated to reducing the harms created by social media platforms and other online services. We do not solicit donations, however, we are asking for additions to our lists of peer reviewed studies and authoritative books and articles.