Social Media Harms

Protecting Our Children: A Survivor Parent’s Arguments Supporting the Kids Online Safety Act (KOSA) of 2023

It’s Time to Give Parents, Teens and Children more control over their online experiences

Sharon Winkler
Social Media Harms


Photo by Mike Scheid on Unsplash

On October 10, 2017, I received an urgent call at work from my husband. Come home now. Little did I know that Alex had not gone to school as expected — instead, he stayed home and died by suicide. I knew that he was upset over the ending of a romantic relationship, but I could tell that something more was going on. He brushed off my attempts to find out what was wrong, telling me he was o.k.. It wasn’t until after he was gone that I realized that users online had influenced the mental state that caused him to take his life.

To be a survivor parent of a child who died by suicide in the United States is to belong to a secret society bound by pain and shame. To suggest that our children’s online activities might have influenced their deaths often results in skeptical looks, paternal platitudes and outright denials. Worse, due to lack of internet regulation, stonewalling from technology companies and legal protections from liability for Big Tech companies, parents feel helpless in effecting any change to make these platforms safer.

In the past few years, U.S. Senate hearings, revelations regarding internet company abuses/cover-ups and changes in academic sentiment are providing a ray of hope that parents, teens and children may be able to obtain more control over their online lives. The academic community has debated for years whether social media use had any negative effects on mental health. With the release of the U.S. Surgeon General’s Report on Social Media and Youth Mental Health and the American Psychological Association’s Health Advisory on Social Media Use in Adolescence prominent medical organizations are acknowledging that social media use may result in harms to at-risk children and adolescents. In May 2023, APA recommended that internet platforms:

“Tailor social media use, functionality and permissions to youths’ developmental capabilities; designs created for adults may not be appropriate for children.”

Congress must pass legislation to force internet platforms to design safe services for minors and mitigate the spread of specifically defined harmful content to children and teens. The Kids Online Safety Act of 2023 (KOSA) will provide those new protections. Re-introduced on May 2, 2023 by Senator Richard Blumenthal and Senator Marsha Blackburn (R-TN), this version incorporates changes requested by activists and online privacy groups to the legislation initially introduced in February 2022 and approved by the Senate Commerce Committee. As of May 17, 2023, KOSA had 33 co-sponsors. KOSA has not yet been scheduled for committee mark-up, so time is still available to update this legislation to fill gaps and mitigate unintended consequences.

“Kids & parents want to take back control over their online lives. They are demanding safeguards, means to disconnect, & a duty of care for social media.” SEN Richard Blumenthal

Significant KOSA Provisions:

Duty of Care

Establishes a legal requirement making internet platforms responsible (meaning they can be sued) for directing content to users under 17 years old that encourages the following specific actions: suicidal behaviors, patterns of use that indicate/encourage addiction-like behaviors, physical violence, sexual exploitation and abuse; promotion and marketing of narcotic drugs, tobacco products, gambling, or alcohol, and predatory, unfair, or deceptive marketing practices or other financial harms.

KOSA also requires that platforms must take reasonable measures in the design and operation of products and services to prevent and mitigate: Anxiety, Depression, Eating Disorders, Substance Use Disorders, Encouragement of addiction-like behaviors, Encouragement of physical violence. KOSA explicitly allows the distribution of evidence-informed medical information regarding the above topics.

KOSA 2023 uses the Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition, or most current successor edition, to define the above mental health conditions. This change was made to narrow the subjects/content platforms could be sued for allowing minors to access.

This provision specifically does not include “gender dysphoria” as a topic that platforms could be sued for presenting to minors. The terms “grooming” and “self-harm” were also removed, which were interpreted by some advocacy groups as allowing states attorney generals overly broad authority to curb online speech.

KOSA also requires that platforms must take reasonable measures in the design and operation of products and services to prevent and mitigate: Anxiety, Depression, Eating Disorders, Substance Use Disorders, Encouragement of addiction-like behaviors, Encouragement of physical violence.

Covered platforms are not required to prevent or preclude any minor from deliberately or independently searching for, or specifically requesting content or providing resources for the prevention of harms listed, including evidence-informed information and clinical resources.

Enhance User Controls and provide effective harm reporting systems.

Requires platforms to provide readily accessible, optional settings to:

• Limit the ability of unrecognized users to communicate with minor users;

• Prevent other users from viewing minor users’ personal data;

• Limit features that extend user time on platform via addictive design;

• Restrict sharing of minor users’ geolocation;

• Allow minor users to delete accounts and any personal collected data

• Limit the amount of time spent on the platform

Highlighted User Control/Parental Control notes:

KOSA requires that youth are notified if parental controls are turned on

KOSA does not require the disclosure of minors browsing behavior, search history, private messages, or other communications

Requires platforms to provide a dedicated reporting channel to alert the platform to harms to minors, and requires them to substantively respond in a reasonable and timely manner.

Requires Covered Platforms to Share Data and Perform Annual Risk Assessments

Requires covered platforms to share data with qualified researchers and participate in independent, third-party risk audits, whose findings are shared with the public. This includes information regarding any platform user recommendation systems.

Photo of Alex Peiser from author’s collection

It is my greatest wish that internet platforms are redesigned with the safety of their users as their highest priority. It appears that senior management of many of these companies do not share that same wish. Given the recent inaction by the Supreme Court of the United States to provide exceptions to U.S. laws shielding Big Tech companies from liability from their abuses, U.S. federal and state legislation are the only avenues available to ensure online protections for our children and teens.

Don’t let another family suffer a tragic death or injury. Contact your elected officials now and ask them to support the Kids Online Safety Act of 2023.

Social Media Harms ( was developed to provide a listing of peer-reviewed studies, books and articles from authoritative sources that document the negative effects of social media use. It is meant to be a tool to be used by people who are concerned about social media’s negative effects on people’s emotions, actions and lives. We do not solicit donations, however, we are asking for additions to our lists of peer reviewed studies and authoritative books and articles.

This work is licensed under the Creative Commons Attribution 4.0 International License. To view a copy of this license, visit or send a letter to Creative Commons, PO Box 1866, Mountain View, CA 94042, USA.



Sharon Winkler
Social Media Harms

Publisher/Editor Social Media Harms, Mother, Grandmother, Retired U. S. Naval Officer