Social Media Harms

Safeguarding Young Internet Users: Senate Commerce Committee Greenlights Kids Online Safety Act and COPPA 2.0

U.S. Congress Takes First Steps to Enhance Online Privacy and Safety for Children and Teens

Sharon Winkler
Social Media Harms


Photo by <a href=”">John Schnobrich</a> on <a href=”">

Passage of the Kids Online Safety Act (KOSA) of 2023 and the Children and Teens Privacy Protection Act 2.0 (COPPA 2.0) by the Senate Commerce Committee on July 27, 2023 marks a significant milestone in the ongoing efforts to protect children and teenagers online. These acts aim to address increasing concerns surrounding online safety, data privacy, and the potential risks faced by young internet users. The next step in the legislative process is for these bills to be scheduled by U.S. Senate leadership for debate and vote by the full Senate.

With the release of the U.S. Surgeon General’s Report on Social Media and Youth Mental Health and the American Psychological Association’s Health Advisory on Social Media Use in Adolescence prominent medical organizations are acknowledging that social media use may result in harms to at-risk children and adolescents. In response to this growing concern, the Kids Online Safety Act and COPPA 2.0 have been reintroduced to strengthen the protection of children’s privacy and safety online.

On May 2, 2023 by Senator Richard Blumenthal and Senator Marsha Blackburn (R-TN), reintroduced the Kids Online Safety Act (KOSA). This version incorporates changes requested by activists and online privacy groups to the legislation initially introduced in February 2022 and resulted in a bill that ensures privacy and search capabilities for minors while empowering parents and guardians to set security features. As of July 27, 2023, KOSA had 43 co-sponsors.

KOSA Major Provisions include:

Duty of Care

The act makes internet platforms legally responsible for content directed at users under 17 years old that encourages specific harmful actions including: suicidal behaviors, addiction-like patterns of use, physical violence, sexual exploitation and abuse, promotion and marketing of narcotic drugs, tobacco products, gambling, or alcohol, and predatory, unfair, or deceptive marketing practices or other financial harms.

KOSA also requires that platforms must take reasonable measures in the design and operation of products and services to prevent and mitigate: Anxiety, Depression, Eating Disorders, Substance Use Disorders, Encouragement of addiction-like behaviors, Encouragement of physical violence. KOSA explicitly allows the distribution of evidence-informed medical information regarding the above topics.

The Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition, or most current successor edition, is used to define the above mental health conditions. This change was made to narrow the subjects/content platforms could be sued for allowing minors to access.

This provision specifically does not include “gender dysphoria” as a topic that platforms could be sued for presenting to minors. The terms “grooming” and “self-harm” were also removed, which were interpreted by some advocacy groups as allowing states attorney generals overly broad authority to curb online speech.

Covered platforms are not required to prevent or preclude any minor from deliberately or independently searching for, or specifically requesting content or providing resources for the prevention of harms listed, including evidence-informed information and clinical resources.

Enhance User Controls and provide effective harm reporting systems.

Requires platforms to provide readily accessible, optional settings to:

  • Limit the ability of unrecognized users to communicate with and view personal data of minor users including geolocation data;
  • Limit features that extend user time on platform via addictive design;
  • Allow minor users to delete accounts and any personal collected data as well as features that allow users to limit the amount of time spent on the platform;
  • Requires platforms to provide a dedicated reporting channel to alert the platform to harms to minors, and requires them to substantively respond in a reasonable and timely manner.

Highlighted User Control/Parental Control notes:

KOSA requires that youth are notified if parental controls are turned on.

KOSA does not require the disclosure of minors browsing behavior, search history, private messages, or other communications to their parents or legal guardians.

Requires Covered Platforms to Share Data and Perform Annual Risk Assessments

Requires covered platforms to share data with qualified researchers and participate in independent, third-party risk audits, whose findings are shared with the public. This includes information regarding any platform user recommendation systems.

Children and Teens’ Online Privacy Protection Act (COPPA 2.0)

The Children and Teens’ Online Privacy Protection Act (COPPA 2.0) was reintroduced by Senators Edward Markey (D-MA) and Bill Cassidy, MD (R-LA) on May 3, 2023. This legislation updates children’s privacy protections passed in 1998 when internet services were far less complex. COPPA 2.0 aims to address the evolving challenges posed by social media platforms and smartphones in collecting personal data online from minors and to correct flaws in the original COPPA legislation.

COPPA was meant to minimize the amount of personal data online platforms could collect on users under 13 years old. Geoffrey Fowler in his Washington Post article, “Your kids’ apps are spying on them” reported on two loopholes that have essentially gutted COPPA enforcement. First, COPPA requires that platforms have “actual knowledge” that the user was under 13. If a user simply claimed that they were over the age of 13, companies argued that COPPA regulations did not apply. Platforms could also skirt COPPA and Federal Trade Commission (FTC) regulations by marketing apps/online services that could be attractive to users under 13 years old by marketing their services for “general audiences.”

“By the time a child reaches 13, online advertising firms hold an average of 72 million data points about them, according to SuperAwesome, a London-based company that helps app developers navigate child-privacy laws.” Source: Geoffrey Fowler, Washington Post,

Per SEN Markey the updated COPPA 2.0 would:

  • Prohibit internet companies from collecting personal information from users under age 16 without the user’s consent.
  • Revise COPPA’s “actual knowledge” standard, covering platforms that are “reasonably likely to be used” by children and protecting users who are “reasonably likely to be” children or minors;
  • Establish a “Digital Marketing Bill of Rights for Teens” that limits the collection of personal information of teens
  • Ban targeted advertising directed at children (as opposed to contextual advertising, for example, directing advertisements to a user based upon the content of a visited website).
  • Create an “Eraser Button” for parents and kids by requiring companies to permit users to eliminate personal information from a child or teen when technologically feasible;
  • Establish a Youth Marketing and Privacy Division at the Federal Trade Commission (FTC)

The United States is lagging behind other countries in regulating internet privacy protections for its citizens, and the introduction of these acts is a crucial step toward ensuring the safety and privacy of children and teens online.

Additional actions must be taken by Congress for this legislation to become law.

Contact your elected officials now and ask them to ask their leadership in the U.S. Senate and House of Representatives to put the the Kids Online Safety Act of 2023 and the Children and Teens Online Privacy Protection Act 2.0 up for a vote. Readers can find information on how to contact their senators here and representatives here.

Social Media Harms ( was developed to provide a listing of peer-reviewed studies, books and articles from authoritative sources that document the negative effects of social media use. It is meant to be a tool to be used by people who are concerned about social media’s negative effects on people’s emotions, actions and lives. We do not solicit donations, however, we are asking for additions to our lists of peer reviewed studies and authoritative books and articles.

This work is licensed under the Creative Commons Attribution 4.0 International License. To view a copy of this license, visit or send a letter to Creative Commons, PO Box 1866, Mountain View, CA 94042, USA.



Sharon Winkler
Social Media Harms

Publisher/Editor Social Media Harms, Mother, Grandmother, Retired U. S. Naval Officer