Social Media Harms

Trailblazing Advocate: LGBTQ+ Lawyer and Parent Champions Kids Online Safety Act (KOSA) of 2023

KOSA “Ferocious Parenting to Protect Kids”

Sharon Winkler
Social Media Harms


Credit: Eating Disorders Coalition,

Laura Marquez-Garrett, a lawyer for the Social Media Victims Law Center who specializes in electronic evidence and forensic investigations, was recently interviewed by media and communications expert Joni Siani on the “No App for Life with Joni Siani” Season 2, “Accountable” podcast regarding her views on the Kids Online Safety Act of 2023.

Marquez-Garrett compares today’s online platforms to the platforms that were in existence when Section 230 of the Communications Decency Act was passed 27 years ago. She explains that she was in college when internet billboards became widespread, providing her as a member of the LGBTQ+ community, with resources and contacts that she might not have access to without online forums. During that time, Section 230 provided protections to users, allowing platforms to set and enforce terms of use, including allowing platforms to moderate content. She explains that today’s internet platform technologies, especially social media, are very different. Due to a lack of updated federal consumer protection regulations, protections have shifted from the user to the online platforms. Internet companies are not required to disclose design features or operational decisions. Parents and children are not given agency to decide what they see, the online platforms make those decisions for them. This shifting dynamic changes the entire online experience and makes those online experiences radically different for parents and their children. An example Marquez-Garrett used was the experiences of a typical girl under 16 years old who signs up for Snapchat. The Snapchat quick-add process often recommends adult users that the child does not know, exposing these minor girls to predators. This was very different from her online experiences as a teen and young adult. She states:

“what it [the internet] didn’t do, was: it didn’t target me, it didn’t exploit me, it didn’t harm me, it was a very different online world.”

Marquez-Garrett was skeptical of the first draft of KOSA in 2022, because she thought the early language was expressed in such a way that it could have many different interpretations. She changed her mind when she read the revised 2023 bill.

“The co-drafters of KOSA meet with LGBTQ+ groups, get their input…and then this incredible thing happens…KOSA is reborn…they made real changes..I cried when I read it…because not only was this something I could 100 percent stand behind…as a parent, as a member of the LGBTQ+ community..every time I read it now, I see how this bill will save kids.”

Four KOSA provisions were highlighted by Marquez-Garrett as especially important for kids online safety. A big concern expressed by many is that kids won’t have access to information they need. KOSA ensures that LGBTQ+ identifying kids will still have online access to information. Section 2, (3) (B) provides exceptions to the definition of “covered platforms” to include not-for-profit organizations and public and private educational institutions (preschool to higher education). Marquez-Garrett explains that under this definition, if a social media company suppressed information on, for example, trans youth health, nothing in KOSA would stop a not-for-profit organization from creating their own website to provide evidence-informed information and clinical services.

Duty of Care. Marquez-Garrett points out that this section gives parents and children the right to hold covered platforms accountable for design decisions and operations that cause six specific types of harms to minors. This is necessary because companies are making content choices for the user to enhance engagement with their products over user safety. She points to the Facebook papers as evidence of this business practice.

KOSA Section 3 (a) states “a covered platform shall take reasonable measures in the design and operation of any product, service, or feature that the covered platforms knows is used by minors to prevent and mitigate” six specific categories of harm. Section 3, (b) states that nothing in subsection (a) shall be construed to require a covered platform to prevent or preclude (1) any minor from deliberately and independently searching for, or specifically requesting content; or (2) the covered platform or individuals on the platform from providing resources for the prevention of the harms described in subsection (2), including evidence-informed information and clinical resources.

Safeguards for Minors. Marquez-Garrett emphasizes several provisions in Section 4, Safeguards for Minors. This provision requires “readily-accessible and easy-to-use safeguards” for users under 17 years old to:

  • Limit the ability of unknown individuals to communicate with minor users. Peer reviewed research has shown that direct messaging is the main method for targeted youth to be contacted online. A lawsuit brought by the Social Media Victims Law center cites the Facebook Papers as disclosing that Meta’s ““People You May Know” feature contributed up to 75% of all inappropriate adult-minor contact.”
  • Highest level safety features are enabled by default.
  • Prevent other users from viewing minor users’ personal data;
  • Opt out of personalized recommendation systems.
  • Limit features that extend user time on platform via addictive design
  • Prohibits Dark Patterns — “design elements that deliberately obscure, mislead, coerce and/or deceive website visitors into making unintended and possibly harmful choices.”
  • Bans the advertising of illegal products.
  • Requires platforms to provide a dedicated reporting channel to alert the platform to harms to minors, and requires them to substantively respond in a reasonable and timely manner. Covered platforms with over 10,000,000 monthly active users (MAU) have 10 days after the receipt of the report to respond; covered platforms with less than 10,000,00 MAU have 21 days. If the report involves an imminent threat to the safety of a minor, as promptly as needed to address the reported threat to safety.

KOSA requires that youth are notified if parental controls are turned on and DOES NOT require the disclosure of minors browsing behavior, search history, private messages, or other communications.

Section 6, Transparency. KOSA requires annual, independent, third-party risks assessments, whose findings are available to the public.

Marquez-Garrett emphasizes that currently, technology companies are controlling what minors view online. They are also making it possible for children to open online accounts without their parent’s permission. KOSA is essential for minor users and their parents to regain agency over their online experiences. She ends by summing up the sentiment that many of her families who have experienced online harms relate,

“I have to speak up because if it happened to us, it can happen to anyone.”

Don’t let another family suffer a tragic death or injury. Contact your elected officials now and ask them to support the Kids Online Safety Act of 2023. Fairplay has made it easy to contact your elected representatives here.

Social Media Harms was developed to provide a listing of peer-reviewed studies, books and articles from authoritative sources that document the negative effects of social media use. It is meant to be a tool to be used by people who are concerned about social media’s negative effects on people’s emotions, actions and lives. We do not solicit donations, however, we are asking for additions to our lists of peer reviewed studies and authoritative books and articles.

This work is licensed under the Creative Commons Attribution 4.0 International License. To view a copy of this license, visit or send a letter to Creative Commons, PO Box 1866, Mountain View, CA 94042, USA.



Sharon Winkler
Social Media Harms

Publisher/Editor Social Media Harms, Mother, Grandmother, Retired U. S. Naval Officer