New York’s Revised “SAFE for Kids Act” Still Faces a Legal Minefield

First Amendment Legal Challenge is Likely

Jess Miers
Chamber of Progress
6 min readMay 30, 2024

--

Created in DALL-E

As New York approaches the end of its legislative session, the proposed New York SAFE for Kids Act (S7694/A8148) is a hot topic. As I wrote back in March, the bill, which forces social media platforms to forgo algorithmic curation of feeds for users under 18 — has significant constitutional issues.

While the bill has undergone several changes since my initial analysis, the new draft in circulation does nothing to mitigate its legal infirmities.

Key Changes

In a notable effort to fix the bill’s initial problems, New York lawmakers recently made the following key changes in a working draft:

  • Removed Time Restrictions: The original version of SAFE for Kids Act made it unlawful for platforms to send notifications to minor users between 12am and 6am. This restriction has been removed.
  • Removed Age Flag Requirements: The original bill required platforms to treat any user as a minor if the user’s devices signaled or communicated that the user is a minor. This restriction has been removed.
  • Removed Private Right of Action: The original bill enabled private lawsuits against social media platforms for violations of the bill. The updated version limits enforcement to the New York Attorney General.
  • The Attorney General Shall Promulgate Rules: The original bill made AG rule promulgation voluntary. The updated version now mandates the AG to disseminate rules for enforcing the bill.

While these amendments represent meaningful progress, they do not address the significant constitutional issues that continue to plague the Act.

The Bill Still Runs Head First Into the First Amendment

The New York SAFE for Kids Act still maintains a preference for chronological feeds over personalized, algorithmically curated content, relying on the unsupported belief that the former is inherently safer. This view not only disregards compelling evidence to the contrary but also overlooks the significant advantages of algorithmic curation, which can greatly enhance online experiences by customizing content to individual preferences.

From a legal perspective, the creation and use of algorithmically curated feeds are central to the editorial functions of online platforms, which are protected under the First Amendment. This protection affirms platforms’ rights to determine their content presentation methods.

Recent legal decisions, such as in NetChoice v. Bonta, NetChoice v. Griffin, and NetChoice v. Yost, reinforce the principle that algorithmic curation, serving as a mechanism for speech delivery and user engagement, is protected by the First Amendment. Any legislation that prescribes specific content delivery methods, by inherently favoring one type of feed over another, constitutes an overreach into regulated speech.

For instance, the NetChoice v. Bonta decision critically noted:

“[T]he Act’s restrictions on the functionality of the services limit the availability and use of information by certain speakers and for certain purposes and thus regulate[s] protected speech.”

Echoing this sentiment, the NetChoice v. Griffin decision stated:

“If the State’s purpose is to restrict access to constitutionally protected speech based on the State’s belief that such speech is harmful to minors, then arguably Act 689 would be subject to strict scrutiny.”

As long as the New York bill continues to enforce restrictions on how content is curated and displayed, its First Amendment issues will persist unresolved.

The Bill’s Age Verification Requirement is Still Unconstitutional

The revised version of the bill eliminates the requirement that online platforms identify users as minors based on device signals. However, the bill still mandates that platforms differentiate between users under 18 and those 18 or older in how content is displayed.

Practically, this means that any platform with a curated feed — be it YouTube’s recommended videos, Facebook’s personalized Feed, or Etsy’s content listings, which are ranked by a listing quality score — must implement age verification to ensure that minors receive only non-curated, chronological feeds. Failure to comply with this could lead to enforcement actions by the Attorney General.

With that, the bill still compels platforms to use “commercially reasonable methods” for age verification, inadvertently leading them to collect extensive sensitive data from minor users. This increases the risk of security breaches. The recent NetChoice challenges also underscore how age verification impedes access to legally available information, creating a significant conflict with fundamental principles of free speech and privacy.

For example, the Court’s injunction against the California Age Appropriate Design Code criticized it for exacerbating privacy and security issues rather than mitigating them, through the compelled collection of additional personal data:

“Based on the materials before the Court, the CAADCA’s age estimation provision appears not only unlikely to materially alleviate the harm of insufficient data and privacy protections for children, but actually likely to exacerbate the problem by inducing covered businesses to require consumers, including children, to divulge additional personal information.” — NetChoice, LLC v. Bonta, №22-cv-08861-BLF (N.D. Cal. Sep. 18, 2023).

In a similar vein, the Arkansas Federal District Court in NetChoice v. Griffin underscored how age verification significantly impedes adults’ access to constitutionally protected speech, deterring engagement with online platforms:

“Requiring adult users to produce state-approved documentation to prove their age and/or submit to biometric age-verification testing imposes significant burdens on adult access to constitutionally protected speech and ‘discourage[s] users from accessing [the regulated] sites.’” (citing Reno v. ACLU).

As long as the bill mandates differential treatment of users based on age, it inherently requires social media platforms to implement age verification — a measure that is both constitutionally and practically problematic.

Teens Have First Amendment Rights Too

The bill still imposes access barriers for users under 18, undermining their critical need for online support and community.

Democratic FTC Commissioner Alvaro Bedoya spotlighted the significant chilling effect this legislation could have on teenagers seeking essential support for mental or physical health issues through social media communities like r/stopdrinking, r/mentalhealth, r/EatingDisorders, and r/AuntieNetwork. These platforms often serve as vital lifelines, and for New York’s youth, the bill risks cutting off these crucial connections.

This concern is not hypothetical but grounded in legal findings. The Federal Northern District Court of California in NetChoice v. Bonta acknowledged these risks, particularly for LGBTQ+ youth in hostile environments, emphasizing the challenge they may face in accessing vital information online:

“LGBTQ+ youth — especially those in more hostile environments who turn to the internet for community and information — may have a more difficult time finding resources regarding their personal health, gender identity, and sexual orientation.”

Accordingly, the bill also fails to recognize the established rights of minors. As reinforced by the Supreme Court in Brown v. Entertainment Merchants Association, minors possess a First Amendment right to access legally protected information — which includes individually curated information — independent of parental or governmental oversight.

AG Enforcement Invites a New Array of Issues

Lastly, while the removal of the private right of action is a positive development, the reliance on enforcement by the New York Attorney General introduces a set of new concerns.

The updated bill assigns the AG the responsibility to formulate rules on how the bill should be enforced, creating a potential conflict of interest. When the AG is tasked with both creating rules (a legislative function) and enforcing them (an executive function), it can lead to biased interpretations that might favor a particular enforcement agenda or align with political interests.

Additionally, this dual role can diminish the incentive to ensure that the rules are clear, fair, and reasonable. This can result in more aggressive or punitive enforcement actions than if the rules were established by an independent, neutral body.

Moreover, the concentration of power in the AG’s office could lead to abuses, especially in politically sensitive contexts where the AG could potentially target political adversaries or protect allies. This politicization of enforcement can make compliance a matter of political strategy rather than practical adherence, varying significantly between administrations. For instance, a Republican AG might approach enforcement quite differently from a Democratic AG, complicating the landscape for all platforms subject to the law.

In sum, New York’s SAFE for Kids Act still faces both legal and practical defects. If passed in its current form, it will almost certainly face a pre-enforcement legal challenge.

Chamber of Progress (progresschamber.org) is a center-left tech industry association promoting technology’s progressive future. We work to ensure that all people benefit from technological leaps, and that the tech industry operates responsibly and fairly.

Our work is supported by our corporate partners, but our partners do not sit on our board of directors and do not have a vote on or veto over our positions. We do not speak for individual partner companies and remain true to our stated principles even when our partners disagree.

--

--

Jess Miers
Chamber of Progress

Senior Counsel, Legal Advocacy at Chamber of Progress