California’s SB 680: Another Content-Based Restriction Heading for a First Amendment Collision

Social media liability bill seeks to regulate online speech

Jess Miers
Chamber of Progress
5 min readAug 10, 2023

--

Similar to the “Age Appropriate Design Code” (AADC) legislation that became law last year, California’s latest effort to regulate online speech comes in the form of SB 680, a bill by Sen. Nancy Skinner targeting the designs, algorithms, and features of online services that host user-created content, with a specific focus on preventing harm or addiction risks to children.

SB 680 prohibits social media platforms from using a design, algorithm, or feature that causes a child user, 16 years or younger, to inflict harm on themselves or others, develop an eating disorder, or experience addiction to the social media platform. Proponents of SB 680 claim that the bill does not seek to restrict speech but rather addresses the conduct of the Internet services within its scope.

However, as Federal Judge Beth Labson Freeman pointed out during a recent court hearing challenging last year’s age-appropriate design law, if content analysis is required to determine the applicability of certain restrictions, it becomes content-based regulation. SB 680 faces a similar problem.

Designs, Algorithms, and Features are Protected Expression

To address the formidable obstacle presented by the First Amendment, policymakers often resort to “content neutrality” arguments to support their policing of expression. California’s stance in favor of AADC hinges on the very premise that AADC regulates conduct over content. Sen. Skinner asserted the same about SB 680, emphasizing that the bill is solely focused on conduct and not content.

“We used our best legal minds available […] to craft this in a way that did not run afoul of those other either constitutional or other legal jurisdictional areas. [T]hat is why [SB 680] is around the design features and the algorithms and such.”

However, the Courts have consistently held differently, and precedent reveals that these bills are inextricably intertwined with content despite such claims.

The Supreme Court has long held that private entities such as bookstores (Bantam Books, Inc. v. Sullivan (1963)), cable companies (Manhattan Community Access Corporation v. Halleck (2019)), newspapers (Miami Herald Publishing Co. v. Tornillo (1974)), video game distributors (Brown v. Entertainment Merchants Association (2011)), parade organizers (Hurley v. Irish-American Gay, Lesbian and Bisexual Group of Boston (1995)), pharmaceutical companies (Sorrell v. IMS Health, Inc. (2011)), and even gas & electric companies (Pacific Gas and Electric Co. v. Public Utilities Commission (1986)) have a First Amendment right to choose how they curate, display, and deliver preferred messages. This principle extends to online publishers as well, as the Court affirmed in Reno v. ACLU in 1997, emphasizing the First Amendment protection for online expression.

Moreover, courts have explicitly recognized that algorithms themselves constitute speech and thus deserve full protection under the First Amendment. In cases like Search King, Inc. v. Google Technology, Inc. and Sorrell, the courts held that search engine results and data processing are expressive activities, and algorithms used to generate them are entitled to constitutional safeguards.

In a more recent case, NetChoice v. Moody (2022), the U.S. Court of Appeals for the Eleventh Circuit declared certain provisions of Florida’s social media anti-bias law as unconstitutional, affirming that social media services’ editorial decisions — even via algorithm — constitute expressive activity.

Further, The Supreme Court’s stance in Twitter, Inc. v. Taamneh (2023) supports the idea that algorithms are merely one aspect of an overall publication infrastructure, warranting protection under the First Amendment.

This precedent underscores a general reluctance of the courts to differentiate between the methods of publication and the underlying messages conveyed. In essence, the courts have consistently acknowledged that the medium of publication is intricately linked to its content. Laws like SB 680 and the AADC are unlikely to persuade the courts to draw any lines.

SB 680’s Not-So-Safe Harbor Provision is Prior Restraint

Sen. Skinner also suggested at a legislative hearing that SB 680 is not overly burdensome for tech companies due to the inclusion of a “safe harbor” provision. This provision offers protection to companies conducting quarterly audits of their designs, algorithms, and features that may potentially harm users under 16. Companies that “correct” any problematic practices within 60 days of the audit are granted the safe harbor.

However, the safe harbor provision is yet another violation of the First Amendment. In practice, this provision acts as a prior restraint, compelling tech companies to avoid publication decisions that could be seen as violations for users under 16. The requirement to “correct” practices before publication restricts their freedom to operate.

Recall that the AADC also includes a similar requirement for mandatory data privacy impact assessments (DPIAs). Although the State of California defended this provision by arguing that it doesn’t mandate companies to alter the content they host, Judge Freeman disagreed, noting that the DPIA provision in the AADC forces social media services to create a “timed-plan” to “mitigate” their editorial practices.

In reality, both the “safe harbor” provisions of the AADC and SB 680 lead to services refraining from implementing certain designs, algorithms, or features that could potentially pose risks to individuals under 16. This cautious approach even extends to features that may enhance the online environment for parents and children, such as kid-friendly alternatives to products and services offered to the general public.

The online world, like the offline world, carries inherent risks, and services continually strive to assume and mitigate those risks. However, laws like the AADC and SB 680 make it too risky for services to make meaningful efforts in creating a safer online environment, ultimately hindering progress towards a safer web.

SB 680 is a Solution in Search of a Lawsuit

In a manner akin to newspapers making decisions about the content they display above the fold, letters to the editor they choose to publish, or the stories and speakers they feature, social media services also make choices regarding the dissemination of user-created content. While newspapers rely on human editors to diligently apply their editorial guidelines, social media companies use algorithms to achieve a similar objective.

However, it is puzzling that newspapers rarely face the kind of political scrutiny experienced by their online counterparts today. The idea of the government telling the New York Times how to arrange their stories in print editions seems inconceivable. But for some reason, we don’t react with similar concern when the government attempts to dictate how websites should display user content.

Despite an abundance of legal precedents upholding First Amendment protections for the publication tools that enable the delivery of protected expression, California lawmakers persist with SB 680. The federal courts’ skepticism toward the AADC law should be a warning light: If SB 680 becomes law this Fall, California will once again find itself embroiled in an expensive legal battle over online expression.

Chamber of Progress (progresschamber.org) is a center-left tech industry policy coalition promoting technology’s progressive future. We work to ensure that all Americans benefit from technological leaps, and that the tech industry operates responsibly and fairly.

Our work is supported by our corporate partners, but our partners do not sit on our board of directors and do not have a vote on or veto over our positions. We do not speak for individual partner companies and remain true to our stated principles even when our partners disagree.

--

--

Jess Miers
Chamber of Progress

Senior Counsel, Legal Advocacy at Chamber of Progress