What to Expect as the Supreme Court Hears NetChoice v. Moody / Paxton

Andres Calzada
Chamber of Progress
7 min readFeb 23, 2024

On February 26th, the Supreme Court will hear arguments in NetChoice & CCIA v. Moody / Paxton. The case holds immense significance, reflecting the broader theme of this Court term where critical issues are at stake, notably concerning freedom of speech on the Internet. It is no exaggeration to say that the outcome of this case has the potential to reshape the boundaries of online expression as we know them.

At issue are laws from Florida (SB 7072) and Texas (HB 20) aimed at restricting social media companies’ ability to moderate content on their platforms. Republican state lawmakers champion these bills as remedies to what they perceive as biased suppression of conservative viewpoints online.

NetChoice & CCIA, representing a coalition of social media companies, has rightfully challenged these laws, contending that they infringe upon the companies’ First Amendment rights. Should these laws be allowed to go into effect, platforms would lose the ability to decide what speech to disseminate. Instead of ensuring the Internet remains a bustling marketplace of ideas where diverse voices can thrive, these bills would effectively compel platforms to carry violent, racist, pornographic, or otherwise unwanted content.

How did this case land in the Supreme Court?

Soon after being signed into law in early 2021, Federal district courts in Florida and Texas enjoined these laws on First Amendment grounds. In turn, Florida appealed to the Eleventh Circuit (Moody v. NetChoice), and Texas to the Fifth Circuit (NetChoice v. Paxton). The Fifth and Eleventh Circuits issued conflicting decisions. The Eleventh Circuit struck down the part of the Florida law that limits the power of social media platforms to moderate and curate content, but upheld most of the law’s individualized-explanation requirement. The Fifth Circuit, in contrast, upheld Texas’s law in its entirety. Given the disagreement between the circuit courts over these closely related laws, the Supreme Court decided to take up both cases at once.

What Must The Court Decide?

In their cert petition, the States have asked the Court to consider whether the laws’ content-moderation restrictions and individualized-explanation requirements comply with the First Amendment.

Preview of the arguments:

Argument 1: Social media platforms are common carriers and must refrain from arbitrarily discriminating against its customers’ speech

At the core of this challenge is the question of whether social media entities possess the First Amendment right to exercise editorial discretion. This issue gains complexity against the backdrop of States like Texas and Florida enacting laws that impose viewpoint-based regulations on these platforms, challenging their operational freedoms under the guise of preventing censorship.

Indeed, the States’ stance leans heavily on analogizing social media companies to traditional common carriers, which historically include utilities like telephone services, broadcast cable companies, and Internet access providers, arguing that social media platforms, by virtue of their ubiquity and role in public discourse, should similarly be restricted from exercising editorial judgment over content. This comparison is aimed at sidestepping the stringent requirements of strict scrutiny, typically applied to regulations affecting speech, by suggesting that social media platforms, due to their function, forfeit a degree of their editorial rights.

The argument hinges on the nature of common carriers, which are subjected to a different constitutional scrutiny due to their essential role in facilitating public communication, without imparting or altering the content conveyed. By drawing parallels to how telephone companies, which do not monitor or curate the content of conversations, are regulated, States aim to establish a foundation for treating social media platforms under a similar regulatory framework. However, this analogy falters when scrutinizing the active role social media companies play in content moderation. Unlike passive conduits of communication, these platforms engage in a form of expression through their moderation policies, curating and shaping the online world in ways that are fundamentally dissimilar to the non-expressive role of traditional common carriers.

Moreover, the argument further distances social media companies from the oft-cited Pruneyard Shopping Center v. Robbins, where private property was deemed subject to certain public speech activities. Social media platforms, by actively moderating content, demonstrate a level of expressive engagement with the content they host, distinguishing them from the passive, non-expressive nature of a shopping mall’s operation. This distinction underscores the platforms’ argument that they engage in protected First Amendment activities by making editorial decisions, a characteristic that is inherently at odds with the non-expressive, utility-based function of common carriers.

Finally, in recent legal maneuvers, States have endeavored to frame their push against social media platforms as a noble quest to eliminate online discrimination, leveraging civil rights rhetoric as a facade for their actions. This strategic positioning is deeply ironic, considering these States’ own legislative records which include measures that arguably undermine the rights of women, people of color, and LGBTQ+ communities. Their advocacy for online censorship under the guise of civil rights is disingenuous, aiming not to protect vulnerable groups but to suppress progressive views and perpetuate the ideologies they endorse. This conflation of content-based moderation with unlawful discrimination misrepresents the First Amendment’s safeguarding of editorial discretion, attempting to misuse civil rights law as an instrument for furthering censorship and oppression.

Argument 2: The States ‘s individualized-disclosure requirement is Constitutional

Lastly, the States will argue that the requirement that platforms provide detailed explanations to users whose content has been moderated complies with the First Amendment. They will cite Zauderer, which allows states to compel businesses to make factual and uncontroversial disclosures as long as they’re not “overly burdensome”. Without this individualized-disclosure requirement, petitioners contend, consumers are left in the dark about platform standards and policies.

The law’s individualized-disclosure requirement is indeed unduly burdensome, a point the Eleventh Circuit has already affirmed. Social media companies make millions of editorial decisions on a daily basis, not to mention the increasing role that algorithms continue to play in automating those decisions. Being required to provide a detailed justification for each of these decisions would be operationally unfeasible for platforms if not outright paralyzing. Further, the threat of being flooded with lawsuits and statutory damages would incentivize platforms to play it safe in order to avoid massive liability. In effect, platforms would cease to make editorial decisions altogether. This, petitioners will argue, will have an overall chilling effect on protected speech on the internet.

As discussed in the amicus brief by Professor Eric Goldman, the States’ arguments do not qualify for Zauderer analysis because compelling editorial transparency undercuts protected speech by targeting platforms’ decision-making processes. Additionally, Daphne Keller’s research on platform transparency laws warns that upholding such disclosure requirements empowers governments to “quietly reshape” platform editorial policies, exerting “new state control over ordinary people’s online speech.”

The threat to free speech protections posed by the States’ individual-disclosure requirement is alarming, and the Court should uphold the lower court’s decision.

What’s At Stake:

Conflicting and Balkanized legislation:

A ruling favoring Texas and Florida could set off a domino effect, prompting other states to craft their own speech-based regulations. The result? A chaotic patchwork of laws that’s not just confusing, but downright contradictory (or worse, the possibility that a single state, like Texas, would be left to dictate national Internet policy; see amicus filed by Chamber of Progress).

Consider the stark contrast between Texas’ HB 20, which restricts the removal of content based on viewpoints, including hate speech, and New York’s AB A7865A, which mandates websites to adopt moderation policies against hate speech. This discrepancy creates a regulatory conundrum for websites, forcing them into a precarious position where adherence to one state’s law inevitably leads to non-compliance with another’s. This regulatory dissonance threatens to transform the digital environment into a landscape where platforms, caught between conflicting legal expectations, might minimize user-generated content to avoid liability, impoverishing online discourse.

Surge in Litigation:

Allowing states to adopt these laws preventing websites from moderating content will likely open the floodgates to a barrage of lawsuits against platforms. Aggrieved users would likely seek legal recourse, emboldened to wield discrimination laws to force platforms to host hateful speech that contradicts platforms’ moderation policies and community guidelines.

This scenario starkly contrasts with past legal challenges where websites successfully removed harmful content, including Facebook’s removal of an account suspected of Russian interference in the 2016 U.S. presidential election and Twitter’s suspension of a user that was inciting hate against the LGBTQ+ community.

Polluting Communities:

Finally, and perhaps most pernicious of all, these laws are likely to lead to the amplification of the most extreme voices on the Internet and unleash the spread of disinformation, hate speech and other harmful content. This jeopardizes the health and safety of online communities, particularly those home to marginalized voices. Since its inception, the Internet has served as a vital platform for self-expression, connection, and learning about the wider world. The Justices have the opportunity to protect this marketplace of ideas for and digital public square for generations to come.

To safeguard the internet’s rich mosaic of voices and ideas, it’s crucial for the judiciary to dismantle these censorship-driven statutes, channeling the spirit of liberty and foresight that underscored the Court’s 1996 holding in Reno v. ACLU — a decision that once affirmed the web as a free expression frontier.

Chamber of Progress (progresschamber.org) is a center-left tech industry association promoting technology’s progressive future. We work to ensure that all Americans benefit from technological leaps, and that the tech industry operates responsibly and fairly.

Our work is supported by our corporate partners, but our partners do not sit on our board of directors and do not have a vote on or veto over our positions. We do not speak for individual partner companies and remain true to our stated principles even when our partners disagree.

--

--