What to Expect as the Supreme Court Hears Gonzalez v Google

Jess Miers
Chamber of Progress
8 min readFeb 17, 2023
From the Gonzalez complaint. Complainants did not specify what the recommended videos displayed.

Next week the Supreme Court will hear the biggest Internet case in more than a decade: Gonzalez v Google. At issue is Section 230; the law that empowers all of us to create and engage with online content. The stakes are higher than ever as the Court is poised to disrupt decades of precedence that underpin the modern web.

The case is brought by the estate of Nohemi Gonzalez, a 23 year old U.S. citizen who was studying abroad in Paris. In 2015, Nohemi was tragically murdered by ISIS during the infamous Paris terrorist attacks. The Estate filed multiple unsuccessful lawsuits against YouTube, alleging that YouTube materially supported the Paris attacks by algorithmically recommending videos created by ISIS. The lower court dismissed their claims on Section 230 grounds.

What Must The Court Decide?

In their cert petition, the Estate asked the Court to consider whether Section 230 should be limited to “traditional editorial functions.” Specifically, the Estate — echoing Judge Katzmann’s partial dissent in Force v. Facebook — claims that Congress never intended to extend the immunity to the curation of user-created content.

The Supreme Court has unequivocally rejected “traditional editorial function” dicta. As Google pointed out in their merits brief, curative activities, such as promoting and displaying content, are absolutely protected by the First Amendment (e.g. Turner Broad. Sys. v. FCC; Manhattan Cmty. Access Corp. v. Halleck; Mia. Herald Publ’g Co. v. Tornillo).

In their reply, Petitioners concede some forms of curation, including the mere display and recommendation of third-party content, are within Section 230’s scope. Changing their original question presented, Petitioners now ask SCOTUS to consider whether Section 230 protects those activities when done so ‘algorithmically.’

A Preview Of The Arguments

On Tuesday, Petitioners will attempt to show that YouTube fails to meet one or more prongs of the three-part Section 230 immunity test.

Source: Prof. Eric Goldman, Internet Law Professor at Santa Clara University School of Law

Argument 1: YouTube is not a provider of an “interactive computer service”

Petitioners will argue that a website does not act as a provider of an ICS when they recommend content that the user did not request. Citing Dyroff v. Ultimate Software, Petitioners will suggest that “servers” (Section 230(f)(2)) are computers that respond to user requests. Petitioners will argue that because YouTube’s recommendations are not the result of a user request, YouTube is not an ICS.

This argument is meritless. Nothing in the text of Section 230 limits a ‘server’ to 1:1 communication. But even if the Court takes the bait, Respondents will argue that every time a user clicks on a website, that user is interacting with the website’s servers.

Further, as discussed in the amicus brief by The Copia Institute et al., recommendations and the algorithms that drive them are not magic. In order for YouTube to recommend relevant content to a user, the user must interact with the service. In that case, the user is indirectly requesting recommendations by continuing to engage with the service.

“In reality algorithms need not be complex: simply listing in chronological or alphabetical order is an algorithmic rendering. It is also important to remember, especially here, is that what is at issue is not some sort of foreign magic but tools of varying complexity that humans deliberately choose to employ as suits their expressive interests.”

It should be undisputed that YouTube qualifies as an interactive computer service.

Argument 2: Petitioners’ claims do not treat YouTube as a “publisher”

The second prong of the test is a highly contentious issue, and we anticipate that most of Tuesday’s arguments will revolve around it.

Petitioners will assert that their claims do not treat YouTube as a ‘publisher’ because YouTube is not acting as a publisher when it performs algorithmic recommendations. Specifically, Petitioners argue that YouTube’s algorithms go beyond the mere dissemination of third-party content. Rather, it’s the algorithm itself that causes the harm.

This argument is also meritless. The harm does not arise from the code that displays content to the user; rather, the content itself is what causes the harm. In fact, if the Petitioners’ request is granted, the Court would only assess the underlying algorithm in terms of proximate cause, and the Petitioners would have no grounds for a claim.

It is impractical to decouple the technology (i.e., the algorithms) from the content display. Indeed, it’s the algorithms that serve as the publishing function. Every single website that hosts user-created content relies heavily upon curation technology (i.e. ‘algorithms’). Section 230’s authors, former Rep. Chris Cox and Sen. Ron Wyden, agree:

“Targeted recommendations are one such innovation in content presentation. A platform that offers targeted recommendations like those at issue in this case displays particular content to users by using algorithms that are designed to analyze user data and predict what the user might want to see. Targeted recommendations are now ubiquitous across the Internet and exist in fields from social media to commerce.” …

“Recommending systems that rely on such algorithms are the direct descendants of the early content curation efforts that Congress had in mind when enacting Section 230.” Gonzalez v. Google. Brief amicus curiae of Senator Ron Wyden and Former Representative Chris Cox.

Petitioners (and several of their amici) also argue that Congress never intended Section 230 to extend beyond defamation torts. Petitioners suggest that the statute borrows the “publisher or speaker” language from a classic defamation claim. But Congress expressly chose to include exceptions for non-defamation torts under 230(e) (e.g. exceptions for federal criminal law, intellectual property, sex trafficking), anticipating non-defamation claims to which the immunity should apply:

“Moreover, had Congress intended to limit immunity to defamation claims, it could have said so explicitly. But it did not. Indeed, it would hardly have made sense for Congress to limit immunity to defamation claims, as the objectives stated in the statutory preamble would be undermined if all claims other than defamation could be used to hold platforms liable for illegal content produced by others.” I.d.

Argument 3: When it comes to personalized recommendations, YouTube is the “information provider.”

The last prong is also highly contentious. But Petitioners’ argument is bizarre. To support that YouTube is responsible for the content at issue, the Petitioners point to YouTube’s URLs. Petitioners claim that the source of the harmful video is a YouTube-created URL (for example: youtube.com/funnycats). Petitioners contrast this with the “third-party” URLs displayed by Google Search.

This argument is nonsensical. Again, the harm derives from the underlying content, not the physical URL. As the Government explained in their own brief:

“URLs are “inherent” in online publishing. All webpages have URLs, which let internet users access individual webpages. Petitioners’ ATA claims are not based on the content of any YouTube URLs. The random string of numbers and letters that make up YouTube URLs are not independently wrongful; they merely specify how users can access webpages.”

We might also hear Google argue the majority opinions in Dyroff and Force which held that recommendations and notification are not “information” (for purposes of Section 230(c)(1)) but instead are tools to facilitate third-party communication. Those tools are expressly covered under 230(f)(4).

Imagine the consequences if the Court were to accept Petitioner’s URL argument. Any website that offers an internal search function would be ineligible for Section 230. Meanwhile, Plaintiffs could sneak their generic third-party content claims around Section 230 by simply pointing out that the defendant created the URL where the content exists.

According to the Petitioners’ reality, Section 230 should apply to static websites with no internal search capabilities, no hyperlinks to user-created content, and no push notifications. Surely, that was not Congress’ intent.

“a web-collage of text and images excavated from the buried neighbourhoods of archived GeoCities pages (1994–2009),” Cameron’s World (cameronsworld.net)

The Ultimate Battle For Online Speech

Google faces an uphill battle next week. For starters, the case is technologically complex. There is a valid concern that the Court may simply not understand nor appreciate the technical complexities that drive the modern web. If Google is to be successful, their lawyers will need to spend some time walking the Court through the content creation life cycle.

Further, there’s no doubt that the facts in this case are tragic. Google will need to keep the focus on the legal strengths of their case. Arguments that will appeal to the more conservative leaning Justices will highlight the First Amendment concerns regarding a publisher’s right to curate content as they see fit.

For the left leaning Justices, Google will need to convey the unintended consequences this case could have for marginalized groups and issues at the center of the GOP culture wars. For example, an adverse decision in this case will detrimentally impact the availability of reproductive healthcare information for women living in states with anti-abortion laws. As we noted in our letter to AG Garland last year:

“Should the Court curb Section 230’s protections for algorithmic curation, online services would face extreme threats of liability for promoting life-saving reproductive health information, otherwise criminalized by state anti-abortion laws.”

Worse, as we mentioned in our amicus brief, deterring content curation through the greater threat of liability would only amplify the volume and risks posed by harmful content. In other words, cleaving algorithmic curation from Section 230 could result in the widespread dissemination of more ISIS content on YouTube, not less.

It is not enough for Google to simply win here. A situation where the Court affirms Section 230 for YouTube but subtracts algorithms from the immunity will result in a flood of frivolous lawsuits — that would fail on First Amendment grounds anyway — claiming “algorithmic harm” as a steady workaround. In fact, there are numerous social media addiction lawsuits waiting in the wings to do just that.

Indeed, the stakes are too high. Anything but wholehearted support for the decades of existing Section 230 precedence carries with it the risk of significant and far-reaching losses for all of us.

Chamber of Progress (progresschamber.org) is a center-left tech industry policy coalition promoting technology’s progressive future. We work to ensure that all Americans benefit from technological leaps, and that the tech industry operates responsibly and fairly.

Our work is supported by our corporate partners, but our partners do not sit on our board of directors and do not have a vote on or veto over our positions. We do not speak for individual partner companies and remain true to our stated principles even when our partners disagree.

--

--

Jess Miers
Chamber of Progress

Senior Counsel, Legal Advocacy at Chamber of Progress