If Congress Repeals Section 230, What Will that Mean for Wikipedia? Here’s an (Incomplete) List of Things that Will Break

Wikimedia Policy
Wikimedia Policy
Published in
16 min readNov 1, 2023


This is the first installment in our three-part series about Section 230.
(Read the
second blog post, and the third post)

Photograph of a broken BenQ computer keyboard
A broken BenQ computer keyboard. Image by Santeri Viinamäki, CC BY 4.0, via Wikimedia Commons.

Written by the Wikimedia Foundation’s: Stan Adams, Lead Public Policy Specialist for North America

“No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” These “26 words that created the internet” are the first provision in Section 230 of the 1996 Communications Decency Act (CDA) (47 U.S.C. 230), the statute that protects the providers of websites and online services from lawsuits for harms caused by information that their users post.

Although famous for its brevity, the law protects an incredible variety of websites and online services. Wikimedians across the country rely on these services as the community of volunteers who create, edit, and curate open and free knowledge on Wikipedia and other Wikimedia projects.

Consider, for instance, a hypothetical series of events leading a Wikimedian to make an edit to the English language Wikipedia article on Section 230. (Note that the bold font indicates websites and online services that benefit from Section 230’s liability protections.)

They begin their day by connecting their computer to the internet via their home network or their neighborhood coffee shop’s Wi-Fi. They check their email, including their spam folder, which they then empty. Noting that the latest email on a Wikimedia community thread mentions a topic they are unfamiliar with — something called Section 230 — they do a web search to learn more. Their search provider filters and downranks irrelevant, potentially dangerous, or unresponsive websites, so they find some potentially helpful entries on the first page of the results. Skimming the snippets of text included alongside several of the entries, they click on a blog from a tech policy nonprofit that seems informative.

Intrigued by what seems like a hot topic in an ongoing debate, they return to the search results and navigate to the relevant Wikipedia article to see how recently it has been edited. Recognizing the username of a friend in the edit history, they use an encrypted messaging app to ask the friend to join other community members in an upcoming virtual meeting: the group will be using a videoconferencing service to discuss how changes to the law might impact Wikimedia projects. Eager to learn more before the meeting, the Wikimedian reads some scholarly articles hosted on open archive websites and checks some of the older court opinions archived in the Internet Archive’s Wayback Machine.

Rereading the most recently added portion of the article, they notice what appears to be an inaccurate framing of the recent Supreme Court cases in which Section 230 was raised. They click the “Edit” button, make a minor revision (citing a well-known news source) and add a note explaining the change. The editing interface and other parts of the tech infrastructure supporting Wikimedia projects are powered by software developed using open source, collaborative, codesharing spaces like Gerrit and Phabricator. Like so many other aspects of the Web, these spaces depend on cloud computing infrastructure to allow users from around the world to harness more computing power and digital storage than they have at home, and to easily and efficiently collaborate on shared projects.

So, in only a few hours, a Wikimedian might interact with a dozen or more parts of the internet ecosystem that depend on Section 230 protections. Websites, apps, and services like the ones mentioned above have become such an integral part of our internet experience that it can be easy to take them for granted. It’s even easier to overlook how Section 230’s liability protections for third-party content help to make these services and websites possible at all. In this blog post, we build on an idea proposed by Stanford University’s Daphne Keller and further developed by Mike Masnick at Techdirt: The internet is more than just Facebook, Amazon, and Google. Lawmakers should test their policy proposals’ impacts on a much wider variety of internet services than just the large social media companies that they want to regulate.

The following (incomplete) list is intended as a resource to which you can return time and time again. For instance, it can serve as a primer whenever you want to consider why some of the useful, if not critical, websites and services that we mentioned above depend on Section 230, and how they might change if Section 230 were reformed or repealed.

Excerpts from 47 U.S. Code § 230
also known as
Section 230 of the 1996 Communications Decency Act (CDA)

"(c) Protection for "Good Samaritan" blocking and screening of offensive material
(1) Treatment of publisher or speaker
No provider or user of an interactive computer service shall be treated as the
publisher or speaker of any information provided by another information content
(2) Civil liability
No provider or user of an interactive computer service shall be held liable on
account of -
(A) any action voluntarily taken in good faith to restrict access to or
availability of material that the provider or user considers to be obscene, lewd,
lascivious, filthy, excessively violent, harassing, or otherwise objectionable,
whether or not such material is constitutionally protected; or
(B) any action taken to enable or make available to information content providers
or others the technical means to restrict access to material described in
paragraph (1)

(f) Definitions
(2) Interactive computer service
The term "interactive computer service" means any information service, system,
or access software provider that provides or enables computer access by multiple
users to a computer server
, including specifically a service or system that
provides access to the Internet and such systems operated or services offered by
libraries or educational institutions.
(3) Information content provider
The term "information content provider" means any person or entity that is
responsible, in whole or in part, for the creation or development of information
provided through the Internet or any other interactive computer service."

[the emphasis is ours]
A photograph of a Wikipedia mural intervened by stickers, posters, and stencils in Aguascalientes, Mexico
A Wikipedia mural intervened by stickers, posters, and stencils in Aguascalientes, Mexico. Image by Luis Alvaz, CC BY-SA 4.0. via Wikimedia Commons.


Learning about anything on the world’s most trusted reference website is made possible by Section 230.

Why it relies on Section 230: All of the content on Wikipedia and other Wikimedia free knowledge projects is created and edited by volunteer editors. Section 230 protects the Wikimedia Foundation from legal liability for content posted by users, and protects volunteer editors from liability when they edit or remove content posted by other users.

Potential sources of liability include defamation, such as a volunteer posting something that someone else believes is untrue and reputationally damaging about a notable living person. On average, more than 100,000 volunteers from across the globe edit Wikipedia every month, generating roughly 5 million edits in that time. To minimize the risk of liability for this content, Wikimedia volunteers and the Foundation would need to implement a legal review and clearance process like those used by newspaper and book publishers—an expensive and time consuming proposition, not to mention a major structural change that would undermine the community-led Wikimedia model.

An often overlooked aspect of Section 230 is that it also protects the users of “interactive computer services” from liability for information provided by other users. For Wikimedians, this means that they are shielded from liability for their efforts to edit or remove contributions from other editors. Here, without Section 230’s liability protections, the Wikimedia’s community-led model simply would not work: The act of editing could expose volunteers to lawsuits brought by either the subject of the article or the editor who contributed the content in question.

What would happen if Section 230 were repealed: Without Section 230, volunteers would be exposed to the risk of litigation based on the contributions of other volunteers. At the same time, volunteers would become sources of liability for the Foundation. As a result, the Foundation would need to shift away from Wikimedia projects’ transparent, community-led content moderation and editing model toward a more top-down, centralized system of managing content on projects like Wikipedia. The Foundation would have to play a much larger role in content moderation, imposing stricter editorial standards on the volunteer community, and spending more money defending volunteers from things like SLAPPs (strategic lawsuits against public participation).

This transition would almost certainly impact volunteers’ interest in contributing to the projects, diminishing the quality of content and weakening many of the positive features that the community-led model provides. The Foundation’s budget, which is primarily funded by small gifts from individual donors, would have to allocate a larger portion toward legal defense and away from things like developing software and providing grants to the volunteer communities. A handful of malicious lawsuits in the United States could drain the annual budget entirely.

Public Library Computer Labs, Coffee Shop Wi-Fi, Internet Service Providers (ISPs)

Coffee shops, libraries, hotels, airports, employers, and others offering public internet access are protected by Section 230, as are the internet service providers who provide your connectivity at home.

Why they rely on Section 230: Providing access to the internet and websites is explicitly covered by Section 230. Providing such access allows users to upload, download, and view content from the internet, which could create liability for the provider. Section 230 shields providers from this liability, which is why places like coffee shops and hotels are comfortable providing such access. Potential sources of liability run the gamut of civil claims, including defamation, privacy torts, fraud, and the intentional infliction of emotional distress. ISPs recently were forced to settle with plaintiffs in several copyright cases, which are not blocked by Section 230, because they failed to terminate the accounts of users who were alleged to have infringed copyrights. Without Section 230, the scope of liability for ISPs based on their subscribers’ behavior would be much broader. ISPs would be more motivated to eavesdrop on every communication crossing their networks and to filter, block, or even terminate user accounts to avoid liability.

What would happen if Section 230 were repealed: Many providers would simply stop offering access. Others might require payment, liability waivers, or restrict user access to activities the provider considers “safe.” ISPs might block access to large portions of the Web, severely restricting everyone’s ability to access information, entertainment, and more. Uploading content via public Wi-Fi would almost certainly be blocked, since it would be too risky to allow without lengthy legal review and risk analysis of uploaded content.

Email, Videoconferencing, and Messaging Services

Whether it’s text, audio, or images, communicating is one of the main things we do on the internet. All of the apps and web-based services you use to communicate with other people on a computer or mobile device are protected by Section 230.

Why they rely on Section 230: You’re probably picking up on a theme here: letting people say and do things online creates liability for providers. That’s equally true for these vital services, since letting people connect in virtual spaces invites all kinds of liability risk based on the behavior of users. Section 230 just reinforces the obvious: those providers are neither the speaker nor publisher of the speech of their users and, therefore, shouldn’t be held liable for it. For messaging, especially end-to-end encrypted systems, legal claims based on negligence or recklessness would be a real problem. Plaintiffs would claim that providers “should have known” that their users could engage in harmful speech, but failed to stop it and are, hence, liable. The possibility of costly litigation–even if the providers would ultimately win–would discourage them from offering such services.

What would happen if Section 230 were repealed: As with many of the cases above, providers would either take measures to protect themselves or just stop offering services that expose them to liability. Hypothetical protective measures in this case include waivers and indemnity clauses (which still involve some risk, at least for the first several cases against each type of provider, because they would require litigation to determine their validity and enforceability in different contexts), but also time delays as well as monitoring and pre-screening all communications on the platforms. Some email providers might try to “close” their systems so that users could only send and receive messages to other users of the provider’s service, such that all users would be subject to contractual provisions that protect the provider.

A photograph of a woman standing on a train platform using her smartphone
A woman standing on a train platform uses her smartphone. Image by Daria Nepriakhina, CC0 1.0, via Unsplash.

Open Source Digital Archives

Knowledge and scholarship shouldn’t be locked behind paywalls. These digital archives provide free access to knowledge, various kinds of records, and other information covering everything from cutting-edge scientific research papers to copies of long-deleted webpages from the ‘90s.

Why they rely on Section 230: Websites like Arxiv and the Wayback Machine are almost entirely composed of third-party content. Like many of the other services we’ve mentioned, this exposes them to liability for things like defamation, where being treated as the “publisher” or “speaker” of information could lead to civil liability. Archivists need to be able to preserve materials as they find them, because to do otherwise is to redact history. By shielding archivists from civil claims based on content created by others, Section 230 empowers the preservation and sharing of archival knowledge.

What would happen if Section 230 were repealed: Like Wikipedia and other websites, many digital archives and repositories would either stop allowing user contributions or implement rigorous legal review processes to minimize their liability risks. Lawyers would have to vet every new research paper submitted by a user or webpage preserved by Internet Archive for legal risk before allowing it to appear on the website, severely limiting the speed with which knowledge is shared with the world as well as the scope of archivists’ preservation efforts.

Browsers and Web Search Providers

Even if your internet experience is primarily app-based, you still probably use these two services from time to time. Along with the access providers, browsing and search providers offer some of the most fundamental and necessary services for making use of the internet.

Why they rely on Section 230: Browsers enable access to the Web, allowing you to navigate to and from nearly all of its hundreds of millions of websites and tens of billions of webpages. In doing so, they also display those webpages on your computer screen and allow your computer to upload and download content and software programs. Without Section 230, browser providers could be held liable for harms caused by the content or software their programs help you access.

Web search providers scour the web for other people’s content to identify, sort, rank, and display what they determine are the most relevant web pages for your query. In response to queries, providers display thumbnail images, snippets, and often include user reviews or ratings as part of the relevant search results. Each of these actions could otherwise expose them to liability without Section 230’s protections. However, each of these actions is also a form of protected speech guaranteed by the First Amendment, so cases brought against search providers for their editorial decisions would almost always fail anyway. The primary benefit of Section 230 is that it provides a more efficient way to dispose of these cases, saving plaintiffs, defendants, and courts both money and time.

What would happen if Section 230 were repealed: Although search might not disappear entirely, it likely would include more barriers and paperwork, such as liability waivers, and additional data collection. This might involve creating accounts—reducing anonymity and increasing privacy risks for users—or subscription fees. For browsers like Mozilla’s Firefox or web search providers like DuckDuckGo, which strive to protect user privacy, measures like these to protect the companies could negatively impact their users. The possibility for new browsers or web search providers to emerge and challenge dominant companies would disappear because only the largest, richest of them would be able to afford the risks of liability. Some types of queries might be too risky altogether, such as searches for abortion clinics in a state where such services are banned, and it’s easy to imagine some states drafting new legislation to enable lawsuits against web search providers.

Blogs and Other Personal Websites

One of the more successful evolutions of Web 2.0, blogs make up about a third of the Web. That essay, recipe, or movie review you read today is just one of more than 600 million. There are more than 30 million active bloggers in the US alone, and all of them are protected by Section 230.

Why they rely on Section 230: Unless you host your own blog or personal website on your own server, it is hosted by a provider who relies on Section 230. Blog hosting providers like Medium and Automattic’s Wordpress depend on Section 230 to shield them from lawsuits over content posted by bloggers (or commenters on those blogs). Section 230 also protects the bloggers themselves from liability for comments on their blogs. Blogs are one of the long-lasting means by which people are able to create and share content online, but most people don’t have the technical expertise and time necessary to support their own privately hosted blog. Don’t forget that blogs are not just wonky policy articles like this one—they cover the spectrum of everything there is online, from journalism, cooking, gaming, and travel to cosplay, political commentary, knitting, and fashion.

What would happen if Section 230 were repealed: The availability of blog hosting providers would decrease significantly, and those who remained would impose restrictive policies on what kinds of content they host, on what comments are allowed, lengthy legal review processes, and other barriers to free expression.

A photograph of a three-dimensional artistic representation of a web crawler or spider, that is, an internet bot used by web search engines to index website content
An artistic representation of a web crawler/spider — i.e., an internet bot used by web search engines to index website content. Image by Haywood Beasley, CC BY 2.0, via Flickr.

Code Sharing Websites

Although you may not think of code sharing websites as being a core part of your internet experience, they are an important venue for the development of the software that defines that experience.

Why they rely on Section 230: If you haven’t visited a code sharing site and you aren’t a copyright buff, you may be wondering what code has to do with liability for speech. First, US courts have recognized computer code as speech protected by the First Amendment, whether many humans can understand it or not. Second, code and code sharing websites include lots of discussion in regular human languages, which cover what the code does, how it works, problems it may have, and more, And, yep, you guessed it: discussions on collaborative coding sites the code itself are sources of potential liability. Section 230 lets code sharing platforms host those discussions without being held responsible for what people say or what they do with the code.

What would happen if Section 230 were repealed: Opportunities for code sharing and collaborative development would be diminished, and subject to the same kinds of friction and barriers as many of the services already discussed. This would impact both innovation and transparency, impacts felt primarily by independent coders and small businesses, and significantly reduce opportunities for open source coding in general.

Cloud Storage and Computing Providers

The “cloud” is just a bunch of computers, somewhere else, which store, process, and allow access to information provided by others. If you use the internet, you use the cloud.

Why they rely on Section 230: Remember the “provides or enables access to a computer server” phrase from the definition of “interactive computer service”? That is exactly what cloud computing is. Cloud providers let you use their servers to store and retrieve information as well as to perform computations on that information. The possibilities for liability are clear: hosting content provided by others includes the possibility that cloud customers will store or distribute content via the cloud that invites legal action. Imagine if you let a complete stranger use your personal computer: wouldn’t you want to be shielded from liability for whatever they might use it for? Now imagine if you opened a business where anyone on the internet can use your computer . . .

What would happen if Section 230 were repealed: As with many of the other services we’ve mentioned, cloud providers would need to protect themselves against liability for user content. They might do this in several ways, including: reviewing all content uploaded to, created on, or transmitted through their systems; implementing much more restrictive terms of use; and, using indemnity clauses, along with collecting additional identity and contact information for customers, so that plaintiffs could more easily bring claims directly against other users. Such measures undermine the privacy of users and their opportunities for free expression, reducing the diversity of voices and viewpoints on the internet.

Spam Filters, Ad Blockers, and Security Services

These behind-the-scenes services make the internet experience safer and better for users by filtering and blocking unwanted content provided by other users and by performing various security functions, such as blocking malware and mitigating denial-of-service (DoS) attacks.

Why they rely on Section 230: Email is another one of those services that has become so ubiquitous that it’s hard to imagine life without it, but it, too, allows people to access and share information through a computer server. Many email providers also include several features to make their services more usable or attractive for users, including tools to filter or block unwanted messages. Tools like spam filters and other software that shields users from certain kinds of content, such as ad blockers, are explicitly protected by Section 230(c)(2)(B). Without this protection, these providers of these services and software tools could face lawsuits if they block too much content, too little content, or even just the content of someone with enough money and lawyers to bully others through the legal system.

What would happen if Section 230 were repealed: Assuming, for the moment, that providers were able to protect themselves from liability well enough to offer email services, they still may not have enough legal certainty to offer spam or junk filters. That is, through liability waivers and terms of service, providers might be able to insulate themselves from claims from their own users, but they would face more difficulty insulating themselves from claims brought by those sending or receiving messages from outside their email domain. For example, a competitor might file suit against another email provider if its spam or junk filter blocked the competitor’s advertising emails. This hypothetical scenario is in fact very similar to the facts of the Malwarebytes, Inc. v. Enigma Software Group, USA LLC case. Without Section 230’s protections, many such services could simply vanish, leaving behind the flaming wreckage of your email inbox.

A photograph of a computer server rack line in operation with many differently colored activity lights shining
A computer server rack line in operation. Image by Tristan Schmurr, CC BY 2.0, via Flickr.


Lawmakers who call for repealing Section 230 should know what that would mean for the internet. Section 230 isn’t a “gift” to “Big Tech,” as some have claimed. It provides the fundamental legal certainty for providers to allow people to create, access, and interact with content on the internet—exactly how you might while waiting in line for a coffee you’ve ordered. It also empowers those providers to curate the services and virtual spaces they offer to meet the demands of their users, clients, and customers. This curation is protected by the First Amendment, but Section 230 makes it economically feasible by giving providers (and courts) a more efficient path to the same legal outcome.

Some lawmakers may see repealing Section 230 as a means to extract certain changes in policies and practices from a few large online social media platforms. However, that is an act of legislative extortion, holding the entire internet hostage until a few companies meet their demands. Instead of recklessly going down that road, we encourage Congress to engage a wider set of stakeholders who will be affected by the repeal—or substantial reform—of Section 230. Such dialogue is the first step toward developing a positive vision for the internet’s future, while acknowledging the real consequences of undermining the legal bedrock upon which the internet stands.

Stay tuned for more posts on Section 230 and its relationship with proposals to regulate online spaces!



Wikimedia Policy
Wikimedia Policy

Stories by the Wikimedia Foundation's Global Advocacy team.