Defending free knowledge ten years after Wikipedia went dark

Ten years ago, Wikipedia went dark to protest a pair of dangerous bills in the U.S. Congress. Today, we call on lawmakers everywhere to protect free knowledge.

Rebecca MacKinnon
Down the Rabbit Hole
5 min readJan 18, 2022

--

Image: Wikimedia Foundation, CC BY-SA 3.0.

On January 18, 2012, Wikipedia went dark for a day. For 24 hours, anyone, anywhere, who tried to access any article on English-language Wikipedia was confronted by a single black page with the message: “imagine a world without free knowledge.”

The decision to black out all of English Wikipedia, globally, was not made lightly. It took three days of non-stop debate and deliberation by more than 1,800 volunteers who edit and administer Wikipedia.

Their decision made Wikipedia an important focal point for a protest movement involving over 115,000 commercial and non-profit websites against proposed anti-piracy legislation in the United States — the Stop Online Piracy Act (SOPA) in the U.S. House of Representatives, and the PROTECT IP Act (PIPA) in the U.S. Senate.

As the Wikimedia Foundation’s then- Executive Director Sue Gardner put it at the time, by enabling censorship of online speech by private companies in the name of copyright protection, the bills “would seriously damage the free and open Internet, including Wikipedia.”

A positive vision for the internet’s future

Ten years later, the Wikipedia community has no current plans for another global blackout to protest any other legislation that we oppose. But policy threats to free knowledge are pervasive.

“If free knowledge communities and movements are to survive and thrive, we need regulators to be “for” an internet that supports open and free societies, not just “against” an internet dominated by a handful of powerful platforms.”

Authoritarian efforts to block, control, or intimidate community-run free knowledge projects like Wikipedia are increasing and growing more sophisticated. Yet even in democracies, lawmakers seeking to hold the largest commercial internet platforms accountable for abuses of power, or for harms caused by toxic business models, must avoid inflicting collateral damage on free knowledge communities and the people who depend on their work.

If free knowledge communities and movements are to survive and thrive, we need regulators to be “for” an internet that supports open and free societies, not just “against” an internet dominated by a handful of powerful platforms.

Internet policymaking should advance a positive vision of a better future: What sorts of laws and policies will help people educate themselves, connect with the broader world, and make informed decisions?

We believe that informed and empowered communities require free speech protections that not only protect press freedom and academic freedom, but also enable free knowledge projects like Wikipedia to thrive.

A community-led approach to platform governance

Wikipedia’s distributed, community-led model of content governance reinforces information quality and accuracy. It seeks to prevent the spread of misinformation and disinformation absent the algorithms and advertising that feed commercial interests.

For an eye-opening example with life-and-death implications, look no further than Wikipedia’s governance of online information related to COVID-19.

As one 2021 biomedical research paper put it, Wikipedia’s “coronavirus-related articles referenced trusted media sources and high-quality academic research.”

This high quality is the product of processes and rules developed and enforced by Wikipedia’s community of volunteer editors — not by people whose full-time job is website content moderation, or by algorithms. In fact, many of the people who edit and manage Wikipedia pages related to COVID-19 are doctors, scientists, and medical professionals. These volunteers collaborate closely through the Wikiproject COVID-19 page, where rules about what belongs on a Wikipedia page and enforcement decisions about content are discussed and debated openly, for anyone to view.

“We believe that informed and empowered communities require free speech protections that not only protect press freedom and academic freedom, but also enable free knowledge projects like Wikipedia to thrive.”

In setting and enforcing content rules, Wikipedians are exercising their right to govern content on a digital platform that they not only contribute to but also help build and operate. In the United States, that right is protected by the Section 230 of the Communications Decency Act.

Section 230 makes it possible for Wikipedia’s volunteer editors to enforce community rules for what content is or is not allowed on the platform. The law protects their right to delete content that is lawful but misleading, or low quality, or otherwise inconsistent with the community’s sourcing rules, without fear of being sued by whoever created it.

Today, this liability shield is under threat from both sides of the U.S. political spectrum. Politicians on the right want it abolished because they claim it allows platforms to discriminate against people who promote alternative versions of reality not backed by “mainstream” scientific research or journalistic sources.

Politicians on the left are calling for reform or revocation of Section 230 due to concerns that it shields social media from liability even when executives are aware that their platforms are being used to spread anti-vaccination viewpoints or organize violence against government institutions.

Smart regulation must protect free speech

Ten years ago, the message from the Wikimedia movement to lawmakers was “leave the internet alone”.

The world — and the internet — has changed dramatically in these ten years, and smart regulation is badly needed to address genuine harms. But it is clear that the public interest will not be well served by eliminating the free speech and content governance protections that Section 230 affords all types of digital platforms, including Wikipedia.

The good news is that there are many other ways for lawmakers and regulators to protect and support everyone’s right to build and govern technologies and online platforms that serve the public interest.

“The public interest will not be well served by eliminating the free speech and content governance protections that Section 230 affords all types of digital platforms, including Wikipedia.”

As many of our allies regularly point out, privacy law that curbs platforms’ power to track and profile users is urgently needed. Lawmakers and regulators can also take steps to give people more choice and control over what online services and platforms are available to them. They can require more transparency and accountability about whether, how, and why content on the platforms people use and depend upon is moderated, curated, prioritized, and amplified.

Wikipedia’s volunteer community has proven that the public is best served when people from all walks of life are not just treated like users or consumers of online platforms — but as empowered contributors, stewards, and community leaders.

We need smart laws and thoughtful public policies that support such a world — and make it easier for everyone, everywhere, to participate in the creation and sharing of knowledge regardless of who or where they are.

Please register to join me on Tuesday, January 18th for a panel discussion hosted by Georgetown Law in Washington DC, featuring remarks by Wikipedia’s Founder Jimmy Wales. For more information on how and when to join, and to register, please click here.

Rebecca MacKinnon is Vice President for Global Advocacy at the Wikimedia Foundation. Follow her on Twitter at @rmack.

--

--

Rebecca MacKinnon
Down the Rabbit Hole

VP for Global Advocacy, Wikimedia Foundation. Author, Consent of the Networked. Co-founder, Global Voices. Founder, Ranking Digital Rights. Twitter: @rmack