Deep Dive: The United Kingdom’s Online Safety Bill

Wikimedia Foundation Policy
Wikimedia Foundation Policy
9 min readNov 17, 2022

Without Further Revision, Bill Will Harm Wikipedia and Other Open Knowledge Projects

The second deep dive in our series on online safety bills.

Palace of Westminster, meeting place for the two houses of the Parliament of the United Kingdom, at night.
Palace of Westminster, meeting place for the two houses of the Parliament of the United Kingdom, at night. Image by JeyHan, CC BY-SA 3.0, via Wikimedia Commons.

Written by Wikimedia Foundation’s: Tina Butoiu, Legal Counsel; Aly Marino, Tech Law and Policy Fellow; Phil Bradley-Schmieg, Lead Counsel; and, Miguelángel Verde, Senior Editor.

Governments worldwide are drafting legislation and approving bills to address online safety challenges and hold technology platforms accountable for harmful content that is spread on their websites. The Wikimedia Foundation is reviewing several of these new laws and their consequences for free and open knowledge as well as for community-driven online projects. In this third blog post of our series (you can read the introduction here, and our review of the Australian Basic Online Safety Expectations here), we return to our discussion of the United Kingdom’s Online Safety Bill (UK OSB) after our early impressions. Now that the OSB is due to be brought back to Parliament, we share our concerns about its lack of protections for community-driven content moderation processes, the privacy implications of collecting data on users to fulfill mandatory age assurance, and its need for strong safeguards and clear definitions to protect freedom of expression.

Wikipedia and free and open knowledge in the UK

Within minutes of the death of Queen Elizabeth II, Wikipedia volunteers had updated her Wikipedia page. Even small edits to the pages of the late Queen and the new King were hotly debated, as anyone can see from the publicly visible “Talk” and edit history logs — sometimes even discussed for days. To enforce Wikipedia’s rules for reliable, neutral content on this and other webpages related to a solemn moment in UK history, Wikipedia editors — all volunteers from around the world — organized a task force that ran like a traditional newsroom. Monitoring developments 24 hours a day, they were able to quickly remove instances of upsetting vandalism at a moment of national grief.

The work of hundreds of volunteers allowed millions worldwide to consult the Queen’s reliably updated and sourced article 19.9 million times in a single day. During September 2022, English Wikipedia had more than 784 million page views from the UK alone. Because volunteers edit in real time, pages are updated so that readers access the latest information. The speedy editing response to breaking events is well-known and motivated by a public service mission. Recent examples of this commitment also include articles such as Prime Minister of the U.K., Rishi Sunak, and Sunak’s recent predecessors — in addition to those mentioned above.

Wikipedia’s volunteer-driven governance model is what allows all of this to work, since it facilitates decentralized decision-making about content on the website. This model of curation of free and open knowledge is led by volunteers who collaborate to expand the encyclopedia and maintain high quality information that is freely accessible around the world. It depends on strong protections for the right to freedom of expression and privacy, and in turn it furthers the right to participate in culture and science, as well as the right to education.

The Wikimedia Foundation, as the nonprofit host of Wikipedia, along with affiliated organizations such as Wikimedia UK, and the larger movement of volunteers support efforts to make the internet safer. When people are harassed or feel otherwise unsafe communicating online, their ability to access, create or share knowledge is diminished. We believe online safety can only be achieved when adequate safeguards for privacy and freedom of expression are in place.

Unfortunately, however, the UK OSB not only threatens freedom of expression and privacy for readers and volunteers alike, but also threatens Wikipedia’s volunteer-driven governance model. In order to “make the UK the safest place to go online,” the legislation seeks to impose numerous duties on platforms hosting user-generated content, including requirements to implement processes to limit or prevent access to illegal or harmful content. Such duties as currently drafted will interfere with the ways that Wikipedia works.

While the OSB as it stands in early November 2022 has been revised to address serious concerns about who has the power to define and order deletion of “lawful but harmful” content affecting adults, many aspects of the OSB remain highly problematic. Chief among those are the failure to protect freedom of expression and community-driven content moderation processes. We are also deeply concerned about the privacy implications of collecting user data for mandatory age verification. With the shared goal of making the internet better and safer for all while also protecting Wikipedia and other Wikimedia projects, we offer our recommendations for revisions of the OSB.

Protect Wikipedia users’ privacy

The Foundation and the community of volunteers care for the safety of children online. Unlike many commercial services, we do not target people of any age with advertisements, or profile them in order to amplify personalized content. Mandatory age verification or assurance — i.e.,“age-gating” — would force the Foundation to collect much more user data than previously in order to know a reader’s age, exposing both adults and children alike to new security and privacy risks. What is worse, if a legislative precedent is established that forces us to collect such data about UK users, we can expect that many other governments around the world will impose similar requirements in ways that will seriously expose our community to security and human rights risks.

For that reason, the OSB should not require all platforms to automatically shield children from certain content in ways that would actually weaken their privacy and security. Currently, in order to safeguard the privacy and personal safety of readers and of volunteer contributors to Wikimedia projects, the Foundation collects very little personal information about people who visit our websites, and retains that information for only a short time. This is critical for many people who face threats of political surveillance and retribution — including, for instance, those sharing or accessing information on Wikipedia about the invasion of Ukraine from Belarus. Our firm commitment to protect the privacy of our large international user base is necessary so that volunteers and readers alike can trust that they will not be tracked in their activities on Wikimedia platforms.

Not only would a mandate to collect user information in order to protect users be counterproductive and potentially ineffective, but even the best age assurance tools have been shown to be inaccurate as well. At the same time, international human rights standards stipulate that states have a duty to protect children’s right to form and express their opinions without interference from automated processes of information filtering and profiling. We urge the UK government to uphold this duty, since it uses law as an important tool to ensure that internet platforms respect and protect users’ rights.

Given the serious human rights and security issues of the OSB’s current approach to age verification, policymakers should instead consider giving some platforms the option to develop open source tools for users to have more control over whom they interact with and which content they access. Such an approach would be particularly suitable to avoid placing disproportionate requirements for data collection and responsibilities for data security on platforms that are not designed or equipped for them, and which lack the capacity and resources to maintain security for large amounts of personal data.

Protect community-driven content moderation

To protect Wikipedia, the OSB should explicitly recognize and support community-governed content moderation systems, which are highly effective against harmful speech and in protecting human rights. Obligations placed on nonprofit, public interest platforms with decentralized, volunteer-run content moderation models like Wikipedia should be different from those required for for-profit platforms, which have top-down, centrally-directed content moderation systems that support advertising-driven business models to maximize profit for shareholders. Wikimedia projects’ successful model of community collaboration and deliberation empowers volunteers to consider the context and sourcing of every sentence or image. This allows them to make nuanced and thoughtful decisions, and to avoid the mistakes and over-censorship common to the automated flagging and removal processes used by commercial platforms. New obligations to automatically remove, block or filter certain content, or to respond to complaints within timeframes so short that they prevent meaningful community decision-making are simply not compatible with community governance models like Wikipedia’s.

Instead, the OSB’s drafters should seek to align with the European Union’s Digital Services Act (DSA), which recognizes the difference between centralized content moderation carried out by employees, and community-governed content moderation systems. The DSA also explicitly prohibits general monitoring obligations — i.e., rules that would require platforms to screen and monitor all activity and content. Like the DSA, the OSB should not impose a new duty to limit or “prevent” access to harmful content in relation to people of any age. Such a duty poses an impossible challenge for volunteer-driven systems due to almost infinite potential interpretations of what does and does not constitute “harm.”

Digital spaces like online encyclopedias and libraries should, furthermore, be exempt from such duties because they provide the public with access to diverse and reliable sources of educational content and information. Wikipedia and other Wikimedia projects are designed to make information easily accessible and freely available. Unlike profit-oriented platforms, Wikimedia projects provide information to individuals without exploiting their data, attention, or targeting them with ads.

Policymakers should narrow the scope of the OSB to carve out “harmful” content — regardless of whether its scope only seeks to protect children from such content — and more precisely specify the definitions of targeted content. This would reduce the compliance burden for nonprofit organizations like the Foundation as well as other hosts of smaller platforms, and protect freedom of expression as well.

Protect freedom of expression

We are pleased that the UK government explicitly requires platforms to consider the right to freedom of opinion and expression. However, the OSB presently lacks the strong safeguards and clear definitions necessary to ensure that it does not cause the removal of educational material, medical information — e.g., documentation of the COVID-19 pandemic, — and other reliably-sourced fact-based content appearing across Wikimedia projects.

The OSB’s most recent publicly available draft includes requirements to remove or prevent access to content that is “harmful,” which could mean losing an accurate historical record as well as access to reliable information. Broad requirements to remove this content, as currently defined, would be detrimental to freedom of expression: What is and what is not “harmful” can greatly depend on an individual’s point of view and preferences or, more worryingly, the views of the government of the day. Marginalized voices in particular are at risk of being further silenced by top-down removal and content suppression requirements for “harmful content.”

Therefore, the definitions of “harm,” “harmful content,” and “illegal content” should be updated to ensure that platforms are not incentivized or required to over-censor content that has educational, journalistic, artistic, religious, cultural, health, or other public interest value, which would violate users’ rights to freedom of expression under Article 19 of the Universal Declaration of Human Rights and Article 10 of the European Convention of Human Rights.

Furthermore, to mitigate at least some of the risks to freedom of expression and privacy, the OSB’s drafters should remove clauses related to criminal liability, specifically the risk of imprisonment under Clause 97 of the OSB.

Lastly, any requirements around keeping internet users safe from harm should also protect end-to-end encrypted communications, and refrain from discouraging or prohibiting their use or from de-incentivizing platforms and other service providers from offering them to safeguard the privacy and safety of their users.

Conclusions

The Wikimedia Foundation has long recognized its responsibility toward society to provide a safe environment online, collaborating with volunteers to address the challenges they face through human rights commitments and due diligence, and also our Universal Code of Conduct. We do so to ensure that Wikimedia platforms are safe and inclusive for everyone, promote knowledge equity, and protect freedom of expression and privacy.

The UK OSB should be revised to effectively solve the safety challenges that the internet and platforms face worldwide. Without significant amendments to protect community-driven content moderation and privacy, it will merely become a blueprint for legislation that targets a handful of large for-profit platforms by adding policing — i.e., “safety duties” — to their responsibilities. Unfortunately, by doing so, such legislation and its burdensome requirements could end up breaking our successful model of content moderation, which has allowed a global community of volunteers to create the world’s largest online encyclopedia.

As a nonprofit organization with a mission to make educational content and information available to everyone freely, we urge UK policymakers to amend the OSB to protect Wikipedia as well as safe and free access to online knowledge.

--

--