A Chilean bill would prohibit community-based content moderation. It could outlaw the work of Wikipedia editors.

Wikimedia Foundation Policy
Wikimedia Foundation Policy
5 min readMar 10, 2022

Written by Wikimedia Foundation’s Amalia Toledo, Lead Public Policy Specialist for Latin America and the Caribbean & Franziska Putz, Movement Advocacy Community Manager

Image of Chilean senate hall: many desktops are on long wooden desks that form a semi-circle infront of a podium which features men in suits and the chilean flag.
Senado República de Chile from Flickr (CC BY-NC-ND 2.0)

The Chilean Congress has been considering a bill to regulate digital platforms since September 2021. The bill aims to curb the enormous power large digital platforms have over users’ content and data — a laudable and necessary objective. However, the bill’s means to achieve this goal could significantly disrupt the sustainability of the free knowledge movement that Wikimedia projects represent.

Below we summarise some of the bill’s problems, which the Wikimedia Foundation and Wikimedia Chile have also raised with the Senate Committee studying the bill.

Ambiguous definitions

The tendency to focus on a small subset of platforms is evident in how ‘digital platforms’ and ‘digital platform providers’ are defined in the bill. The interpretation of ‘digital platforms’ excludes any mention of the elements that make each platform unique, such as its operating practices, business models, number of active users, website visitors, purpose, etc.

Ignoring these differences means that community-led projects are forced to comply with obligations that have been designed to target commercial platforms.

The same is true when a ‘digital platform provider’ (Article 3b) is understood as “the natural or legal person that offers and manages a digital platform.” This includes the one who offers a digital platform and the one who manages it. In the case of Wikimedia, the phrasing would bind multiple subjects, including the Foundation, its affiliates, and volunteers, as decision-makers responsible for the platform.

The one-size-fits-all approach to regulation, which we’ve already seen elsewhere, such as the European Union Digital Services Act, is also prominent in one of the bill’s principles (Article 4), which states that digital platforms must guarantee the provision of a universally accessible, quality and non-discriminatory service. Making knowledge accessible for everyone is an essential principle at the core of Wikimedia projects. Yet again, vague wording diminishes the power of this principle. It assumes a universal “quality” standard and that platform providers create content. Nothing could be further from the truth for the Wikimedia projects.

The Foundation develops and maintains the infrastructure that allows people to create and use the knowledge available to meet their needs. The users build the content and improve its quality through deliberative processes publicly documented on the projects’ websites and based on established neutrality and reliable source standards. While today the parameters of “quality” content are determined by volunteers, passing this bill would transfer this power to the courts. Forcing platforms like Wikimedia to adhere to this provision would restrict users’ freedom of expression and agency to contribute to the sum of all knowledge.

Impractical content moderation requirements

The bill’s language around content moderation is equally problematic. First, imprecise language may establish inordinate obligations. Article 6 actually creates contradictory obligations by stating that user-generated content “may not be removed unless they might be considered civilly injurious, libellous, or they constitute threats or constitute crimes established by other legal bodies or that incite to commit a crime.”

On the one hand, content moderation is prohibited; on the other, certain content needs to be moderated. Not only is this confusing and creates legal uncertainty, but it also incurs negative consequences for users’ freedom of expression and privacy.

The mandate to monitor certain content could legally oblige editors to monitor contributions and push the Foundation to constantly intervene so that no controversial content is hosted on its sites. Persistent surveillance of this kind could have a chilling effect on participation as individuals who may feel watched could choose to silence themselves.

Second, the language in the bill ignores the different content moderation practices deployed across digital platforms. Instead of applying nuance, it presumes that all forms of content moderation correspond to the vertical structure typical of social networks. Prohibiting content moderation, for example, would undermine the collaborative editing mechanics that have enabled Wikipedia to grow as a resource for verifiable and accurate information.

Ignores the international nature of community-led online platforms

Three additional elements of the legislative initiative could erode the viability of community-led digital platforms:

  • The norms on extraterritorial application of the law (Article 2);
  • the broadness of right to be forgotten (Article 7);
  • the vagueness of the user’s right to information (Article 10).

The territorial scope of the bill means that the law may apply to entities outside Chilean territory. As mentioned before, Wikimedia projects such as Wikipedia are conceived so that anyone can access and contribute to the identical versions of the articles anywhere in the world.

The bill threatens to extend the right to be forgotten beyond user data to article contents. If this is not addressed, the right to be forgotten could be abused to intervene in articles such as biographies of living persons, subject to specific community policies and notability requirements. Wikimedia Chile has already been involved in a costly and time-consuming legal process after an individual complained about biographical facts included in the Wikipedia article about them. The provision enables tampering with factually accurate articles, which means the provision threatens freedom of expression.

The sustainability of community-led sites is also at risk. The mechanisms via which the bill empowers users to understand how their content is moderated and their data is used only applies to one type of platform: where users have limited ability to participate in their content and data decisions. This design ignores the functions of community-led platforms in which all users participate in decisions about their content and data. Strict compliance with this article would mean that a public justification has to be provided for any decision made about users or their content. Such nonsensical administrative burdens can become barriers that prevent individuals from participating in the free exchange of knowledge.

Conclusion

We appreciate the genuine intentions expressed in this bill to further the public interest. We also welcome regulations that encourage the development of digital environments that protect free access to knowledge and foster broad participation and productive collaboration among communities on the Internet.

Essentially prohibiting community-based content moderation practices to respond to the enormous power of big tech over people’s communication can have the unintended effect of reducing the participation of communities that create and use digital content, which models such as Wikimedia enable. Information is an ecosystem, and regulation everywhere must consider online participation in all its forms.

--

--