What You’ve Heard Is True. The EARN IT Act is catastrophic for free speech and for privacy — and for Wikipedia.
Written by Kate Ruane, Lead U.S. Public Policy Specialist, Wikimedia Foundation
The EARN IT Act is a counterproductive attempt to address digital safety. It’s intended to protect children from harm, but in reality it will undermine our privacy rights, restrict free expression, and make it harder for community-led platforms like Wikipedia to engage in content moderation. Rather than pursuing the EARN IT Act, we call on lawmakers to work with civil society, impacted groups and companies to create a comprehensive approach to regulating the online ecosystem that supports human rights, enhances privacy, and protects civil liberties.
Tomorrow, February 10, the United States (US) Senate Judiciary Committee is set to vote on a bill called the Eliminating Abusive and Rampant Neglect of Internet Technologies Act (EARN IT Act). The bill has a noble and necessary aim: to eradicate child sexual abuse material (CSAM) online. However, it would eliminate Section 230’s free speech protections in some circumstances, meaning platforms could be legally responsible for any content created by their users if it violates state or federal criminal or civil laws regarding advertisement, promotion, presentation, distribution, or solicitation of CSAM. This approach to eradicating CSAM endangers essential digital rights that need to be protected if we desire an online ecosystem that is safe for all to participate in the free exchange of trustworthy, quality information. As we we highlighted alongside allied organizations, including the Center for Democracy and Technology, Public Knowledge, and Ranking Digital Rights, in an open letter, the proposed approach will:
- Endanger encryption, which will undermine the privacy many in our communities need to participate in the exchange of free knowledge;
- Weaken free speech protections in a way that will disproportionately harm vulnerable and marginalized communities.
Most concerningly, these dangers could interfere with the operation of Wikipedia itself and take power away from the community of volunteers, editors, and moderators that make Wikipedia the valuable global, reliable, free knowledge resource it is.
The EARN IT Act (and its shortcomings) are similar to other proposals that have been made in the US Congress to amend Section 230, which generally provides online intermediaries protection from liability for content created by their users. The EARN IT Act’s form is reminiscent of SESTA/FOSTA, which carved out Section 230’s free speech protections for content related to sex trafficking, and wound up harming the sex workers it purported to protect. Other current proposals are also designed to eliminate Section 230’s protections, albeit via diverse approaches. Some would reduce Section 230’s protections for categories of speech that can be difficult to define in practice, like hate speech or terrorist content. Other proposals take almost the opposite approach and would eliminate Section 230’s protections if the platform decides to take down vaguely defined categories of speech, like political speech.
These proposals all attempt to incentivize certain content moderation decisions by exposing speech intermediaries, like Wikipedia, to liability risk for hosting or refusing to host certain speech. However, these proposals ignore the vital role that Section 230 plays in empowering the Wikipedia community to responsibly moderate content with safety and accuracy.
To understand what that means for Wikipedia and its related projects, it’s helpful to briefly explain how content moderation on Wikipedia works. Unlike other platforms with centralized and at least partially automated content moderation practices, the Wikimedia Foundation does not set editorial policy for Wikipedia or any other Wikimedia sites. Those policies are created by our community of volunteer editors, and content moderation decisions are made by those same human volunteers, not by automated systems. The community that edits and creates Wikipedia is the community that sets the rules for what can and cannot be on Wikipedia, and implements the processes for making those decisions. And those rules and processes are remarkably effective. Vandalism on Wikipedia is usually removed within six minutes. Wikipedia’s community content moderation practices are a model for handling and combating COVID-19 misinformation.
Yet, proposals like the EARN IT Act could disrupt those very effective processes and rules developed by community-led governance models and push us toward less effective methods of content moderation, including error-prone automated systems.
For all of their good intentions, the EARN IT Act will be ineffective at achieving its worthy goal and ignores the fact that, though we wish it weren’t so, it would be impossible to eliminate all criminal activity, including CSAM, from the internet. The real problem with the EARN IT Act and other proposals that create broad carve-outs to Section 230’s intermediary protections is that they would make it harder to moderate content the way that Wikipedia’s community does. It is Section 230 that gives our community the freedom to develop their own rules and enforce high standards while preserving our community’s privacy and supporting the freedom of expression. Without it, that community would not exist. Section 230 should not be amended without careful consideration of its impacts on community-governed platforms like Wikipedia. From this perspective, the EARN IT Act would do far more harm than good.
That doesn’t mean Congress should forgo all attempts at regulation. We agree that there are significant problems with the online ecosystem right now and that Congress should address them. For example, Congress could create comprehensive data privacy rules that would prevent companies from gathering people’s sensitive personal information and using it in unexpected ways, potentially even selling it to governments. Congress could also examine automated systems intended to capture our attention to extract more data for profit, and consider policy changes that would encourage collaborative spaces that benefit everyone. Congress could consider ways to provide meaningful transparency into the implementation of social media content moderation rules and practices, which could shed more light on the automated content moderation practices that disproportionately harm marginalized communities and help to devise ways to address those harms. And when it comes to the scourge of CSAM, the Justice Department is woefully under-resourced in its efforts to prosecute the people that create and distribute it and Congress could act to rectify that.
Everyone needs an online ecosystem that operates in the public interest for the benefit of all, just as Wikipedia currently does. Lawmakers should be enacting policies that don’t simply aim to fix the problems listed above, but create a legal and economic environment that encourages participatory digital spaces where free expression is supported and privacy is protected. The Wikimedia Foundation looks forward to working with Congress to build that future.