By Jason Pielemeier
The German “Law to Improve Law Enforcement in Social Networks” (Netzwerkdurchsetzungsgesetz, or NetzDG) was the outcome of a multiyear process led by the German Government to address hate speech and other illegal content being disseminated online. After a period of “voluntary” collaboration with major social media platforms was deemed not to have worked to the government’s satisfaction, the law was introduced in Parliament on May 16, 2017 and passed the very next month. In addition to criticizing the relatively-short period for debate, critics voiced concerns that vague definitions and the rapid timeline for company responses would negatively impact freedom of expression, complained about the “outsourcing” of content adjudications to private platforms, and warned of potential unintended economic and international impacts.
Since NetzDG went into effect on January 1, 2018, it has served as a key reference point for those advocating both for and against explicit regulation of content decisions by social media platforms around hate speech, terrorist content, and “fake news”. Given this interest, it is worth examining how NetzDG has worked to date. For example, as Theresa May’s government develops legislation to make the UK “the safest place in the world to be online,” what lessons can be drawn about the potential unintended consequences of regulating online speech? Should the proposed EU regulation on preventing the dissemination of terrorist content online adopt NetzDG’s approach of creating a safe harbor for smaller platforms? And what can we learn from the way NetzDG has been used by illiberal states about the possible precedents and pretexts that other similar efforts could create?
In a recent report, “Germany’s NetzDG: A key test for combatting online hate,” Olivia Knodt of the Counter-Extremism Project (CEP) and William Echikson of the Centre for European Policy Studies (CEPS) offer a thoughtful analysis on the law’s implementation thus far, characterizing NetzDG’s impact as “somewhere in between” supporters view of the law as a “necessary and efficient response” to online extremism and hate, and critics views of “an attempt to privatise a new draconian censorship regime.” While this overall assessment is fair, some of the authors’ conclusions are unsubstantiated and risk skewing objective assessment of both NetzDG and other, similar legislative proposals under consideration elsewhere. It may be too soon to draw definitive conclusions about NetzDG’s impact, but we should all agree on the importance of getting the facts right in order to facilitate informed, cogent debate about how best to effectively address hate speech and other forms of controversial content online.
When NetzDG was proposed, critics including my organization, the multi-stakeholder Global Network Initiative, or GNI, raised concerns about its short timeline for removal decisions, the vagueness of the categories of content prohibited, and the specter of major financial penalties for failure to demonstrate compliance. The result, they warned, would be the “over-removal” of content that is not prohibited under community standards, and may even be protected under German law.
Knodt and Echikson cite the first batch of NetzDG-specific transparency reports as evidence that the law has not led social media platforms to adopt a “take down, ask later” approach and that “little evidence exists of widespread blocking.” However, a careful reading of the first set of transparency reports makes it clear that the evidence presented in support of this claim is at best incomplete, and that the concerns about over removal remain quite real.
NetzDG requires social media/content hosting companies with over two million German users to publish transparency reports documenting their compliance with the law, every six months. According to the reports published by Change.org, Facebook, Google, and Twitter covering the first six months of 2018, over 500,000 unique pieces of content were reported to the platforms under the law, and over 100,000 of those were removed.
Unfortunately, there is not sufficient publicly available information to determine how much of this kind of content was being reported to and acted upon by companies before the law went into effect (NetzDG did not change existing speech prohibitions under German law). Even as to the content removed since the law came into effect, the transparency reports do not help us understand how much of that to “attribute” to NetzDG. As Echikson and Knodt acknowledge, “[i]t remains unclear whether NetzDG content would have been reported via community guidelines in the absence of the law.”
Further complicating efforts to ferret out the impact of NetzDG is the fact that when governments or users flag content as “illegal”, platforms often review it against their community standards first, and only consider whether it violates local law if and when they determine that it does not violate those standards. As the report notes, “all reported content is first reviewed under Facebook’s community standards. If these are violated, then the piece of content is removed globally and is not included in the specific NetzDG transparency report.” As a result, the relatively small number of “NetzDG removals” in Facebook’s transparency report likely does not represent a true reflection of the content that would not have been taken down “but for” NetzDG.
What we do know from the transparency reports and other reporting is that all of the major platforms have increased the number of staff focused on German language content referrals. That fact, combined with the likelihood that the law has increased awareness and reporting of content by users, would suggest that there is almost certainly more content being taken down in Germany (under both community standards and the law) since NetzDG, even if we can’t determine precisely how much.
Of course, an increase in reporting and removal of content is exactly what was intended and does not in itself prove that the law has resulted in the take down of content that should be protected under company standards and/or German law. On this point, Echikson and Knodt seems to suggest that the relatively high percentage of requests that were rejected by platforms should provide some reassurance. For instance, despite the high amount of content reported in Google’s NetzDG report (214,827 “items”), the authors claim that “little evidence exists to demonstrate over removal. During the first six months of 2018, Google rejected the majority of NetzDG complaints (about 73%).”
Rather than cause for reassurance, however, the fact that Google rejected more than 7 in 10 requests — notwithstanding the significant financial penalties they might face for getting these wrong — actually signals that users are not very discerning in terms of what they are flagging. Combining that insight with what we know about how difficult these determinations can be, should raise concerns. Mark Zuckerberg recently noted that “depending on the type of content, our review teams make the wrong call in more than 1 out of every 10 cases” (the percentage of erroneous take down in the copyright context has at times found to be even higher). Even if we assume an over removal “error rate” of just 5%, this would mean there were almost 3,000 items that were over removed from YouTube in Germany in just six-months.
The significance of this over removal from a free expression perspective is impossible to assess without being able to examine the specific content and context of these items. Still, anecdotal evidence taken from court cases brought by aggrieved users, as well as instances highlighted in the media, demonstrate that the impact is real and not insignificant. This in-and-of-itself is concerning, even before one considers whether specific types of users or content are more likely to be unjustly impacted, or factors-in the second-order effects that this impact may cause, such as chilling effects, speech or opinion “martyrs”, or the conflation of satire or humor with hate speech.
However one chooses to interpret the available evidence, we should all be able to agree that more transparency — including from the German government — would be helpful and should be encouraged. Until we have more data, commentators and policy makers should be circumspect about drawing conclusions about NetzDG’s impact on freedom of expression.
The Accountability Gap
In an April 2017 statement on NetzDG, GNI Board Chair Mark Stephens noted that “[t]he practical effect of this bill would be to outsource decisions on the balance between the fundamental right of freedom of expression and other legally protected rights to private companies.” This concern has also been articulated by UN Special Rapporteur on freedom of opinion and expression David Kaye and others. Echikson and Knodt note these concerns but do not engage with them directly. Yet this is a fundamental point that bears underscoring and unpacking, especially as more and more States are expressing interest in laws that effectively outsource content decisions to private platforms.
In democratic systems, decisions by States to adjudicate content are generally transparent and the State is ultimately accountable to both publishers and the public for specific decisions (via judicial challenge), as well as for its broader approach to content (via elections and other democratic feedback mechanisms). However, today the majority of online content policing, adjudication, and removal decisions are made and executed by companies pursuant to their community standards. As Prof. Jack Balkin has pointed out, this creates an accountability gap, because companies are generally not judicially or democratically accountable (at least in any direct sense) for their decisions or their broader approach.
NetzDG forces companies to either expand their community standards to ensure they cover all categories of content declared illegal under German law, or to add separate criteria and reporting mechanisms to cover any delta between the two. It also compels companies to increase the resources they commit to policing, adjudicating, and removing content. This not only widens the accountability gap between the public and platforms, it can also weaken the claims of individuals and public authorities attempting to challenge instances or patterns of over removal. Indeed, we have seen companies citing — entirely appropriately — to NetzDG as an explanation / defense in actions brought against them for alleged over removal.
To the extent the law could have done more to provide definitional clarity, procedural guidance, and requirements for redress, there is room for improvement. Although insufficient to address this “accountability gap,” the transparency requirements in NetzDG are positive, and the efforts that platforms are making to create more transparency and redress around their content decisions should be welcomed. The recent decision by Facebook and Google to take advantage of the law’s invitation to work with the “Voluntary Self-Control for Multimedia Service Providers (FSM),” an independent association that works as an intermediary between media companies and the Commission for the Protection of Minors, to address the hardest content decisions could be productive. Regardless of how that goes, it will help illustrate the potential for, as well as the challenges of, independent mechanisms for content review — an idea that has received significant attention of late.
Unnecessary and Disproportionate
GNI and others raised concerns about the law’s consistency with international human rights law principles. In practice, international human rights law (and European law) requires that laws which infringe on free expression satisfy a three-part test of legality, legitimate purpose, and necessity. Echikson and Knodt express some sympathy for these underlying concern, noting for instance that “NetzDG fails to differentiate between terrorist incitement and counterfeit products” and that the German government has so far failed to “offer a clearer definition of the law’s vague description of ‘obviously illegal’ content or systematic failure of compliance”.
While others have underscored legitimate concerns about the speed with which the law was debated and passed and the vagueness of relevant terms, the primary challenge to the legal legitimacy of NetzDG that I wish to unpack relates to the issue of “necessity.” The principle of necessity is generally understood to mean that any expression-restricting measure by a State (i) must be appropriate to achieve a recognized, legitimate purpose, (ii) must be the least intrusive means for achieving that end, and (iii) must be proportionate to the interest being protected.
While it is beyond the scope of this essay to explore each of the underlying categories of illegal content referenced in the law, at a fundamental level, it is not clear how a deliberative process focused on a specific set of concerns related to “hate speech” can be reconciled with the government’s decision to address 21 very broad categories of illegal speech, most of which have nothing to do with hate speech. In addition, to the extent that effectiveness is an indicator of whether a law is “appropriate”, Echikson and Knodt’s conclusion that “NetzDG seems to have done little to advance the goal of eradicating extremist content from the internet” should trigger careful reflection. Finally, the decision to outsource determinations regarding illegality without stipulating rights of review and redress (i.e., the “accountability gap” outlined above) further undermines any argument that the method of speech restriction outlined in NetzDG is the “least intrusive” means for achieving the underlying objective of addressing hate speech, or that it is proportionate to that interest.
The report closes with a few conclusions and recommendations, which seem to be targeted at the European Union and others who may be looking at NetzDG as an example to follow. One particular recommendation is worth further scrutiny. The report calls on the European Union to “implement binding obligations for platforms to work with Europol to build up a comprehensive database of hashes, to prevent re-uploads of known harmful content.”
There may be some value in implementing systems to prevent re-uploading of already adjudicated content, as well as in sharing information about those efforts across platforms. However, there are also real and often under-appreciated limitations on the effectiveness of existing technology to search, analyze, compare, and filter content online. These limitations can lead to erroneous content removal that may be compounded by re-upload filters. Beyond that, any system that gives governments an explicit say in determining what constitutes “known harmful content” or making related adjudications, and then allocates enforcement to private platforms would underscore the accountability gap concerns articulated above. Finally, it is easy to see how the mandated use of upload filters for “re-uploads” might quickly lead to similar mandates for “proactive” content removal, effectively automating prior-restraints on speech.
There are two other arguments that were not directly cited in GNI’s original statement but which GNI and others have often made about speech-restrictive laws enacted in democratic countries. The first is that the law undermines the broader, multistakeholder effort to push back against speech restrictions around the world. The point here is not to draw any lines of causality between laws like NetzDG and bad laws in autocratic States, but rather to illustrate how the former impacts the fight against the latter. By enacting a law that forces platforms to comply with domestic speech laws under threat of heavy penalties, accompanied by requirements that companies subject themselves to personal jurisdiction by placing representatives in country, NetzDG makes it more difficult for platforms, civil society organizations, international institutions, and governments (especially the German government, whose leadership on international human rights issues is increasingly important) to criticize similar laws in non-democratic countries. This point has been brought home concretely by the fact that a NetzDG clone has been introduced and is moving forward in the Russian Duma.
The second concern is about the extent to which outsourcing resource-intensive, content-adjudication functions to private platforms creates barriers to entry and/or growth for smaller companies. To its credit, and in contrast to the draft regulation on online terrorist content being considered at the EU-level, NetzDG attempts to address this concern in part by limiting its reach to platforms with at least two million German users. Notwithstanding this carve-out, the law covers Change.org, an online petition site not typically considered to be a concerning platform for hate speech. While Echikson and Knodt note that “the expense of implementing NetzDG was high” for Change.org, they also cite the company as stating that in its opinion NetzDG does not restrict freedom of expression. However, Change.org’s Head of Global Policy Sunita Bose clarified to me in an email that: “In our experience to date, NetzDG has only seen us remove content that would have otherwise violated our existing policies, but that’s not to say this will always be the case. There are many areas of uncertainty around how a platform applies NetzDG, which give it a high potential to restrict freedom of expression as it encourages an ‘if in doubt, take it down’ approach. That’s a particular concern for smaller companies that may not have the legal resources or time to challenge every unreasonable request made under the law.”
And of course the reports do not tell us about the ways NetzDG may be influencing decision making by companies that are approaching the two-million-user compliance threshold. It certainly seems logical that such a company (say Snap or the German platform Xing.com) might at least hesitate before growing their German-language presence or their German-market targeting, if they deem the cost of compliance significant in comparison to the potential returns from such expansion.
NetzDG has attracted attention from advocates, scholars, and government officials, in part because it is seen as one of the first significant tests of the previously fashionable, but increasingly contested proposition that imposing liability on internet intermediaries would undermine both freedom of expression and economic incentives for enhanced investment in related products and services. Echikson and Knudt’s report contributes to this debate, which has critical implications for the future of the Internet. While the report offers a number of valuable observations, it does little to disprove the concerns that GNI and others expressed. Looking ahead, it is crucial that the companies and the German government continue to be transparent about the requests and decisions made under this law, the steps they are taking to implement it, and the consequences — both intended and unintended — that these efforts are having. Civil society organizations, academics, companies, and governments need to continue having informed, evidence-based debates that take into consideration the immediate and eventual implications of NetzDG, not only for the sake of German users, but for the sake of global digital rights.