Thinking Through Transparency and Accountability Commitments Under The Digital Services Act

Encouraging meaningful, innovative, and rights-respecting transparency frameworks for both companies and governments. By Spandi Singh, Policy Analyst, New America’s Open Technology Institute

Global Network Initiative
The GNI Blog
5 min readJul 20, 2020

--

The Digital Services Act (DSA) is a fundamental component of the European Commission’s roadmap to rethink “Europe’s digital future” and revise the existing legal framework for intermediaries and their responsibilities related to user content and conduct. As previously outlined in this blog series, whether the DSA yields binding regulations, a directive that guides member states, or a hybrid of both, these revisions could significantly influence freedom of expression and privacy around the world. As a result, it is vital that these approaches to content regulation are grounded in a strong, rights-respecting framework that emphasizes transparency and accountability, both from Internet platforms and governments.

Over the past several years, Internet platforms have augmented their efforts to provide meaningful transparency and accountability to policymakers, their users, and the public. Following the Snowden revelations in 2013, a large number of global Internet and telecommunications companies began publishing transparency reports, which outline the volume, and more recently the types, of government demands for user data they have received.

In addition, over the past few years, many of these companies have expanded those published transparency reports to include the volume and types of legal requests for content removal they have received. In 2018, YouTube took the first steps to expand reporting on content takedowns by publishing a comprehensive transparency report that explains how the platform enforces its own content policies. Since then, companies such as Facebook and Twitter have followed suit, and the companies have broadened their reports to also include data across products, as well as data related to the impact of these moderation decisions, and their appeals procedures. Further, a limited number of Internet platforms, including Facebook, Google, and Reddit, have also begun producing ad transparency libraries, which aim to provide insight into the scope and scale of digital advertising in categories such as housing, employment, and political ads. These corporate transparency efforts are valuable. However, as OTI highlighted in the Transparency Reporting Toolkit on Content Takedown Reporting and in the recent report series, on how Internet platforms use algorithmic decision-making to shape the content we see and engage with online, platforms can — and must — do more.

The DSA can help encourage platforms to implement rights-respecting transparency and accountability mechanisms in their policies and procedures. The three own initiative reports (“OIRs”) drafted by rapporteur’s from three key committees within the European Parliament (Internal Markets and Consumer Protection (IMCO), Legal Affairs (JURI), and Civil Liberties, Justice, and Home Affairs (LIBE)) provide a valuable foundation for how such transparency and accountability commitments should be conceptualized. In particular, they emphasize the need for greater transparency and accountability around content moderation, digital advertising, and the use of algorithms. They also recommend the creation of an independent body that would oversee the implementation of these transparency commitments, among other things. As European policymakers think through the transparency commitments and obligations under the DSA, there are a few key considerations they must keep in mind.

First, policymakers must balance the desire for standardization with consideration of how specific platforms operate. Through our previous work, we have outlined how a lack of standardization in transparency reporting often makes it difficult to compare company efforts and understand what the larger content moderation ecosystem looks like. However, variations in transparency reporting, particularly around the metrics that companies report on, are also reflections of the unique contexts and roles that different products and companies play in the digital ecosystem, and show how different companies conceptualize meaningful transparency. As the European Union deliberates on transparency commitments under the DSA, officials should ensure that it is balancing the need for a degree of standardization with an appreciation of the fact that different companies have varied kinds of users and content. Allowing flexibility for this type of platform-specific transparency also encourages companies to innovate around the metrics they produce and explore different components and definitions of meaningful transparency.

The conversation around transparency and accountability in content moderation and algorithm use is continuously evolving, and any policies and requirements around transparency and accountability should encourage innovation in this space, rather than stifle it. Further, any transparency-related obligations should recognize the associated cost and capacity needs, and ensure that these requirements do not create barriers to competition for smaller or newer platforms. Finally, conversations around transparency commitments should also consider who each transparency effort is targeted toward (e.g., users, researchers, or regulators) and how companies can provide adequate transparency to a range of parties, while appropriately addressing data protection, competition, and other relevant concerns.

In addition to encouraging greater transparency and accountability from Internet platforms, the DSA should also create commitments for government transparency in this regard. This is particularly important as the content moderation landscape has expanded over the past several years to include Internet Referral Units (IRUs). These IRUs are operated by EUROPOL, as well as a number of EU member states, and are tasked with identifying and flagging violating content to Internet platforms. However, these IRUs often flag content for removal on the basis of violations of company content policies, rather than submitting formal legal requests to remove content for violating local laws. This process raises significant concerns regarding transparency and accountability around government intervention in the content moderation process, and the DSA should include provisions to address this fundamental gap.

As the European Parliament continues to deliberate on the format and structure of the DSA, it is vital that they develop their discussions based on a rights-respecting transparency and accountability framework. As outlined, these efforts should seek to set a baseline for transparency that is not overly restrictive and encourages innovative thinking around meaningful transparency and accountability measures. Finally, these transparency commitments should ensure that as both Internet platforms and governments play an increasingly interconnected role in the content moderation and algorithmic curation landscape, they are required to be transparent around their plans, procedures, and impacts.

--

--

Global Network Initiative
The GNI Blog

GNI is the only multistakeholder initiative dedicated to advancing freedom of expression and privacy in the information and communications technology sector.