Remedy and Enforcement in the Digital Services Act

Molly Land
The GNI Blog
Published in
5 min readAug 4, 2020

Embodying Principles of Legality, Comity, and Subsidiarity. Written by Professor Molly Land, UConn Human Rights Institute.

Credit: Sébastien Bertrand / Flickr under a CC BY 2.0 license.

In the field of human rights, remedy and enforcement are essential but often elusive. Access to remedy, for example, has been called the “forgotten pillar” of the three pillars in the UN Guiding Principles on Business and Human Rights. And while enforcement is the ultimate goal of human rights law and practice, it often seems far beyond reach. By including provisions on remedy and enforcement with respect to content on social media platforms, the European Union’s new Digital Services Act (DSA) will be an important step forward in promoting business accountability for human rights harms.

Although the DSA’s focus on remedy and enforcement is essential, the EU will need to establish the right degree and scope of supervision for it to be effective. All three own-initiative-reports (OIRs) from committees of the European Parliament discuss enforcement and remedy to varying degrees. Two reports, from the Internal Market and Consumer Protection (IMCO) and Legal Affairs (JURI) committees, require the establishment of a dispute resolution mechanism to ensure access to remedy, and the report from Committee on Civil Liberties, Justice and Home Affairs (LIBE) appears to require access to judicial redress for users. The JURI report notes that dispute resolution mechanisms should be composed of independent legal experts with competence over both legal and regulatory requirements, as well as platform rules. All three envision the creation of an EU authority that would have power to remedy procedural and transparency failures. These proposals should incorporate greater attention to principles of legality, comity, and subsidiarity in order to remain consistent with human rights and international law.

1. Legality

First, as Richard Wingfield recommends in his post in this series, the new DSA must clearly distinguish between illegal and merely harmful (but not illegal) content. With respect to harmful but not illegal content, the principle of legality would require that the DSA limit itself to, at most, providing mechanisms for the enforcement of contractual obligations. To the extent that a user is aggrieved by a post that is offensive but not illegal, the DSA could not, consistent with the principle of legality, require the platform to remove the content. The DSA could of course require a platform to provide a grievance procedure for contractual remedies the user may have against the platform. However, because platforms can draft their terms of service in extremely broad ways, users may not always have such remedies.

Limiting enforcement to illegal rather than harmful content is all the more important because online enforcement is more likely to be disproportionate than its offline equivalent. Offline, we have a range of institutional features such as prosecutorial and judicial discretion that help ensure that enforcement is directed to the speech that is most harmful. Enforcement by platforms will not have the same kinds of built in discretion and considered human judgment that can be instrumental in ensuring proportionality, particularly as they increasingly rely on automation to detect certain kinds of harmful speech. This risk to proportional regulation counsels restraint in terms of the speech the DSA seeks to regulate.

2. Comity

Second, provisions for enforcement and remedy should incorporate the principle of comity. The DSA does not need to be limited to content only uploaded from within a member state (which seems to be suggested by the JURI OIR), since even content originating from outside the EU can have an impact on EU member states. Nonetheless, the DSA should be cautious in defining the scope of enforcement power entrusted to the EU regulator, and should limit its authority to content that is directed toward a member state. Furthermore, global takedowns such as those sought by the French Data Protection Authority intrude deeply on principles of comity and respect for the laws of other jurisdictions. Respect for principle of comity serves the interests of all states because it helps balance the needs of different national jurisdictions. Allowing global takedown orders under the new DSA would also set a precedent that could be used by other states to impose their content preferences extraterritorially.

3. Subsidiarity

Third, the new DSA might usefully draw from the concept of subsidiarity in determining how best to account for the problem of scale and also preserve innovation and flexibility. The principle of subsidiarity in the Treaty on the European Union, which governs the relationships between the EU and member states, requires consideration of the level at which decisions are most appropriately taken, in light of both the scale and effects of the action proposed. Similar considerations can also be considered in defining the respective competencies of platforms and regulators in providing enforcement mechanisms and remedies. For example, requiring that platforms provide independent review of all user grievances would be impossible to apply at scale and could also be subject to abuse and manipulation. Moreover, regulatory requirements that are overly detailed may not be able to evolve to take account of changes in platform use and technology.

Consistent with the animating principles of the concept of subsidiarity, an approach that sets broad expectations regarding due process and remedy and requires platforms to demonstrate the steps they are taking to resolve disputes would be better suited to leveraging the knowledge and expertise of platforms in innovating to respond to these challenges. In finding this balance, it may be useful to draw insights from an emerging movement in the context of business and human rights aimed at requiring companies to engage in due diligence to identify, mitigate, and remedy human rights harms that they are responsible for or with which they are linked. Regulators will also need to consider any overlap between requirements in the context of digital rights and these new due diligence laws, such as the 2017 law of vigilance in France and proposed EU regulation on mandatory due diligence.

This approach would also counsel providing platforms with discretion regarding the type of remedies provided to users. Removal of content is not the only possible remedy, and removal may be disproportionate when applied broadly, even to illegal speech. Instead, the DSA should encourage platforms to consider and deploy a range of remedial actions, including but not limited to greater user choice regarding the content they see, counter-speech, friction, and/or data portability (so that users can vote with their feet). Access to a greater variety of remedies allows platforms to ensure more proportional responses.

--

--