Fundamental Rights Impact Assessments (FRIAs) under EU Legislation Related to the Digital Environment

Emma Day
Tech Legality
Published in
6 min readJan 11, 2024

January 11, 2024

By Sabine K Witting, and Emma Day — human rights lawyers and CEOs, Tech Legality

© Tech Legality 2023

Relevant EU Legislation Requiring FRIAs

Fundamental Rights Impact Assessments (FRIAs) are an important tool for businesses to identify and subsequently mitigate negative impacts of their business activities on fundamental rights as set out in the EU Charter.

In recent years, FRIAs have become an increasingly popular tool in EU legal instruments in the context of regulating the digital environment. FRIAs are not a new invention, but are derived from the United Nations Guiding Principles on Business and Human Rights (UNGPs) which provide a framework for responsible business conduct in the technology sector, in accordance with international human rights law.

Below we briefly demonstrate in which existing or upcoming pieces of EU legislation FRIAs are or will become relevant. When conducting a FRIA, it is essential to move away from a tick-box exercise and make FRIAs meaningful in protecting and advancing the rights in the EU Charter.

Digital Services Act (DSA) 2022

As of 25 August 2023, very large online platforms (VLOPs) and very large online search engines (VLOSEs) are required by the EU’s Digital Services Act (DSA) to assess systemic risks stemming from the design or functioning of their services and their related systems (Art 34 (1) DSA). This mechanism is essential for understanding and mitigating any actual or foreseeable negative effects for the exercise of fundamental rights. When conducting risk assessments, VLOPs and VLOSEs shall take into account, in particular, the design of their recommender systems and any other relevant algorithmic system, their content moderation systems, the applicable terms and conditions and their enforcement, systems for selecting and presenting advertisements and data related practices of the provider.

The assessments shall also analyse whether and how the risks are influenced by intentional manipulation of the VLOPs/VLOSEs service, including by inauthentic use or automated exploitation of the service, as well as the amplification and potentially rapid and wide dissemination of illegal content and of information that is incompatible with their terms and conditions (Art 34 (2) DSA). It is important to highlight that negative effects stem not only from the overall design of systems and technical functionalities of the platform, but also from policies and practices intended as mitigation measures (e.g. content governance systems). Further, the company has to assess the impact of its services to all its users, but also non-users who might be affected by their business activities.

FRIAs under the DSA will be carried out on an annual basis and additionally at specific times where the VLOP/VLOSE introduces functionalities on its service which are likely to have a critical impact on fundamental rights (Art 34 (1) DSA).

Proposed CSA Regulation (2022)

On 11 May 2022, the European Commission published its proposed Regulation laying down rules to prevent and combat child sexual abuse. The proposed Regulation aims to establish a clear and harmonised legal framework to better identify, protect and support victims of child sexual abuse (CSA), notably through a clarification of the rules and responsibilities of online service providers when it comes to online CSA. It seeks to provide legal certainty to providers as to their responsibilities to assess and mitigate risks and, where necessary, to detect, report and remove known child sexual abuse material (CSAM), new CSAM or solicitation of children on their services.

The proposed CSA Regulation does not foresee a holistic FRIA similar to the one in the DSA, but requires providers of hosting services and providers of interpersonal communication services to identify, analyse and assess the risk of use of their services for the purpose of online child sexual abuse (Art 3 (1) proposed CSA Regulation). Responding to the identified risks, providers have to take reasonable mitigation measures. Such mitigation measures must be effective, targeted and proportionate in relation to the identified risk, applied in a diligent and non-discriminatory way having due regard to the fundamental rights of all affected parties (Art 4 (1) — (2) proposed CSA Regulation). The risk assessment is hence centred around the impact of the service on the rights of the child relevant to their protection from all forms of sexual violence, such as the right to the integrity of the person (Art 3 EU Charter), the prohibition of inhuman and degrading treatment (art 4 EU Charter), the right to private and family life (Art 7 proposed CSA Regulation) and the rights of the child more broadly (Art 24 EU Charter).

The European Parliament (EP) has recently concluded its position on the proposed CSA Regulation. Amongst fundamental changes across the entire draft, the EP also proposed a holistic FRIA, similar to the DSA. The FRIAs should be included in Art 3 (2) (aa), which obliges platforms to assess ‘any implications for the exercise of fundamental rights or possible infringement of EU law’. The Council has not yet concluded its position on the proposed CSA Regulation. Once the position has been concluded, trialogue negotiations will start (approximately Q1 2024). It therefore has to be awaited whether a holistic FRIA as proposed by the EP becomes part of the final text of the CSA Regulation. Tech Legality strongly supports the requirement for a holistic FRIA, so that the full range of human rights are considered from the outset, and any risks mitigated, and so that conflicts of rights can be avoided further down the line.

Proposed AI Act (2021)

On 21 April 2021, the European Commission proposed the first EU regulatory framework for AI. It says that AI systems that can be used in different applications are analysed and classified according to the risk they pose to users. The different risk levels will mean more or less regulation. The draft regulation aims to ensure that AI systems placed on the European market and used in the EU are safe and respect fundamental rights and EU values.

The EP concluded its position on the 14 June 2023 and included in the text a mandatory FRIA for high-risk AI (Art 29a proposed AI Act). On 8 December 2023, the Council presidency and the European Parliament’s negotiators reached a provisional agreement on the proposed AI Act. MEPs successfully managed to include the mandatory FRIA, among other requirements, applicable also to the insurance and banking sectors, into the political deal.

While this is indeed good news, the final text of the proposed AI Act will only be negotiated in the coming weeks. Therefore, the final text has to be awaited to develop a legal opinion on the scope and accountability mechanism in relation to the FRIA in the proposed AI Act.

Tech Legality’s approach to FRIAs

At Tech Legality, we provide services to companies, developing FRIAs in the context of relevant EU legislation and digital technologies. Our approach goes beyond compliance, ensuring that every FRIA is meaningful and impactful and not simply a tick-box exercise.

Tech Legality’s methodology for carrying out FRIAs is based on the UNGPs and existing international best practice. The benchmark for FRIAs under EU Law is primarily the EU Charter, even though reference will be made to the European Convention on Human Rights and International Human Rights Law.

Where possible, FRIAs should assess all negative impacts on all rights related to a digital product or service. As part of a fundamental rights mapping, Tech Legality specifically focuses on the right of various vulnerable groups, such as children, women, racial or ethnic minorities, migrants, refugees, members of the LGBTIQ community, amongst others. If of interest to the client (especially social impact companies), our FRIAs can also highlight potential and actual positive impacts on fundamental rights, to help the client to identify which areas to continue, scale, or expand.

Tech Legality draws from our consultancy roster with highly specialised experts, many of them lawyers from different jurisdictions and geographies, backgrounds, and experiences. We can put together a stellar team tailored to assess the digital product/service at hand, paying special attention to ensuring representation of various affected communities within the Tech Legality team.

Contact us at info@techlegality.com if you would like to learn more about how we can help you with FRIAs for your company.

--

--

Emma Day
Tech Legality

Human rights lawyer, specialist in law and technology, and co-founder of Tech Legality. UK raised, worked for 20 years in Asia, Africa, and North America.