Privacy Talk with Kris Shrishak, Technology Fellow at the Irish Council for Civil Liberties: What is your concern with AI regulation?

Kohei Kurihara
Privacy Talk
Published in
5 min readJun 27, 2022

--

“This interview recorded on 19 May 2022 is talking about
technology-policy and privacy technology.”

Kohei is having great time discussing technology-policy and privacy technology with Kris Shrishak.

This interview outline:

  • Introduction
  • What is your role at Irish Council for Civil Liberties?
  • What is your concern with AI regulation?

Kohei: So, thank you for everybody to come to the privacy talk. I’m quite honored to invite Dr. Kris. So, today we’re going to talk about the facial recognition AI and also their privacy related technology as well.

So Kris, thank you for coming today.

Kris: Thank you for inviting me.

Kohei: Thank you. First of all, I’d like to share his profile. Dr Kris Shrishak is a Technology Fellow at Irish Council for Civil Liberties (ICCL) where I work on technology-policy with a focus on algorithmic decision making. Previously Kris was a researcher at TU Darmstadt where he worked on applied cryptography, privacy enhancing technologies (PETs) and Internet infrastructure security.

So again, appreciate having a call this time. So I’d like to start today’s agenda. And the first question is about your motivations. You have been working in this field from a technical and political perspective. So what is your motivation to work on technology and policy this time?

  • Introduction

Kris: So, so my PhD work, of course, was in a very technical field. During my PhD, I was always thinking about the policy side of things, although I wasn’t making any headway. And so after the end of the PhD, it made sense for me to actually try out working in the field of technology policy.

Part of the reason being, I felt that having a lot more technology knowledge in the policy area is important. And that’s something I could bring. And that’s essentially the motivation to work in this field.

Kohei: Thank you. I think there is very rare to work on a technical and political area at this moment. So you work at the Irish Council for Civil Liberties right now. So what is your main role and the work at this organization at this moment?

  • What is your role at Irish Council for Civil Liberties?

Kris: So my primary work at Irish Council for Civil Liberties is in the area of artificial intelligence, and in particular, the regulation of artificial intelligence systems in the European Union.

For context, the European Union has, the European Commission has proposed a draft text on regulating artificial intelligence.

So this was in April 2021. And the regulatory process, essentially, is about now the European Parliament and the European Council would then propose amendments to this text, and eventually they would find compromises and we would have a regulation.

And so I’m essentially working to identify potential issues in the existing texts and help the regulators correct them before it becomes regulation.

Kohei: Thank you.

Kris: Yeah. So I mean, just for context, as I come from a technical background, one of the approaches I use is to look at the regulation in a way that there something that mistakes from a technical point of view, so errors, mistakes, those are not necessarily political issues.

So can be actually corrected much easily.

Kohei: Thank you. I think you had published the some of the letters to the political decision, that political announcement. There seems to be in Europe has many political amendments in Europe at this moment regarding data related policy draft.

So you have been working on enforcement AI regulation part. So maybe you have a like to propose to the Commission before about this letter. So is there any response from them? And if you give us any insights for this letter, could you share about it?

  • What is your concern with AI regulation?

Kris: Certainly, I think the letter you’re referring to is a letter from a couple of months ago, where myself and my colleague, Johnny Ryan, wrote to the European Commission about specific issues regarding Ex post enforcement.

So Ex-post is once an AI system is already deployed, how do you check and enforce the requirements that are set forth in the regulation?

So in that regard, the primary role falls on an entity known as market surveillance authority. This comes from essentially product safety regulation framework in the European Union, and AI regulation is being regulated in that framework.

And our main concern was that even at the level of the text, so even on paper, the enforcement is very weak, even weaker, weaker than GDPR. The issues with GDPR were more enforcing what is on paper in reality, and here, even on paper, it’s very weak.

So we have specific suggestions, and we followed up the letter with specific amendments, and some of them have been embraced by the European Parliament and the lead committees, which are known as the internal market committee and the civil liberties committees, and they have taken up some of the amendments that we’ve presented to them.

In particular, one of our concerns is that we might run into the issue that we don’t have enforcement powers at the Union level. So what that means is that current enforcement framework puts a lot of emphasis on the member states of the European Union.

So think of Germany, Spain, Ireland, different countries, but there is no clear framework in the Commission’s text, which allows for resolving issues if the member states actually do not take actions.

So that’s one aspect so we wanted to place a certain amount of emphasis on having Union level enforcement as well. Second thing was that the current regulation does not get individuals like you and me to complain to the authorities to the regulators if something goes wrong.

So that’s another thing. So essentially giving a right. So right to complain the individual if something goes wrong with the AI system, including cases where it’s not about personal data, because if it’s only about personal data, we might be able to cover that in the GDPR but it is not about personal data and still the rights of individuals affected, then we’d need those to be included here.

To be continued..

Thank you for reading and please contact me if you want to join interview together.

Privacy Talk is the global community with diversified expert, and contact me below Linkedin if we can work together!

--

--