Privacy Talk with Kris Shrishak, Technology Fellow at the Irish Council for Civil Liberties: What is the effectiveness to adopt privacy enhancing technology?
“This interview recorded on 19 May 2022 is talking about
technology-policy and privacy technology.”
Kohei is having great time discussing technology-policy and privacy technology with Kris Shrishak.
This interview outline:
- What is privacy enhancing technology?
- Who do you use privacy enhancing technology?
- What is the effectiveness to adopt privacy enhancing technology?
- Message to listeners
- What is privacy enhancing technology?
Kris: Certainly, privacy enhancing technology or PETs in short is essentially a number of different kinds of techniques. So if different techniques provide different kinds of guarantees.
So you did mention for instance, homomorphic encryption. Related essentially broad field is what is known as secure multi-party computation.
So both of these are primarily interested in what is known as input privacy. So to give you an example, what this means is let’s say you have some information, some data that you feed into a computing environment, and you want to actually compute on something.
Maybe you just want to feed in 10 individual salaries and they are interested in the average salary. But what you want to protect is each individual salary. So no one apart from the individual should know the salary amount, but at the end everyone is interested in knowing the average amount, right?
You can do something like that here. So you’re protecting the individual privacy here. Input privacy here.
And the second part is output privacy. So this is the case where you want to publish certain information, think of statistics and you want to make sure no one is identifiable.
- Who do you use privacy enhancing technology?
And at a computational level, there’s a technique known as differential privacy, where it provides a fair amount of strong guarantees. One example for that would be the US Census recently adopted differential privacy for this purpose. That’s one.
Yeah, so these are, let’s say, two different angles. One is in the privacy rights, one is out privacy. So understanding what privacy guarantees you need and of course, you can combine these techniques. It’s you can combine these techniques when you need both both characteristics.
Yeah. So yeah, so I think it’s very interesting. We also did a bunch of my research on that during my PhD. But yes, there is also elements, there are different issues involved, like homomorphic encryption.. secure-multi-computation.
There’s a lot of time now being put into how do we scale them? Because they come with a lot of computational overhead. So yeah, so there are challenges, but it’s getting there and of course, the field itself did get a lift because of GDPR.
I would say that when there is a regulation that requires or in this case, data protection, a lot of these techniques don’t have to be very beneficial, right, but understanding exactly what specific guarantees you get is also important.
Kohei: Yeah, absolutely. So I think that there is some of the technical solution even t’s a R&D phase research and development right now, but probably some of the demands to use this privacy tech.
So is there any field are you expecting to use the privacy enhancing technology even it takes many times there is the big market in next decades, you have any ideas for that?
- What is the effectiveness to adopt privacy enhancing technology?
Kris: I think it is getting picked up can immediately come up with specific names but for instance, I think there are companies which are developing specific kinds of secure-multi party-computation protocols, in particular what is known as threshold signatures, that is being considered for blockchains, for example. for the signing and I mean, I’ve worked in past on that as well. So that’s one example.
But there are other cases, such as if one might think of images from multiple institutions, any kind of images or information from multiple institutions. You might want to pour in and actually identify patterns, for example, right? So for instance, machine learning, wherein you can learn from different hospitals for example.
You want to learn patterns to understand what’s happening, but you don’t actually want to get the individual information from the hospitals because that’s private information, not only at the hospital, but as the patient’s, right.
Those kinds of scenarios are the place where you might think of trying to be very useful, but there’s this couple that come to mind but there are definitely multiple benefits there and potential uses.
Kohei: Thank you. Yeah, I also expected this kind of technology will be integrated. In other natural processing the personal data could be the more protected our data and not just the released our own identifiable data to be unknown the party that is a significant choice for us.
- Message to listeners
So yeah, that’s a very informative that thank you for the taking interview. Lastly, I’d like to ask about any message for listeners because I think facial recognition AI the privacy tech is very important field. A lot of the listeners should know what it’s happens and what they should do that. So could you give us any message for that?
Kris: Well, I mean, there are a couple of things. One is when using privacy enhancing technology, make sure you understand what their privacy guarantees you get from specific techniques.
So don’t just buy whatever you see on the market from a regulatory standpoint, because EU’s regulation of AI systems is essentially the first and so one might expect that other countries would potentially rely on the text or maybe even copy to an extent, that’s going to happen.
So yeah, so just keeping an eye out and what’s going on. The EU in general is also a good thing. They’ve done facial recognition. There are multiple angles there.
So we spoke about CCTVs for example.
One thing I always like to emphasize is it’s not only about the possibility of facial recognition on the CCTV, but you can also do facial recognition once these images are captured on servers and computers. Yeah, so yeah, so that’s an important point to keep in mind.
Because you might have companies that might send you a CCTV say, Oh, this doesn’t have facial recognition, although you can run the facial recognition once you’ve captured the image and store it service, for example. So yeah.
Kohei: Thank you.
Kris: Yeah, that those three would be the important ones I say.
Kohei: Thank you. Yeah, but very kindly the message for the listeners to understand what the point is needed to understand. So thank you for sharing. And also it’s very appreciated to have a great time is Dr. Kris. It’s very knowledgeable and very well experienced, then this field I think it’s very ambiguous now technology and politics, but your work is very exclusive to push the more on democratic voice to political decisions. It’s a more shareholder happy to support them somehow to walk around the space.
So thank you for having me this time.
Kris: Thank you for having me.
Kohei: Thank you.
Thank you for reading and please contact me if you want to join interview together.
Privacy Talk is the global community with diversified expert, and contact me below Linkedin if we can work together!