Will the GDPR save you from killer robots?
Automated decision-making & profiling in a near future dystopia
The GDPR seems really cyberpunk to me. Like some lost William Gibson novel plot out-line. With all the talk of data controllers, processors, personal data, rights of data subjects… it sorta sounds like the backdrop to a dystopian future/parallel reality where technologically marginalized and oppressed people scurry from shelter to shelter in the outskirts of a “smart city,” under the all-seeing dominion of an augmented, pervasive, command and control system.
Which has got me wondering, with all the rights accorded to natural persons under this regulation with respect to data about them, is it possible that the GDPR might just one day save you from a killer robot attack?*
* I mean if you’re an E.U. citizen that is. Because the rest of us seem more or less f*&%ked.
EU playing hardball
The EU’s General Data Protection Regulation, when it comes into force next year on the 25th of May, 2018, appears that it will be the most far-reaching and comprehensive privacy and data protection system out there (that I’ve seen anyway). And companies both inside and outside the Union who want to target EU citizens will have to play ball or be excluded from the European Economic Area. In exchange, regulation sets up one arm of the digital “one-stop shop” for companies doing business in the EU, so they don’t have to be subject to a nest of varying rules in each member state.
Still not convinced that EU stuff really matters to tech companies? Well they just today fined Google/Alphabet $2.7 billion dollars in an anti-trust case:
Will Google end up having to pay? No idea. But it’s safe to say that “Something is happening…”
Automated processing and you
So anyway, in Chapter 3: Rights of the Data Subject, Article 22, paragraph 1, the GDPR reads:
“The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.”
I’ve been trying to grasp what is probably meant by “legal effects” here in a more mundane context. Given that the relationship between users (data subjects) and service providers (data controllers) is generally a legal agreement or contract like a Terms of Service, or an End-User License Agreement, how should we interpret this clause? Is it just anything that touches on the ToS or EULA?
Recital 71 sheds a little light (excerpted below), though it is non-binding:
“The data subject should have the right not to be subject to a decision, which may include a measure, evaluating personal aspects relating to him or her which is based solely on automated processing and which produces legal effects concerning him or her or similarly significantly affects him or her, such as automatic refusal of an online credit application or e-recruiting practices without any human intervention. Such processing includes ‘profiling’ that consists of any form of automated processing of personal data evaluating the personal aspects relating to a natural person, in particular to analyse or predict aspects concerning the data subject’s performance at work, economic situation, health, personal preferences or interests, reliability or behaviour, location or movements, where it produces legal effects concerning him or her or similarly significantly affects him or her.”
It goes on to exclude from these restrictions automated processing necessary for the purposes of fraud detection, tax evasion, provisioning of services (security, reliability, etc), and so on and so forth.
I mean, I know there’s a big difference between being turned down for an online credit application and being killed without legal consequence by a corporate police robot, but my mind always leaps to these extreme outrageous scenarios to try to make sense of the whole structure and function of these regulations.
Because I'm a weirdo
Recital 71 continues later, explaining that appropriate safeguards should be in place, such that data subjects have:
“…the right to obtain human intervention, to express his or her point of view, to obtain an explanation of the decision reached after such assessment and to challenge the decision.”
In actual practice, we know from ample life evidence that rules and laws are often followed where and when expedient. I suspect killer robots won’t be much different.
Maybe it would go something like this, IRL:
[Man runs down a dark alley between warehouses at night.]
[Private security robot appears.]
ROBOT: (to Man) Stop, or I will shoot!
[Man stops, slowly turns around, raising his hands in the air.]
MAN: I--I’m an EU citizen. I have rights.
[Robot scans the biometrics of the man, processing through a proprietary database…]
MAN: I do not consent to be scanned. (looks around at ambient cameras filming the alley)
ROBOT: Scanning is obligatory in this area for purposes of public safety.
[Robot continues scanning.]
MAN: Come on, man. Just let me get my ident card. I’ll show you — (seems to reach for wallet).
[Robot shoots the man.]
Would it be possible to effectively balance the rights, freedoms and interests of human data subjects with the ‘legitimate interests’ of public and private powers with a mandate for public safety activities?
Without a doubt such a balancing act will be necessary, but it’s explicitly outside the “Material Scope” of the GDPR as laid out in Article 2:
“This Regulation does not apply to the processing of personal data: […]
by competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, including the safeguarding against and the prevention of threats to public security.”
So, I’m guessing based on this exclusion that any corporation which potentially could bring to market killer robots would probably end up being itself a “competent authority” or its products used by one in the protection of public safety.
In other words, I think the GDPR probably won’t save you from killer robots, unfortunately. Sorry about that. Still worth learning about anyway, if you ask me.