Data Protection is not the only answer, in an era where connected devices, sensors and invisible cameras are increasingly in our homes, on our bodies, at our work places and in public spaces. It is one tool to ensure that individuals are protected from the excesses of data exploitation, profiling, unaccountable artificial intelligence and insecure devices, to name but a few problems.
Rather than just considering whether GDPR changes the landscape we need to consider how technology changes the landscape, and thus why it should be welcomed that privacy advocates such as Privacy International seek to uphold individual and consumer rights and protect the right to privacy. The principles that GDPR updates are not wrong, but they are not being enforced, and we continue to see huge data breaches. GDPR is evolution, not revolution.
If you haven’t been affected by a data breach, data leak, insecure device or exploitation of your data, you just don’t realise it. We are affected whether we are dating, whether we are cheating, whether we are parents, or whether we are all three.
The Norwegian Consumer Council tested several smart watches for children and found significant security flaws, unreliable safety features and lack of consumer protection. Through a few simple steps a stranger can take control of the watch and track, eavesdrop on and communicate with the child.
In relation to the internet-connected toys My Friend Cayla and i-Que it was found that with a few simple steps, anyone can take control of toys through a mobile phone. This makes it possible to talk and listen through the toy without having physical access.
The terms and conditions allow personal data to be used for targeted advertising and shared with unnamed 3rd parties. Anything a child tells the doll is transferred to the U.S. based company Nuance Communications, who specialize in speech recognition technologies.
At Privacy International we are conducting our own investigations. Looking under the hood of connected devices, both physically and using subject access requests. This project was live streamed the other day via periscope which involved dissecting a Furbee. The findings will be published in due course.
In addition to security, a growing cause of major concern is profiling. We are profiled whether it’s our music taste, our use of apps, how we interact with social media and even based on our voices for job prospects.
A company called Jobaline offers “voice profiling” to predict job success based on how candidates sound; its algorithm identifies and analyzes over one thousand vocal characteristics by which it categorizes job applicants on suitability.
Profiling is nothing new, but what technology enables is the granularity of data available to the extent that companies or political campaigns know individuals’ deepest secrets. This is well documented in the US for example, where the Republican National Committee provides all Republican candidates with free access to a database that includes data on 200 million voters and includes over 7 trillion micro targeting data points.
Omer Tene and Jules Polonetsky compare the relationship between data companies and individuals to a “game of poker where one of the players has his hand open and the other keeps his cards close.”
Targeted profiling presents a challenge to society in many disparate ways. In Germany the Afd radical party publicly promised to stop sharing offensive posters, yet continued to target specific audiences with the same images online.
As part of or in addition to profiling, we are monitored without our knowledge such as by Wifi and Bluetooth.
This slide from a TFL document describes the use of monitoring in Hyde Park, retail outlets and how free Wifi enables access to even more of an individual’s data.
Our world is one in which more and more of what we do is traceable, where aggregated data can reveal a lot about a person and where sophisticated processing takes place with regards to automated decision-making. This can be anything from refining what we read, watch or listen to, to making decisions that are discriminatory due for example to biased data, or can make life changing decisions without human input.
Proprietary software, such as the COMPAS risk assessment that was sanctioned by the Wisconsin Supreme Court in 2016, calculated a score predicting the likelihood of committing a future crime. Even if the final decisions are made by a judge, the software’s automated decisions can be decisive, especially if judges rely on them exclusively or haven’t been warned about their risks, including that the software may produce inaccurate, illegal, discriminatory or unfair decisions.
In the quantified society profiling ranges from decisions about your credit-worthiness to job applications and even universities, from manipulation of your shopping decisions (called behavioural targeting) to health and insurance decisions, and increasingly the way we vote! Companies and governments do not only profile us, they share and sell data about us third parties whether it’s our political opinions, our health and fitness, or online behaviour.
Added to this we are impacted by serious data breaches and large scale id theft.
“At times it seems we have lost control over our data.”
Companies collect your data in vast quantities
Information you give or are asked to give is only a small fraction of it. In almost anything you do nowadays you are followed, tracked and categorised by a myriad of little and big brothers, most often without your knowledge or consent.
Your information comes from public records, location data from your mobile phone, shopping loyalty cards, passenger name records, internet browsing habits (cookies), credit card usage, smart meters, black boxes for car insurance, TVs, smart appliances in your home, the list goes on.
Large amounts of information come from the so-called ‘metadata’ — the data about the data — such as what device you called from, it’s exact location, the screen resolution, battery life, to whom you sent it and their exact location. These are kept by ISPs and give an incredibly detailed insight into your private life.
There are good and positive aspects to society from Big Data. But the concerns regarding privacy are immense.
Why is this taking place?
When a rapidly growing number of daily interactions and behaviors undergo unrestricted digital monitoring, analysis, and assessment, corporate actors can systematically abuse their resultant unprecedented data wealth for their economic advantage. It is clear that information from and about us drives the commercial and political marketplaces.
Corporations want to know as much about us as possible, including how to influence us at any point in what they call the “consumer journey”.
Increasingly such methods are used by political parties.
Despite wide reporting on mass data breaches, insecure connected toys and internet of things with default passwords and the evidence that companies are collecting more and more data on individuals, consumers are largely unaware. There aren’t many individuals with the knowledge of the harms and commitment to take on giants like Facebook for example.
We shouldn’t simply blame the consumer for not challenging poor practice, particularly when they are told by companies that everything is fine, where terms, conditions and privacy policies are lengthy, and where the don’t realise how information about them is being shared.
If terms are very difficult to read, because they are very long, ambiguous or written in an overly technical, complex or vague language, there is a strong case to be made that informed consent is not possible for most consumers
Factors hampering the effectiveness of existing remedy mechanisms include the persistent lack of knowledge about the protection of personal data. Individuals therefore do not understand what constitutes a data protection violation.
In addition, when individual complaints are made, they often suffer from the lack of adequate resources and powers of national DPA’s.
So what role does PI play in this?
You would not expect everyone who drives a car to be a mechanic. The internet and the digital economy are even more complex. Companies do not all always respect the law and therefore, watchdogs like PI and consumer rights organisations can investigate in depth what is going on by legitimate research methods, can support and help enforce the legislation properly.
On behalf of consumers and citizens we can hold these companies to account.
Under GDPR organisations must be fair and transparent about how they process personal data and demonstrate accountability with the data protection principles. Organisations must provide information about how they are processing personal data in a concise, transparent, intelligible, in an easily accessible form using clear and plain language. GDPR also provides new rights for individuals such as erasure and portability, which to an extent seeks to empower individuals and address the imbalance of power between individuals and those controlling their personal data.
These tools only go a short way in doing so, however, they are a start and civil society has a role both in educating/ empowering data subjects and holding data controllers and processors to account.
Article 80.1 of GDPR is non-derogable, a gives individuals’ the right to mandate a not-for profit to represent them. I’m going to focus however on 80.2. Exercising their derogation rights, unfortunately, the UK Data Protection Bill does not currently provide for qualified non-profit organisations to pursue data protection infringements of their own accord, as provided by GDPR in its article 80.2. This is a significant missed opportunity.
We along with UK digital rights and consumer organisations are strongly recommending that the Bill is amended to include this provision. We also hope to see this right enacted throughout Europe.
It will enable qualified NGOs to take up collective actions on behalf of all consumers and citizens affected by data breaches and other illegal activities which may cause them financial or other detriment and of which they may not be aware to take action on their own. Properly constituted organisations could act without consumers each having to bring — or appoint a body such as PI to bring — an individual case against the company involved.
This is particularly appropriate in the context of data breaches, given the many thousands of consumers who may be affected by a single breach but whom individually may have suffered a relatively small loss. Such a mechanism would therefore have the potential to save significant administrative and court time, in that it will avoid a myriad of individual claims.
Finally, implementing 80.2 is also nothing new. A robust collective redress regime was recently introduced under the Consumer Rights Act 2015 for infringements of competition law.
Whilst we may be enthusiastic, we know companies are resistant and perhaps this was one reason why GDPR attracted over five thousand amendments. However, that resistance is actually a good sign. They do not want this because it would be an extra deterrent to unlawful activities and would be effective additional oversight when we know Data Protection Authorities lack resources and expertise.
In Acxiom’s quarterly report filed 05/27/16 they state:
“With the growth of online advertising and e-commerce, there is increasing awareness and concern among the general public, privacy advocates, mainstream media, governmental bodies and others regarding marketing and privacy matters, particularly as they relate to individual privacy interests and global reach of the online marketplace.
Negative publicity and/or increased restrictions on the collection, management, aggregation and use of information could result in reduced demand for our products or services, decreased availability of certain kinds of data and/or a material increase in the cost of collecting certain kinds of data.”
Companies will not all change their practices as a result of GDPR, but if Privacy International or Which? commence an investigation, they might, and that could benefit thousands, even millions of individuals.
Powers of collective redress are vitally important since personal data has become such an essential part of the national and global economy, while the imbalance of power and the information asymmetry makes it particularly difficult for individuals to claim their rights effectively.
Data protection is a vital framework for the protection of human dignity in the digital age. But GDPR and its implementation is not the end game. It is just the start. We haven’t grappled with questions of non-personal data, with how we are protected when organisations are exempt from data protection safeguards, relying upon national security certificates to avoid rights and protections when they collect, store process and share our data. Nor do we know whether in light of UK mass surveillance legislation and national security provisions in the Bill, whether the UK will get an adequacy decision from the EU Commission when the UK leaves the EU.
Privacy International is excited about the future of technology and what it can offer. But we also know that there is a vast eco-system of players who want to exploit our data. That’s why we need to support those who can use data protection for the benefit of society, to protect our rights and protect our data.