When “YES” means “NO” or the trouble with consent to the use of our data

Elizabeth M. Renieris
Golden Data
Published in
6 min readDec 15, 2018
Image from page 109 of “Platform echoes: or, Living truths for head and heart” (1890) — IABI

Authored by Elizabeth M. Renieris — December 11, 2018

Let’s talk about consent. You’d have to be under a rock to miss the paradigm shift happening around our personal data — a shift from “ownership” and control by large organizations to one of “ownership” and control by the individual (see here for the problem with a property law-based ownership approach to personal data). It’s clear that things are trending towards greater personal independence, liberty, agency, and autonomy. We see that, for example, in the rise of interest and development around “self-sovereign identity.” But, I would caution that — to the extent this trend continues to rely upon the flawed application of “consent” as a lawful basis for processing our data — we will never achieve this new paradigm. The reality is that we live in a post-consent world and must find a way to adapt.

While there are competing definitions and standards for “consent” under existing laws and regulations, let’s talk about the obvious one — the standard under the GDPR. Like the Directive before it, the GDPR still largely relies upon a framework of “notice and consent” as the basis for processing our data. The theory of “notice and consent” is simple — an individual is presented with the terms of an agreement for the use of his or her data (i.e. notice) and expresses his or her agreement to such terms (i.e. consent). Consent was also a lawful basis for processing under the Directive but, where the Directive allowed for implicit or opt-out consent (“any freely given specific and informed indication of [the data subject’s] wishes”), the GDPR standard is significantly higher.

Article 4 of the GDPR defines “consent” of the data subject as “any freely given, specific, informed and unambiguous indication of the data subject’s wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her.” The GDPR requires an affirmative act for each instance of consent. Moreover, the consent must be specific, informed, and unambiguous. Finally, it must be freely given — consent is not freely given where the data subject has no meaningful choice (e.g. where there is no alternative service provider). This sets a pretty high bar. So, what’s the problem?

The original underpinnings of the “notice and consent” framework stem from contract law but that contract is now broken. Due to an array of quantitative and qualitative challenges, GDPR-grade consent is near impossible to achieve in our current digital ecosystem. At the time of the Directive in 1995, fewer than 500 million people, or less than 1% of the world’s population, was online. Today we are closer to 4 billion active users. Much of that growth has occurred since the GDPR was first introduced in draft form in 2012. According to the ITU, 2018 will be the first year in history that more than half of the world’s population will be online.

Not only are there significantly more of us online but we are doing more than ever before. In just one minute we send approximately 156 million emails and 16 million text messages, watch more than 4 million YouTube videos, and swipe nearly 1 million times on Tinder. Google processes more than 3.5 billion searches per day. Our growing online population combined with an increase in our activity has resulted in an explosion of data being created. We currently create 2.5 quintillion bytes of data each day. The digital universe doubles in size every year. This will only accelerate with an expanding internet-of-things (or “IoT”) universe as more devices come online (we are running out of IP addresses).

Clearly, we are already at a point, and moving increasingly into a future, where we cannot realistically implement a push notification providing adequate notice and seeking to obtain specific consent for every action requiring the use of our data. We would get thousands of push notifications a day (likely tens of thousands if you include our mobile phones, wearables, smart devices, self-driving cars, etc.). And, even if we could implement such an approach, who would want to live in a world like that? Imagine the nuisance of cookie banners and popups, but everywhere and all of the time (this nuisance is in part what’s motivating an upgrade to the EU’s cookie laws in the draft e-Privacy Regulation).

In addition to the quantitative challenges to consent posed by this data explosion, there are new qualitative hurdles to consent, characterized by a lack of transparency due to an increase in complex and proprietary algorithms, the rise of machine learning and artificial intelligence (AI), and related processes like deep learning and neural networks that technologists themselves cannot explain. This qualitative complexity is also further amplified by the rise of IoT and machine-to-machine communications.

The GDPR hints at these challenges in two key innovations over the Directive before it. The first innovation is the requirement to comply with data protection by design & default requirements, whereby the rights of the data subject are protected and promoted by default and through the proactive design of technologies such as web browsers. The second is the requirement to articulate in clear and plain terms the logic for decision-making to the data subject when profiling and automated decision-making are at play (in the future, this will almost always be the case). But the GDPR didn’t anticipate the scale of the data explosion or the use of technologies that technologists themselves cannot explain.

Given these quantitative and qualitative challenges, how can we possibly be expected to give specific, informed, and unambiguous consent for things we cannot understand? We rarely have a “meeting of the minds” required to form a valid contract that can provide the basis for an informed consent model. Instead, we end up saying “yes” when we would likely say “no” if we had full transparency and awareness of the true consequences of our decisions (e.g. we might not download that app that is scraping our address book, accessing our microphone, and literally stalking us through our geolocator). In the digital world, large online intermediaries, ad tech providers, and other corporates are happy to ignore that our “yes” actually means “no” in many cases. They are enabled by lawyers who craft privacy policies and user terms that purport to obtain our “consent” to unethical practices like corporate surveillance and behavioral engineering (in my view, it’s time to hold those gatekeepers accountable too).

Going forward, these challenges will make a meaningful standard of consent even harder to achieve. We can anticipate what will happen — just as we have used browser settings to express preferences in our online behavior, we will end up relying on hyper-charged intelligent agents that will decide for us. And we’ll have to trust those agents, right? Wrong. Just as law and policy are trending towards requiring privacy and data protection by design and default for browsers and other digital intermediaries, these agents should be held to such “by design and default” requirements. In fact, given their inevitable prominence in our lives, we should require more than mere data protection, security, and privacy by design — we should also insist on “ethics by design and default.”

So where does that leave us? We can accept the status quo and persist with the notice and consent framework as currently in place (ignoring the illegitimacy of this approach in most cases and the resulting consequences of surveillance capitalism) — or — we can create an alternative framework that stacks the deck in favor of individual rights. In this alternative framework, we would move “consent” up one level (i.e. out of the level of unconscionable terms and conditions and contracts of adhesion) by expanding the reach of “by design and default” legal, technical, and ethical standards. We would also restore the fundamental principles of contract law embedded in the origins of notice and consent, whereby the individual has sufficient bargaining power to negotiate an enforceable contract (this may require collective action and coordination across the “human sector” in ways we’ve never seen before).

For a small window, the choice is still ours to make.

--

--

Elizabeth M. Renieris
Golden Data

Founder @ hackylawyer | Fellow @ Berkman Klein Center for Internet & Society | Fellow @ Carr Center at Harvard |CIPP/E, CIPP/US | Privacy, Identity, Blockchain