We need to talk about consent

Alan Mitchell
Mydex
Published in
10 min readDec 20, 2021

Aside from promoting innovation and growth (for which read ‘business models based on data surveillance’), the UK Government is justifying its proposed data protection ‘reforms’ on the need to address ‘consent fatigue’.

It’s quite right to raise this issue. Consent in relation to the collection and use of personal data is in a mess, which needs sorting, most visibly and annoyingly with cookies. The question is, how best to sort this issue?.

The UK Government’s cynical solution is to simply abolish consent in all but name; to replace citizen rights over their data with corporate rights. But a different, better solution is possible.

This, the fourth blog in our series prompted by the Government’s proposed data protection ‘reforms’, explores how the consent mess came about and how to clear it up.

How we got here

Our current problems with consent are caused by a combination of philosophical, regulatory and administrative factors. We need to understand them all to see a way forward.

Philosophical differences

Two philosophical differences underlie the great consent debate: what sort of thing is personal data, and what’s the most appropriate way of treating it?

What sort of thing is personal data? Some simply see personal data as a ‘resource’ like any other: to be mined, extracted, processed and monetised. Others see personal data as something that is highly personal — an extension and expression of a human being.

The heart of the UK Government’s approach is to treat personal data as a thing — a resource whose value is to be extracted and maximised by ‘innovating’ corporations. It treats the people whose data it is like rats in a lab, to be observed and manipulated, and like sheep in data fields, to be herded and shorn of their valuable data wool. Hence the Government’s goal of ‘securing the UK’s status as a global hub for the free flow of personal data’ where personal data is treated as a commodity like any other.

Others, including Mydex, see personal data as an extension and expression of a human being, which therefore needs to be treated with the respect that we would afford another human being.

This isn’t just a warm, fuzzy sentiment. It has practical implications. It means operationalising, in rules and processes, respect for human beings’ agency (the desire to pursue goals), their autonomy (their freedom to do so), and their dignity (their right to have their views and feelings taken into account rather than being treated as a thing and therefore ridden over roughshod). Ultimately, it boils down to the Golden Rule: Treat others as you would wish to be treated.

In each case, these different forms of respect should manifest themselves in two key areas: having rights a) over how our data is collected and used (which includes ‘consent’), and b) over what it is used for.

What’s the most appropriate way of treating it? Current data protection law (e.g. GDPR) opts for a rights based approach (a plus point). But it muddies the waters by mixing the pure assertion of rights with treating personal data as a subject of a contract negotiation.

At one level, consent as a contract negotiation fits with the notions of human agency, autonomy, dignity and mutuality as above. People agree to contracts if there is mutual benefit in them.

But contract law is based on an absurd fiction: that there are two equal parties negotiating with each other. That an individual citizen and a global titan such as, say, a Google, Facebook or Nation State are both equals in the negotiation.

Contract law is also based on the mythologies of ‘rational economic man’ that underpin modern economics, which tell us that individuals are (or should be) ‘rational’ in their decision-making. This may sound harmless, but economists use the word ‘rational’ to mean something completely different to its meaning in everyday language.

In economics-land , a ‘rational’ decision maker always assesses every detail of information there is about their decision, weighing pros and cons that ricochet indefinitely into the future with exact probabilities, and does so with every decision, big and small, in an instant, without any cost in terms of time or effort.

According to this view, 60 pages of small print legalese to accompany an attempt to access a service is entirely ‘rational’: to make a rational decision the individual has to be ‘fully informed’.

And, thanks to the mythology behind contract law, it is assumed that an individual presented with 60 pages of small print legalese when seeking to access a service is perfectly free to sign the contract or not. They are free to walk away if they want to.

And if they sign it, it is their responsibility to face the consequences because they freely decided to agree to it.

But as we all know, in the real world, fully understanding acres of small print is a burden and interruption that gets in the way of our doing the things we want to do. We don’t always want to be ‘fully informed’ — often, we would simply prefer not to have to think about it. And we’re not really free to walk away if we want to access the service in question. So we end up signing contracts we have not read — signing our rights away, in a way which is deemed perfectly lawful.

This is how consent became a stick to beat citizens with — where ‘consent’ is used to get citizens to ‘agree’ to all manner of things that they don’t really want, and where they are blamed for the result because ‘you agreed to it!’

A regulatory mess

Once the regulations were set up in this way, with this unclear mixture of rights and contracts, three things happened. First, risk averse corporate lawyers started using ‘consent’ as a blanket protection for their corporate employers.

Aside from consent, European data protection laws established five other grounds for lawful processing of an individual’s data: Contract (e.g. use of data for performance of a contract e.g. delivering a service), Legal obligation (e.g. to comply with the law), Vital interests (e.g. to protect someone’s life), Public task (e.g. in the public interest) and Legitimate interests (a vague area which the UK Government is seizing on to deny citizens their data rights).

Practically speaking, virtually every use of data by bona fide service providers can be, and is, covered by the first five. But ‘consent’ could cover anything, so the lawyers added it in, just to be on the safe side. Very soon, it became the main practical legal ground for the processing of personal data … which was never the intention.

Second, wily operators on the make saw the opportunity to game the consent system, and game it they did. Knowing full well that the ‘fair contract between free and equal parties’ was a myth, they deliberately constructed terms and conditions and ‘privacy policies’ by which individuals could be asked to consent to all manner of things. Knowing that a) most people wouldn’t read or understand the small print and b) that even if they did, they didn’t have much choice but to agree if they wanted to access the service in question, they used ‘consent’ to open up a data free-for-all.

The third thing that happened was that with ‘consent’ now being gamed to unleash a tidal wave of abuse, the regulators … did nothing. What else could they do? The law was clear: if someone had freely consented, that was it; they had consented. It was lawful. End of story.

Finally, recognising this had become a problem, with the introduction of GDPR and related legislation, they sought to tighten the rules surrounding consent. But they did so by going even further down the contract negotiation route, thereby compounding the problem. Hence the ultimate absurdity of cookie pop-ups with mountains of manipulative small print getting in the way of us every time we want to visit a website.

Badly designed processes

To turn the car crash into a fully fledged motorway pile up, law makers and regulators based the processes by which consent would work on the organisation-centric assumption — that organisations are and always should be the ones controlling individuals’ data.

Two things followed. Individuals were presented with small print written by the organisation for the organisation. (It never entered anybody’s minds that organisations could or should agree to terms set by the individual.) And individuals had to deal with each organisation separately (because each ‘contract’ was separate), thus multiplying the burden of reading and understanding obscure small print potentially hundreds of times over.

Result? The ‘consent fatigue’ that the UK Government talks about.

A positive way forward

The Government’s cynical solution to the problem of consent fatigue is to simply abolish it by expanding organisations’ right to collect and use data without consent so widely that consent becomes irrelevant.

A different approach is needed, one which:

  • fully embraces the notion of personal data as an extension and expression of a human being (the philosophy bit)
  • interprets and enforces the law in a way that practically supports this approach (the regulatory bit)
  • does so in a way that is safe, easy and efficient (the administrative process bit).

There is a way to do this. Here it is.

Getting the philosophy right

On the philosophical front, we need to recognise the power imbalances that lie at the heart of contract negotiations between individuals and organisations and discard the fiction that humans are ‘rational economic agents’. We need to design systems that take account of how the real world works and respect the ways that real people behave.

We humans are ‘cognitive misers’. We don’t want to invest huge amounts of time and effort turning every transaction into a contract negotiation. We don’t want to invest hours of our precious time trying to understand small print unless it’s really necessary. When we buy something in a shop or online, we don’t enter into a contract negotiation about honest, factual descriptions about the product concerned, the safety of its materials and construction, and liability if something goes wrong. We don’t have to worry about these things because we are more or less ‘safe by default’, thanks to a web of protection legislation that was painstakingly constructed over decades.

We need the same for personal data. When a citizen shares data with a service provider, the citizen should have a right to feel confident that their data (and therefore they) will be kept safe. The philosophical approach that should be adopted is ‘Safe By Default’.

In this case, there should be a standard (not individually negotiated) assumption that any data that is collected will only be used to provide the service in question; that it will not be shared with anyone not needing to be involved in this service provision; and that the service provider won’t collect more data than is needed to provide the service and won’t hold the data longer than is needed for its provision.

All of this is already enshrined in European data protection law, and the processing of data on these grounds is already deemed lawful as ‘necessary for the performance of a contract. There is no need to change the law for Safe By Default to become the default. It is already there. It just needs to be enforced.

Enforcing the regulations

For Safe By Default to work one simple thing needs to happen. Regulators need to stop being spineless: they need to interpret and enforce the law as it was originally intended.

This would require three things. First, regulators should interpret existing law relating to the six lawful grounds for processing (outlined above) so that ‘consent’ should only be used when there is an exception to the first five. In particular, consent should only be required if an organisation wants to use data for something other than for performance of a contract. Organisations should be punished for using consent unnecessarily — because if the data is genuinely being used in performance of a contract then consent is unnecessary.

Second, if consent is used, organisations should be required to keep to the spirit as well as the letter of the law. For example, they should not be allowed to bundle different things into the same contract, so that individuals have to agree to things they don’t want in order to access the things they do want.

Third, the law needs to be enforced. Organisations should be punished for any breaches of data beyond the boundaries of Safe By Default.

Reforming processes for consent

The processes by which consent is managed need to be reformed.

  1. Individuals should be able to present their own terms and conditions for access to data, which organisations should be able to agree to (thus turning current processes on their head). (This does not require every individual to become a data lawyer. The vast majority of use cases for data sharing fall into a small number of categories (e.g. for insight and research, for advertising, for rent or sale, etc) which can be covered by a small number of standard contracts.)
  2. Individuals should be provided with their own consent management dashboards where they can see and manage all consents from one place (including revoking consents or exercising their rights under data protection law). Organisations should be required to connect to and operate their processes via these dashboards.
  3. These processes should be automated as far as possible. Think of a direct debit. You spend some time setting it up in the beginning (knowing that its subsequent operation will be Safe By Default). Then, once it is set up, it just happens automatically, with the individual remaining in full control because they can always revoke it if they want to. This is how data sharing, and any consent that is needed for such data sharing, should work.

Conclusion

The UK Government is right on one thing. The operations of consent and ‘consent fatigue’ are real problems that need sorting. But its cynical ‘so let’s abolish consent’ response is unacceptable. Citizens did not create this problem. Policy makers, regulators and organisations did.

A positive solution is possible: one that protects citizens’ rights, respects their dignity and makes data sharing and use Safe By Default. It does this by developing new infrastructure that simplifies, standardises and automates data sharing and consent processes, stripping out unnecessary complexity (and the abuses that go with this complexity).

This is infrastructure that positively empowers citizens rather than denying them their rights and it is all perfectly do-able. Mydex CIC is already building it — the rest boils down to ‘where there is a will there is a way’, especially on the part of regulators.

There is no need to reform data protection law to address the challenge of ‘consent fatigue’. There is a pressing need to empower citizens with their own data.

Other blogs in this series are:

  • Why is Data Valuable? Exposes misconceptions about the value of data, and therefore where its biggest economic potential lies.
  • Five Myths about Data, Innovation and Growth Explains why the Government’s claims that its ‘reforms’ will promote innovation and growth are without foundation.
  • AI: The Emperor’s New Clothes? Shows how the Government’s claims that these ‘reforms’ are needed to promote Artificial Intelligence are without foundation, and based on deep misunderstandings of AI itself.
  • Data: A New Direction — But Which Direction? Analyses the core proposals made by the Government’s initiative to ‘reform’ data protection regulations.

--

--