Notice & Consent: the Words and the Way

William Heath plenary keynote for the world’s information data protection and privacy commissioners at the 39th ICDPPC conference

27 Sept 2017

Hong Kong

— — — — — —

The one time I visited Beijing was when the Young Foundation was invited by the then Chinese Prime Minister’s think tank to discuss how to help the PRC establish a thriving NGO sector as part of the harmonious society. The night before we flew out we were invited to dinner by the British Council to wish us good luck. But they had a warning for us: they said “while in China do not criticise China” We said of course not; we were fascinated by China and happy to be there as guests. The British Council then warned us again: “furthermore, while you are in China, do not criticise Britain either.” We said that’s ridiculous. They said “just don’t criticise Britain, because it would embarrass your hosts.”

So we flew out to Beijing and heard two days of extraordinarily dedicated south american, Indian and other social entrepreneurs doing incredible things for no money, and Chinese deputy mayors shouting at us about their cities’ extraordinary growth and environmental performance. After two days I’d made friends with the deputy head of the think tank and explained our problem: We’re trying to share how you get a vibrant NGO sector, but we’ve been told not to criticise Britain. But why do think these people do what they do? They’re drawn to brokenness, and they’re insanely critical of their own governments. But we’ve been told not to tell you that. The secret is you have to find a way to protect this ecology of dissent.

His reply, after careful thought, was: “In modern China it’s OK to be critical. As long as you do it in a constructive way.”

It’s excellent advice.

Please accept what follows as a provocation, not a prescription.

On my trip out I’ve been reading the Chinese Daoist teacher Chuang Tzu. I’d like to propose the ICDPPC adopt him as its first guiding philosopher: wise and funny and extraordinarily contemporary he has some spot on messages for our topic today.

Chuang Tzu teaches us to deconstruct the language we use. He says the Tao that can be talked about is not the true Tao.

The words we use: privacy, consumer, data subject, consent, and notice — these are not “the way”.

The C4thBC philosopher Chuang Tzu (or Zhuang Zhou/Zhuangzi)

They’re simply not adequate to construct a human foundation of trust in global online relationships.

You know the basics better than anyone in the world, but let’s restate them as a start point. The first problem word is “Notice”.

There are too many notices — hundreds for each of us.

The notices are too long. Instagram or Apple iTunes notices are twice as long as my talk to you today.

They’re too complicated. They cover every eventuality for the company, leaving the individual naked.

Each service however trivial and commonplace has to have its own notice.

It’s fine for the other side’s lawyer to talk to my lawyer. It’s not ok for me routinely to sign whatever someone else’s lawyer sticks in front of me or my child.

It’s fine for my machine to deal with someone else’s machine. It’s not Ok for me to spend my valuable human time engaging in dialogue with machines that can’t engage adjust or respond meaningfully.

The notice is presented in an instant when I want to get something done. But the consequences last forever.

Relied on as a universal cure-all, notice is less a panacea and more a laughing stock.

Pilbrow, drawing here for the UK satirical magazine Private Eye is concise and truthful, more so than the notice and consent process.

Next problem word: Consent.

If the notice is inadequate (and it is, as we’ve seen, in a whole variety of ways) then the consent is pretty meaningless.

It’s not limited, I’ve no choice, it would take valuable time to think about it, and it’s not negotiable anyway.

Meant to fix everything, used everywhere it becomes devalued, like antibiotics applied universally.

The result is that what should be healthy online relationships are instead built on devalued and degraded contracts which undermine trust and mutual respect.

You could imagine “ideal” consent, or at least adequate consent. Europe’s GDPR makes a serious effort at that, requiring consent to be

  • specific, granular, clear, prominent, opt-in, properly documented and easily withdrawn

Beyond these GDPR criteria it would be great if consent were

  • as standardised as possible
  • issued by my machine to your machine
  • actionable, so there are remedies for breach

But fully implementing even just the basic GDPR requirements presents challenges technical legal and commercial. (For the benefit of our translators “presents challenges” is British English understatement for “this is pretty much impossible”)

Consent that takes any real effort is only worth seeking for the limited occasions where there’s a really significant decision, where the decision and the considered consent actually matter.

Can we as a start point for our discussion today reject the notion that we gave adequate consent to what is now being done now with our personal data?

GIVE ME THE MACHINE!

It seems odd to rely so heavily on consent when it’s human nature that we all make bad decisions all the time. Zach Weinersmith nails it here, in a cartoon he shares with his compliments.

That won’t stop. A legal construct that fails to recognise that won’t put us on the right path.

From a data contract point of view we’ve got the information age off to a bad start. The words are stuck, while the reality is changing fast. And of course it’s about to get much worse.

The amount of data we generate is increasing faster than Moore’s Law.

As we sign up to ever more services we enter ever more agreements barely any of which ever get read.

Now we add an order of magnitude increase to the problem, as not just complex devices like cars and mobile phones but also what should be straightforward items such as toasters lightbulbs and fish-tanks are sold as internet connected devices wrapped up with terms and conditions or the manufacturer’s privacy policy.

And by now we do everything digitally; we work, we socialise, learn, create, share culture and relax. It’s a large percentage of our human lives, all of it observed and increasingly manipulated on this flimsy legal basis.

Yes we buy goods and services, but we’re not merely consumers. We’re producers and agents and creators and friends and family and much more besides.

And yes, there’s data about us. But that doesn’t reduce us to being data subjects. Subjects are acted upon; patronised or infantilised. Subjects have little meaningful agency or control.

“Data subject” is a legal term devised for a specific purpose. But the term belittles us as humans in an information age.

Chuang Tzu teaches that we don’t have words for things we don’t experience. And we do not experience what happens to our data.

Of course regulators have to put up with their fair share of ridicule. Grizelda who drew this for Private Eye sends best wishes to you all by the way for a successful conference.

In her picture they’re cold-shouldering this regulator, for now. But I’d like to see her draw their faces again when he’s just issued them a 4% of turnover fine.

There’s a change of gear here. Notice and consent was born out of well-intentioned US consumer protection. But in drafting the GDPR Europeans see a fundamental right at stake here — the right to privacy, constitutionally protected in Germany and elsewhere, perhaps India too now.

GDPR may offer superior protection but — I suggest — it still can’t make consent entirely fit for purpose. Even in doing what it does (still not enough to please consent purists; already far more than the corporate lobby wanted) it may make consent unviable as the lawful basis for a whole swathe of activities such as insurance

This may push businesses to use one of the other half dozen lawful basis for the processing of data (still within your remit but beyond the scope of this session).

This diagram by Marty Abrams is very telling. It suggests some progress at regulating the data individuals provide, but little progress in regulating observed data such as CCTV and none to speak of in the valuable and fast growing areas of derived and inferred data.

Regulators are still at step one. It may feel like this is getting away from us.

Contracts based on lying that you’ve read them are broken. Marketing is broken, certainly if the basis is meant to be consent. Given the chance — the legal right — an overwhelming majority will choose to end that relationship.

This brokenness perverts the business model of the media we depend on to reward creators and keep society informed. The New York Times CEO recently described digital advertising as a “nightmarish joke, “total mess,” “out of control” and “dangerous environment for brands.” He then warned: “There is also a negative side.”

Digital marketing isn’t just annoying and increasingly creepy; it’s based on data of very dubious quality. As businesses and governments use the same data for risk management, turning to the same discredited data brokers to manage fraud or make recruitment decisions these vital processes too are tainted.

We’re like flood victims trying to survive in a toxic soup of our leaked personal details. Detailed profiles of us are now in the hands not just of companies and governments but also of hackers and crooks, based on these shabby data contracts. It’s unprecedented. The head of Britain’s MI6 spoke in this context of “Profound risks; fundamental threat to our sovereignty”.

We’ve seen extremely serious consequences. Malevolent actors have tilted our public discourse to strike a mortal blow against the EU and put a delinquent narcissist in the White House.

It’s as bad as the widespread delinquency that preceded the fall of Jerusalem 2500 years ago.

The prophet Jeremiah warned the Jews in stark terms of the wrongness of their ways. The same words could ring out today in the boardrooms of Equifax and other data businesses operating behind our backs.

But no-one paid any attention to Jeremiah, and what came next was Old Testament fire and brimstone on a Biblical scale, followed by a major rethink on ethics.

This toxic soup, the foetid water leaking from cracked cisterns is our problem, here in this room today. Very few of us will shed a tear if Equifax and others go the way of Lehman Brothers and Enron, but we have to try to fix it before our world collapses, and it feels like all we have are broken words and fatuous check boxes with the wording the wrong way round.

The wonderful German word “Weltauffassung” evokes how we grasp or understand the world, drawing on every discipline available to us: science, philosophy, culture and faith.

It feels like our Weltauffassung has broken down. We’re left crying out — like the hero of a mid c19th German tragedy — “Ich versteh die Welt nicht mehr!” — I no longer understand the world.

In Daoist terms it feels like we’ve lost our way.

Perhaps that not surprising. We’re in the process of connecting the entire world in all its diversity of nation, culture and world view. This is an insanely ambitious and optimistic undertaking. Technology has made the connection; ethics law and regulation need to catch up.

If we’re to have a frame of reference for how we understand and regulate global life online — how we codify relationships and keep ourselves and our societies healthy — that will need a deep and broad world view. That’s the scale of the question that lies behind the apparently intractable problem in this session.

I’m not sure we can solve it today in this forum, but we have to raise it here, now, given not just your responsibilities and professional skills, but also the wide diversity of cultures backgrounds principles and beliefs represented here.

This concerns not just the artificial constructs of government and business, it’s not arm-wrestling between free market advocates and the regulator.

Chuang Tzu teaches a profound hatred of anything that seeks to enslave or control the human spirit. You can just imagine the cryptic post-modernist anecdotes — probably involving fish — with which he would have ridiculed the governments that came up with the Aadhar card, the surveillance state, the Citizen Score or an open market in our browser histories.

In a similar vein the psychologist Josh Cohen gives a great explanation of why privacy matters in his book Why we live our lives in the dark. Whether you look at the human condition via science, religion or his own discipline of psychology you have known externalities: the body of scientific knowledge, sacred texts or buildings or the problematic behaviour of the patient.

But in each case, Cohen says, the part that matters is the part we do not know and cannot put into words: the undiscovered knowledge, the inexpressible divine, or for the Freudian, the id which is unknowable for therapist and patient alike.

So to try routinely to intrude on people’s private lives and codify their essence — the meat and drink of surveillance capitalism and the surveillance state — goes, Cohen suggests, right against the grain of human nature however we understand it, and is doomed to failure.

The issue at stake isn’t limited to our fundamental right to privacy any more than it is to consumer protection. The way we need to follow leads to the richness and complexity of global connected individuals treating each other with dignity, growing trust and friendship.

That’s why these words consumer, data subject, notice and consent are too limiting. They’re valid in a way, but soulless, devoid of anything that people actually care about.

This will not be a monoculture any more than Hong Kong is one. Scale that up to the whole world.

Clearly whatever basis we end up with for our global data protection governance will have substantial US European, Chinese, Indian elements. It will reflect free market aspects, government concern for security and well-intentioned bureaucratic interventions to protect enlightenment. We may find some shared values; inevitably we’ll find a lot that is distinctive and different. Connecting with different cultures of the world awakens us to the full range of the human personality: this is a wonderful thing. Also terrifying of course, but wonderful.

My own journey here is as an entrepreneur. My first business researched the computerisation of public services; I was shocked to find this was being done with no effort to base it all on the needs of the individual.

After I sold that business I co-founded more businesses anticipating an inevitable shift in control to the individual.

One is Ctrl-Shift, the successful research and advisory business which does the unmissable annual review of the Personal Information Economy. Another was Mydex CIC.

Big businesses now want to understand what happens when customers take charge, and how the commercial and compliance implications play out. Regulatory pressure has been a clear and direct contributor to the demand for these services, so thank you.

Understanding the shift is one thing; actually doing it, offering individuals not just lip service but meaningful control over their data is going to take a little longer. There are to my knowledge now a few hundred entrepreneurs with a range of serious offerings of this sort.

It’s a tough undertaking, but entrepreneurs are resilient and we believe this sector — personal information management systems, personal data stores or vaults, personally controlled identity, trust frameworks — will eventually be transformative, and the rewards considerable.

So my hypothesis is that we’ve made a fundamental error. In the way we’ve codified these online relationships we’ve inhibited the growth of the human spirit.

What can individuals do? Withdraw, opt out, try to unravel who holds what data from where. It may seem a hopeless task.

SLIDE: JLM inc

One woman, a student at the Royal College of Art, got so fed up with the privileged status of corporations as people and their encroachment on our lives that she incorporated herself.

JLM Inc now transacts with the rest of the world as a corporation, exploring aspects of quantified self, personal data control and tax breaks. If it’s a success I suggest we all follow suit.

What we need is to empower individuals to get stuff done, regain control, responsibility and agency.

SLIDE: BLACK

I don’t want some sinister Fancy Bear configuring my social bubble of fake news. I want to work with curators I trust to configure my own bubble.

I want to control and use my own digital credentials, proofs of the relationships I have, the track record, customer data so I can assert claims and manage and share my own preferences.

The crucial next step is the introduction of the individual as a node in a person-centric data architecture, instead of data simply being shared between organisations in a continuing organisation-centric mindset and architecture.

Without underestimating the challenges of culture change, or human and structural inertia I and other entrepreneurs I speak with are confident that market solutions based on an individual-centric model and privacy-friendly services are set to have a transformative effect on our online relationships trust and the information economy.

I believe they can be more convenient for the individual, offer more valuable trusted relationships to business with richer data on a permissioned basis, and help governments focus public safety efforts where it is most needed.

SLIDE — CHINESE TRADITIONS

Trying to express this in terms of the great Chinese traditions if we empower individuals, give them agency and control

  • a tiny delinquent fraction of a percent will abuse it and require authoritarian legalism — close surveillance and control;
  • a certain percentage will respond to Ren — firm but compassionate Confucian guidance,
  • but the vast vast majority will be much better off and find their own way with wonderful consequences.

Including the individual and letting them use their personal data to accumulate trust and evidence supports personal development and better information logistics. That helps everyone.

There’s a key regulatory role in this. Once holding personal data costs real money, businesses will either want a solid return, or to cut their cost by transferring the responsibility to the individual. Everything else has gone self-service; the same will happen to personal data management.

It’s not just a cost and regulatory saving for business; there’s a sales upside. The most economically valuable data is not in the toxic soup the foetid pool. It decisions yet to come; future buying intentions. That is the sweet water.

Let’s be French about it and treat the provenance of our personal data as an appellation d’origine controlee. The wine industry shows how the race to the top is worth more than a race to the bottom. That’s why GDPR means growth.

We’re in the wrong place, and regulators are being asked to do an impossible task with inadequate resources. But we can get out of this mess, and your role is key.

The great thing is you do not have to deliver a perfect regulatory solution. A strong regulatory thicket is enough to force up the cost of bad practices, to expose bad actors and help us find a digital Tao, an information age Way of Nature, with the emergence of new market solutions.

I hope the conference sees that as an eminently feasible and incredibly worthwhile challenge.

Happy to meet after and take questions and offer any follow up references. Many thanks for your invitation and your attention.

ENDS