Here’s what NNG got wrong about Privacy (+ Lessons learned from Game of Thrones Season 8)

Nathan Kinch
Greater Than Experience Design
6 min readJun 10, 2019

Image credit: HBO

Recently, Nielsen Norman Group published an article titled, Creepiness-Convenience Tradeoff.

Summary: As people consider whether to use the new “creepy” technologies, they do a type of cost-benefit analysis weighing the loss of privacy against the benefits they will receive in return.

Source: NNG Group

Definition: The creepiness–convenience tradeoff refers to people’s willingness to accept the downsides of a technology that invades privacy for the sake of its benefits.

I called this out on LinkedIn. Glasses, beanie and all.

Specifically, I was (and remain) concerned that this ‘insight’ was situated in a largely psychological, rather than sociopolitical context.

It would seem Game of Thrones Season 8 suffered a similar fate.

This is problematic, particularly when the conclusion of this article proposes considerations (some of which, like the mention of progressive disclosure, are actually very useful by the way) for how to design for such scenarios. Even more so given this came from NNG. People pay attention to what they publish. Heck, these are the people that came up with the usability heuristics (1994) so much of the industry relies on today!

Let me be clear. We should never design creepy technology. Ever. Privacy and security should be baked into the core of all products and services from the outset.

It’s worthwhile noting that privacy is a right. Having it or not can directly impact our health. It’s often misconstrued and miss-understood. Here’s a little clarification in the context of data protection.

Many people working directly in this space, including myself, are actually arguing that we challenge the existing client:server architecture of the web.

“Solid changes the current model where users have to hand over personal data to digital giants in exchange for perceived value. As we’ve all discovered, this hasn’t been in our best interests. Solid is how we evolve the web in order to restore balance — by giving every one of us complete control over data, personal or not, in a revolutionary way.”

— Tim Berners-Lee in his blog announcing Solid

It’s worth noting that Solid is a platform built using the existing web. The SAFE Network is a different story. This might well be ‘a’ version of the new internet we’ve been envisaging. Check it out!

“Privacy and security should be available to everyone, not just those with deep pockets.

That’s why the SAFE Network encrypts all data by default, automatically splitting it into many pieces which constantly move to locations around the globe that cannot be traced. And your access to them is untraceable too.

No more hacked data, no more stolen passwords, no more eavesdropping. Private. Secure. Anonymous.”

I digress.

Back to the real problem. The Creepiness-Convenience Tradeoff relies on a series of assumptions:

  1. People are rationale: Sometimes(ish), but it’s way more complicated
  2. There is equal information available to both parties: There isn’t. Organisations know WAY more
  3. Both parties are in an equal position of power: They’re not. Organisations have FAR more power
  4. Organisations are incentivised to do what is right, rather than what is economically viable: They’re not. Read our Rebuilding Trust in Financial Services Playbook for more
  5. People haven’t (as a result of the system) learned distinct behaviours: They have. Specifically, people have learned to ‘tick and forget’. There simply isn’t another way

There are a variety of others. Let’s remain concise.

NNG’s perspective is limited and limiting. It fails to account for the broader context in which we operate.

Let’s get real about how a few key aspects of the information economy works today.

The architecture of the web

The web is architected in such a way that whenever we interact with digital services, data is shared, whether an explicit action or not. When combined with the impact of network effects, this creates information monopolies.

This is a day in the life of our data. It’s the reality of the information economy today. There is a fundamental power imbalance. We, as individuals, are on the (negative) receiving end.

Business models and incentive structures

Organisations value what they measure. They don’t necessarily measure what they value. This is problematic.

Let’s take programmatic or ‘RTB’ advertising as an example.

Ad exchanges run an auction to determine which ad (version etc.) will be shown to the visitor of a given website. Exchanges share the data they process about a website’s visitors with several hundred prospective advertisers. This enables them decide (based on specific criteria) whether or not they’ll place a bid to capture the visitors attention.

Source: fixad.tech

In fact, research from New Economics Foundation’s estimates that ad exchanges broadcast intimate profiles about an average UK internet user 164 times per day. This is not a prerequisite of effective advertising. There are direct alternatives to this model.

To be clear, we’re optimising an entire ecosystem, that just so happens to be leaking data all over the place, because we literally place a value on clicks…

Let’s take Map services as another example.

Maps services train models based on our collective usage. They deliver marginal value back to us in the form of ‘free’ services. The companies backing them generate billions in revenue as a result of the models they deploy. We’re effectively working for Big Tech.

When did we have a discussion about this ‘value exchange’? When did we decide it was appropriate?

We didn’t. The discussion hasn’t been had. We’re all just locked into the model. Again, this is social. It’s political. It’s our current context.

Manipulation and (lack of) choice

Deceived by Design. Heard of it? You should have. We’re all victims of it.

Dark Patterns are everywhere. Behavioural design is being used to ‘hook’ us. Choice architectures are increasingly defined by a smaller and smaller group of largely influential companies.

When it comes to terms and conditions, or any other form of agreement you enter into, it gets worse. 33,000 words. Grade 16 readability. Almost three hours of (forced) reading time. And it’s all hidden nicely within a simple process that takes just a few minutes… This is a take it or leave it approach. It’s a zero sum game. The unfortunate reality is we’re all playing along.

So where do we actually get to? If people care so much (about their right to privacy), why do they keep participating and sharing (so much data)?

It comes down to a simple concept: the tradeoff fallacy.

“The findings, instead, support a new explanation: a majority of Americans are resigned to giving up their data — and that is why many appear to be engaging in tradeoffs. Resignation occurs when a person believes an undesirable outcome is inevitable and feels powerless to stop it. Rather than feeling able to make choices, Americans believe it is futile to manage what companies can learn about them. The study reveals that more than half do not want to lose control over their information but also believe this loss of control has already happened.”

People are not actively engaging in rational tradeoff choices where the costs and consequences are truly understood. They share because they have no other choice. This is the reality of the modern web. We’ve backed this up again and again empirically. This is supported by existing bodies of research globally (albeit mostly in the west). In fact, we consistently observe a 6 stage process that situates people’s attitudes and behaviours in a broader, sociopolitical context.

But it doesn’t have to be this way. This model is being challenged by entrepreneurs, major companies like Microsoft and Apple, and of course, policy decision-makers globally. There is more and more consensus that we need to design a humanity-centric information economy that protects people’s fundamental rights and freedoms in such a way that innovation and competition truly thrive. After all, trust disproportionately impacts business performance. We may as well make what is right for humanity great for business.

If you’d like to dive deeper, get in touch. I do value a good coffee :)

--

--

Nathan Kinch
Greater Than Experience Design

A confluence of Happy Gilmore, Conor McGregor and the Dalai Lama.