Why we need to invent data ethics. And soon.

The princess, the rock star, the actor, the football player. All jealously guard their privacy and build fortresses around details of their private lives, dodging telephoto lenses and Twitter trolls. Meanwhile, those of us less rich and famous freely donate our personal data to anyone, including those who do not have our best interests at heart. So, how can businesses develop profitable relationships with customers without exploiting them via their data? And how can consumers navigate the jungle of data rights and access?

Designit
Matters
Published in
4 min readMay 18, 2020

--

One good person to ask about this is Pernille Tranberg, expert and advisor from Data Ethics. Pernille’s description of data privacy is refreshingly simple and she offers it quickly and confidently. “It’s the right to decide who knows what about you and when.”

The ‘who’ in question is the invisible organization behind the app on our phone. The software engineers who have access to our Facebook profile. Or the ingredients of the cookies we allow, with a perfunctory click. The reality is, in order to benefit from the advantages offered by technology, we share details which become part of the Big Data people can profit from.

Non-personal data is benign. Examples include the stats informing the weather forecast or the traffic density flashing up on our GPS maps. But personal data is another story and Pernille is deeply concerned.

“I’m suddenly offered a book on Alzheimer’s, based on my browsing history. It’s creepy. It’s design that is deceiving.”

She looks forward to a world where data is anonymized by default and design, where personal information is not linked with a particular individual. A process that she believes could take a decade because a universal standard needs to be developed and agreed upon. And most countries, with the notable exception of Germany, are rushing, not reflecting. “Germany is way behind the rest of Europe, digitally, but this is because they take time to think.”

Pernille believes there are parallels with what happened in our supermarkets. “Look at organic food. Years ago it was unregulated but now there is one, universally accepted, certified standard. And, as with organic food, which started out expensive, perhaps the first people to benefit from data anonymity will be the people who can afford to pay for it.”

We need to know what we are getting ourselves into when we interact with technology.

“Terms and conditions should be so simple, a twelve year old should be able to understand them,” she comments.

Although a twelve year old is unlikely to be a PayPal customer, their T&Cs run to 36,275 words, which is the same length as Shakespeare’s Hamlet. And even Apple, who Pernille spotlights as a company with an admirable data reputation, asks you to read 19,972 words (Macbeth length) before using iTunes. Is there a reason these documents are so long and complex? “Because they don’t want you to read them,” she says.

So, for now at least, it will be down to the conscience of the individual business to articulate its own definition of ethical behavior. And there is evidence that the conversation is breaking out in boardrooms across the globe.

“Businesses are worried about their decisions about data affecting their reputation. Tabloid newspaper headlines. Cambridge Analytica is still fresh. The hardest dilemma is balancing making profits with doing things that are human-centered. Partnering with people, not deceiving them,” Pernille comments.

This will require a long-term, patient perspective, with no quick payback. The first businesses that adopted a green agenda and embraced sustainability did not make money overnight.

So, how can we decide who to trust, when there is no trust marque out there yet? Pernille offers some tips.

“First, ask yourself how the company you’re dealing with makes money. Are they collecting your data simply to make money from it or is just something they’re doing as part of something else? Where are their headquarters? Europe has better data laws. Are their terms and conditions designed so you don’t understand them? And search if anyone is bitching about this company online.”

The gravity of the subject is immense to Pernille: “If we can’t save privacy, we can’t save human beings or democracy.”

This is a comment that resonates at particularly high frequency in COVID-19 times, when personal data is being collected in the name of collective wellbeing and the common good. As countries begin to reopen and stay-at-home orders relax it’s more front of mind than ever.

By harvesting biometric data to safeguard the public against potential spreaders, the line is quickly blurred between what Pernille labels “good and bad data” or if you like, citizen empowerment vs totalitarian surveillance. How permanent, deep-set that line is a matter of ethics. Yuval Noah Harari also argued in his recent FT article that making a choice between health and privacy is in itself a false choice. Both are something we should enjoy.

So, if we’re seeing large-scale social experiments being pushed through, at break-neck speed by emergency degree, rather than being the result of rigorous, time-consuming debate, then when does it end? For how long is it acceptable to continue arguing for greater collection of personal data to justify the current pandemic? At what point do we ensure that the data is not only collected, but collected ethically, with people’s right to privacy at the core?

Having said all this, Pernille sees no inevitability to a depressing outcome. “Even though there are some nasty bastards out there, you have to be optimistic about humanity.”

You can draw your own conclusions about the likelihood of meaningful data ethics by listening to Pernille’s interview in Episode 01 of YELLO, Designit’s podcast.

--

--

Designit
Matters

Designit is a global strategic design firm, part of the leading technology company, Wipro.