The meaning of privacy by design: a panel discussion at UCD Bristol

UX used to come at the end the design process, but times have changed, along with the notion of privacy.

UCD Bristol
8 min readJun 24, 2019

Back in January, we had computer scientist and design thinker Lon Barfield talk to us about privacy by design with a focus on the modern concept of privacy and data. It was a popular talk with plenty of food for thought around how we as UXers need to consider privacy with our designs and processes moving forward.

This month, we invited Lon back to UCD Bristol, this time as part of a panel, to talk about this topic in more detail.

Let’s introduce the panel:

  • Lon Barfield — computer scientist and design thinker at SimpleWeb.
  • Rita Cervetto — Service and Visual Designer at Ovo Energy.
  • David ‘Shef’ Barker — User Experience expert at Immersive Labs
  • Maria Santos — Digital Marketing Manager at People for Research.

Adam Babajee-Pycroft from Natural Interaction was moderating the panel and he started with a broad question:

What does privacy by design mean to you?

Our panel’s answers varied but one thing was clear — privacy by design is an ideal — a common goal that we all need to be aiming for. For Maria, it’s about communication: keeping your users informed of what you’re doing with their data. For Shef, it means compliance: he feels that privacy has always been a low priority for businesses but now, with GDPR and a heightened awareness, it’s becoming more and more important for UXers and designers to build it into their processes. In fact, he likened the challenge to accessibility — both have to be thought about early in the design process to make a site or product fully usable.

Lon then recapped us a little from his talk in January and explained that UX always used to come at the end the design process but that now, it simply cannot work that way.

Addiction design & dark patterns

There’s been plenty of talk recently about the dangers of too much screen time, but where do we draw the line when a brief encourages engagement and habit-forming? What is healthy and who is responsible?

Maria quoted former Google design ethicist Tristan Harris:

“We’re upgrading the machines, but we’re downgrading humans, downgrading our attention span, downgrading our mental health”.

He talks publicly about how urgent action is needed to address the dark side of technology, something our panel all agree with.

Lon suggested that the difficulty with addition and dark pattern design is that often, games and app designers are in fact giving people what they want. It might not be good for them, but it’s what they want. Rita agreed, but felt that as a designer you should be careful about which behaviours you’re encouraging.

Shef, like many of us, was not surprised by any of this. He pointed out that many online platforms implement gamification patterns, and while some of these feel harmless — for example, learning streaks in Duolingo — , other times there are loop boxes and random rewards designed specifically to keep a user engaged for as long as possible.

Addictive design is often a result of the performance metrics used and measured by a business; Lon thinks that this is where the problem could have originated, since rewards are encouraged based on simplistic behaviour. It’s not all bad news though, as there is some evidence of big businesses starting to take a more ethical approach to over-engagement. Take Instagram, for example — they now monitor your usage allowing you to track how long you’ve been on the app for.

How do decision makers empathise with the user needs?

For Shef, it all comes down to transparency ; Lon agreed, saying that “if you ask for permission and explain what you do with each piece of data you collect” you are likely to end up with a better user / brand relationship.

Maria suggested that talking to users at the very beginning of their journey with your business is the best approach. Working for a company which recruits people for research, she often speaks to audiences with specific needs and believes it’s imperative that the consent forms used across the industry are easy to understand, provided upfront and accessible to all.

An ethical approach is the only way to go. Being transparent, upfront and clear about why you’re collecting data is the best way to gain trust and understanding from your users.

“Doctors take the Hippocratic oath, perhaps we could have something similar,” a great suggestion from Lon. In heavily regulated industries like finance and energy, data collection and transparency have been a key part of design and customer experience for a long longer than elsewhere. Could the wider world learn from them? If not, are we at risk of burying ourselves under layer of rules, cookie notifications, forms and text which hamper design performance and creativity?

Privacy is a modern concept

Next we turned to the broader concept of privacy. Rita was clearly shocked at the speed at which data greed and collection has grown, to the point where it has become a huge industry. Adam suggested this could be related to the fact that data laws are still drafted by people who don’t understand technology: the cookie law, for example, requires huge swathes of text to be displayed on a website and yet, usability testing shows that most people either shut down the notification without looking or, worst case scenario, struggle to read around it.

Alongside giant cookie pop-ups and notices, we have privacy settings hidden away on apps. Surely it’s our responsibility as designers to make these tools easier to find and use? Rita’s concern here was that some people may not even be aware of what they’re giving away.

Our views on the importance of privacy can vary by age and socio-economic factors, so we also spent some time discussing how privacy is perceived by different groups and all agreed that future generations will probably see things differently to us.

For example, we’re pretty new to all this, whereas our children will grow up as ‘no privacy natives’. Shef thinks it’s likely that future generations will simply be so used to sharing all their personal details, that it might not seem like such a big deal to them. Bringing the conversation back around to trust, Lon suggested that whilst we no longer know who has our data and can’t trust ‘them’, the future is likely to hold a more equitable and trustworthy relationship between givers and holders of data.

Alan, a member of the audience, asked a poignant question: “do younger generations care about privacy or do they assume they have none and therefore don’t care?” This led to a conversation about whether younger generations are willing to give up data for convenience, something Adam doesn’t believe in. He mentioned how there is currently a trend for school kids to use Google Docs to chat on, rather than apps which collect their data. It can be used anywhere and the chat function is ephemeral. Are they being taught to do this by their teachers and parents or is it their own awareness?

Another important question: in the future, could data privacy be only for those who can afford it? Maria worried that vulnerable people could be at risk and might end up giving their data because of rewards or discounts? Right now, for example, you can buy the Amazon Fire tablet cheaper with adverts on it than without. It looks like devices that “protect” you will cost more in the future, so again privacy could become an exclusive privilege to those who can afford it.

GDPR is a blunt instrument, but it has actually helped make a shift

Lon reminded us that if you follow the legislation properly and collect less data, the quality will almost certainly go up. Honesty and communication will build trust and result in less fake, unusable data.

You have to treat privacy as a trust relationship and not a data collection tool for your sales team. Two examples here:

  1. Apple are increasingly serious about privacy as a service so perhaps their conscientiousness will lead as an example for others.
  2. Monzo are open in their communications and have a number of ethical features (such as gambling limits), listening to customers and an open feedback loop.

It feels that some businesses are controlled by humanity and not law or rules. At the other end of the spectrum, in China, your social value is now based around your behaviour and your online persona — but whilst data has no borders, laws do. This is something to think about because as your data is collected, you could find that you break the laws in some countries and not others.

Does blockchain offer any privacy protection?

Adam asked the panel whether blockchain and peer-to-peer networks could provide a solution to the seemingly endless stream of digital data collection we all face on a daily basis. As an expert in this area, Lon suggested it may help because it stops data sitting in one single place ready to be hacked.

Thank you to our panel for tackling some big, challenging topics this month at UCD Bristol. We’re not sure we answered everything, but one thing is clear: as UXers and designers, we should be pushing for ethical communication and transparency around data. Only by building trust between companies and users, can we truly turn something which is perceived as negative into a positive.

We’ll be sharing a video of the discussion soon. For now, why not check out some of our previous speakers? We’ll be back soon with more details about the next UCD Bristol.

We are currently accepting talk submissions, so if you would like to feature in one of our next meetups as a speaker, complete this short form. We can’t wait to hear about your awesome ideas!

--

--

UCD Bristol

If you're interested in the fields of UX, Product and Service Design, Customer Research and beyond, this hands-on monthly meetup in Bristol is the one for you.