Designing for Privacy

It’s time for digital products to be more respectful of users’ data. How can designers advocate for the increasing need for better privacy?

Christin Roman
Dec 19, 2019 · 7 min read
Image for post
Image for post
Credit: Erica Peterson

As a UX designer working in the tech industry for almost 14 years, I’ll admit that I hadn’t quite owned up to my own complicity in how the digital products I’ve designed collect and store data. As someone whose job it is to advocate for the user, I’m no stranger to picking battles — with founders, marketers, developers, graphic designers, anyone whose decisions can have an impact on how somebody uses or experiences a digital product. The battleground is well-trodden, often revolving around what we think our users will and won’t do, what they do and don’t need. Whether things are clear enough, clickable enough, useful enough. The common opt-in vs opt-out debate, which I’ll admit, I have lost many times. But aside from the most egregious and obvious violations of privacy, challenging the common design patterns of account creation and data storage, or the catch-all business model of collecting as much data as we can about our users now and figuring out what to do with it later — suggestions like these were non-starters.

That is, until I did some work for Canopy, a new company founded by ex-Spotify employees that is dedicated to the idea that the products we use today were built on an unfair value exchange that has dangerously become the norm, and should be reimagined from the ground up. A team of designers at Type/Code, myself among them, had a chance to work on a product where the idea that we shouldn’t collect users’ data was taken as a given, and the task of designing a personalized experience to people who are very much used to giving their data away fell to us. We joked many times that if we had been working on this project 12 years ago our job would have been so much easier. (Remember when people were skeptical about giving away their email addresses, creeped out by the idea of using social networks, and would never dream of “linking” their bank accounts to anything?) But, in this current climate of complacency-turned-resentment, explaining to people that you don’t want their private data is actually a harder sell than convincing them to give it to you.

We did a lot of research as part of this project, mostly because we wanted to make sure we were solving the right problem, partially to find the best way of explaining the technology, but also because we feared any missteps. The app we designed — Tonic — does, in fact, collect some of your data (as with any recommendation product, the way you use it is factored in to further improve your recommendations), it just doesn’t store any of it in a way that would be meaningful. The data is still yours in the sense that it stays on your phone and isn’t tied to any specific user account as Canopy sees it. (As founder Brian Whitman likes to put it “we can’t be evil.“)

As a result, we learned a lot about the state of data privacy — how people feel about the products they either continue to use in spite of, or abandon (though only in part) because of the unfair value exchange; which data they consider to be “private” and which they will share as a matter of course; which steps they’ve taken to preserve their privacy and how effective they think those steps have been (short answer: not very); and much more. We got to know our users and were delighted to learn that we had succeeded in earning their trust, because when it comes down to it, image is everything, really. We were concerned to find that, despite the belief that more transparency will lead to more trustworthy products, too much transparency can actually be quite off-putting. And we were relieved to learn that, by making the right choices about which data we actually need, making our algorithm more transparent (ie, how the data is interpreted and used), and putting more control in the user’s hands, we could both live up to our promise and keep them on our side.

It was a great experience to work on a product where we didn’t have to pick our battles for the user. There was no talking our client out of dark design patterns, no weighing decisions that would be helpful to the business but harmful to the customer, no debates over whether to opt-in or opt-out by default. It was a given that we couldn’t truly design a useful, usable product as long as it was built upon an architecture that continues to contribute to what is becoming the tech industry’s most existential threat.

But, having had this opportunity, it can be hard to go back. Knowing what we know, and feeling what we ourselves feel, it’s time to figure out the new battles worth picking. How do we use what we know about the concerns people have about data privacy to design products that serve them better? How can we advocate on behalf of our users’ needs for privacy, the way we advocated for their need to do perform tasks from their mobile phones, their need to manipulate information, or even their (now very obvious seeming) need to find the information most relevant to them 5, 10, 15 years ago?

Designers have the ability, the responsibility even, to make a difference. But we have to figure out how to design for privacy even when we are working on products that aren’t specifically founded on the idea of privacy. Where we were once the champions of usability, then usefulness, we now have to take on the mantel of ethicality.

I hope that 2020 will bring more discussion in the design community about how we can begin to contribute to the solution. I found Baratunde’s Tech Manifesto to be a helpful starting place, but as designers and researchers, we need to share more practical approaches so we can start to move the needle using the tools that we have at our disposal.

Here, for now, is what I’ve learned from my own experiences in designing for privacy:

  • Transparency alone isn’t the answer: The problem isn’t that people can’t see how the sausage is being made, it’s that the sausage is being made at all. As designers we have to remember what we know about selectively displaying information, even while we are trying to help people understand how their data is being collected and used.
  • Understand what your users are comfortable with: So then what amount of transparency is acceptable? I’m sure this changes from one audience to another, and depends on the type of product or service you are talking about, but it helped us tremendously to have a baseline for what our users considered acceptable in terms of their data. And in testing, when we saw our users balk at certain information, we paid attention, and changed the way the app worked under the hood so that we didn’t need that information at all. Rather than continue with the accepted practice of collecting any and all data we could, we were able to streamline it to only the data we actually needed. If the data you need is the data that people are comfortable with sharing, then you no longer have a transparency conundrum.
  • Business Model First: For a long time the accepted order of business has been to collect as much user data as possible now and figure out the business model later, with the result being that it’s the data itself that tends to get monetized. We can do better than that.
  • Privacy is more than just settings: If the best precedents we have are Facebook and Google privacy settings, then it’s a long way up from here. How do we build better privacy into our products in such a way that we can forgo making people take these extra steps which they don’t think really solving the problem anyway? We’ve managed to accomplish this in other areas. Let’s do it for privacy.
  • Content is still king: People aren’t leaving Facebook because of privacy concerns alone. They are leaving because they are unhappy with the content. The scandals just make them feel better about their decision to leave.
  • There is an opportunity to differentiate based on being privacy-respectful: The data fatigue is real. The doubt is beginning to weigh on people to the point that they are less inclined to try new products, and even those who are still signing up only do so because they’ve learned to suppress that little nagging voice in the back of their heads. The less invasive your product can be, the more people you may find willing and happy to take the leap. And it helps to talk about this up-front, to make it part of your brand. There is a reason why Facebook is always the scapegoat while Google gets away with the same things. The brand matters.
  • Everyone is a potential customer: Products that offer better data privacy as a selling point aren’t just for tinfoil hats or people who are “concerned” about data privacy. Everyone is open to alternatives, even the cynics.

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store