Image by Mark del Lima

A Message to Companies That Collect Our Data: Don’t Forget About Us

Arianna McClain
Design x Data
Published in
7 min readSep 7, 2016

--

The Importance of Designing for Trust, Transparency, and User Control

By Arianna McClain

When it comes to privacy, I question using the word “data” because it holds so much. There’s automotive data, financial data, and scientific data. For now, let’s focus on customer data, also known as human data.

I’ve noticed that when it comes to digital products, services, and experiences, companies often forget that the data they’re collecting (e.g., clicks, search terms, responses to surveys) represent people’s attitudes and needs. They’re traces of human behavior. And these data points represent people’s stories.

So let’s think about data as a metaphor for people.

For example, what if today I decide to have a one-on-one conversation with a friend and she casually asks me my salary. At first, I may feel uncomfortable telling her. But, after she promises to keep it private, I tell her. Then she goes and tells someone else that she knows an IDEO designer who makes $X amount. And then that person goes and tells someone else.

Two things happen. First, she breaches my trust. My salary is private and here she is telling people?!

To that, she says, “Don’t worry! It’s anonymous!”

Second, even though she told no one my name by saying she has a friend who makes $X amount, and is female, black, an IDEO designer, that information renders it no longer anonymous. Together, all of those data points make it personally identifiable information. People can easily find out she’s talking about me.

Now think about U.S. Census data or posts you like on Facebook. These data points are more than numbers, they are stories that make up real people. Facebook apparently labels me “very liberal” and lets marketers target me based on that. But what if the label was not just about my politics, but about something more personal, like my sexual or gender identity? Wouldn’t it be considered unethical to sell that story? That information has the potential to embarrass, hurt, or harm someone (e.g. Peter Thiel).

I feel that calling human stories “data” makes you forget the differences between machines and humans.

Calling it all “data” makes customers forget what they’re giving up — their identity.

Calling it all “data” transforms human stories into objects that companies can own and sell.

So when designing for privacy, it’s important for data-collecting companies to ensure people know that they are giving informed consent and what they’re giving away. It’s important to be intentionally human-centered and ask:

How might companies clearly communicate that they own and could sell any story that’s shared with them?

This is where the law and privacy notices are supposed to help. Unfortunately, the law is nearly impossible to understand. People don’t know how the law protects them, and where it doesn’t.

Most privacy policies are overwhelming and written to protect the company, not the user. While at Facebook’s Privacy@Scale Conference in May, professor Omri Ben-Shahar shared a photo of himself holding Apple’s iTunes privacy policy. It was 30 pages in 8-point font!

User policies today feel more like an encyclopedia than a guide to help make a decision. And even if people do try to read them, they find unfamiliar words that prevent them from understanding everything they’re giving away.

So when designing a privacy policy, how might we design for trust, transparency and control?

Here are some user-centered design principles to consider:

GIVE ME THE GIST

I remember when I first started at IDEO. I was hired to design with data. Over time I learned that even though I needed to know the details of building a statistical model and ensuring it was statistically significant, the designers I worked with just needed to know the gist. Specifically, all they needed to know was how the data impacted the final design.

When it comes to designing for privacy, I’ve noticed that lawyers often make a similar mistake. They worry about being accurate or thorough, but forget the person who will be reading and applying the policy.

What if instead, privacy policies were designed for extreme users — people with low-literacy? For example, when IDEO designed voting machines for the city of Los Angeles, we designed for all kinds of voters: blind people, people in wheelchairs, people with low or no literacy, people with cerebral palsy. If we could make the voting machine work for them, it would work for everyone. In that vein, what if the first page of a typical privacy policy simply stated, “The three things you really need to know”? Terms of Service; Didn’t Read shows how companies could do this well.

SPEAK LIKE A HUMAN

In the legal profession, there’s a movement to change from legalese to plain language. What if we moved from plain language to human language, using words and statements that most people understand?

Above is an example we used for a recent automotive project, for which we needed an online participant consent form. As it was created, our designers realized we were missing the human voiceover that always accompanies the form we share in the field. Our solution was to add content in the form of ”human language annotations” — little speech bubbles that explained the legal paragraphs in human language.

SHOW ME VS. TELL ME

Looking at pages of text can be intimidating and exhausting. A few people might choose to read through the text, but a clear picture is worth a thousand words.

Above is an example of a template IDEO might use if we were designing a legal agreement for users to create a health app. We kept it visual. We used language the people using it would understand. This is human-centered design.

GIVE ME A CHOICE

Last, but not least, give people a choice. Most products and services either say you have to agree to everything in their legal agreement, or you can choose to not use their product. In reality, that’s not really a choice at all. Many people will choose to give up all of their privacy for the convenience of an iPhone, or so they can be invited and RSVP to friends’ events on Facebook.

What if companies gave people a choice between sharing data or not? What if there were different versions of a product based on what people were willing to share?

Facebook offers a variety of sharing options. People can choose to share every post in as broad or narrow an audience as they choose. But what about the anonymized data companies like Facebook sell to other companies? I haven’t seen any good examples of companies giving people total control over how that data is sold and shared and from what I can tell, no one is doing this well right now.

The best example I can provide is something similar to Spotify. Right now Spotify has a free membership with targeted advertisements or a fee-based membership with no advertisements. What if as part of paying for no advertisements, people could also restrict what data is shared by the company? This would give users three choices: users the chance to either unconditionally share their data, pay and provide restricted use of their data, or not use Spotify. I actually do not believe that that paying for privacy is the best or most equitable option, but at least it’s an option.

In Conclusion…

Privacy is becoming an increasingly important currency. Even though companies might not be able to entirely prevent large-scale cyber attacks, they can prevent people from unintentionally sharing data or stories that have the potential to harm and embarrass them. For example, when Target collected and used credit card data to figure out a teenage girl was pregnant before her father did, and unintentionally shared her story by sending coupons for baby items to her parents’ home.

But it won’t happen by chance. Companies need motivation to act on this. So what do they get out of designing for privacy? In a word, trust. They want to be seen as protecting the interests of their customers. And those that succeed in protecting those interests will be considered the preferred choice. Apple is the best example of this. They’re betting a lot on data privacy to position themselves as a Google alternative.

So how might companies get there? By understanding that the charts, graphs, and tables with sample sizes in the millions represent real people. By making sure that those who handle and make decisions around data regularly meet and talk with users.

When designing privacy policies, those who write them must emulate how they might convince someone to trust them. They could give their users the overall gist of the policy, they could write in “human” language, they could draw it out to make things more visual, and they could add more options for sharing choices.

So if companies remembered that data weren’t just senseless numbers, but people and their stories — would they treat it differently? I think they would. I believe that if we change the language we use to describe collecting people’s life stories, and differentiate it from other non-human forms of data, we can take a step toward making companies more accountable and intentional about story collection. It’s a matter of trust.

Special thanks to IDEO colleagues Sean Hewens, Matt Cooper Wright, and Tina Barseghian for helping me write this post and Mark Del Lima for the illustration.

--

--

Arianna McClain
Design x Data

Director of Research @Cruise | Formerly @IDEO and @DoorDash