Starting Conversations about Customer Privacy and AI

A guide for UX professionals

Derek DeBellis
Microsoft Design
5 min readDec 3, 2019

--

By Derek DeBellis, Penny Marsh Collisson, Angelo Liao, and Mar Gines Marin

Two people in an office discuss data displayed onscreen.

I love when AI makes a recommendation that accounts for me, my goals, and my context. I love when AI automates part of my workflow by recognizing where I’m going and cutting out some of the work required to get there. I also love when companies respect my privacy. I’m not alone. I’ve heard this countless times in user interviews: people want personalized AI-driven experiences that cater to their specific needs while also respecting their privacy.

When we operate with shared values and communicate about how to put those into practice, researchers and product teams can help to deliver both personalization and privacy to our customers. At Microsoft, we’ve been compiling privacy practices that we think every UX professional should know and understand. The below list isn’t exhaustive, but we’ve found that the ideas it contains help UX professionals exploring AI and privacy. We also include questions you can ask your product and data engineers to kickstart a conversation about AI privacy and design.

Collecting the right data

A lot of the beauty of advanced statistical approaches resides in the ability to handle rich, multidimensional sources of data. The more features a dataset has, however, the more effort is required to make sure that no one can be identified. Take care, also, to collect and use the data in a manner that aligns with your company’s values and your customers’ desires and values.

Conversation starters
• What features would be contained within this data set?
• How important are these features for the model’s performance?
• Do we have a justification for needing that piece of information?
• Does having that information increase the odds we compromise someone’s anonymity?
• Are we (and our partner teams) selecting and using data in a manner that our customers have both comprehended and agreed to under their own volition?
• Are we collecting and using the data in a way that reflects our customers’ values?

Exploring the shape of data without exploring the content or individuals

AI systems don’t need to know much about individuals to make useful predictions for them. Current approaches allow data to be aggregated, featurized, and encoded to anonymity without detracting from the ability to do computations on it. The important patterns can be retained after adding noise to the data. This noise makes it extremely difficult to trace it back to content or individuals. There are also techniques that make sure queries and models return statistical or aggregate results, not raw or individuating results.

A table with two columns, one showing words, the other showing those words represented as vectors
Words can be represented as vectors of numbers. These sets appear meaningless to us, but they often contain patterns valuable to the AI system.

Conversation starters
• If I run a query on this data, is it possible that the results will be associated with a small subset of individuals?
• Do we have a way to make sure our queries return statistical, aggregate, and de-identified results?
• Is it possible to determine whose data was in this initial training set?
• How are we anonymizing and encoding data to ensure privacy?

Handling customer data

Modern technology allows us to address many concerns about how, when, how long, and where the data is being handled. For example, a customer’s information doesn’t always need to travel to the cloud for AI to work. Advances have made it possible to get sophisticated models onto a customer’s device without taking up all the device’s memory or processing power. Once on the device, AI can function offline, without needing to constantly connect to the cloud. There are, in addition, many ways to maintain privacy within the cloud.

Conversation starters
• If we want personalized models, how do we build, store, and update them?
• Are we housing our AI models in the cloud or the device? Why?
• How do we update our general models?
• Who, if anyone, can look at the data? When? How? What data exactly?
• How long is the data being stored?
• Where is the data being stored?

Providing customers with transparency and control

Ultimately, you’re asking these questions so you can give customers what they want, which our research shows is transparency and control. You want people to have the information they need to decide whether they want to use the AI-driven features. Make sure you’re presenting this information in an easily understandable way. And if customers decide they don’t want to use AI-powered features, they should have the controls to make the necessary adjustments.

Conversation starters
• Do we have answers to the questions users are asking?
• Do customers have the information they need to determine if using our AI is worthwhile?
• Do customers have the controls necessary to manage their experience? If so, are these controls nuanced enough? Are they too nuanced?

The real UX work begins after you sift through these questions

We hope that these questions help open conversations with the people on your team building AI-driven experiences. This communication reinforces a shared objective and leads to an understanding of how you can help protect user privacy. That knowledge empowers us, in turn, to help our customers navigate privacy in AI-driven products and communicate these intricacies in ways that are better, simpler, and clearer.

Authors

Angelo Liao is a program manager working on AI in PowerPoint.
Mar Gines Marin is a program manager working on AI in Excel.
Penny Collisson is a user research manager working on AI in Office.
Derek DeBellis is a data scientist and user researcher working on AI in Office.

With special thanks to Simo Ferraro, Zhang Li, Curtis Anderson, Josh Lovejoy, Ilke Kaya, Ben Noah, Bogdan Popp, and Robert Rounthwaite.

Want to join us in building trustworthy AI at Microsoft? We’re hiring!

--

--

Derek DeBellis
Microsoft Design

I like to explore patterns in data with the hopes of stumbling onto something that helps me get to a new vantage point. Views are my own.