SDXD: Exploring Data Privacy and Artificial Intelligence in UX

Alex Stolzoff
SDXD
Published in
9 min readFeb 7, 2020
Attendees watching the event at Blink UX.

In the ever-changing world of technology, Artificial Intelligence and Data Privacy is a topic looming over everyone’s head, right?! Or, perhaps you're a software developer who works with AI and believes that it is the future.

Regardless, San Diego Experience Design (SDXD), made sure their first event of 2020, Exploring Data Privacy and Artificial Intelligence in UX, addressed top of mind issues and provided insights on excellent UX practices to implement when dealing with these issues. The event, which was sponsored by ConveyUX, was held at Blink UX, in Downtown San Diego. General Assembly provided delicious 30-inch pizzas, and Topo Chico served the finest sparkling water known to man.

The event consisted of talks from both Burton Rast, UX Lead of Google’s Privacy Data Protection team, and Laura Coburn, Senior UX/UI Designer at ServiceNow’s Vulnerability Management Product. Rast provided an engaging 45-minute presentation on his unorthodox journey into UX and design, as well as some insight about the state of artificial intelligence and data privacy at Google.

Afterward, both Rast and Coburn shared their perspectives and answered attendees’ questions in a fireside chat facilitated by Brent Summers, Marketing Director at Blink UX.

Burton Rast’s Eye-Opening Perspective

When Burton Rast began his talk I expected him to take a deep dive into artificial intelligence, data privacy, and where UX plays a role in all of it. Instead, I was pleasantly surprised. Rast spent the first half of his talk explaining his unorthodox path to becoming a design lead at Google.

While discussing his unconventional path, he mentioned a point in his life when he felt creatively drained. He felt as though everything in design was always evolving, but he didn’t know what to do. He titled this design FOMO (and even wrote a definition for it):

“Anxiety that an emerging technology may elsewhere be evolving the design industry, and you’re missing out on learning all the new skills”

This is a feeling that I believe most designers can relate to at one point or another in their career, and I think it’s why Rast felt it was so significant to discuss.

He said when he hit this point the only thing that brought him back from feeling creatively drained was experimentation and boldness. He went on to talk about how he began experimenting with a different style of photography which eventually got him published on Apple’s (yes, Steve Jobs’ Apple) Instagram account. This bold experimentation, and putting himself out there for months, is what led to what he called his creative rejuvenation.

Now, onto artificial intelligence and data privacy.

Is Artificial Intelligence Going to Become Our Skynet?

According to Burton Rast, it is quite the opposite. In fact, he is enthusiastic about the future of artificial intelligence and the role it will play!

When Rast started speaking about AI, he began with a brief overview of the current state of artificial intelligence and its common uses. He used automatic speech recognition as an example since it is the most popular version of AI at the moment. Automatic speech recognition, or talking to Siri, Alexa, or Google, is looked at as a new interface for technology. No need to deal with excessive tapping — simply ask your device to do exactly what you need it to do, and viola.

Rast explained the basic process of automatic speech recognition:

  1. The algorithm takes in your verbal input.
  2. Once recorded in a .wav file, it chops up into tiny pieces — 25 milliseconds to be exact. The reason these pieces are so small is due to specific tonal qualities that can differentiate between a man’s and woman’s voice. This is how Siri is able to learn whether or not it is you speaking.
  3. Once it has these 25 millisecond sound bites, the algorithm runs them through a Fourier transform, which is an algorithm that breaks down one complex sine wave into the individual sine waves that make up the original wave. This allows for the algorithm to differentiate what is speech and what is background noise.
  4. It then runs the “cleaned up” version of what was recorded through a neural network, which is a Machine Learning algorithm designed to function similarly to the human brain (don’t worry we haven’t achieved Skynet just yet!)
  5. Now, it can predict the most likely successor to what you just said.

This is essentially is how speech-to-text works on your phone, according to Rast.

UX is crucial to this technology, as more AI programs develop that serve as a new interface. This will include the overall experience from the UX writing of what the device says, to how they execute these actions, and how they mitigate errors.

The whole process seems very harmless and, according to Rast, it is. This type of interface allows for a seamless user experience to accomplish tasks.

While this was all very impressive, it still didn’t answer the main concern most users have: “How can I have more control over my data?”

In Order For AI To Exist, All Our Data Must Be Available

Well, not exactly. I mean let's be honest; as much as we love the advantages of technology, the ability for a device and/or company to know us better than we know ourselves is a bit creepy. Companies, developers, and UX designers have taken note of this and created a few ethically viable solutions.

When most people think of Artificial Intelligence their mental model is something reminiscent of automatic learning — automating machine learning to take in endless amounts of data in order to constantly improve machine learning. Because of this, fear of artificial intelligence has begun to spiral towards dystopian views of the future. An alternative to automatic learning is what is called augmented learning, Rast advised. At the core of augmented learning, the purpose is to put the power of what is fed to the algorithm back into the users’ hands.

A recently open-sourced example of this, is a learning model known as federated learning. Federated learning is a decentralized learning model where only the results are uploaded to the cloud, as opposed to the more popularized centralized learning model where everyone’s data is uploaded to the cloud. It is able to provide a more personalized experience across your own operating system and applications while protecting your data from being uploaded to the cloud. Rast explained how this is accomplished:

  1. A machine learning model is installed on your device (phone, tablet, laptop, computer, etc.)
  2. You set what data the algorithm has access to
  3. The learning model trains based on the data you’ve granted it permission
  4. The learning model returns your results as statistics, as opposed to personal information
  5. When the learning model is done, it sends the results to the cloud (for storage efficiency)
A diagram of federated learning

So, instead of having Facebook know every website you’ve visited, it will just know how frequently you visit sites of a certain category. You keep your data private while still getting the enhanced experience of artificial intelligence. The best part is you can turn it off whenever you see fit! If you would like to learn more, check out this article.

While AI is a significant technological breakthrough, we need to always keep the user in mind when using it. What is the point of creating a technological innovation if so many users feel uncomfortable using it? We must always put our users first.

Data Privacy and the Importance of UX

In the information age, data privacy is what seems to be a never-ending concern. The belief that personal data is kept private is something that is necessary for users to feel comfortable engaging with any application. Because of this, data protection policies and how we intend to collect, store, share and terminate a user’s data must be the main concern of all UX designers.

“Why are we still asking users for their email?” — Laura Coburn

While data protection is important, making your users fully aware of what is being done with their data is just as important. Even more, we need to question what data do we actually need and how long do we need to hold onto it?

“Addressing data privacy is basically working against the entire system of what the internet is.” — Burton Rast

The current standard for forming trust is, by design, very subtle. Everyone has agreed at one point or another to a terms of service (TOS) contract they didn’t read, and why is this? It’s because TOS are written in a way that makes it difficult for the average person to understand. Additionally, they are extremely long. Therefore, the consensual agreement to handing over your data, is not designed with the user in mind, which is a real issue. It is the job of a designer to advocate for the user, even if that means going against the status quo and fixing a broken system.

So, how can we fix this?

One possible solution is to walk users through how their data is being used as they’re using the application. It’s such a simple idea, yet it’s something rarely seen. Instead, it is common for an application to walk users through the features.

Another solution is to apply UX writing to the TOS. While this is an ideal solution, it is one that would require someone to be able to translate the whole terms of service, while another person who is well versed in UX writing rewrites it.

Rast provided an excellent example. He asked everyone to raise their hand if they had ever used an incognito window. Not surprisingly, every single person in the room raised their hand. He then followed up the question by asking, how many people have read the 6 bullets points of text after opening an incognito window, and only 4 people raised their hand. This is the crux of the issue. While we are technically informing users of what an application is doing, we are not doing it effectively.

“Our goal in design is to build trust” — Laura Coburn

As designers, we are often so pressured into finding the fastest way to finish a design sprint that we tend to overlook these hard-hitting questions — the questions that have great impact on our users. Questioning why we need a specific piece of information and if we really need to hold onto it is a good start. It may seem harmless to have a user’s data in what you believe to be a secure data privacy system, but the reality is we must take precautions to protect our users’ data.

Takeaways

My main takeaways from this event are:

  1. When it comes to UX, we need to stop thinking of the surface level experience and advocate for the user wholeheartedly. We can’t simply fight for the proper designs to be implemented based on user research; we need to fight for what is ethical for the user.
  2. There are many fields to which you can apply UX, such as artificial intelligence and data privacy. For example, you can apply UX writing to make terms of service for comprehendible.
  3. Consider what will give the user the most pleasant, carefree experience when using your application. In the end, it is our job as designers to make sure our users have an enjoyable experience. This includes mitigating any privacy concerns the user may have during the experience.

Most of all, my favorite takeaway was Rast’s insight on how to get out of that creative rut he labeled design FOMO. We need to always experiment and be bold in our designs, in the policies we advocate for, and in how we determine the data that needs to be collected when making an application. Just because there is a current standard doesn’t mean it’s ethical. And, just because everyone is “agreeing” to a terms of service contract doesn’t mean that everything within that contract is best practice.

I got involved with SDXD because I believe it is one of the best design organizations in San Diego. SDXD holds some of the most informative events and always attracts an excellent crowd to network with. This combination results in events that truly resonate with designers from all aspects of UX ranging from research, design, development, and content creation. This is why I decided to volunteer and would encourage anyone who is interested in UX to do the same!

If you enjoyed this article and would like to get in touch you can reach me on Twitter, LinkedIn, or my website.

About SDXD, San Diego Experience Design is a catalyst for a vibrant San Diego experience design community. A professional networking and education organization, they serve primarily UX research and design practitioners but welcome anyone who works in, or is simply interested by, the various experience design disciplines and techniques (UX, IxD, usability, prototyping, HCI, service design, industrial design, etc.).

Find us at: http://www.sdxd.org/ http://www.meetup.com/s-d-x-d/

Volunteer, sponsor, or just plain get involved in this community. Find out how by visiting us at: http://www.sdxd.org/ These events are made possible by great people and by the companies that put us to work. If you or your company would like to sponsor us, we’d love to talk. Download our one-page about SDXD and the type of events we host. http://www.sdxd.org/sdxd-community/

--

--

Alex Stolzoff
SDXD
Writer for

UCSD Alumni.UI/UX Designer & Developer.User Researcher.Drummer.Published Co-Author.Avid User of Periods To Separate Terms Which I Believe Define My Identity.