What is Machine Learning + UX?

Michelle Carney
Machine Learning and UX
11 min readNov 6, 2020

--

“What is machine learning + ux?” — from Salesforce UX’s Data Driven Personas talk at MLUX in 2019

Since founding the Machine Learning and User Experience Meetup three years ago, I still get asked one question time and time again: “What is Machine Learning + UX?”

I love this question because it seems to change as Machine Learning (“ML”), Artificial Intelligence (“AI”), and Data Science (“DS”) start touching every aspect of our lives. Just in the last few years, we’re increasingly seeing a shift from experiences ‘powered by AI’ to AI as the experience itself. The MLUX meetup continues to be a community to share these case studies and best practices as we figure out how to make machine learning/AI/data science approachable and inclusive for everyone.

How did MLUX get started?

When I started the Machine Learning and UX Meetup in 2017, I thought: “wouldn’t be nice to have a quarterly pizza party with others interested in this field?” Evidently, there were a lot of folks also interested in this topic, and eager to learn as well. By our second event would have over 200 attendees at the Autodesk Gallery, and we would continue to grow over the next 3 years with fantastic events with Salesforce, Spotify, IDEO, PARC, and more!

From a 60 person pizza party at Quid to a 200 person lecture at the Autodesk Gallery— we have seen it all!

Personally, I have a passion for helping others understand and build better AI for everyone. While studying at the

** I focused on using data science techniques on UX problems and building better experiences around data-intensive AI. However, I would routinely get comments from colleagues and in job interviews around:“If you’re interested in both ML and UX, you must not be good at either.” I knew my specialization in ML+UX was an asset, which is why I formed the MLUX meetup in hopes of meeting others interested in this topic. Needless to say, I am so grateful and humbled to have met so many talented and kind folks through the MLUX community who were also passionate about this area, and who have helped us grow to what we are today, all while remaining free to everyone!

Now MLUX has more than 2,400+ members on meetup, and even more on our linkedin, twitter, youtube, and mailing list, and we’ve had events in the SF Bay Area, NYC, and Seattle, and more to come 🤖✨

Big thank you to all of our previous speakers and sponsors for helping make MLUX such an amazing community, and supporting us to keep it free for all!
The loop between data*** (i.e., ML, AI, Stats, etc) and UX (i.e., more than just ‘making things look pretty’ but also the design of the overall experience) is incredibly important when we move towards a world where AI is the experience.

So, what is ML+UX? 🤖🌟

I tend to think of ML+UX in two parts:

  1. How might we use data science techniques to inform and drive UX design decisions?, and
  2. How might we use UX and design to make ML models more transparent and approachable?

In this article, I will expand on some of my favorite examples of ML+UX from these two perspectives, and there are many more out there (check out our favorites)!

“How do we use data science techniques to inform and drive design decisions, while also creating an experience that makes it transparent to the end user what the model is (or isn’t) doing and how they can give it feedback.”

- Michelle Carney, MLUX founder

How might we use data science techniques to inform and drive UX design decisions? 🤖➡️🎨

There are many parts of data science that can be used to inform and drive design decisions. From even just quantitative research and forecasting, to unsupervised learning (a type of ML) to find patterns at scale. Three of my favorite examples are: Data Driven Personas, Data Visualizations of ML, and exploring uniquely AI capabilities.

Data Driven Personas 👤

Personas are often developed qualitatively by User Experience Researchers (“UXR”), and data driven personas, or personas created via unsupervised ML techniques, can be more robust and show underlying trends and patterns within user groups.

One of my favorite examples of data informing UX is the Salesforce Data Driven Personas talk from MLUX SF’s April 2019 event.

’s team describes how they went from using K-Nearest Neighbors (“k-NN”) to Principal Components Analysis (“PCA”) on self-reported survey data of their users at scale to find underlying patterns and trends about different user types, and how that had tangible change on their product. Typically, unsupervised learning techniques like this are not always viewed as being apart of the UXR toolkit, but Salesforce’s team of UXRs did a fantastic job of being both data scientists and qualitative researchers — following up the data-driven insights with qualitative validation work.

MLUX x Salesforce April 2019 talk on Data Driven Personas

Other unsupervised learning to create Data Driven Personas has been done on UI click metric data, and I am excited to see more of it becoming the norm in the field, supporting previously “fuzzy” personas with more quantitative insights (like how many users are Persona A, or how one user might be 60% Persona A and 40% Persona B under certain contexts).

Data Visualizations 📊📈

Similarly, I am inspired by the data visualization and understanding work being done to help ML be more approachable and less of a ‘black box.’ Once we can understand how and what the ML models are doing, we can start approaching design thinking for machine learning in new and interesting ways.

Some of my favorite examples include

’s Sunburst Sequences , and ’s R2D3: A Visual Introduction to Machine Learning, Stitchfix’s Algorithms Tour, and ’s Machine Learning for Visualization (see more below). This type of work allows anyone to better understand trends in the data, whether they are a data scientist or a designer, and understand a little more as to what exactly machines are learning.

Exploring uniquely AI capabilities 🤖

Getty Image’s Andrea Gagliano explains how AI’s metric of “distance” between two images. Left side reflects the representation of women nurses (88% reported in the US census), right side reflects what a more balanced future we could strive towards.

Finally, we need data science to help highlight uniquely AI capabilities and show the pros and cons of different design approaches. We need to approach AI as an artifact of our culture and background, including the labelled data that goes into the model and the predictions that come out. How can we use uniquely AI capabilities while also allowing for human sentiment to change over time?

Andrea Gagliano, Head of Data Science at Getty Images, discussed diversity in image data sets as a mathematical calculation of distance during our MLUX Seattle Febuary 2020. In her talk, Andrea explains how images labelled by people might not be enough to capture the nuance of lived experiences, and AI trained on these labels might exacerbate already existing biases. She also raises the question: do we show images that are a mirror of our current lived reality (for example, if 88% of the population of nurses are female-identifying, should our images of nurses also be 88% female?) or should we strive for a more representative future in the images Getty surfaces? Should we let AI find trends and patterns between faces and predict the distances? What are the potential harms that might arise? Consider these questions when you choose how to build and display the underlying AI. You can watch Andrea’s talk on our YouTube channel.

How might we use UX to make ML models more transparent and approachable? 🎨➡️🤖

Just as we can use data to inform and drive UX design decisions, we can use design to inform the user as to what is going on in the data and make a transparent experience for them. By having AI and ML experiences be transparent and approachable, we avoid the user mental model of ‘AI is something that happens to me’ and get closer to the notion that ‘AI is a tool that I can use.’ It is also important to mention that UX is much more than just visual design or re-skinning an AI experience to look ‘pretty’ — it is designing the entire experience as a whole, from the data used to the words displayed, to how the user gives the model feedback.

Some of my favorite examples of this are: Allowing the AI to help surface things of interest or complete a task, and Creating AI experiences that are fun and approachable to everyone. (There’s so many others, so check out more on our MLUX Resource list!)

Allowing the AI to help surface things of interest or complete a task 📝

AI can be a powerful tool for users to connect the dots, find patterns, or complete tasks faster. This may seem like a no-brainer, but this is the majority of the ways that the average user experiences AI, so it is important to build these experiences the best we can!

Pinterest’s visual discovery that allows you to re-query the search based on a single image, right within your search!

Two of the major domains of Data Science are Computer Vision (“CV”) and Natural Language Processing (“NLP”). One of my favorite CV example of this is Pinterest’s Visual Discovery tool. In the gif you can see a sample query (who knows what they were searching for — I still don’t know!), but something sparked their interest — a pizza 🍕! Right within the same experience, the user can tap the lower right corner of the image to re-query the search and populate the same screen with new photos of pizzas — no need to scroll back up to the search bar! I think this is a wonderful example because, while we don’t know exactly what is happening on the back end, we do know that AI is great at finding similar images, and that tends to be a use cases for folks scrolling on Pinterest, so this AI-powered easy-to-use search-right-in-your-search allows anyone to do the task they planned. (Maybe it was “Find inspiring dishes” or “Recipes for easy weeknight food”?)

Google’s Gboard uses NLP to parse the text and suggest emojis or gifs, and the suggested words (similar to autocomplete) to help users send messages.

Similarly, NLP techniques can be used to help users type faster (or at least just spell better!), but can also be used to bring joy. One example is the Gboard’s feature to add emojis, stickers, and gifs based on the text written. In this example, someone replies to a text with ‘Good Morning,’ and adds a sticker with breakfast foods. It might seem a little silly, but examples like this help users expectations as to what the AI can do, and other features offered by the product.

Creating AI experiences that are fun and approachable to everyone 🎉

Last but not least, we need to create AI experiences that are fun and approachable so that way everyone can understand what is going on with the AI! (at least at a high level)

While these examples might seem trivial, they’re approachable and easy to understand. They can help people from any background understand what AI and ML are doing and create frameworks for them to think about AI they encounter in the future.

Google’s Quick, Draw! is a game that allows the user to try to draw a computer-labelled image in a matter of seconds, and collects the vector patterns used to draw it.

Time and time again I find myself going back to the Google Quick, Draw! and Autodraw examples for fun and approachable AI. Quick, Draw! is a game where you are assigned to draw something in under 20 seconds, and an AI Guesser tries to guess what you are drawing (see gif or try for yourself!). This fun and engaging game allows people to see what an AI “sees” — a circle might be a nose or a moon or a donut, and what features are most important for the AI to classify it correctly. One doodle of a donut on its own is cool, however when you have hundreds of thousands of doodles at scale, this allows for an amazing dataset of labelled doodles!

Google’s Autodraw builds off of the Quick, Draw! Dataset to “auto complete” drawings and suggest clipart for those of us who are less artistically inclined. One can imagine making the coolest “YARD SALE” signs, or invites to a Pizza Party 🍕

The many brilliant minds behind this then took it a step further — what can we do with a huge dataset of labelled doodles? Maybe we can make it easier for people to find the right image right off the bat! If the AI is able to guess that you are drawing a Pizza, maybe it can suggest artistically drawn different pizzas for your Pizza Party 🍕 Invitations! This is a great way of creating AI that is fun and approachable, and anyone can try it to see the limitations or understand where the edge cases are.

(If this topic interests you, I highly recommend reading

’s Machine Learning for Visualization on this as well 🤖)

Also, Google’s Teachable Machine is a great way for anyone to see how to train a model — no coding required! It all happens in your browser, and you can play around to see what the machine is learning, adjust parameters, retrain it, and more.

Fun and approachable AI would not be complete without the amazing example of Google’s Teachable Machine.

Teachable Machine is a great way for anyone to see how to train a model — no coding required! (see their CHI 2020 paper) It all happens in your browser, with your webcam or microphone, and you can play around to see what the machine is learning, adjust parameters, retrain it, and more. You train a simple classifier (of audio, images, or poses), and you can even export it to integrate with other tools. Training a machine learning model in minutes and trying it out for yourself allows anyone to have confidence in approaching and understanding AI and its creative potential.

What’s next for ML+UX?🤖💖✨

So to recap: ML+UX is understanding how humans and machines come together to create better AI experiences for everyone, and the MLUX meetup is a community for like-minded practitioners to share these best practices!

There are so many other examples out there and tons of folks doing really interesting things. It’s important to remember that one company isn’t going to “solve” MLUX — there are going to be many different design patterns for AI and they’ll change with time — and that’s ok! I’m excited to see how this field has grown and where it will go in the future.

In the meantime, check out some of my favorite resources here, including the

(“PAIR”)’s PAIR Guidebook and PAIR Explorables!

** The UC Berkeley School of Information is also are home to awesome organizations like the Algorithmic Fairness and Opacity Working Group (AFOG) and the Center for Technology, Society and Policy (CTSP), which first funded MLUX Meetup, allowing us to make it free for everyone. Thank you for the support

*** In this article, I use AI, Machine Learning (“ML”), and Data or Data Science interchangeably. A lot of these experiences also blur the lines of Data becoming ML becoming AI, and I want my language used to reflect that.

Thank you to

for helping me prepare the original outline, and for editing this article!

--

--

Michelle Carney
Machine Learning and UX

Founder, Machine Learning and UX @mluxmeetup. Member @feministai. UXR @GoogleAI. Lecturer @Stanforddschool. Former @CTSPBerkeley @AFOGBerkeley @BerkeleyISchool