Machine Learning and UX

It Doesn’t Exist Yet

Byron Houwens
Designer Hangout
7 min readJan 29, 2017

--

If you’re in the tech industry at this point in human history you probably hear something about machine learning and artificial intelligence at least once a day. Many companies that have been around the technological block for a while, as well as a garden of new ones popping up, are using machine learning as a hammer for all kinds of nails. Oftentimes the hammer works really, really well.

Naturally, in our endeavours to apply this newfound magic to all the things, we will come around to how we can use it on user experience design, which is something I’ve received a number of messages about recently. These generally come from designers who are looking to use machine learning to improve their design or workflow, and sometimes they’re hoping to get a peek at what UX design might look like in the years to come because of it. They generally take the form of a few related questions:

  • How can I learn more about machine learning in UX?
  • What tutorials or articles are good resources?
  • How will machine learning change UX in the future (or will we still have jobs)?

The problem is that, aside from a few articles like here and here, there isn’t a whole lot of meshing going on between machine learning and UX, at least not in a way that’s accessible to a lot of designers. I’m approaching machine learning from the perspective of a programmer, so my path is not exactly an accessible one for the average designer either.

What I can do, though, is point out spots where machine learning might help the UX we can provide to our users now and in the near future, as well as how we could start using it in our workflows as designers.

Opportunities for the User

The recent HBO series Westworld had a theme park full of highly advanced robots that catered to the every whim of the generally depraved guests that entered it. It’s a great show, but what’s interesting from a psychological perspective is how the encounters that the guests have are often manufactured to their specific, individual desires.

No two people ever really have the same experience.

Anthony Hopkins standing menacingly in front of a wall of disembodied and sleepy heads in Westworld

Some version of “if you told me that would be possible 10 years ago I would’ve said you were crazy”, accompanied by a disbelieving shake of the head, is normally what people say in response to a technological development like that.

Disney, of course, is one theme park that is already tailoring the experience to the individual via it’s very expensive wristbands. The wristbands do this by collecting mounds and mounds of data — the resource of machine learning — on the park visitors and relaying relevant data points to staff; things like your name or table number are known without any physical contact or speech, making the park more and more anticipatory by design.

Theme park designer Dave Cobb, in a Westworld retrospective, labelled this only a first step though. The eventuality here would be a system that has gathered so much data about a user over such a long period of time that the anticipatory design creates a completely different experience for each person, despite the fact that all the users are in the same physical location.

This is the kind of power that machine learning provides for users. The light that dims to that concentrated glow you love to read by. The music that plays to lift your spirits when you walk through the door after a difficult day at work. The car that informs you of a back road to take due to unexpected traffic before you’ve gotten to your garage. The fridge that tells you to include the avocado in that salad because it has a given probability of reducing the risk of the diabetes your genetics might steer you towards.

If the data exists — if it’s specific to a person’s geographic, social, historical or even genetic context — then the machine can create an experience wholly unique and individual.

This idea of the individual, as opposed to the target group, is what defines user experience and its inevitable marriage to machine learning.

It’s this way of thinking, this focus on the singular, that I think designers need to start looking into and mulling over.

But there are problems with that.

Working Together to Save the User

Right now UX designers have enough of a job create a compelling experience for groups of people, never mind individuals. How then do we handle the experience for each person?

Sometimes there’s just too much going on

Again, this is where machine learning comes to our rescue. Designers tend to erroneously believe that they can’t grasp data and logic-driven concepts because “their brains aren’t wired that way” or “they’re the creative ones”, but data scientists themselves often have plenty of trouble making sense of complex and enormous datasets.

They get around this problem by a number of methods. Sometimes they can reduce the number of variables by looking for relationships between them and remodelling the data, and sometimes they look for clusters of data that belong together, which they can then reason about.

Clustering, an unsupervised machine learning process for grouping data

In the future we might see tools for insight generation from data specifically for us as designers, but for now a good workaround would be for designers to engage with data scientists in our teams and companies and find ways of incorporating their abilities and findings into our designs.

Designers and data scientists live in an unexpected Venn diagram of intent.

I think the cohesion of these groups will produce interesting and pleasantly surprising experiences for users, certainly in the near future (Invisible Design, a concept put forward by Amber Cartwright and linked further up the page, seems to think along these lines).

Opportunities for the Designer

What about us designers though? How can we take advantage of machine learning in our own processes? This is where I’ll get selfish and focus on where I think we should go with it.

Powering up with machines

It’s clear I think machine learning will only increase the workload put upon designers in the future, but that we can use that same machine learning to help us make sense of it all.

Systems to extract and suggest insights based on user data will become normal and necessary, particularly as datasets get larger and more granular. The sensible interface is one which is natural for us as humans: natural language queries of datasets is something I’d like to see, possibly even in the form of bots that recognise speech (like Jarvis).

I’d like to see these bots designed to specifically extract and reason on the kinds of things designers care about, such as sentiment of user feedback and extraction of particular pain points in usability tests. SQL queries and Python scripts can only take us so far.

We need systems that interact like humans but think like machines

This will go a long way to removing the psychological barrier of interaction that many designers currently experience when confronted with walls of code, while still providing us with the power of that code geared specifically to our needs.

One Big Thing

If you’re a designer looking for resources on how machine learning is changing design, stop. There just aren’t enough around.

What this means for us, though, as that we have the opportunity to define how this relationship will play out in a time when most people jump straight onto Google to give them all the answers.

It’s an opportunity for us to collaborate with data scientists (and others) to test and experiment and, hopefully, create unexpected, exciting and even beloved experiences that befit the kind of future most of us would love to be apart of.

It’s an unexplored frontier waiting, not to be discovered, but created by you and me.

--

--