Measuring Emotions to Improve UX

Deborah Ko
PALOIT

--

I’m a big advocate of understanding users’ emotions to make products and services that users love. When I present this in talks or to clients, the biggest question that always comes up is, “I’m totally onboard — but how do you measure people’s emotions?” And the answers that I give is not always what people want but it’s this: “as many ways as possible.”

Is there a scientific way to precisely measure emotion?

No… not yet.

One of the most promising and intriguing[1] is Affective Computing which is based on decoding minute facial muscle movements that unconsciously twitch in response to an emotional reaction. There is great hope (and possibly creepiness) in a computer being able to decipher how you feel about something you’ve just seen or experienced.

But there is a lot that affective computing can’t answer yet.

  • How good is it at capturing complex emotions? Perhaps there are some basic emotions that are more hardwired than others. But then there are a lot of important emotions that we might want to be looking out for that are more nuanced [2]. Like you might want to know if someone is bored, or curious, frustrated, confused, or relieved. Computers (and sometimes people) are not good at always detecting these more complex emotions.
  • What is the source of the emotional response? Often times when we do user testing, people may do something and produce a confused face, but we don’t necessarily know what they were confused by. Perhaps they expected something else to happen — but what? Perhaps they didn’t understand the label of a button they just saw.
  • How generalizable are these emotional measurements? Most emotion literature, especially on facial expressions is conducted in Western countries. There are studies to show that people from Asian cultures may express some emotions differently. Other studies show that context can play a strong role in determining how people perceive emotions.

Measure emotion with as many vantage points as possible

So there are no quick and easy ways to measure emotion (yet?). Like most things in psychology, if you want to be more sure of something that is probably a bit fuzzy, you measure it in many different ways and see if they all point to the same thing.

Here are some techniques that we employ in combination to get a better picture of people’s emotions. Note that none of them by themselves could definitively identify an emotion, but their results as a whole would provide a clearer understanding:

Self-report

  • Interview — used in user testing, we can ask people or ask them to volunteer information about what they liked most and least and what parts were frustrating or confusing. We can ask them what their expectations were when they make errors. This feedback is prey to subjectivity (we often don’t know what causes our own enjoyment or frustration of a product), users lying (knowingly or unknowingly), and users being aware of their own reactions.
  • Customer support/call center data — we can get a marker of where the biggest problem spots are in a journey based on how many people report bugs/complain. This doesn’t get at finer problems of design but can get at the big ones. This is subject to user personalities — some users are more prone to make the effort to complain.
  • Ratings/Reviews online — an easy and public way of getting an idea of how people think of a service/product overall and sometimes what specific features are tied to that rating. This is subject to user personalities similar to the customer support data. It also may be subjective for reasons similar to the interview process.

Behavioral markers

  • Rage clicking — when user clicks multiple times on the same thing possibly to make it go faster or to make it work. Usually indicates frustration.
  • Abandonment/Drop off/Bounce rate — When people leave a location they were looking at. Can mean many things but it can indicate goal completion or frustration.
  • Wait time — When user stays for a long time on a page. Can indicate high engagement or confusion (or sometimes abandonment)
  • Scrolling behavior — Very quick scrolling up and down a few times often means that the user is looking for something but cannot find it (and the information they are finding is irrelevant or non-engaging)
  • Mouse movement — oftentimes, individuals will move their mouse where their eyes are moving. Rapid jumps back and forth to the same things may indicate confusion or information processing. This coupled with inaction often means individuals are confused at what they should do or where they should go. Rapid mouse movement while waiting for something to load can often signal impatience.
  • Using search — deciding to use search after several navigation attempts often means individuals are frustrated and use search as the last ditch effort to get what they want.
  • Backtracking — individual goes back and forth between different pages/links in rapid succession, usually meaning that they have looked for something and didn’t find it and go back to “home base” to start again. Doing this repeatedly often leads to frustration. Some people adopt this strategy to quickly learn a system so it can also signal curiosity.
  • Body language — some of the usual signs of frustration — sighing, tongue clicking. Some usual signs of disengagement — leaning back when user was sitting forward before, clicking without thoroughly reading through when they were more diligent about reading before (although this can also show mastery)
  • Facial markers — most of these facial markers are part of the FACS system which affective computing uses. Things like duchenne smiles (authentic smiles), eye rolling, pursed lips, furrowed brows, brows pointed upwards, etc. can be indicators of surprise, joy, fear, anger, or disgust.

Visualizing the emotional journey isn’t a perfect science

It’s not a “perfect science”, but it is the scientific method at it’s core. As I was saying, the idea is you take as many of these measures as you can to get an overall estimate of the emotional journey. There will be instances that stick out more strongly than others in terms of severity, so the emotional journey that I map out is not a hard-core quantitative exercise where the numbers are absolute. Instead, they are created relatively — ranked against each other in terms of really good and really bad experiences. It’s not perfect, but no measure of emotion is perfect, not even in the ivory towers of academia.

The emotional journey is still a powerful tool

It’s a visual understanding of an experience that helps our teams and our stakeholders understand how to prioritize, empathize, and problem solve better.

It’s powerful because it offers guidance on what direction our improvements should take.

At the end of the day, emotions move people — and that’s where we want to pursue with our creations.

[1] What about other biological measures like galvanic skin response, EEG, eye tracking? Why didn’t you talk about these? Well, because galvanic skin response and eye tracking don’t measure emotion, they measure arousal. EEG is interesting but I’m not convinced by our abilities to accurately measure the brain yet. Even fMRIs that record brain activity — our brain is so complex, we are really guessing what it means when part of our brain lights up. Emotions can also be expressed by what parts of the brain are quieted down — we don’t really measure that part well yet.

[2] Now, some psychologist will probably get pissed with me because they’ll say that these are “moods” not “emotional states” but for designers, you just care about how someone experiences your product, moods or emotional states be damned.

--

--