Artificial Intelligence and User Experience

While it’s not perceived as a problem, a limitation on user experience design is that it is an optimization of competing interests to seek the most palatable solution for potential users. We see this a growing limitation of our technology to adapt to who we are, and more importantly, how we’re feeling when we use a device or an app. So long as UX is static, our interactions with technology will be limited.

In the next twenty years, we expect to see these limitation lifted; where the technology we use adapts to our behavior and our personality. Our technology will not only learn from our past, much the way that Big Data makes predictions on what we’ll buy, what we want to listen and watch, but will be able to detect our emotional state and physical condition, allowing the interface design to adapt to our immediate condition.

What we see is that technology will be able to connect humans and machines in a “smart” way, to improve the user experience holistically. With that said there are always emotions we have to deal with which in turn affects our daily experience. As a user we always have a user experience by using technology. Either we drink coffee, play a game or use an app on our smartphones; by the end of the day we want satisfaction from having a good experience which in turn makes us happy.

How do we detect emotions?

As a foundation we have basic emotions such as happiness, sadness or anger. All three are relatively easy to detect by human beings and is he foundation in our daily lives to relate to others. To detect these on a smartphone is a much harder challenge because you can’t simply inquire about the user’s state of mind, as the result would be subjective.

As presently seen and expected in the future the power of smartphones to collect necessary data which can be processed into accurate emotions will be enormous. Modern smartphones are already equipped with several sensors allowing us to record and store sound, motion, pictures and light intensity. By bringing this data into context with our bodies we are able to identify emotions which are different than our status quo. A faster gesture, a louder voice, and even a different mimic than usual can signal that a user is in an angry mood. Based on this we might deliver different designs to calibrate the user experience individually based on his or her current state of mind.

Possible use cases

A basic scenario looks like this: the smartphone detects a user’s emotion and warns him or her not to perform a particular action to avoid potential harm to others because research has proven that driving while angry can be dangerous. Recommendations based on changes in an emotional state, if relied upon too heavily by users, can have the downfall as a user can become overly dependent on them, but for the intial phase of this technology such examples are necessary to begin to refine their accuracy.

Let’s examine a few potential scenarios that would be easily buildable using today’s existing applications

1. Customer Satisfaction

A happy and satisfied customer is more likely to come back and use services again than an unsatisfied customer. A call center is already using voice recognition in order to detect an angry customer who might cause problems. What about smartphone users who have missed a connecting flight? They may be angry and want to check their smartphone for the next flight or just complain about the waste of time waiting at the airport. Imagine a user who engages an app that has the ability to detect his or her emotional state and can provide the right design which helps to generate a positive user experience. He or she is then more likely to remember the positive turn of the situation and walk away feeling satisfied.

2. Buying Decisions

People’s purchasing behavior can be influenced by their emotional state. Many products directly target consumers in a certain mood as they are actually developed to arouse that mood. Other products or services, on the other hand, would not be considered for purchase based on a conflicting state of mind of a product and the intended emotional the product or service is designed to create. If a smartphone is able to detect emotions it can offer the right product for the fitting state of mind of the user which would increase customer satisfaction in online shops by actually directing users in a physical space.

3. Storytelling

The intent of storytelling is to arouse certain emotions that increases the likelihood of a consumer to make a purchase. The question of what story is fitting to which user’s emotions is one that will be addressed in the future. Can a smartphone in the future recognize a user’s emotions well enough to direct ads using the right ‚story’ to him or her to ensure the best fit, thus the strongest interest in a product or service and to enjoy a positive User Experience?

4. Recommendations

Which movie do you want to watch? You may have no idea but you can describe your mood precisely. If your TV would have the ability to detect your emotions it might also offer you recommendations for the fitting movies for you that you would have interest in watching with your current state of mind. This would also be applicable for any music or other streaming service that provides recommendations to users.

Thanks to Daniel Young for editing my English. Read more on our blog.