Published in


Feeling Good About Emotion AI

The marvels of emotion-detection technology

Four eggs with various emotive faces drawn on with sharpie
Photo by Tengyart on Unsplash

Emotion is central to the human condition. During conversation, we detect emotion through tone, body language or facial expressions. Using emotion, our voice delivers content and, more importantly, context, to every interaction. Sentiment analysis is a technique that processes language to identify subjective information in conversation. For example, attitudes or opinions about a business or topic.

While sentiment analysis has been around for decades in marketing, audio sentiment analysis remains mostly conceptual. Despite this, audio sentiment analysis is making a splash in the AI realm. Rather than solely focus on what people say, audio sentiment analysis understands how it is said. Dubbed the power of “emotion AI,” this technology can, to an extent, detect human emotion. The process is far from perfect, but its use cases blur the line between reality and science fiction.

The Customer Experience

Knowing what people think about a brand is crucial to maintaining a positive customer experience. Every interaction can provide information, and sentiment analysis is one tool businesses can utilize to monitor the current customer experience.

AI sentiment analysis reveals more than meets the eyes or ears. In contact centers, this technology can analyze customer comments over phone, emails, text messages and chatbot sessions. If AI deems a customer to be frustrated, for example, it could route this person to an agent who is skilled at handling these situations. Beyond routing to appropriate agents, customer feedback provides insight on how well a brand markets to target audiences, promotes brand awareness or compares to competitors.

Sentiment analysis is not new, but its applications are becoming more widespread. Historically, the tool has been used to process text-based data. IBM Watson’s Tone Analyzer uses linguistic analysis to understand the emotions behind words. Through tweets, online reviews and customer service messages, a company can paint a picture of its customers’ perceptions of its brand or product.

Nowadays, a machine learning-based approach attempts to delve deeper into the context behind the data. The same words could mean different things when said with sarcasm or when the listener catches nuanced context clues. To better understand the voice of the customer, other technology is taking a more literal approach.

Three people at a table in the middle of conversation
Photo by Yura Timoshenko on Unsplash

Sounds Good To Me

Rather than extract meaning from what people write, why not use what people say? Companies like want to decode the human voice. Their DeepTone model is trained on hundreds of thousands of emotional speech data samples, which provides insights to a variety of use cases. Content moderation is one of the most compelling uses. Wherever there is voice chat, such as video games or social audio rooms, DeepTone can help flag unruly behavior or underage participants.

Call centers also benefit from DeepTone software through tone coaching. Similar to how companies have used text sentiment analysis to review customer feedback, emotion AI technology measures tone of voice to detect customer satisfaction.

Airlines and retail companies use similar software to improve quality assurance at call centers. This machine learning tool acts as an assistant to customer service agents. It can remind agents to slow down their speech or provide suggested responses for real-time customer complaints. Beyond live help, AI eases the burden on quality managers by saving the time spent reviewing call logs.

These instances of emotion AI are the tip of the iceberg. Smart Eye recently acquired Affectiva, an emotion-detection software, for $73.5 million. The software is aptly named after affective computing, the ability for computers to understand human affect or emotion. This software will be combined with Smart Eye’s car monitoring systems to improve the driving experience. The AI-based eye-tracking technology pairs well with Affectiva’s ability to detect drowsiness through frequent yawning or troubling body language. This “interior sensing” market is yet another example of how seemingly small interactions can make a big impact on the user experience.

Hand reaching out to pick up phone with no one around
Photo by Elena Koycheva on Unsplash

Listening Closely

As with all emerging technology, many remain cautiously optimistic about emotion AI. Companies like and Affectiva have produced promising results, but determining human emotion is hardly ever 100% accurate. Think about the last time you incorrectly guessed if a friend was being serious instead of joking. Gauging one’s emotions requires more than current context clues; it requires a deep understanding of someone’s personality and experiences.

Some professionals think this existing technology uses a reductionist approach to emotion recognition. From their perspective, emotion in language requires more data points than voice alone can provide. However, when combined with other modalities like facial recognition or interior sensing (like Smart Eye’s use case), this technology may glean more accurate insights.

From car makers to call centers, the market for emotion AI is constantly expanding. To become more widespread, businesses must address technological constraints and ethical considerations. Machine learning is only as good as its training data. Voice data contains a myriad of features that need to be extracted to determine whether someone is happy or angry. But what if the data are only coming from one country or one audience segment? People express emotion differently across cultures, presenting a challenge for this new technology.

Nevertheless, emotion AI’s capabilities are poised to grow as new training data emerge and inclusive design comes to the forefront. Our perceptions of this technology will likely shift as AI continues to remind us what it means to be human.

Like this article? Let’s connect on LinkedIn or chat over virtual coffee




where the future is written

Recommended from Medium

AI Is Too White

For about ten years I owned a sleep therapy company treating people with sleep apnea, and before…

How to tell if your startup is going to fail… before it happens!

Will Humans Ever Allow Artificial Intelligence to be Truly Useful as the Designer?

Let’s Dive into the Future

How Remote Depositions Help Excessive Court Backlogs

An introduction to bot builder platform Dialogflow CX

Dialogflow CX Visual Flow Builder

Review: Wysa

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Julia Anderson

Julia Anderson

UX designer exploring how technology can make us better humans

More from Medium

Artificial Intelligence in Content Moderation

How Prejudice Creeps into AI Systems

The Beginner’s Guide to Making AI Do What You Want

5 Scenarios That Demonstrate AI Superiority in Customer Service

Artificial Intelligence AI