The Dawn of Emotion AI

Jennifer Cle
Inside Machine learning
2 min readApr 24, 2018
Shutterstock image

Chatbots not quite stealing your heart? Soon, they very well may be.

At IBM Think 2018, keynote speaker and futurist Dr. Michio Kaku outlined a future of an emotional internet characterized by the sharing of visceral memories and feelings using a brain pace maker.

This ‘brain net’ as he calls it, will undoubtedly change the way we interact with machines and with each other. “Eventually, we’ll upload whole experiences. We’ll be sending emotions, memories, feelings of our first kiss…on the internet,” he said in this short interview, captured shortly before he took the stage.

The notion of machines learning to understanding our very feelings, and adapting their responses to satisfy our needs (and the needs of the businesses they serve) brings up a whole host of scenarios, challenges and opportunities.

The customer service industry has been among the first to experiment with ways to increase emotional intelligence using intent classification, sentiment analysis and natural language processing.

Pioneering companies are experimenting with machine learning programs that can help clue in call center reps in when, say, customers are stressed out. The resulting increase in empathy might just make the difference in a company’s bottom line.

As other industries are set to follow and dip their toes into the world of “Emotion AI”, some are asking whether bots need to be emotionally aware or are we asking too much from algorithms?

While it may be another decade before physical spaces manage to discreetly monitor and respond to human feelings, movements and gestures, ideally for consumer benefit, early signs of emotion AI can be found in the virtual personal assistants people use every day.

One can look to startups such as Soul Machines to get a glimpse of what’s next. The New Zealand-based startup has set its efforts on putting a “human face” on AI with interactive artificial humans built using a combination of neural networks and brain models powered by IBM Watson and IBM Cloud.

In February, Soul Machines announced it had partnered with Daimler Financial Services to demo Sarah, a digital human designed to improve their customers’ experiences with car financing, leasing and insurance. Sarah features artificially generated empathetic facial gestures and a natural voice intonation that not only feels more humanlike but will eventually recognize nonverbal behavior in real time using face recognition.

Learn about the latest trends and considerations of emotion AI from Sensing Feeling’s Jag Minhas and WayBlazer’s Andrei Faji in this second piece on the Thoughts on AI series, originally published on IBM Big Data and Analytics Hub.

Connect with Jennifer Clemente on Twitter @jenn_clemente

--

--