Adding an Emotional Face to Machine Learning

By Paula Klein

the evolution to humanize technology, Affectiva is carving a niche. Its software development kit (SDK) and cloud-based API allow developers to enrich digital experiences by adding “emotion awareness” to apps from games to medical devices. And that means that machines can collect data and respond to users’ emotions in real time, mostly based on facial recognition techniques. It’s what the company calls, Emotion AI.

As noted in a recent Forbes article: “Affectiva’s technology has proven transformative for industries like automotive, market research, robotics, education, and gaming, but also for use cases like teaching autistic children emotion recognition and nonverbal social cues.”

Demand is flourishing: Fortune 500 brands such as Kellogg’s want a more data-driven way to test advertising, consumer behavior, and marketing campaigns beyond the limitations of focus groups or surveys. Affectiva’s technology can collect, store, and analyze the reaction of viewers and measure their facial responses: Did they frown, smile, yawn, or wander? Autonomous car companies are interested in determining driver moods and attentiveness for conversational dashboards and to improve safety.

To address these requirements, Affectiva has been partnering with robotics, AI, and marketing companies to augment its 30-person staff.

Burgeoning Demand

But it wasn’t like this when Rana el Kaliouby began her research more than a decade ago as a doctoral student at the University of Cambridge and then as a post-doc at the MIT Media Lab. “I was interested in the application of emotion as it related to autism,” el Kaliouby said last week. Emotion AI was an untapped market, not getting much attention. “It was so new, then. Now, people just say: I need it!” Joining the burgeoning “empathy economy,” is a way for AI companies to distinguish themselves and their products, she said.

Rana el Kaliouby at the May 25 MIT IDE conference. Photo by Andrew Kubica

El Kaliouby, Co-founder and CEO of Affectiva, based in Boston, returned to MIT to give the keynote address at the IDE annual conference on May 25. She described the company’s progress and the rising demand for facial and voice recognition technologies. By 2009, she said, so many requests for commercial apps were flooding the MIT Media Lab, she was encouraged to spin out the company and seek venture funding. “It’s challenging to work in academia and scale a project,” and el Kaliouby was determined to explore more about how human-machine interactions could be applied to commercial applications.

Affectiva was launched that year, and so far has raised $25 million in funding from leading investors including Kleiner Perkins Caufield Byers, Horizon Ventures, and WPP. The company has been ranked one of the country’s fastest-growing startups, and el Kaliouby has won many accolades including the “Women in Engineering” Hall of Fame and Technology Review’s “Top 35 Innovators Under 35” award.

El Kaliouby is still intrigued by the possibilities of AI and machine learning to identify human emotions. “Emotions influence so much in our personal and social lives, but it’s largely missing in the digital world.” According to a March KBV Research report, the emotion, detection and recognition market globally is forecasted to grow at a CAGR of 27.4% and anticipated to reach nearly $30 billion by 2022. Competitor, Emotient, was acquired by Apple last year.

Distinguishing Smiles from Smirks

Currently, Affectiva’s emotion-recognition products can determine 20 facial expressions, seven emotional categories as well as age, gender, and ethnicity with over 90% accuracy. Based on deep learning algorithms, the cloud-based database can read the “nuances of facial expressions” such as a squint or a smirk.

Affectiva is developing APIs specifically for kids, for healthcare robots such as Catalia Health’s Mabu robot — designed for chronically ill patients — and for retail kiosks.

Mabu health robot at work.

El Kaliouby is well aware of possible customer resistance over monitoring, privacy concerns, and legal issues associated, for instance, with unwanted screenings at borders or airports. So far, the company has avoided the surveillance and security markets, and looks for social value as a result of the technology deployments it supports. Additionally, the apps are provided on a consent-driven, opt-in basis.

Nevertheless, public acceptance is something that evolves over time as do new markets. Maybe the next killer app will be for dating, or assistive robots that help at home. Certainly, as the pace of AI accelerates and social/camera apps like Instagram and Snapchat proliferate, El Kaliouby expects emotional AI to become ubiquitous in the next three years as well. “As long as we find examples to align with consumer’s comfort level, it will gain ground,” she said.

And that will make any smart device happy.



The IDE explores how people and businesses work, interact, and prosper in an era of profound digital transformation. We are leading the discussion on the digital economy.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store

Addressing one of the most critical issues of our time: the impact of digital technology on businesses, the economy, and society.