3 most widespread myths about emotions we teach AI

Emotions have been playing a more central role in business these days. At some point we ran into the idea that the decision about a purchase is massively influenced not only by what consumers think about the product, but what they actually feel about it. That is why companies today are trying to integrate emotional aspects in everything they do: customer analytics, services, and technologies.

For humans the time of emotionless rationalism has come to an end a long time ago, but machines are now just at dawn of their EI, emotional intelligence. Throughout the last decade we have seen rapid development of emotion technologies, a field usually called Affective Computing. But when emotions are involved, there always have been mysteries.

There are though some erroneous clichés that we encounter more often. We will consider 3 most widespread myths about emotions in Affective Computing replicated in the media and business world.

Myth 1: Paul Ekman’s heritage

The foundation of Paul Ekman’s theory can be summarized by the thesis that as far as facial expression is concerned, people display and recognize some emotions, which he called ‘basic’, in universal ways. No matter where we are and whom we are talking to, we will always recognize when our interlocutor is expressing five* emotions: anger, fear, disgust, happiness, sadness.

*Surprise was excluded from the list of basic emotions after the revision of the theory.

Paul Ekman’s basic emotions (plus neutral state) by Tim Roth, who has played the protagonist in the ‘Lie to me’ TV-series.

One of the first critics of Ekman’s theory was psychologist James Russell. He denied the idea of universality of emotions, claiming that the emotion-face link is not as straightforward as Ekman initially supposed: the same facial expression might have different meanings in different contexts. Later, Beatrice de Gelder in her book ‘Emotions and Body’ claimed that in fMRI experiments no neurological ground has been found in human brain to confirm that the various emotions are basic or universal.

More recently, a famous neuroscientist Lisa Barrett, one of the most prominent critics of Ekman’s basic emotions theory, claimed that emotions are not something inborn but rather learned through experiences. Emotions differ from culture to culture, from person to person. In a series of experiments a research team went to Namibia to check how would an isolated tribe, Himba, recognize happy, sad, fearful, angry, disgusted, and neutral faces. While there were no complications with positive facial expressions, Himba people tended to mix negative expressions like disgusted or angry. When experiments were held in other tribes, similar results were obtained. Thus, Barrett concluded, the way we explain emotions is stereotypical for our culture — we give the same names to what actually has different nature.

Even though in 2011 Ekman has changed the definition of what he refers to as emotions, including cultural and individual specifics, and even excluded one emotion from the list of basic emotions, many companies that work with Affective Computing still base their methods on the old Ekman’s theory. They still include essence like ‘basic’ emotions in their datasets. And with this approach, as Liza Barrett has put it, they are going to fail. But, she says, if external and internal context is added, the technology holds the potential to revolutionize the science of emotion.

Laboratories and companies that work with emotional analytics should avoid the mistake of slipping into the bias of universality of emotions. First, the affective data used for the training of algorithms should be specific, that is — to take into account the culture, language, gender and even age when trying to determine what emotions are expressed. Second, emotion recognition algorithms should be sensitive to the context. What is important, some labs have done the attempts to integrate context (for instance, here), but no ‘big’ Affective Computing company has done it so far.

Myth 2: Smile indicates happiness

Ekman’s theory naturally led to the conclusion that emotion expression can be associated with what people feel inside.

For instance, smile, one of the most easily detected emotions, has a bunch of meanings attached to it: feeling happy, pleased, satisfied, supporting, appreciating, etc. This transforms into the question: what is the function of a smile?

In the recent study [1], the subjects were asked to solve nine difficult tasks presented on the monitor. When they managed to give the right answer, they smiled, even though they did not have an addressee of a smile, except for the computer. At the same time, a theory of social displays states that the function of smiling differs when the person is in social environment or alone.

In Affective Computing, at least in a commercial version of it, current emotion recognition techniques can only analyze emotions separately from the social context. Thus, to truly understand the meaning of a smile, we should teach machines to distinguish emotional expressions that arise in different situations, social or not. There is a difference between a smile we share with a person and a smile in front of a monitor.

This is also precisely the reason why we should take into account a more complex nature of emotions. Analyzed facial expressions can be supplemented with acoustic parameters, body movements and physiological manifestations. This approach is known as multimodality of emotions.

Myth 3: Body ‘language’?

So far, we have come to the conclusion that emotions are not universal, that the concept of basic emotions is a debatable question, and that emotion expressions seem to be culturally, individually and context-dependent. The situation gets more complicated since emotional expression indeed is not limited to our face, but include voice, body movement, interpersonal distance, physiological manifestations.

That’s why as much as people have been trying to understand if the person is lying by observing facial cues, they tried to do the same with the body movements. For instance, body postures have been associated with almost everything, probably most famously touching one’s mouth with lying, or body expansion with feeling secure. The theory has become so widespread it has echoed in stress-management, security affairs and even cinematography.

In the airports, the question of security has always been of top priority. First automatic behavior detection systems were placed in the U.S. airports at the end of 20th century, and since then have been used across the world. Typically, the probability that a passenger belongs to the risk category is calculated based on a set of key features associated with high risk. Many researches claimed that up to date no particular psychological attribute have been found that can be used to describe any ’personality’ that is distinctive of terrorists. [2] The connection between how a person moves and the fact that he or she is lying is not as straightforward as supposed in folk psychology.

All in all, it is a debatable question whether there is a folk version of body ‘language’ that can reveal what a person feels. It is of course possible to obtain the connection between non-verbal signals and emotional behavior of a person. Today a separate technology called body-tracking deals with massive amounts of data about body movements. In Affective Computing it aims to find statistical dependencies in the way our bodily expression is connected with particular emotions.

To sum up

Affective Computing is an exciting, yet complicated area in both scientific and business worlds. It is indeed on the frontiers of high-tech. Still, in many cases commercial applications of emotion recognition tech are based on outdated approaches due to many comfortable reasons: either a sound name of a founder of a famous approach, or limited goals these applications are aimed to achieve.

Obviously, everybody would love to ‘read’ emotions just like the main character in the famous Lie to me TV-series. It is important though to keep in mind that the nature of emotions is much more delicate and perplexed, not to sank into phrenology and palmistry.

We named a few of the most widespread myths about emotion science in Affective Computing. It is necessary to get rid of them so that this technology could be serving humanity, with otherwise inaccurate and biased predictions.


References:

[1] Harry J. Witchel et al. A trigger-substrate model for smiling during an automated formative quiz, Proceedings of the 36th European Conference on Cognitive Ergonomics — ECCE’18 (2018). DOI: 10.1145/3232078.3232084

[2] Airline Passenger Profiling Based on Fuzzy Deep Machine Learning (2016). Zheng, Yu-Jun et al. doi: 10.1109/TNNLS.2016.2609437

Authors:

Olga Perepelkina, Chief Research Officer at Neurodata Lab

Kristina Astakhova, Evangelist at Neurodata Lab, MA in Cognitive Science and Language