Emotional recognition — judging AI’s ability to judge human emotions

David Lee Smith
Product AI
Published in
4 min readJul 29, 2021

Facial recognition is old news. Okay, so my smartphone’s camera can identify my face (within a reasonable margin of error), and that can be used for security purposes. Understanding that this technology is more or less mainstream already, what’s next? Modern iPhones have already done away with thumbprint identification in place of facial recognition, and even lower-end smartphones utilize an “eye-scanner” to keep their screens unlocked while users are still actively looking at the screen. Let’s explore the newer, deeper evolution of this technology — emotional recognition.

Have you ever had an interaction with the police, or at least watched a clip of one? How many times have you heard something to the tune of, “You look nervous”, or, “You’re acting nervous”? For a human, such as you or I (hopefully), this kind of emotional intelligence is more or less instinctual. Humans, just like many other animals, are able to innately pick up on sometimes minuscule physical cues that indicate certain feelings. As useful as this ability might be on a date, think about how practical it could be during that aforementioned police interaction, or at a job interview.

As of recent years, AI has been dipping its toes into the field of “emotional recognition”. Though it hardly demands explanation, this is simply an AI-assisted technology which examines the facial cues of a person, and subsequently renders a verdict as the relevant emotional state.

This kind of approach first received criticism as being nothing more than a novelty. Take emojify.info for example. While technically utilizing AI to recognize human emotions, this translated into not much more than smiles equaling happiness, and frowns equaling the opposite. If AI-assisted emotional recognition is to really break into the mainstream, it’s got to be smarter — way smarter.

Let’s take 4littletrees as a more practical implementation.For the past year, young girls from a school in Hong Kong have not only been studying from home, but also have been studied from home. Modern AI tools from 4littletrees have allowed this tech to monitor the students’ subtle facial movements and cues, and judge their emotional reactions and status. As sinister as that might sound, this tool’s main purpose is to allow teachers and educators the ability to accurately identify their pupils’ emotions in real time, so as to tailor lessons and approaches to them, and create the most appropriate and effective learning environment possible.

Now, let’s move onto the corporate world. As this recent article from CNN pointed out, AI may be just another hurdle you’ll have to overcome when going through the sometimes rigorous interview process of a new job. Are you nervous? Are you lying? Did you really not steal a pen from your most recent employer? With the use of emotional recognition technology, or ERT, this question might not even be left up to a human.

So you may be asking, “Sure, this is interesting, but haven’t polygraph tests been producing these kinds of results for decades?” In actuality, there is very little evidence that these “lie detectors” are actually accurate at all. There are two main underlying issues with this type of approach. For one, there’s no universal psychological “baseline” on which these types of tests can be calibrated, therefore the results typically come not from evidence, but from inference. The other issue is quite obvious — these tests are most often interpreted by a human. To err is human, sure, but in the case of a criminal trial or a multi million-dollar executive position, can we really take that chance?

It is imperative that any kind of AI-assisted technology, it is absolutely imperative that the issue be approached with caution. This is both evident in the philosophical ramifications of AI in general, and the very, uncomfortably real truth that any tech will be designed to suit the needs of people — people which can have any number of prejudices or biases. In China, for example, ERT is being tested on citizens without their consent, and has the potential to be used in the future as a kind of “future crime”-style prosecution measure.

The fact is, emotional intelligence, or the ability to accurately judge the emotional state of another, is not something that we are born with. Rather, it is something that we must develop through social interactions. Teaching AI to do the same might very well prove to be the same type of ongoing educational process. As we better understand ourselves and each other, so too might AI. Where we go from there, is anyone’s guess.

--

--

David Lee Smith
Product AI

As a tech researcher and author, I've seen everything from the advent of the internet to the release of the smartphone. Let's see what's next!