AI Scholar: Facial Expression Recognition Research Based on Deep Learning

This research summary is just one of many that are distributed weekly on the AI scholar newsletter. To start receiving the weekly newsletter, sign up here.

If you think about it, emotions are the essence of what makes us human. They impact our attention, perception, memory, daily routines, and social interactions. Since our faces are the most reliable indicators for emotions, facial expression analysis is all we need to get the knack of emotion recognition and research.

As deep learning advances, convolution neural network (CNN’s) are increasingly achieving better object recognition model performance. However, the classification mechanism of CNN's still has a long-standing challenge — too many parameters, which makes it difficult to analyze.

Designing and Training a CNN for Facial Expression Recognition

Researchers recently developed and trained a CNN based on facial expression recognition, and explored its classification mechanism. Using a deconvolution visualization method, they project the extremum point of the CNN back to the pixel space of the original image. They also design the distance function to measure the distance between the presence of facial feature unit and the maximal value of the response on the feature map of CNN.

Deconvolution visualization of active values

A CNN feature map is determined by comparing the maximum distance of all facial feature elements in the feature graph and the mapping relationship between facial feature elements and becomes sensitive to the facial feature if the distance is more. The approach has been verified to form a detector for the facial action unit in the training process to realize facial expression recognition.

Potential Uses and Effects

Facial expression recognition is one of the best ways to test the impact of any content, product or service that is supposed to elicit emotional arousal and facial responses and can, therefore, be applied to instantaneously detect faces, code facial expressions, and recognize emotional states.

Many applications including consumer neuroscience and neuromarketing, multimedia adverts, psychological research, clinical psychology and psychotherapy, artificial social agents (avatar) engineering, and more can greatly benefit from this research.

Read more: https://arxiv.org/abs/1904.09737v1

Thanks for reading. Please comment, share and remember to subscribe to our weekly newsletter for the most recent and interesting research papers! You can also follow me on Twitter and LinkedIn. Remember to 👏 if you enjoyed this article. Cheers!