Facial Emotion Detection Using Deep Learning

Afaq Iftikhar
3 min readAug 1, 2023

--

Figure: Emotion Recognition

Introduction

Facial Emotion Recognition is an emerging and significant research area in pattern recognition domain. In daily life, the role of non-verbal communication is very significant and involved more than 70%. Facial emotion analysis is major part of analysis provided by data from the surveillance videos, expression analysis, gesture recognition, smart homes, computer games, depression treatment, patient monitoring, anxiety, detecting lies, psychoanalysis, paralinguistic communication, detecting operator fatigue and robotics. Major companies like Google, Apple, Alibaba, and Facebook are constantly trying to innovate in this field and leading the major development in the field. The growth of this technology has led to more and more people researching in this area and trying to solve real world problems. In this tutorial, we have tried to code a simple facial emotion detection using convolutional neural network. You can find the code for this at the [github link]

Dataset Used

We have used FER2013 dataset. The dataset contains approximately 30,000 facial RGB images of different expressions with size restricted to 48×48, and the main labels of it can be divided into 7 types: 0=Angry, 1=Disgust, 2=Fear, 3=Happy, 4=Sad, 5=Surprise, 6=Neutral. The Disgust expression has the minimal number of images — 600, while other labels have nearly 5,000 samples each.

Libraries Used

Tensorflow, Opencv, Deepface, Numpy, Pandas.

Methodology

We have used convolutional neural network with three layers and learning rate of 0.0001.

For training purposes, we divided the dataset into 90% training and 10% testing. We trained the dataset using 20 epochs and observed the model accuracy increased with the number of iterations the dataset was passed through the convolutional network.

Figure:01 Model Accuracy vs Epoch
Figure: 02 Model Loss vs Epoch

The following graph shows up to 50 epochs and we can see the trend of increased model accuracy with epoch numbers. In reality, the epoch number depends on the computational power of the machine and considerations are taken in order to reduce the cost and what is the optimal accuracy level for the model.

Output

The algorithm outputs the score in softmax scores for the seven emotion classifications.

We have also used real time emotion detection using deep face and Opencv. It uses transfer learning and pre-trained model from deepface library to output the dominant emotion.

Conclusion

We provide a basic framework for understanding of this complex and exciting technology. Emotion detection have proven to be very useful in healthcare, surveillance and a big step towards the new generation of robots that will be able to semantically understand the non-verbal clues.

--

--

Afaq Iftikhar

I am a passionate AI/ML professional and provide services in AI model development and deployment. You can contact me at afaqiftikhar7@gmail.com.