EmotionCues: AI Knows Whether Students Are Paying Attention

Synced
SyncedReview
Published in
4 min readJan 16, 2020

Facial recognition technology was introduced in the 1960s, languished through the AI winter, and in recent years has taken off — boosted by increasingly powerful deep neural networks. Facial recognition has been applied in Face ID device unlocking functions, public security services, smart payment systems and more. During Taylor Swift’s 2018 “Reputation” tour, the American singer-songwriter’s security team utilized the tech to safeguard her from stalkers.

Now, a research team from the Hong Kong University of Science and Technology and Harbin Engineering University has adopted facial recognition technology to analyze students’ emotions in the classroom through a visual analytics system called “EmotionCues.”

Paper co-author Huamin Qu says the system “provides teachers with a quick and convenient measure of students’ engagement level in a class. Knowing whether the lectures are too hard and when students get bored can help improve teaching.”

But is it really that simple?

The proposed EmotionCues system includes a data processing phase and a visual exploration phase. The system first processes a series of raw data inputs and uses computer vision algorithms to extract emotion information using steps such as face detection, face recognition, emotion recognition and feature extraction. In phase two, the interactive visual system uses granular visual analysis of classroom videos to predict students’ emotional state and also the evolution of each student’s emotional state — i.e. is Lily losing interest?

The research team tested their EmotionCues system in the Hong Kong University of Science and Technology and at a Japanese kindergarten. Results show that EmotionCues performs better in detecting “obvious emotions” such as the sense of joy when students experience a particularly interesting or intense learning interest. The system’s ability to interpret “anger” or “sadness” however still needs improvement. Students who are actually very focused on class content may for example purse their lips in contemplation, which unfortunately the system might easily interpreted as “anger.”

The new study is not the first use of tech to analyze students’ emotional states. Last year, students in a primary school in Jinhua, Zhejiang wore smart headbands which measured electric signals of brain neurons and translated the collected information into an attention score. The headbands on students who were focused displayed a red light, while the less focused students’ headbands glowed blue. Student attention scores were sent to the teacher’s laptop every 10 minutes and synchronized to a WeChat group so parents could remotely monitor their child’s status at any time.

Although the aim of the project was to help students study more efficiently and help teachers improve their teaching quality, concerns were raised regarding both student privacy and system effectiveness.

The paper EmotionCues: Emotion-Oriented Visual Summarization of Classroom Videos is available on IEEE.

Author: Yuqing Li | Editor: Michael Sarazen

Thinking of contributing to Synced Review? Sharing My Research welcomes scholars to share their own research breakthroughs with global AI enthusiasts.

We know you don’t want to miss any story. Subscribe to our popular Synced Global AI Weekly to get weekly AI updates.

Need a comprehensive review of the past, present and future of modern AI research development? Trends of AI Technology Development Report is out!

2018 Fortune Global 500 Public Company AI Adaptivity Report is out!
Purchase a Kindle-formatted report on Amazon.
Apply for Insight Partner Program to get a complimentary full PDF report.

--

--

Synced
SyncedReview

AI Technology & Industry Review — syncedreview.com | Newsletter: http://bit.ly/2IYL6Y2 | Share My Research http://bit.ly/2TrUPMI | Twitter: @Synced_Global