Now, computers can tell how you’re feeling

By Roxanne Bauer

World Bank
World of Opportunity
3 min readOct 15, 2015

--

Imagine watching a commercial, and the TV or mobile phone on which you are watching immediately knows if you’d like to buy the product being advertised. Imagine feeling stressed out while driving, and your car automatically starts talking to you and adjusting the air and radio controls. Or imagine a video or film that changes the storyline based on your reactions to characters. This is the future, in which devices react not just to our behavioral and physiological clues, but also to our emotions.
Affective computing is the study and development of systems and devices that can recognize, interpret, process, and simulate human the emotional states of humans. It is an interdisciplinary field spanning computer science, psychology, and cognitive science.

Photograph of software tracking facial cues © MIT News/Limited license

Most of the software in the field of affective tracks emotions, like happiness, confusion, surprise, and disgust, by scanning an environment for a face and identifying the face’s main regions — mouth, nose, eyes, eyebrows. The software then ascribes points to each and tracks how other points move in relation to one another. Shifts in the texture of skin, such as wrinkles, are also tracked and combined with the information on facial points. Finally, the software identifies an expression by comparing it with those it has previously analyzed.

The applications of affective computing are both numerous and varied. Affecting computing can be used in online learning applications to adjust the pace or style of a computerized tutor when a learner is bored, interested, frustrated, or pleased. Marketing agencies can observe and record consumer reactions to advertisements and products to test their wider popularity on the market. Apps may one day select music, news stories, or games based on its owner’s mood, and fitness trackers may suggest certain types of movement to clear our heads, overcome frustration, or encourage play.

Affective relies on measuring two main components: the sympathetic nervous system and the parasympathetic nervous system. The sympathetic nervous system controls our “fight or flight” responses, increases our heart rates, and can be measured by monitoring skin conductance. The parasympathetic nervous system controls our “rest and digest” responses, lowers our heart rate, and is measured directly through the variability of the heart rate. Through these impulses, computers can tell when someone is interested in something, as their heat rate increases and their electrical pulse lights up. Likewise, computers can tell the moment when someone loses interest because their eye contact wanders, their heart rate drops, and they are a little less electric.

While some social scientists argue that context plays a far greater role in reading emotions than affective computing allows, laboratory studies have demonstrated that context-blind computers are often more accurate than humans in deciphering emotion. Computers often outperform people in distinguishing polite, social smiles from those triggered by spontaneous joy and in differentiating between fake pain and genuine pain. One reason for this is that computers can operate with tireless attention and register expressions so fleeting that they are unknown even to the person making them.
With more information on how and why affective computing will change media, marketing, education and many other industries, Rosalind Picard, the originator of the term “affective computing”, shares why she thinks it’s important to integrate emotions into the machine environment.

Read more World Bank blogs.

--

--

World Bank
World of Opportunity

The World Bank has two ambitious goals: End extreme poverty within a generation and boost shared prosperity.