Why is affective computing important? Emotional states. Other uses.

VU Token
VU Token
Published in
4 min readMay 19, 2018

Affective computing is a concept that was proposed by Dr. Rosalind Picard — Founder & Director of the Affective Computing Research Group at MIT, when she published a book that proposed the framework in 1997.

Put simply, affective computing is a field wherein the hardware or software driving a process has a notion of — through any number of data inputs like camera, microphone or something more bespoke — how the human user is responding and is able to interpret that information. This goes well beyond the typical canned responses or tooltips to a user’s keystroke or click of a button — picking up on other cues like the tonality of voice, micro expressions, body language or any electrodermal activity (EDA) information that the computer has access to.

For example, a software product might be able to recognise that a user is experiencing levels of stress by monitoring skin conductance or recognising a user’s tone of voice and tailor support or offer suggestions based on what can be deduced from that data. Similarly, a game that emphasizes tension and fear would be able to fine tune the experience based on a user’s responsiveness to it by measuring physiological signals like breath and heart rate.

Advancements in affective computing are not just relegated to entertainment. Academia, healthcare and marketing are fields that have utilized affective computing to great success in learning about how a human is responding to a particular stimulus. An e-learning system for a school that was ‘emotionally intelligent’ enough to be able to distinguish between confusion and self-confidence, for example, would potentially create the ideal environment for learning. Researchers at MIT are working on a system that uses wearable sensors and smart equipment to pinpoint the early signs of depression, analyze responsiveness to treatments and provide a voice for those who cannot communicate. This has the potential to quickly spot relapses and allow for preventative treatments to intermediate — made possible from the in-depth data insight from each person’s personalized monitoring.

As the technology continues to evolve, more data can be gathered to use affective computing as a tool to actually analyze and compound data generated from bodily responses — especially now that wearables have become more mainstream. Such intimate tech allows for a range of signals to be detected, all of which can be interpreted by computers to introduce new levels of responsiveness in human-computer interactions. Very soon, technologies will be able to reliably detect more subtle indicators such as pupillary response and eye movement that generic equipment would miss.

This equips story tellers and developers with something truly profound in terms of audience interaction — connecting with users accurately and automatically beyond surveys or interviews. By quantifying new and more in-depth sources of data from emotionally responsive AI equipment; the analysis of human responses — specifically within a media setting — can help to mold the foundation of something that deeply resonates with a person.

Here’s where the magic happens in terms of game development and simulations — by analyzing and interpreting this collection of physiological signals, we can deploy machine learning in the future to create an affective computing model for a range of different players, enabling a virtual environment and its non-player characters to adjust themselves in real time to the user. The result of this is a an adventure that feels immersive and real, where users can feel present as the system dynamically responds to create events with less regularity and a greater sense of surprise.

This is one of the first times in human history where the arc of a particular story or adventure is so organically malleable in real-time. It’s absolutely an area of exploration of us in the creation of our Virtual Universe, VU — which sets out in to be an experience that unfolds and molds itself based on actions of the user. Each player unconsciously helps inform the system — through the automatic quantification of their experience — of any changes that would increase enjoyment and engagement for subsequent users.Through this process, VU will naturally evolve into a realistic and immersive simulation that will connect its users in a virtual world with limitless capabilities.

Ciaran Foley is CEO of Ukledo and Immersive Entertainment, Inc. a Southern California virtual reality software company developing a new virtual engagement platform called Virtual Universe (VU).

___________________________________________________________________

Learn more about Virtual Universe and VU token by visiting our website and signing up for email updates, visiting our Github, following us on Twitter, Facebook, Linkedin, and Instagram, or being part the discussion on Telegram and Discord.

--

--

VU Token
VU Token

Virtual Universe (VU) is an epic, story-driven open world game in LivingVR™ powered by AI, VR, and blockchain. The VU Token powers the economy as a currency.