Using BCIs for emotions

Neha Adapala
11 min readOct 15, 2023

--

I’m so glad we came here for vacation!

Honestly, I wish we went to the Bahamas…

Thanks so much for organising this for us! We’re gonna have so much fun!

I could’ve done it better AND with a smaller budget. Never letting her plan anything again.

Do you know that feeling? When you just have to act happy for someone but you really weren’t. And then they do the same thing again because they think you were happy with it. Sometimes at times like this, it’s probably best to be able to read how the other person actually feels. Especially if you were the one who was preparing the holiday. But that’s impossible… or is it? Maybe, just maybe, you could record those emotions…

Emotion recognition goes just beyond knowing how happy or sad someone is. It’s a way for people to be able to understand others in instances where there are barriers to those communications. Examples of applications of this could be with those who find it hard to express their emotions, or those who choose not to out of fear.

Around a half of autistic people have difficulties understanding and describing their own emotions according to Autistica. Having a tool like this could drastically change the way that people around those on the autism spectrum can understand each other. Additionally, not many people with mental illnesses are bold enough to say that they are struggling from such illnesses , and in serious situations, this can result in undiagnosed mental diseases, making the illnesses worse over time, hence harder to treat. The World Health Organization (WHO) estimates that close to a whopping two-thirds of mental illnesses go untreated. This can lead to these illnesses to become far more severe, making them even harder to treat. The biggest barrier here is likely the way that humans try to hide their real emotions, making it unclear if they have a mental illness or not. Therefore, using BCIs, we can reduce this barrier.

What are BCIs?

Brain-computer interfaces are computer-based communication systems that analyse signals from the brain. No- it is NOT a mind-reading machine that is going to take over your internal thoughts instantly, so take a breath.

You must be thinking with intention for the program to pick up on the most relevant signals. The basic goal of these BCIs is to identify and assess features of brain signals that suggest what the user is thinking about. After this, the key features, like the signals that suggest that someone is feeling happy or sad, will be transmitted to another device which will output whatever emotion the person is feeling (in this use specifically). Crazy, right?

However, to get from the thinking to the executing stage without moving an inch, there are 4 components in between.

LEVEL 1:

Signal acquisition

  • receiving and recording signals from neural activity
  • Sending data to preprocessing component

LEVEL 2:

Pre-processing

  • Signal enhancement
  • Noise reduction

LEVEL 3:

Translation

  • The signals are translated into code

LEVEL 4:

Feedback / device output

  • The desired action is executed

You can think of it like making a Subway sandwich- level 1 is putting all of the ingredients into the sandwich, level 2 is removing the vegetables that the client doesn’t like, level 3 is toasting the sandwich so the cheese is all melty (the only way that I would accept the sandwich) and level 4 is giving the client their Subway.

We are going to be discussing non-invasive EEGs today. Why? Well, they are safe, inexpensive, non-invasive, easy to use, portable, and often maintain high temporal resolution (when the signal happens). Although they do lack spatial resolution, making it hard to know where the signals came from, the advantages outweigh this.

What is emotion?

Most of us have emotions… It’s what makes a lot of us human. But what is emotion? As per a research paper published in 2022 by Essam H.H., Asmaa H. and Abdelmgeid A.A., “emotion is a complicated condition that expresses human awareness and is described as a reaction to environmental stimuli. Emotions are, in general, reactions to ideas, memories, or events that occur in our environment”.

A big takeaway from this is that emotions are essentially a response to some sort of stimuli.

That’s why stimuli are a big part of emotions elicitation experiments!

How to induce emotions

  1. You could create simulated scenarios to evoke emotions

Remember when you fell off of your bike onto the rocky, muddy floor? You probably can’t remember what trousers you were wearing that day or what colour your helmet was. But I am 99% sure that you can remember how you felt. If we made you fall off your bike again, we could probably evoke those same emotions!(although the problem is that you might not generate the same emotion)

2. Evoke emotions by displaying photographs, videos or music and other similar materials that stimulate emotions

Like that video of you and your family going on that road trip… or the selfie you took with your best friends at a restaurant that one day you went out. All of these resources can evoke emotions in you AND you could label them

What do you feel when you see this picture? Maybe you’re melting inside because of how cute this is. Or maybe you’re fearful because of that time a German Shepherd chased you for 10 minutes (or at least what felt like it) around a park while the owner did nothing.

3. Computer games

Yeah! Maybe not League of Legends but computer games can definitely elicit emotions as you immerse yourself into the situation, rather than just passively watching or listening

Some Models of Emotions

DISCRETE: In discrete models, there are some major emotions

Discrete emotions consist of 9 basic emotions:

  • Interest- excitement
  • Surprise-startle
  • Enjoyment-joy
  • Distress-anguish
  • Dissmell
  • Fear-terror
  • Anger-rage
  • Contempt-disgust
  • Shame-humiliation

Which one are you feeling right now?

DIMENSIONAL: In dimensional models, they categorise emotions on dimensions.

RUSSELL’S CIRCUMPLEX 2D MODEL:

This model is used most often, with all emotions lying somewhere along the graph of arousal against valence. Personally, I find it easier to categorise more complex emotions, like guilt, into this model than looking at the larger emotions.

Can you identify where you are currently feeling on this model? Is it too basic for human emotion?

THE EKMAN MODEL:

Ekman states that basic emotions must:

(1) be instinctive and not forced

(2) many people feel the same emotion when put in the same situation

(3) several people express basic emotions in somewhat similar manners

(4) physiological patterns of different people are constant when basic emotions are produced.

Per Ekman, 6 primary emotions were also identified as universally recognisable by facial expression:

  • Sadness
  • Surprise
  • Happiness
  • Disgust
  • Fear
  • Anger

Other emotions can be generated from these broader emotions. Why don’t you test it out now? Think of 5 random emotions- could you fit it into one of the emotions above?

WAVES

No, not a Mexican wave…

EEG signals are classified into five categories based on the variation in frequency bands:

WHERE IN THE BRAIN???

Delta waves, detected in the frontal cortex have an amplitude of 20–200 lV. These are often detected when you’re unconscious- like when you’re sleeping!

Theta waves are found in the parietal and temporal lobes with amplitude 100–150 lV. Theta waves will rise when you feel positive emotions and are often associated with relaxation. Who isn’t happy when they are relaxed?

You at the beach in the Bahamas with theta waves firing off!

Alpha waves are often found in the occipital lobe and parietal lobe with amplitude 20–100 lV. They can be detected in a resting state with eyes closed. They have really high oscillatory energy in both positive AND negative emotions.

Beta waves are really only observed in the frontal lobe, but if you’re contemplating, the beta waves pop up in loads of locations with an amplitude 5–20 lV. They happen when a person’s mind is very active and focused.

Gamma waves often have an amplitude of lower than 2 lV. They are associated with brain cognitive tasks and functions at a high level like information reception, processing, integration, transmission, and feedback in the brainstem as well as activities that require concentration.

So… Why EEGs?

Since the user has no control over what their brain waves are doing when they’re feeling certain emotions, using EEGs seems to be the best way to know what someone’s true emotions are. This can massively speed up the diagnosis of certain diseases and allow them to get help as soon as possible. This is far more efficient than relying on voice tone, facial expression, or words which is something that we often do rely on.

To acquire EEG signals non-invasively, you need:

  • a set of electrodes
  • a data storage unit
  • an amplifier
  • a display unit

You would then preprocess the EEG signals by cleaning and enhancing the waves that you have recorded. This is because EEG signals are pretty weak and can easily be affected. After this, you must extract critical features that will be sent to the classifier using the BCI.

Machine learning with EEGs

In the systems that recognize emotions, machine learning algorithms are often used to classify different emotional states that can be identifies from EEG-based BCI.

There are two categories of machine learning models: supervised and unsupervised learning.

WARNING: Don’t read this if you’re hungry. Mentions of food approaching…

Supervised:

Supervised learning means that our training data is made of images and their corresponding class labels. For example, you have pictures of fried chicken, pizzas, salad, all with their name. You can then train an image classifier that takes in any image of food as an input. Hopefully, it will produce a label that is as close to the actual class label as possible. For example, if I put this picture in:

It will tell me that this is pizza!

And as you train the classifier, it tries to improve its accuracy. So maybe on the first time I put the picture in, it will call it a salad but after loads of trying out different pictures, it will call this a pizza!

Now if we imagine this in the context of the EEG signals, we could input the signals and the machine learning program would let us know what the waves are! Crazy, right?

Supervised learning means that we have a target to aim for during the training process. When the model is very accurate at learning, we can allow the program to make actual labels, when it is given new data it hasn’t even seen before!

Unsupervised:

Our training set is made of unlabeled images this time round. So how does it do it? The algorithm groups all the natural groupings in the data it can find. The algorithm can be trained to separate images into clusters depending on similarities in colour or geometric traits, like if it’s a circle or a square. BUT how do you know if these clusters are right or not? Well… you don’t, so there’s no measure of accuracy.

But to know if you’re on the right track, you’d have to create a new metric to separate data fed into the algorithm into groups. An example could be the distance between similar and different groups of data, and at the end of the training the program, you will end up with an image clustering algorithm. The way images are clustered together will probably show us what patterns naturally emerge from a given dataset.

Using unsupervised learning would mean you’d need less time for pre-processing and feature extraction, making the program way quicker. It also means you don’t need to train your model, which is great because it means all of your data can be used for testing. Also, it’s less computationally complex than supervised learning, further increasing the speed. However, you would need to use noise-free samples of EEG signals with the unsupervised learning and you need way more samples to validate the accuracy and results.

Applications

The applications of using BCIs for emotion recognition are boundless.

For example, more than 1 in 10 women experience postnatal depression, but they’re often forced to pretend they’re fine because they “should be happy for the new baby”. By keeping women in postnatal care and using BCIs to efficiently diagnose postnatal depression, we can ensure that there are more mums that can be happy after birth.

Also, some autistic people may have trouble detecting emotions when they are younger, so being able to easily communicate your emotions using BCIs can be so valuable in their own growth of detecting emotions.

Moreover, you know when you just have that gut feeling that something bad is going to happen- but it’s not enough to do anything? Well with BCIs, we can quickly detect those feelings and take necessary steps. In situations where you may feel like you are in danger, you probably don’t want to make it obvious that you do by taking out your phone and calling someone. Instead, BCIs could provide a much more discreet method of doing this.

Next Steps

To allow BCIs to make a large enough impact, we need them to be accessible. Right now, they can range from hundreds to thousands of pounds, which is out of budget for many, especially when we are currently undergoing a cost of living crisis. Hopefully, as research develops, the prices for these technologies will decrease and more people will be able to buy them! Some other ways to make these technologies more accessible would perhaps be to have centres that can be accessed by public transport, which would have lots of hardware available, so people can just go and access the technologies there for a few hours in store. Maybe there could also be some sorts of a renting system, in which people would be able to rent these technologies for a week or so at a much lower price.

Also, they are quite clunky right now, perhaps scaring some people off from using them. Additionally, if we want to use these in daily life, we can’t be walking around with a cap of dry electrodes fastened around our head. For example, some people are producing BCIs in the form of glasses. These innovations can really help BCIs make the biggest impact possible.

So, the next time you’re planning a holiday or maybe a trip out, think about what your friends are actually thinking about… and maybe use a bit of technology to find out!

Thanks for reading. If you are interested or have any questions, please email me at neha.adapala@gmail.com

Feel free to follow me on LinkedIn: https://www.linkedin.com/in/neha-adapala-7b2a56231/

And X: https://twitter.com/neha_adapala

--

--

Neha Adapala

Hi! Feel free to contact me about anything from natural sciences to arts. My interests mainly lie in technologies though!