The present and the future of Brain-Computer-Interfaces

Linda Weber
Mindable Health
Published in
9 min readJan 23, 2019

In March 2014 I came across an art piece that changed the direction of my life:

Eunonia

In this installation the artist Lisa Park stood in the center of five flat dishes which contained a thin layesr of water. These metal dishes were mounted on speakers. She is wearing a headset that would measure her brain activity to identify her mental state, whether she is rather relaxed or more focused or even tense. The more tense, the higher is the frequency her brain waves oscillate. This headset is connected to the speakers via Bluetooth. Depending on her mental state, the speaker would produce high or low frequent sounds,which in turn created corresponding ripples in the water above.

Eunonia — art installation from Lisa Park

While this is a beautiful piece of art, there was something very specific about it that stoked me. It was the device she was using to record her brain waves.

Scientific EEG

So far I had only known EEG in a scientific setting within my Psychology studies. Using scientific EEGs usually meant to apply over 50 electrodes with some kind of gel to a research participant’s scull. This process could easily take up an hour and included tons of cables. I would say, it is probably the least sexy head decoration one could find on the market.

So when I saw this headset I was hooked. A consumer friendly device that would measure ones electrical brain activity, a headset that you could just put on and you were ready to go. Clearly it wouldn’t pass scientific standards, but it would open a whole different range of possibilities.

That day, I decided that this will be the technology I want to work with in the future.

Consumer BCIs

Simply put, a brain-computer interface is a way to connect the brain to an external device in order to send and/or receive information directly from it.

There are a couple consumer BCIs currently leading the market: Emotiv, Muse and Neurosky. They range from 3 to 14 electrodes. Of course, compared to a scientific EEG the resolution is very bad, but that’s not the point here. They are build to be cheap and mainly used for training meditation or concentration.

Though they are great advancements, they all have two problems:

  1. They are mainly uncomfortable (the Muse headset is ok).
  2. They look weird.

If we want users to wear them outside of their own four walls, these headsets need to become more inconspicuous. Luckily, new devices are on the rise, that might get us there.

The startup Mindset uses normal headphone and integrates EEG electrodes. The goal is to measure one’s concentration level over the course of the day.

Mindset

Smith is working on sunglasses with integrated EEG sensors.

Lowdown Focus

Application areas of Brain-Computer-Interfaces

Let’s look at what amazing things we have already accomplished with Brain-Computer-Interfaces up until today.

And all of those interpret and process the brain signal in a very distinct way.

Moving Objects

One of the first BCI implementations was to help paralysed people move a prostheses with their thoughts. The motor cortex in a paralysed person usually works just fine. It’s just that the spinal cord, which had served as the middleman between the cortex and the body, stopped doing its job. So those BCI researchers thought: “What if we use the motor cortex as remote control?”

Neuroprosthetics

To identify what each brain signal means, they let the paralysed person think of various movements - like move up, down, right and left. Then they recorded this pattern. And whenever the person thought of moving the arm to the right, the BCI would pick up on the signal and move the prosthetic.

Motor Cortex as remote control

This is also how a paralysed man was able to do the kick off of the Paralympics in 2014 …

Paralympics 2014

… and how IBM impressed the visitors at SXSW in 2016. Visitors were able to steer the Starwars droid BB-8 only with their thoughts. in this case, they used the BCI from Emotiv, one of the leading companies in the consumer BCI market.

BB-8

This has even worked between people.

Brain to brain communication

Two people, in separate buildings, worked together to play a video game.
The left one could see the game, the right one had the controller. Then the player who could see the game would, without moving his hand, think about moving his hand to press the “shoot” button on a controller. Because their brains’ devices were communicating with each other, the player with the controller would then feel a twitch in his finger and press the shoot button.

Typing your thoughts

In 2017 a paralysed woman able to type 8 words per minute just using her mind.

Paralysed woman typed with her thoughts

How did she do it? There is a specific signal, called P3, a positive peak 300 ms after a stimulus has been presented that indicates that indicates decision making.

Typing with your thoughts

Mental Health

Another way to use BCIs is by detecting mental states, such as relaxation, mediation, attention or even emotional states like fear. These kind of applications aim at training the brain by using one of the most fundamental principles of learning. And that is feedback. It’s fairly simple, really. By providing you feedback about your mental state you are able to influence it into one or another direction.

Muse

How does it work?

At the root of all our thoughts, emotions and behaviour is the communication between neurons within our brains. Brainwaves are produced by synchronised electrical pulses from masses of neurons communicating with each other.

Brain waves

Your brain can oscillate in different frequencies. You can think of brainwaves like music — the low frequency waves are like a deeply penetrating drum beat, while the higher frequency brainwaves are more like a subtle high pitched flute.

Delta brainwaves are slow, loud brainwaves, like a drum beat. They are generated in deepest meditation and dreamless sleep.

Theta brainwaves occur most often in sleep but are also dominant in deep meditation. Theta is our gateway to learning, memory, and intuition. In theta we are in a dream; vivid imagery, intuition and information beyond our normal conscious awareness. It’s where we hold our ‘stuff’, our fears, troubled history, and nightmares.

Alpha brainwaves are dominant during quietly flowing thoughts, and in some meditative states. Alpha is ‘the power of now’, being here, in the present, and can be recorded in the resting state for the brain. Alpha waves aid overall mental coordination, calmness, alertness, mind/body integration and learning.

Beta brainwaves dominate our normal waking state of consciousness when attention is directed towards cognitive tasks and the outside world. Beta is a ‘fast’ activity, present when we are alert, attentive, engaged in problem solving, judgment, decision making, or focused mental activity.

Our brainwave profile and our daily experience of the world are inseparable. When our brainwaves are out of balance, there will be corresponding problems in our emotional or neurophysical health. Research has identified brainwave patterns associated with all sorts of emotional and neurological conditions.

Several examples show how brain-computer-interfaces may be used to treat those conditions.

Therapeutic Neurogames

Isabella Granic used the concept of neurofeedback to develop a neurogame for kids suffering from anxiety. In this game, children need to navigate little Arthur through a scary house full of shadows. Whenever the child becomes anxious, a light on Arthur’s head dims down and harmless furniture will appear like monsters. When the player relaxes, he/she can see the true gestalt of the rooms. These intent of these games is to use the brain’s ability to rewire itself by experiences and practice. A concept called neuroplasticity.

Mindlight- a game for anxious children

In my Master’s thesis I used a similar approach and designed a therapeutic neurogame for people suffering from obsessive-compulsive-disorder. Read more here.

OCD — get out of me

So you see, these devices find applications in many areas: games, health care, sports, art and mental fitness or wellness.

And while some are working on improving the technology of today, others are working on the far future.

The future of BCIs

Which brings us to the final part of this article, in which I want to share how Facebook and Neuralink are planning to revolutionize the way we communicate today.

Two new players with a grant vision

Just this year, two big players announced their part taking in the BCI market.

March 27, 2017. Elon Musk publicly announces Neuralink, a neurotechnology company to develop an implantable brain-computer-interface. The company, Musk explained, will initially seek to treat people with disabilities, but eventually its goal will be change the way we communicate. Instead of having to describe difficult-to-explain ideas, you could send them in a “raw” format; the abstract concept in our minds before we form it into a word. Multi-dimensional concepts that would otherwise take thousands of words to explain, could all the sudden become intuitive to others in seconds.

Apr 19, 2017. Facebook revealed it has a team of 60 engineers working on building a brain-computer interface that will let you type around 100 words per minutes with just your mind. The team plans to scan your brain to detect you speaking silently in your head, and translate it into text.

Sounds crazy, right?One thing is sure. The P3 measurement technique describe above won’t fly. They need to come up with something different.

Facebook’s announcement

When we look at communication speed , we quickly see that, we think 4x faster than we speak and 15x faster than we write. Especially between typing and thinking we are loosing a lot of time.

Now if you were paralysed, but fortunate enough to live today and use a BCI to type, it would take you 1 hour to write 500 words. This wouldn’t even fit on the time line. While the current BCIs make you slower, Facebook is trying to even decrease the speed of typing form 12.5 min to 5 min. Facebook want to reach 5 minutes.

Elon Musk, on the other hand wants to cut language from the equation all together. He plans to build a whole-brain interfaces that allows people to communicate at thinking speed.

He argues, language is a low resolution medium in itself. There are a bunch of concepts in your head that then your brain has to try to compress into this incredibly low data rate called speech or typing. That’s what language is — your brain has executed a compression algorithm on thought, on concept transfer. And then it’s got to listen as well, and decompress what’s coming at it. And this is very lossy as well. So, then when you’re doing the decompression on those, trying to understand, you’re simultaneously trying to model the other person’s mind state to understand where they’re coming from, to recombine in your head what concepts they have in their head that they’re trying to communicate to you.

If you have two brain interfaces, you could actually do an uncompressed direct conceptual communication with another person. Which preserves all the meaning with none of the fuss.

From low resolution to HD

Currently we can measure around 500 neurons at once. The brain, however, has around 100 Billion Neurons. This is very similar to a super low resolution image. When it comes to your thoughts, a few hundred electrodes won’t be enough to communicate more than the simplest message. We need higher bandwidth - way higher bandwidth so that this picture becomes an HD image.

Articles I recommend:
https://medium.com/svilenk/bciguide-246a9ca76fcd

--

--