BCI- A complete overview

What makes something a living thing?

Scientists and Philosophers are trying to answer this question since the beginning of time!

Well, you may have a different answer but to me the only diff between a living and a non living being is the ability to do something if he thinks?( including doing thinking)I mean if I decide to think about you- nothing can stop me, not even you and in the next moment if I decide to move my legs and reach your home. YES, I CAN.

But, if a rock decides to think of you, there is definitely something stopping it!!

A rock simply just can’t do it!

But how can we do something by thinking?

In 1800’s, Mr. Joseph gall, a phrenologist — indicated that the brain was the source of self and not the soul, heart or anything else by running his fingers over the scalps of strangers!

Phrenologists believed that a person’s personality was linked to his/her skull morphology. Phrenology later was dismissed as a cult psuedo science but an important message was originated during this time that — our brain is the source of all functions and behaviour and that studying it can answer my previous question- How can we do something just by thinking?

So the next question that you should be curious about now is- What exactly happens inside our brain when we think ?

SYNAPTIC CONNECTIONS aka Brain fireworks

Behaviour (learning, memory, emotions, cognitive functions) exists because we have a brain.

These behaviours arise specifically because of the specific electrochemical connections between the neurons (~100 billion in the brain; ~20 billion of these in the frontal cortex)

The smallest definable unit in these connections is the synapse.

In a human, there are more than 125 trillion synapses just in the cerebral cortex alone, That’s roughly equal to the number of stars in 1,500 Milky Way galaxies.

A single neuron more like a single human is incapable of doing anything in isolation. We need a whole social system of beings from plumbers, craftsmen, engineers, Doctors, philosophers to work in this architecture of society.

In other words we need to connect to conduct(any activity). Quite obviously, Neurons are no exception!

At this point, If you feel like going down the rabbit hole of synapses- I gotchu

It’s in the postnatal period that synapses start to develop. Alterations in synapses or loss of synapses define and shape behaviour later in life.

This also means that studying your synapses can tell me everything I need to know about your behaviour and functions?!?!

Is there any way of doing that?

Neural activity produced from synaptic connections in our brain causes electric and magnetic fields and blood flow changes in our brain.

Changes in this brain activity can be measured and can provide information about which brain regions are involved. Since different brain regions have different functions, it also provides us with information about which cognitive and motor functions are implicated.

There are two types of Neural activity/activation

Endogenous activation- voluntary- For example-Making a mental calculation, following a line of reasoning, imagining what to do next, deciding about a movement are examples of voluntarily evoked brain activity.

Exogenous activation- involuntary- We also experience multisensorial stimuli, sometimes beyond our control. We have less or no direct control over our involuntary brain activity- For example-seeing, hearing, feeling, or memorizing and mental states like fatigue, frustration, excitement etc.(some regulation of this activity is possible by deciding what will have our attention- this is where meditation comes in)

Research on capturing brain activity changes and translating them into control commands for an application, device etc. has come to be known as brain computer interfacing. (control commands can also be for communication device)

Recording of electrical activity in the human brain using electrodes attached to the human scalp started with experiments by Hans Berger in the early 1920s.

He recorded brain wave patterns, that is, rhythmic repetitive electrical brain activity with a certain amplitude. This repetition is measured in Hertz (Hz), where 1 Hz is one cycle per second.

Measuring Neural activity changes

There are 2 ways of doing this-

A. Noninvasive( no surgery is required)

  1. EEGs

2. fMRI




6. rTMS

7. tES


B. Invasive( surgery is required)

  1. ECoG

2. DBS

3.Implanted microelectrodes


5. Optogenetics

Bonus — TCRE

Despite decades of research within the field, the accuracy and reliability of BCI systems remains limited for several reasons. One of which includes-current EEG scalp electrodes( the most widespread BCI measuring technique used) lacks the necessary spatial and temporal precision.

Activity from a relatively large area of the cortex of the brain affects the signal that is recorded by one electrode, so differences in brain activity in neighboring regions of the brain cannot be detected.

To address this limitation , Dr. Walter Besio at The University of Rhode Island is developing and researching a new EEG sensor; the Tripolar Concentric Ring Electrode (TCRE) with a signal-to-noise ratio 4 times more accurate than that of conventional EEG electrodes .

Dr. Besio has demonstrated the ability to find high-frequency oscillations preceding seizures with TCRE electrodes; oscillations that are completely absent from recordings taken with conventional EEG electrodes. His research with TCRE electrodes has also discovered that movements of different fingers can result in different EEG signals, as opposed to recordings from conventional EEG electrodes that show no difference.

How does TCRE do it?

by lowering the problem of artifacts! but what is the problem of artifacts?

Whether EEG or fNIRS are being used, there is the problem of artifacts- We breathe, move, perceive. Muscles are used for eye gaze and changes in facial expressions. We are in a particular mood or experience a particular emotion. If we want to measure brain activity that is related to a specific task that we aim to perform or that the environment nudges us to perform then we should get rid of such non-task related activity- which disrupts the accuracy of our measurement. This is the problem of artifacts.

Detecting brain signals, analyzing them and extracting the relevant information is one of the main issues of BCI research. Detecting, analyzing and extracting requires advanced methods of signal analysis, machine learning and pattern recognition.

The TCRE is highly focused on local activity due to its concentric configuration, which sharply attenuates distant signals and artifacts such as muscle artifacts . It increases the signal-to-noise ratio (SNR) with high common mode noise rejection, providing automatic artifact attenuation.

Control and Communication with BCI

This measured brain activity can be translated into control commands for an application. Control can be explicit, where the user wants to issue a particular command to a brain-controlled device like

For ex. Switching off your room’s light with thought alone by thinking about it using a BCI application.

However, it is also possible that a user’s brain activity is monitored and this information is used to determine aspects of a user’s mental state, which is then translated into changes in a user’s environment that better suits this particular mental state or that aims at changing this mental state.

For ex- Spotify recommending or playing songs according to your mood using a BCI application.

These are some of the new fancy applications of BCI technology but originally it was thought of as an clinical assistive communication technology by Edmond Dewan who accidentally discovered it in 1964.

During an experiment in which the scientist was measuring his own brain waves, he was suddenly impressed with the fact that he could control their activity.” Moreover, “The interesting thing about these waves is that they can be controlled without muscle movement. All a person has to do to turn them on is relax as if going to sleep. To turn them off, all one has to do is concentrate on a scene or object.” Hence, “For this reason, Dr. Dewan believed these waves can conceivably be used as a communication device for persons who have lost their ability to move.

Once such a person has learned how to manipulate his alpha wave rhythm pattern, it becomes possible for him to communicate through Morse Code, or some other simple response system.

Some main BCI frameworks

In active BCI, we assume that a subject is active or in other words is able to manipulate his or her brain activity to issue commands to a brain-controlled device. Can we manipulate our brain activity? Certainly.

We can make the decision to relax. When successful, relaxation can be observed in our brain activity. We can act as if we are angry (emotion imagery). Again, when we are able to do this in a convincing way, it will show in our brain activity.

Similarly, we can also imagine that we want to move our body or body parts in a particular direction, for example, move our left hand to the right. Again, what we imagine can be detected by electrodes that pick up signals, in this case from our motor cortex and these signals can be used to, for example, change directions of our wheelchair in a physical environment, or direct our avatar in a videogame to a particular position.

In reactive BCI, it is usually the application that generates stimuli that we are supposed to focus on, which can then give rise to changes in a subject’s brain activity. Hence, the subject is asked to pay attention to and choose among artificially evocative stimuli.

Paying attention can be seen as a voluntary act. We are engaged in an act that requires us to pay attention to perceptual stimuli. We are asked to pay attention to these stimuli and while doing so our brain emits information about what we perceive. Usually, stimuli are presented visually on a computer screen. However, stimuli can also be presented auditorily or by touch, and in principle, by taste and smell.

This reactive viewpoint where the user is explicitly asked to pay attention can be complemented with a reactive viewpoint in which external stimuli are present, their effect is measured, feedback to the BCI application is provided, but the user does not have the explicit task to control the application.

  • In passive BCI the subject has no intention to control or communicate using BCI. Brain activity is measured and used to make changes to the environment or the task the subject is supposed to perform. A subject’s brain activity is measured without him or her being asked to voluntarily evoke a particular kind of brain activity or paying attention to external stimuli that will have an effect on brain activity. The user is simply monitored while performing a task.

Although there are more ways by which a user’s brain waves can be translated into intended commands for control and communication, in clinical BCI research the main paradigms (or markers) for doing this are motor imagery (active BCI) and event related and evoked potentials (reactive BCI)

Motor imagery (MI) is about movements. Intending to move or beginning to imagine a movement and ending the imagining of a movement leads to changes in the alpha (8–12 Hz) and beta frequency bands that can be measured in the motor cortex; these changes have been called event-related desynchronization (ERD) and event-related synchronization (ERS). This motor imagery BCI, as well as other cognitive imagery tasks, can be used to steer a wheelchair, an avatar in a videogame or a cursor on a screen without making limb movements. MI is an active BCI that requires spatial and spectral information.

• Another paradigm is the steady-state evoked potentials. For example, a steady state visual evoked potential (SSVEP) can be designed such that the user has to pay attention to a screen on which various patterns of repetitive flickering stimuli are displayed. By focusing on one particular pattern, its frequency of flickering can be observed in the occipital region of the brain, and the BCI system can interpret this as a preference or a decision of the user, which can then allow an application to perform a certain task. This is a reactive BCI that requires spatial and spectral information.

  • BCI based on event related potentials (ERP) is another reactive BCI paradigm. An example is the P300, a potential that can be elicited using the oddball approach, that is, the user perceives a sequence of stimuli, but only one of them, the target, is relevant for the user. When that stimulus is presented, there is a positive deflection in voltage of the EEG signal occurring with a latency of approximately 300 ms. It can best be measured by electrodes over the parietal lobe. Hence, it requires spatial and temporal information. P300 can be used to choose among stimuli. For example, when presented with a stimulus that represents letters, the target letter, that is, the letter that a subject wants to ‘type’, elicits a P300. Stimuli can also be auditory or tactile in nature.

Scientists, Designers, students and even common people like me and you who have just begun looking into the treasure-house of cool BCI applications can make use of all these different BCI paradigms and viewpoints to design a BCI application of our own!

Integration of AI and BCI

Artificial intelligence (AI) techniques such as modeling common-sense knowledge, reasoning and pattern recognition can assist in interpreting BCI commands and embedding them in an environment where detailed low-level BCI commands are not necessary. As a simple example to show how useful it can be to integrate BCI and AI, consider a disabled user who controls a wheelchair using BCI. The AI can have knowledge about the environment in which the user has to navigate. A user can make errors and the BCI device can wrongly interpret the intentions of a user. Thus, if the result of an interpretation is to turn left with the wheelchair and the AI knows that instead of a doorway there are descending steps, it can prevent the wheelchair from taking that turn.

As another related example, if a higher-level aim of a wheelchair user can be detected, for example, to go to the kitchen, then no detailed BCI instructions are necessary since the control can be taken over by the AI that knows about the route to go there and the wheelchair can have sensors that use computer vision (another AI technique) to avoid obstacles. As another example, when a BCI user starts a particular action, the AI can predict the most likely next actions and then lets the user make the choice. This can be considered an autocomplete function for actions, rather than for words or letters in a word processor.

Thanks for reading this far! I’m Neha- a 17 yo psychology, philosophy enthusiast and BCI,AI innovator.

Have got any thoughtful questions? Dm me- I’ll love to chat

Twitter Linkedin



Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Neha Gajbhiye

Neha Gajbhiye

16 yo curious thinker, tireless observer, interdisciplinary creative. Innovator@ TKS