The Data behind Brain-Computer Interface

Jeff Lin
Voice Tech Podcast
Published in
5 min readApr 8, 2020

As an aspiring data scientist with a background in Biomedical Engineering and Neuroscience and after reading some recent news on the investments into companies that allow users to control devices or computers with their brains, I thought I write a little about the data behind brain-computer interfaces.

Brain-Computer Interfaces (BCI) or Brain-Machine Interfaces (BMI) have been studied as research since the 1970s. Recent advances in Machine Learning, development of small microchips, and understanding of brain signals increased the interest in having our brain to control devices like magic or like a Jedi. Elon Musk invested in an implantable device company called NeuralLink that published a research article showing has over 3,072 electrodes. Facebook has also recently acquired a company in September 2019 that will allow people to control digital devices with their brains using a head and armbands (CTRL-Labs).

In a WIRED article recently published by Arielle Pardes, she talks of a demonstration of NextMind, a headband with combs that separate the hair for easier detection of brain waves using “dry electrodes”, developed by a Neuroscientist entrepreneur and the app that lets you play a game like Nintendo’s “Duck Hunt” where you focus on the duck and within a second the duck explodes and change a mock TV’s channels by glancing the corner of the screen.

Emotiv EPOC+ EEG Headset (left), NextMind Headband (middle), and NeuralLink connector microchip (right).

The majority of research on BCI has been on the control of prosthetics, wheelchairs, and moving a cursor on a screen. Maybe it is just the bulk of the research that I have read, due to my interest in prosthetics. The majority of us have seen or heard it the news about how a quadriplegic can just think about moving the arm and a robotic arm moves to do something, like grab a bottle of water or write a word on a paper.

The possibility of the BCI could not be without the discovery of electrical signals from the brains in animals by Richard Canton in 1875 and the development of electroencephalography (EEG) and coining the term “brain waves” by Hans Berger in 1924. Jacques Vidal a UCLA professor coined the term “BCI” with his 1977 experiment where a cursor is controlled through a maze using EEG signals. The first brain chip was developed testing on a monkey in 1969 and successfully implanted in a patient in 1998.

Source: Tarik Al-ani and Dalila Trad (January 1, 2010). IntechOpen, DOI: 10.5772/7032

The bulk of the data is a continuous brain signal that is either acquired from the implanted microelectrodes chips like the one developed by Neuralink or on the scalp electrodes such as the Emotiv EPOC+ headset and NextMind. Both types generate a large amount of data and require a data pipeline to classify the raw brain signals into actionable categories. Before the signals can be classified, for example of moving a cursor left or right, the signal is preprocessed using electrical engineering techniques such as Current source density (CSD), Common average referencing (CAR), Frequency normalization (Freq-Norm), Principal component analysis (PCA), and Independent component analysis (ICA) to amplify and correlate signals.

Spatial and Time scale difference from single neuron to population. Source: Tomas Ros

Before we even process the signal, let us explore what the brain waves (brain signals) are.

Build better voice apps. Get more articles & interviews from voice technology experts at voicetechpodcast.com

A single neuron process and propagate electrical signals, also known as action potentials. Neurons spike, or fires, when a certain voltage threshold is reached. Synchronization of these spikes or the sum of these voltages over an area creates what can be measured as the location field potential by the EEG (making up a signal from a population of neurons).

These brain waves are all electrical signals measured in voltages (millivolts) and they are classified into changes in brain rhythm, movement-related potentials (MRPs), event-related potentials (ERPs), visual evoked potentials (VEPs), Slow cortical potentials (SCPs), and several others based upon the brain region measured along with stimuli presented or action performed. EEG has main classifications based on the sinusoidal frequency ranges: alpha (8–12 Hz), beta (13–30 Hz), gamma (30–150 Hz), delta (1–4 Hz), and theta (4–8 Hz).

32 channels per electrode measurement (left) and spikes measured from 1020 electrodes. Source: Neuralink. bioRxiv 703801. DOI: 10.1101/703801

As can be seen in the Spatial and Time Scale image for EEG, and the NeuralLink implant electrode’s signals, there’s a lot of data that goes through preprocessing and filtering, feature selection and dimensional reduction, before the data can be input into a machine learning algorithm such as EEG2Code that utilizes Convolution Neural Network to classify the brain signal. The machine learning algorithms training can be done based on specified tasks and the raw brain signals, and have been proven highly efficient. The demand for the computation is high due to the high volume of data and a high interest in BCI development.

EEG2Code Convolutional Neural Network Architecture

Non-invasive BCI uses EEG to control the device, but it is more prone to noise as it doesn’t have a fine spatial or time resolution. In my opinion, that also means the finer controls are not there. As mentioned in the WIRED article, the app that the BCI uses for control takes only minutes to calibrate using machine learning classification. With the advancement of research and development in the number of small electrodes that can be implanted within a small area and the EEG detection devices getting smaller as well as other types of non-invasive brain signal measurements being developed, it will be up to the consumers to decide whether BCI is useful or whether to get a chip implanted or use headbands.

Something else that’s interesting is what if you can control someone else’s arm, not just machines. See the TEDtalk below.

Control someone else’s arm with your brain waves.

If interested in watching the NeuralLink launch event, to get a bit more history on the BCI and more specifications on the implantable device itself by watching the video below or on YouTube.

Something just for you

--

--