Intro to The Data Science Behind EEG-Based Neurobiofeedback

Chelsea Tanchip
Nullastic
Published in
8 min readSep 9, 2017

This biomedical phenomenon’s magic works through complex but futuristic technology. Here’s how.

Source.

The Neurobiofeedback machine gained popularity for its non-invasive and quantitative approach to behavior regulation, but its legitimacy remains in question by pediatricians, therapists, and other professionals. In academic-sounding terms, this machine (which I’ll be abbreviating as NBF from now on) is built on the concept of feedback therapy, which exploits our ability to exert and/or regain control over physiological aspects in our body.

NBF is a type of Brain-Computer Interface (BCI) machine that senses your brain wave activity in different ways (usually involving hardware-software interaction) and rewards you with an auditory or visual stimulus when your brain wave’s frequency matches the desired frequency. This comes from the scientific notion that brain rhythms correspond to certain cognitive states.

Super technical process of NBF, which I’m trying to learn and understand. Source: https://www.nature.com/nrn/journal/v18/n2/fig_tab/nrn.2016.164_F1.html

tl;dr: it’s playing mind games to make you smarter. Don’t we already do that in romantic relationships?

By “mind games”, the ‘auditory or visual stimulus’ I mentioned last paragraph usually comes in the form of a game. Yes, think of this as playing Angry Birds. Instead, in the NBF context, you have to think about killing the bird (fixation). And, if your brain waves indicate enough level of focus, you kill the bird. While you’re busy envisioning dead virtual birds, researchers are on another screen compressing your brain activity as data.

NBF uses operant conditioning (giving those rewards via virtual games) to slowly change the way your brain works. Your cognition gets stronger after a few sessions. This is a novel method of brain training and not all specialists are into it. But instead of discussing the pros and cons of the biofeedback, which has been covered widely, I’m focusing on the data-centric technology behind it. I went to Hong Kong Polytechnic University last August (2017) so I could learn the theory and try my hand at some hands-on NBF.

When I was there, I liked learning the neuroscience behind NBFs but I was more fascinated with how it worked, how both client and practitioner could see results in real time as a game and as clean data visualization respectively.

The university hosting the magic.

After getting hands-on, getting my video documentations together, and researching literature reviews, I got a good grasp of how the process works (which I still want to keep learning).

defining our system

EEG stands for electroencephalogram, and it measures the electrical activity in your brain from the synaptic excitations of your neurons’ dendrites. I’ll be talking about EEG-based Neurobiofeedback, which uses electrodes placed in different places on the scalp. I used a device called the NeuroSky MindWave Mobile, which skips the different-places-on-scalp machinery but offers a similar function of measuring electrical activity from your neurons.

The MindWave Mobile headset is a passive dry sensor and is definitely non-invasive.

It can measure attention, meditation, eye blinking strength, and the various brain waves (delta, theta, low and high alpha, low and high beta, and gamma). Here’s a detailed explanation of the waves for those interested in neuro/physiology.

The headset is connected to the Neurosky Mindwave Mobile app, which we downloaded on the on-site iPad. It has Bluetooth connectivity, too.

Took a quick pic of this device.

how systems communicate

Inside the Mindwave headset, there’s a special chip called the Think Gear chip. With the help of the Think Gear Communications Driver (TGCD)’s API, the chip can communicate with a computer application (also built in Mindwave) used to collect/analyze the EEG data.

stage 1: processing

I looked at two different data processing software: LabView and MATLAB.

LabVIEW is a graphical programming environment. It provides a graphical code development environment, and it has a front panel (FP) that allows the user to interact with a virtual environment. Why LabVIEW is beneficial is because it has a huge library of prebuilt signal processing and analysis functions. You can even filter parts of the signal, known as epochs, and calculate something known as average fast Fourier transform (FFT), which refers to the conversion between time domain representation of a signal into a frequency domain representation (“FFT Analysis”).

Source.

For MATLAB, I looked at EEGLab, which processes continuous and event-related EEG data using different methods of analysis:

Independent Component Analysis (ICA)

A statistical/computational technique that reveals hidden factors that underlie sets of random variables, measurements, or signals. ICA helps separate EEG signals as they come from different sources (the electrodes placed on your scalp) that mix linearly in the graphs even though they’re supposed to be highly correlated.

Ungureanu et. al recounted that “if the weights were known, the potentials in the sources could be computed from a sufficient number of electrode signals” (2004). ICA helps solve the problem.

Time and Frequency Analysis

Time analysis refers to when the amplitudes happen (when the peaks and troughs in the waves happen), while frequency analysis calls back to FFT analysis (see LABView above). These methods are necessary for the analysis of pre-stimulus and spontaneous signals in longer time periods.

Also check out this cool Github project on MATLAB-based EEG processing to see raw coding in action.

stage 2: classification

Classification of data has a lot of different approaches, but I’ll only be covering one (saving the rest for another piece). Classification converts the physiological explanation from the sensors to feedback score values that you can work with to monitor the progress of the client. With these values, you can get to graphing. The first approach I’ll be discussing is the tree classification method.

An explanation of decision trees and nodes

In computer science, trees are data structures (ADTs) that, well, organize data systematically. Nodes are units in computer science that represent a data point in a system/tree. The root node is at the top, and they branch out. The leaf nodes are the branches that have no further branches.

A type of tree, the binary tree. The root node is 1 and the leaves are 8, 6, and 7. Source: http://flylib.com/books/en/3.421.1.138/1/

I mentioned discussing the tree-based method with data classification — in this case, the sensors are the leaf nodes and score value nodes (that store the data for classification) are root nodes.

This decision tree is useful for eliminating noisy and redundant signal channels, increasing the accuracy of the NBF’s data.

Aydemir and Kayikcioglu tried this decision tree method, harnessing a training set of the subject and automatically generating specific decision trees for each new subject by determining the most appropriate feature set and classifier for each node. They found that after training their algorithms, the existing node could be easily replaced with new one without breaking the whole decision tree structure, rendering their method flexible.

Moreover, one of the most popular decision tree methods is the Classification And Regression Tree (CART), a technique that builds on binary trees (wherein each non-leaf node has only two branches).

Source.

Quick definitions:

Feature selection: choosing a subset of relevant variables for use when building the classification model, so that it will be easier for researchers to interpret the data and reduce the variance, increasing the model’s ability to generalize data.

Tree pruning: removes nodes of the tree (filtration) to make feature selection easier and reduce runtime complexity.

Now, check out this excerpt from Arvaneh et. al (2010):

The CART would reduce [signal] noise by identifying the subject-specific frequency range, and then the most discriminative subset of features is selected by the defined decision tree classifier. Finally the selected features are ranked according to a tree pruning method. Since the decision tree selects a feature according to the results of previous chosen features, selected features would be more relevant and less correlated to each other.

After classification, the EEG system will already know which variables are relevant for analysis (e.g. brainwaves indicating focus only, brainwaves indicating relaxation only) and convert the values into visual results.

stage 3: visualization

The Mindwave device actually has a built-in Brainwave Visualizer, which uses the already classified data to display a beautiful and interactive graph for the researchers to see and analyze.

While the Visualizer runs, this is (something like) what the client will see.

recap.

I discussed the EEG-based Neurobiofeedback’s ability to process, analyze, and visualize data for researchers to easily understand. To do so, I drew from personal/qualitative experience from my Hong Kong PolyU trip and filled the gaps in my technological knowledge of the NBF from various literature reviews and topics in data science.

For processing, one can play with LABView and/or MATLAB (EEGLab, independent Github projects — specifically check out Eric Blue’s works) and use different analysis methods (FFT, ICA) to organize the data. Then, you would have to classify the data

Playing mind games is fun…but the complex science behind it proves that they’re not just magic. This also reinforces the increasing importance of [big] data as something to learn for neuroscientists (and anyone in healthcare).

Hopefully, we can continue to uncover more data/machine learning algorithms that enhance the efficiency of our current methods and make learning the process more accessible to non-technologically inclined healthcare professionals.

Follow @nullastic on Instagram and check out my new website.

further reading.

Arvaneh, M., Guan, C., Ang, K. K., & Quek, H. C. (2010). EEG Channel Selection Using Decision Tree in Brain-Computer Interface. Proceedings of the Second APSIPA Annual Summit and Conference, 225–230. Retrieved August 25, 2017, from http://www.apsipa.org/proceedings_2010/pdf/APSIPA45.pdf

Aydemir, O., & Kayikcioglu, T. (2014). Decision tree structure based classification of EEG signals recorded during two dimensional cursor movement imagery. Journal of Neuroscience Methods, 229, 68–75. doi:10.1016/j.jneumeth.2014.04.007

Dhali, S. (2015). A Study of Brainwave eSensing Activity. Department of Computer Science, Malmo University, 1–4. Retrieved August 25, 2017, from https://www.overleaf.com/articles/bci/mcsvkjwhcffb/viewer.pdf.

Irtiza, N., & Farooq, H. (2016). Classification of Brain States using Subject-Specific Trained Classifiers. Technical Journal, 21(2), 1–11. Retrieved August 25, 2017, from http://web.uettaxila.edu.pk/techJournal/2016/AcceptedPapers%20No1/Classification%20of%20Brain%20States%20using%20subject-specific%20trained%20classifiers.pdf

Jillich, B. (2014). Acquisition, analysis and visualization of data from physiological sensors for biofeedback applications (Master’s thesis). University of Stuttgart, Germany.

Nicolas-Alonso, L. F., & Gomez-Gil, J. (2012). Https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3304110/. Sensors (Basel)., 12(2), 1211–1279. doi:10.3390/s120201211

Sałabun, W. (2014). Processing and spectral analysis of the raw EEG signal from the MindWave. Przegląd Elektrotechniczny, 90, 169–173. doi:10.12915/pe.2014.02.44

Ungureanu, M., Bigan, C., Strungaru, R., & Lazarescu, V. (2004). Measurement Science Review, 4(2), 1–8. Retrieved August 25, 2017, from http://www.measurement.sk/2004/S2/UNGUREANU.pdf

Varada, V. R., Moolchandani,, D., & Rohit, A. (2013). Measuring and Processing the Brain’s EEG Signals with Visual Feedback for Human Machine Interface. International Journal of Scientific & Engineering Research,4(1), 1–4. Retrieved August 25, 2017, from https://www.ijser.org/researchpaper/Measuring-and-Processing-the-Brains-EEG-Signals-with-Visual-Feedback-for-Human-Machine-Interface.pdf.

--

--

Chelsea Tanchip
Nullastic

Fishing connections between speech, language, and technology.