What if the computer could read your mind and knew exactly what you liked even better than you do? What if it could use this insight to compose original music that’s specifically tailored to your taste and what you want to hear in this moment? What if you could take a quick 10 minute test to screen for all sorts of psychiatric disorders in your own home to help your doctor diagnose you? These are the sorts of possibilities that make me excited about the combination of affordable EEG devices and machine learning/AI.
It all started when I saw an advertisement for Muse, the “brain sensing headband.” It’s essentially a consumer grade EEG device that is intended for neurofeedback during meditation. The idea is, you wear it while meditating, it detects your brainwaves, and it transforms the signal into sounds that give you realtime feedback about how you’re doing. If your mind is very distracted you’ll hear thunder. If you manage to calm the mind more, you will hear gentle rain, then it will go quiet, and eventually you will hear birds singing, which is a sign you’re doing really well. As a Vipassana meditator, I approve, but as a machine learning engineer I am even more excited to get my hands on the raw EEG data to see what kind of insights can be extracted.
The first thing I had to find out is, does this thing actually work? Is it really sensing the brain or is it some kind of elaborate placebo? I did my research, and the short answer is, yes it detects the brain with surprising accuracy. I found an excellent research paper from the Centre for Biomedical Research in University of Victoria which compared Muse against a research grade EEG system, which costs $75,000.
The results presented here clearly demonstrate that the MUSE EEG system can be used to conduct event-related brain potential (ERP) research from a single computer without the use of event-markers. […] Further, we note there that the time to complete both experimental tasks — including EEG setup — was done, on average, in less than 10 min. For comparison purposes, one must consider the task completion time with our large array ActiChamp system. […] the setup time and testing time with the MUSE was approximately one sixth of the setup and testing time with our large array system. Other points to consider here are the fact that large array systems typically require two (or more) research assistants vs. one with our MUSE setup and cost considerably more approximately $75,000 vs. $250 for the MUSE system
Wow! This is a game changer. Often times, the most exciting technological progress comes when the price of existing technologies drops to be affordable to the masses. Think about personal computers and smart phones. It wasn’t until they became cheap, that they really changed the way we live. Of course, the expensive setup is more still accurate and has more electrodes, but if you give me something that’s 95% as good, easier to use, and literally 300 times cheaper, I’ll take that one, thank you very much.
So, as a first step I set out to try to replicate the above paper with my own brain scans, and I was surprised how easy it was. To understand this, you need to know what an Event Related Potential (ERP) is. An ERP is simply a specific pattern of voltages (hence potential) that can be measured in the brain immediately following some event. For example, reward positivity (RewP) is an ERP that happens when you get feedback on an action that is positive (e.g. you won a game). Feedback negativity is another ERP, when you get negative feedback on an action. Many other ERP’s have been studied that could potentially be useful. Since this paper uses RewP, I decided to try to detect it myself.
I needed to design and experiment to perform on myself to trigger RewP. Previous studies have used monetary reward to coax this out of subjects, but thankfully I found a study, which showed that RewP is present even without a monetary reward, if the subjects emotionally cares about the outcome. So, I decided I would record my brain while solving chess puzzles (I’m mildly-obsessed with chess, so I will care if I get it right), and I wrote quick Python script to record my mouse clicks with timestamps, so I can sync up the timing of the EEG data. I also recorded my screen, so I can label the mouse release timestamps with the outcome (whether I got it right or not). Here’s what RewP is supposed to look like (from the University of Victoria paper). Look at the darker line.
And here are some of my EEG recordings immediately after I got a chess puzzle right.
The graph is a little squished horizontally because my X axis shows 1,000 ms vs 600 ms in the paper, but you can definitely see the resemblance! The signal dips down low, comes back up higher than it started, and then levels out again. I’ll note the ERP didn’t always appear (as in the second to last graph above), but it was present in 11 out of 14 of my recordings in the first session, and that’s without even pre-processing the data like the paper did. It’s pretty exciting to be able to replicate part of this paper in just one day! It goes to show how Muse has made the barrier to entry of brain research so much lower, so that a tinkerer like myself without the support of a university can participate!
I definitely need to collect more data, especially from different brains, no just my own, but these preliminary results are very encouraging. From here once I have enough data, I want to train a machine learning model (probably an RNN) to automatically detect these ERP’s. Since you can stream the EEG data to the computer it could be in real time. I could take something as complex as activations in the brain, and simplify it into an event-based API that anyone could code against. From there, the possibilities are endless. If I could find an ERP that shows, “I like this” or some proxy for that, it could lead to the greatest recommender system ever, because it wouldn’t even need you to input your preferences. It would just scan your reactions to content, and immediately know what you like the most. Another idea is, since different mental disorders such as bipolar, depression, OCD have signature variations in how ERP’s present themselves that have been studied, I could train a model that detects these anomalies, as a type of screening you could do at home, and take the results to your doctor for further testing.
Now that we have an easy, affordable way to directly interface the brain with computers, the possibilities are limited only by our imagination. The sensing technology will continue to improve, and so will our ability to decipher the signals and make sense of them. This is just the beginning.