Emotiv + Arduino: Developing with EEGs in engineering

Kevin JY Cui
11 min readMay 4, 2019

--

The prospect of directly using our brains to move devices outside of our bodies is riveting, but this ability has been historically reserved for mythological epics. Yet as recent trends have repeatedly proven, most impossible traits of the past have become everyday products of the present. Of course, the product, in this case, would be Brain-Computer Interfaces (BCIs).

Shakuni, from the Sanskrit epic “Mahabharata”, uses telekinetic powers to manipulate dice in a game in his favour. The Ancient Hindu storytellers marvelled at the prospect of direct brain-to-device control.

A Brain-Computer Interface is a device that detects, reads, and analyzes the electric signals emitted by the brain during synaptic transmissions. To get a more in-depth understanding, see this introduction to BCIs.

Those who have been following my work recently may know that I have been developing software applications for EEG technology. To sum up my previous work, I have been building applications that use EEGs as an input device for various applications, such as a cursor navigator and an audio playing interface. The main intent for these applications is to ultimately produce a method for patients suffering from paralysis to be able to use computer software with limited or no motor capabilities. You can watch the demonstration of these applications here.

Neurotechnology software produces interesting results, but it can be argued that the greatest potential for the technology is performed outside of the screen. BCIs in terms of application has been most frequently used as a method of prosthetic control, in which the prosthetic reacts to the subject’s corresponding muscle signals, just as an organic limb would. I decided to go beyond software in this project and delve into the engineering-related potential of neurotechnology. As an introduction to this concept, I built an LED-visualizer of the physiological state of a user, turning on and off a series of LEDs based off of certain states.

A servant encounters a telekinetic power in this 1911 edition of the French magazine, ”La Vie Mysterieuse”. It may be difficult to imagine how one would convince the artists of this vintage cover page that just under a century later, direct brain-to-device control would be very much possible.

The idea

The first thing that needs to be done in order to develop with EEGs and engineering is setting up the method of communication. Similar to previous projects, I will be using Emotiv’s API and INSIGHT EEG (their 5-channel mobile headset). I also have available to me an Arduino Mega, although other versions of the Arduino board would work just as fine (such as the Arduino Uno). The idea here is to set up a way to get the data from the Emotiv (input) to the Arduino (output). Once we set this up, we can go ahead and build whatever we want (mind-controlled robots, mind-controlled prosthetics, etc.), but for reasons of simplicity, I will be constructing a simple LED-visualizer of the client’s synaptic transmissions.

Prerequisites

To follow along with this article, or if you want to build this yourself, there are a few devices and programs that you must have and be familiar with

Hardware

Software

This list may seem complicated, but we will go over what each one does and how it does it through the next few sections.

Getting started with Emotiv

Emotiv uses a type of BCI called electroencephalography, or EEG, to detect and record a user’s synaptic transmissions. In essence, a series of electrodes on a user’s scalp read the electric signals emitted by the user’s brain, takes the data through an amplifier, and then sends it to a computer.

An interior view of the Emotiv INSIGHT

The data is then processed on the computer as to how the developer wishes, but must communicate between the program and the API via JSON-RPC. As for the syntax, the API allows the streaming of several different data types. In my cursor navigation program from the last time, I used the ‘command’ data type, which sent data to a server called a WebSocket and associated certain mental states with certain commands. For this project, I will be using the ‘performance metric’ data type instead. This data type carries pre-set physiological states, including ‘excitement’, ‘focus’, and ‘engagement’, which it determines based on the physiological arousal levels (which is determined based off of action in the ARAS, ANS, and Endocrine System), wave morphology, and wave frequency of the user (as well as time for long-term states). To give an example, engagement is determined by an increase in physiological arousal, attenuated alpha waves, and beta waves. The Emotiv API can be used with a multitude of languages, but I will be exclusively using Python for this project due to its simplicity, its efficient integration with JSON, and its data-intensiveness.

Regarding Emotiv as a whole, it is important to note that other EEG devices are also available, such as those designed by Muse and NeuroSky. However, I find that Emotiv is much more developer-friendly (with an API and SDK) and their headsets provide more accurate data. NeuroSky’s headgear is less accurate (with only 1 sensor), and active support of Muse’s SDK has been discontinued as of 2018.

Getting started with Arduino

An Arduino board is a microcontroller, or a miniature computer, that fits onto a single integrated circuit. Unlike an actual computer, microcontrollers perform very specific tasks. However, we can see the resemblance to a computer on an Arduino board. Included on the board are multiple input and output pins, a processor (CPU), and storage for memory.

The parts of an Arduino Uno

Besides the Arduino board, we will also need the Arduino IDE, which can be downloaded here. What the IDE will do is it will upload code to the storage of the Arduino board, and get processed by the CPU. The code will involve the input and output pins of the board, which the board will execute. LEDs (or other output devices) on a breadboard can be connected to the output pins on a breadboard using jumper wires, with the ground of the breadboard connected to the ground of the board, and the program must be connected to the computer uploading the code via USB.

Setting up

Let’s start with the Emotiv portion of the project. Emotiv uses a user interface called Cortex UI, which acts as an interface for logging in and connecting to the headgear. Alternatively, we can log in using Python, but for simplicity, we will be using the UI. Once the headgear is ready (charged and powered on), we can connect it via the UI. As for the Arduino board, we can begin by plugging in the program to the computer via the USB ports. This will send the program from the computer to the board, as well as act as the board’s power source. We can then plug in the 7 LEDs to the pins 7 to 13. This will make each LED correspond to a physiological trait once we execute the program. Specifically, the 7 traits from 7 to 13 will respectively be interest, stress, relaxation, excitement, engagement, long term excitement, and focus.

Arduino set up. My set up only shows 5 LEDs, as I had run out (people keep eating them), but there should be 7 if we wish to visualize all 7 traits

All set? Time to look at the code

Now that we have all the hardware connected, it is time to get to the software. Note that, as always, my code is available as open source code on GitHub. Do not hesitate to fork it, or use it as a guide.

The first thing to do is set up communication between the Python program and Emotiv’s API. We will need to import json and create_connection from websocket-client for the connection. We must also import ssl, which will help us make a secure connection. We will have two files for this project, one for setting up the connection and one for running the program. I named them setup.py and client.py, respectively. We will need to import these libraries for both of them.

import json
import ssl
from websocket import create_connection

Now that all our libraries are set, it’s time to authorize and transmit data with the EEG. The syntax for this can be found on Emotiv’s Cortex API documentation. The following code in setup.py will authorize the user, set up a stream, and subscribe to the performance metrics data stream. This process is all of Emotiv’s standard API and will allow data to be read from the EEG on Python. By subscribing to the met stream, we are effectively subscribing to the performance metric data.

import jsonfrom websocket 
import create_connection
import ssl
import time
import requests
receivedData = create_connection("wss://emotivcortex.com:54321", sslopt={"cert_reqs": ssl.CERT_NONE}) def setup():
receivedData.send(json.dumps({
"jsonrpc": "2.0",
"method": "authorize",
"params": {},
"id": 1
}))
token = receivedData.recv()[43:-3]
receivedData.send(json.dumps({
"jsonrpc": "2.0",
"method": "queryHeadsets",
"params": {},
"id": 1
}))
print(receivedData.recv())
receivedData.send(json.dumps({
"jsonrpc": "2.0",
"method": "createSession",
"params": {
"_auth": token,
"status": "open",
},
"id": 1
}))
print(receivedData.recv())
receivedData.send(json.dumps({
"jsonrpc": "2.0",
"method": "subscribe",
"params": {
"_auth": token,
"streams": [
"met"
]
},
"id": 1 }))
print(receivedData.recv())

To see a more thorough breakdown of this code segment, see the explanation for the code of my cursor navigation project.

Next, we have to communicate with the serial port of the Arduino board. Similar to how the JSON communicates between the EEG and the computer, the serial port communicates between the computer and the Arduino. You can see how these communication methods create a bridge for the data from the EEG to the computer to the Arduino.

To use Python to communicate to the Arduino, we will have to import the pyserial module in our other file, client.py in addition to the modules already imported.

import pyserial

We should also import the setup function that was defined in the setup.py file.

from setup import setup

Next, we need to create a serial communicator. Find the port that your Arduino is connected to (COMX), and initialize a serial object. In my case, the Arduino was plugged in at COM5. You could figure this out by going to Tools and then Port in your Arduino IDE.

ser = serial.Serial('COM5', 9600)

Then, we can go on and define a function for getting data from the EEG. This function is very simple: it checks to see if there are metric data available, and if so, it returns it.

def get_value():
data = json.loads(receivedData.recv())
if 'met' in data:
value = data['met']
return value
else:
return -1

Serial communicates via bytes, which are a very small datatype representing a binary number. It must be remembered that LEDs, being diodes, only have two outputs: on or off (1 or 0). However, Emotiv returns the metric data as a float inclusively between 0.0 and 1.0 with 15 digits (floats can be up to 32 bits). So, we must represent it as 0 or 1. The way that I do this is I check the average metric of the user throughout the usage. If the next input is above the average, we output a 1. Otherwise, we output a 0. There are many other ways to represent this data as a binary variable (such as checking if it increased/decreased from last input, checking if it is above or below a constant, etc.), but it really depends on what we’re trying to measure. We can feel free to explore the types of application in this stage.

The following code checks only the focus level of the user (hence why it splices for the second value in the data list).

sum1 = 0
total = 0
avg = 0.5
while not exit_flag:
val = get_value()
print(val[2], avg)
if val[2] > avg:
ser.write('1'.encode())
print(1)
else:
ser.write('0'.encode())
print(0)
sum1+=val[2]
total+=1
avg = sum1/total

On the receiving end in the Arduino IDE, we can use this to look at what the Python serial communicator has sent and output accordingly.

int LED = 13;
void setup()
{
pinMode(LED,OUTPUT);
digitalWrite(LED,LOW);
Serial.begin(9600);
}
void loop()
{
int data = Serial.read();
if (data=='1')
{
digitalWrite(LED,HIGH);
delay(10000);
}
else
{
digitalWrite(LED,LOW);
}
}

Alternatively, this code can check all the metrics of the user.

sumList = [0]*7
total = 0
avgList = [0.5]*7
while not exit_flag2:
val = get_value()
print(val, avgList)
result = ['']*7
for i in range(len(val)):
if val[i]>avgList[i]:
result[i] = '1'
print(1)
else:
result[i] = '0'
print(0)
sumList[i]+=val[i]
total+=1
avgList[i] = sumList[i]/total
for data in result:
ser.write(data.encode())

And here is the receiving end in the Arduino IDE. We can notice that a serial communicator cannot send a list. Instead, it iterates through the list and sends the bytes in it one at a time. On the receiving end, the Arduino takes them one at a time in sync so that it can display all the contents of the list with the LEDs.

int lights [] = {13, 12, 11, 10, 9, 8, 7};
void setup()
{
for (int i=13; i<6; i++)
{
pinMode(i,OUTPUT);
digitalWrite(i,LOW);
}
Serial.begin(9600);
}
void loop()
{
int data = Serial.read();
for (int i=0; i<7; i++)
{
int data = Serial.read();
if (data=='1')
{
digitalWrite(lights[i],HIGH);
}
else if (data=='0')
{
digitalWrite(lights[i],LOW);
}
}
delay(10000);
}
}

The result

Arduino with the breadboard. My set up only shows 5 LEDs, as I had run out (people keep eating them), but there should be 7 if we wish to visualize all 7 traits
Python console

Once we run the final result, we can see that the LEDs light up accordingly to our metric data. The above visual shows the LEDs at the second input (0, 1, 1, 1, 1, 1, 1). The outputs on the console are the current metrics, followed by the average metrics, and then a binary number determining if the corresponding LED is on or off. Note that there are no other programs regulating the LEDs, just our program reading my brain waves.

Wrapping it up

When the first myths of telekinesis were being formed, whether by Hindu storytellers or French magazine artists, the idea was not based on the existing technology but rather the vision of these individuals. And if this idea was not formed beforehand, it is probable that the scientific perspective of brain-to-machine communication would have never been regarded as a possibility.

In truth, this particular brain-computer interface project is not all that impressive. There are so many more applications for BCIs in engineering that spans further than ‘light operation interfaces’. As I always say, brain-computer interfaces are just another form of technology: in the same category as touchscreens, Arduino boards, and toaster ovens. It’s not really the technology behind BCIs that are revolutionary, but rather the technology that can be developed from it fueled by vision. This LED project is not a leap of discovery but rather a step in the right direction towards much grander innovations inconceivable to modern tech-enthusiasts.

I plan on expanding this idea to even greater developments, both with software and hardware, and will continue to build new projects to play around with and share here on this platform. But until then, I once again hope that this project has inspired you to go ahead and build your own applications in neurotechnology.

--

--