Derek Palmer
RE: Write
Published in
2 min readFeb 18, 2020

--

EEG, Interface Design, And You

Photo by Sebastián León Prado on Unsplash

Have you ever tried to move an object with your mind? Ever imagined you were a character in Star Wars as a child? Maybe you reached out towards an an apple or an empty glass and tried to feel it through the air. Maybe you really did feel something. Whatever you did, it felt like something.

We’re getting closer to EEG tech that works as an interface produced for the mass market. Emotiv’s products can manage interface sensitivity based on your emotional state. They can read cardinal directional impulses and let you move a wheelchair you’re sitting in with your mind.

Paired with VR/AR that allows a user to use gestures to manipulate a digital object and we may have a digital third hand.

All of this is essentially based on the input and desires of the user. It’s responsive, but there’s no reason we can’t go in another direction.

People design experiences for specific brands all the time. Nike should feel a certain way, Apple another way, Google a third way. The dream is a brand experience distinct enough that you understand what you’re interacting with without seeing the brand itself. Like a Coca cola blindfolded.

It’s a fact that certain postures and motions can change our internal states. Maybe it’s time to get started on marrying the emotional content of gestures with the neurological feedback of EEG and the outcome driven nature of gestural interfaces. To get granular enough to choose the right kind of finger pointing, for example.

At the very least, it’d be interesting to really know all the permutations of how it feels to move things with your mind.

--

--