Hold That Thought: A Tangible Gateway to the Brain

Benjamin Roop
wpihci
Published in
4 min readMay 18, 2021

A group in France has developed a doll named Teegi whose head lights up based on a user’s brain activity.

For many students, or anyone interested in studying the brain, understanding functional specialization can take some time. Early lessons come with the potentially shocking reality of functional specialization, the understanding that different brain regions have different functions. For example, the back of the brain’s cortex (known as the occipital lobe) is responsible for vision, and the central portion of the cortex controls movement. Generally, students learn about functional specialization through textbooks or other 2D visual media. However, a new tool makes discovering the brain a tangible task.

Electroencephalography (EEG) has long been used to noninvasively measure and record activity in the brain. The technology involves placing recording electrodes on the surface of the scalp, and these electrodes detect the electrical activity of neurons in the brain’s cortex. Different areas will appear as active depending on the process the user is doing, and different brain wave frequencies each correspond to different types of tasks. For example, sensorimotor activity generally appears at a frequency of 16–24 Hz, whereas visual activity presents at 8–12 Hz. This brain activity traditionally gets outputted by EEG to the user as a series of signal traces that require substantial training to understand, but the technique described in this paper makes interpreting EEG easier, tangible, and interactive with a tool known as Teegi.

Instead of representing the neural signals as traces, a Tangible EEG Interface (Teegi) allows the user to hold onto a doll-sized character whose “brain” lights up in real time based on the user’s own brain activity recorded with EEG, giving a representation in 3D via the character. Teegi’s lights get more intense and change color depending on how active a brain region is. For example, the back of Teegi’s head lights up intensely when the user is looking at something because vision is processed in the occipital lobe. Similarly, lights on the back of Teegi’s head dim when the user’s eyes are closed because no visual stimuli are being processed. As a fun added feature, Teegi has illuminated eyes that open and close in sync with the user’s own eyes (thanks to specific EEG waves that correspond to blinking). Teegi also lights up in response to user movement. If the user shakes their hips, the region in the very middle of Teegi lights up because that is the part of the motor cortex that controls the hips. Similarly, wiggling one’s tongue causes the outsides of Teegi’s head to light up (approximately above where Teegi’s ears should be), again because those regions control mouth movement. Thus, students can discover through experience the specific spatial layout of the motor cortex, all with a simplicity and elegance Wilder Penfield would marvel at.

A given type of brain activity rarely occurs in isolation. When a person reaches out to pick up a cup or write a note, the brain’s motor region activates to cause the motion, but the visual cortex activates too because the person looks at what they are doing. For a student learning about functional specialization with Teegi, this could cause some confusion over which illuminated brain region corresponds to which action. However, the inventors of Teegi included a feature that allows the user to control what type of activity Teegi lights up for. Each of three mini-Teegis correspond to one of three processes the user might perform: movement, vision, or meditation (because there are specific brain patterns associated with meditation too!). If the user moves one of these mini-Teegis to a specific control region on a tabletop setup assembled for testing Teegi, the big Teegi will only light up in response to the process represented by the selected mini-Teegi. For example, if the user selects the mini-Teegi for vision and then wiggles their fingers in front of their face, only the visual region on the back of Teegi will light up as the user watches their fingers move; the motor regions controlling the fingers will not illuminate on Teegi because signals for every process but vision have been filtered out.

Different regions on Teegi’s head light up in response to certain brain activities. By selecting a mini-Teegi for a specific process (shown in the inset images), a user can set Teegi to illuminate for activity corresponding to only that process. Taken from Frey, et al. 2014.

Teegi offers many other learning features beyond those highlighted here. Actual live EEG traces can be projected onto the tabletop to demonstrate the complex activity being summarized by Teegi, and users can hold a model of Teegi’s brain to observe cortical activity on the actual cortex. Users involved in the experimental testing of Teegi not only enjoyed the process but also reported to have learned something about the brain and EEG. Future trials with Teegi are in the works with students and at science museums, so be on the lookout for Teegi coming to a neighborhood near you (and remember that the back of the brain is active while you’re looking out)!

For more information on Teegi, check out the full paper:

Frey, J., Gervais, R., Fleck, S., Lotte, F., & Hachet, M. (2014, October). Teegi: tangible EEG interface. In Proceedings of the 27th annual ACM symposium on User interface software and technology (pp. 301–308).

--

--