Students Design a Device to Allow Paralyzed Patients to Communicate With Their Mind

Unexplored Neuroscience
Unexplored Neuroscience
5 min readOct 5, 2020

Many neurological disorders impair an individual’s motor ability, which includes speech communication. Under Locked-In Syndrome, for example, the patient is left totally paralyzed with the exception of their eyes but preserves cognitive abilities. Current technologies allow these patients to communicate using eye movements: they formulate words and sentences by blinking or moving their eyes up or down for individual letters. Some more advanced methods use eye-tracking technology.

While those are remarkable feats, eye gaze communication holds limitations. Its use requires focused attention, causing fatigue in the long-term. Furthermore, it is not accessible to all patients. In the total form of Locked-in Syndrome, not even eye movement is retained.

This is similar to Akinetic Mutism, where the individual cannot move or talk despite being awake and aware. They are cognitively sound yet unable to communicate with others, trapped inside their minds.

How do we overcome this problem?

We have designed a communication device that would allow the user to communicate basic needs and emotions just by thought alone, bypassing any need for intensive eye movement.

How Does the Product Work?

We plan to create a wearable device that records brain activity and generates text based on brain wave activity it picks up in Broca’s Area, the region of the brain that generates speech. The neurons in our brain process information through electrical pulses. The specific pattern of electrical activity, if decoded, can give insight into what neurons are doing and what instructions they are giving the body to carry out. Even if those commands cannot be carried out by the body because of paralysis, the command is still generated. What our product will do is learn how to decode these messages through the aid of artificial intelligence and machine learning on control subjects. Once experimental trials show that the device can accurately decipher speech generated in Broca’s Area and present it to others, either in text on a computer screen or through machine generated speech, we will then use it on patients as described above to allow them to communicate and engage with the world as they wish.

Source: Illustration of Broca’s Area localization and functions by Gary Ferster

The Audience

The device is targeted for patients who are left unable to communicate despite their cognitive abilities remaining intact, rendering them trapped and isolated from the rest of the world in their minds. Many of these patients wake up from a coma and are stuck in these conditions, not being diagnosed for months or even years. They are also often misdiagnosed as being in a vegetative state, which makes them legally dead in the United States and other countries. Those who do get diagnosed and are given access to eye-tracking communication are 80% more likely to be living ten years later compared to their counterparts. They also report having meaningful and fulfilling lives. They can engage with friends and family and have the opportunity to be involved in their own healthcare decisions. Yet this option is not accessible to all. We aim to bring that level of autonomy, grace, and dignity to those who currently do not have any alternatives.

The Team

This project will require a diverse team from an array of disciplines. The team will comprise of; neuroscientists; computer scientists with experience in machine learning and artificial intelligence to train the computer to recognize, decode, interpret, and improve its own algorithms to enhance its ability to understand speech patterns generated in the brain; and electrical and/or biomedical engineers to build the devices.

Tests

Tests will run with training and testing data from patients under locked-in syndrome who are already familiar with eye gaze communication. Thus, we can assess the specificity and sensitivity of our classification algorithms by comparing it to patients’ reports. Success will be estimated with qualitative and quantitative measures. Quantitatively, we will compare the sensitivity and specificity of our model, as well as time to convey simple ideas, to other current alternatives in nonverbal communication. Qualitatively, we will survey participants on the interface’s navigability and perceived change in communication when using our device.

The Next Questions

While testing prototypes, we want to be reflecting on the next steps to refine our product. As for now, apart from success rates, we must consider the costs and time to build the electrode sensors as well as the invasiveness (wearable vs. fixed) of the technology.

This series features Neuro-Innovation projects created by members of the Unexplored Neuroscience community, with the goal of exploring the intersection between Neuroscience & Innovation within different branches of Neuroscience.

This project was created by Zenab Louche and Amanda Portugal during our community Think Tank in our month of exploring Cognitive Neuroscience.

Zenab Louche

Zenab Louche is an undergraduate student studying brain science and is pursuing a career in neuroscience research. She is interested in neurotechnology, systems neuroscience, neuroethics, and sensory and perception.

Amanda Portugal is a sophomore at Minerva Schools at KGI and a passionate life-long learner. Her curiosity and fascination about the brain guide her investigations and will hopefully convert in a career path on research.

You can contact Zenab at zenab.louche@gmail.com, and Amanda at amandaportugal@minerva.kgi.edu.

References

  1. Laureys, S., Pellas, F., Van Eeckhout, P., Ghorbel, S., Schnakers, C., Perrin, F., Berré, J., Faymonville, M. E., Pantke, K. H., Damas, F., Lamy, M., Moonen, G., & Goldman, S. (2005). The locked-in syndrome: what is it like to be conscious but paralyzed and voiceless?. Progress in brain research, 150, 495–511. https://doi.org/10.1016/S0079-6123(05)50034-7
  2. Locked-In Syndrome. (2018, March 12). Retrieved October 02, 2020, from https://rarediseases.org/rare-diseases/locked-in-syndrome/
  3. “Plug-and-play control of a brain–computer interface through neural map stabilization” by Daniel B. Silversmith, Reza Abiri, Nicholas F. Hardy, Nikhilesh Natraj, Adelyn Tu-Chan, Edward F. Chang & Karunesh Ganguly. Nature Biotechnology.
  4. University of Helsinki. (2020, June 17). Brainsourcing automatically identifies human preferences. ScienceDaily. Retrieved October 2, 2020 from www.sciencedaily.com/releases/2020/06/200617150003.htm

--

--

Unexplored Neuroscience
Unexplored Neuroscience

A community dedicated to exploring the unexplored spaces in our understanding of Neuroscience.