Hey Drivers, the U.S. Army Wants to Read Your Brain Waves
Man-machine interface, for the win!
by JORDAN PEARSON
Imagine you’re a soldier driving for hours through a stretch of empty, dusty road. You’re wearing a helmet packed with sensors that record your brain’s electrical signals and sends them out into the ether for processing at a remote facility.
Meanwhile, a team of military specialists is monitoring your cognitive abilities in real-time, based on an algorithm’s interpretation of your brainwaves. What they want to know is this — are you too sleepy to be driving down that long, unrelentingly uniform highway?
The U.S. Army’s Human Research and Engineering Directorate, which has the mission of improving “soldier-machine interactions in mission contexts,” recently sponsored and co-authored research that could lead to the above scenario — if the technology ever gets there.
A paper posted to the arXiv preprint server on Friday described an HRED-sponsored study that looked into using brainwave-reading tech, also known as electroencephalography, to monitor the drowsiness of drivers and how it affects them doing things like changing lanes.
It’s worth noting that other researchers have investigated this scenario before, so the technology shouldn’t be thought of as only having military applications. One could imagine a rather disturbing scenario where long-distance truckers are required to have their brainwaves monitored, for example.
The basics of EEG involve a headset that records the brain’s subtle electrical signals, and an algorithm to interpret them. Usually, an accurate interpretation of what brainwaves mean only comes after the algorithm is trained using a particular person’s brainwave data. Brainwave patterns can be so individual and specific that they’re like a kind of fingerprint.
Once an algorithm can reliably interpret one person’s brainwaves, it may not be able to do the same for another person without being re-calibrated. That takes time and effort, so it’s not ideal for a scenario where you need a single EEG system to monitor numerous individuals at once.
In the study, the researchers developed a series of algorithms to interpret brainwaves based on their previous work, and tried them out on 16 different people strapped into a car in a testing chamber. The test subjects were asked to drive down a long, uniform highway and periodically switch lanes in a simulation. Their brainwaves were recorded during the test, and a video camera kept tabs on them.
The algorithms made use of something called “transfer learning,” which involves taking things an algorithm’s already learned to do and applying them to a new scenario. The results were “outstanding,” the researchers wrote, but there’s a lot more to be done in order to make the algorithms more accurate and faster.
EEG isn’t a new science, and occupies a strange liminal zone between hackneyed crap — for example, the world of EEG video games — and hardcore research.
The fact is that we don’t know everything about how the brain works, and with EEG it can seem like a guessing game or trial-and-error to figure out what electrical signals mean and how to use them. There are also questions of practicality surrounding driving while wearing what looks like on old-school swimming cap on your head.
But with every study, it seems like researchers are getting a bit closer. And so is the military, if self-driving cars doesn’t make human drivers obsolete first.
Originally published at Vice Motherboard.