By Josh Wolfe
We forget that technology even at its most useful can be alien, unnatural and awkward. Under the illusion that technology serves us, we overlook that what technology really does is train us: to tap awkwardly placed buttons, turn dials, pull levers, swipe, and flick switches. With each new interface, we learn a new language to tell our technology what to do. In the process, we lose another shred of our humanity.
But what if instead of us conforming to technology, technology conformed to us? What if, in the future, technology was truly intuitive?
That day is today. CTRL-labs, based in New York, has created a transformative neural interface technology. The company, led by Thomas Reardon, creator of Internet Explorer and a neuroscientist, announced today that it raised $28 million in a Series A financing. My firm, Lux Capital, co-led the investment round with GV, alongside top-tier investors Vulcan Capital, Founders Fund, and the Amazon Alexa Fund, with participation from existing investors Spark Capital, Matrix Partners, Breyer Capital, and Fuel Capital. The round is joined by notable angels Tim O’Reilly, Slack founder and CEO Stewart Butterfield, Warby Parker CEO Dave Gilboa, Cyan Banister, Mike Slade, and others. This brings the company’s total financing to $39 million.
CTRL-labs’ technology is creating a future in which machines are natural extensions of thought and movement. That sounds crazy, but it’s real.
When we imagined a future through George Jetson, we got a lot wrong. Remember his job? He sat in front of a giant computer and pushed a button or two. The reality is going to be a lot cooler. Even Minority Report’s futuristic sci-fi vision had Tom Cruise’s character wearing gloves to have gesture control, but didn’t imagine even more advanced tech that could read neurons to intuit gestures. CTRL-labs will connect your nervous system to your technology and allow your brain to send commands through the muscles in your arm. It’s so sensitive you don’t even have to move your arm or hand to make it work. The controller just “gets the idea.”
It sounds like fiction, but it’s very real. The first time I tried this in the lab, it felt like I was experiencing real-life magic.
Here’s how it works. A small band worn around your arm sensitively detects waves of neurons firing, and like software detecting the soundprint of a song in Shazam it turns it into a series of 0s and 1s. From those signals you can use machine learning and predict that when that signal spikes, you are moving one finger or two or your entire hand and precisely how you are moving it. Without a glove you can turn your hand into a universal controller, mapping controls and commands to simple gestures. Imagine typing in free space without a keyboard, tapping two fingers to turn on Sonos, flicking a pinky to switch songs, making a tiny motion with your index finger to make the volume louder or softer. You don’t need a screen, a keyboard, a mouse, a switch — just your body.
You will be like the sorcerer’s apprentice, conducting the simple internet of things around you, as well as complex things such as robots and industrial machines.
Forget what you’ve heard about brain implants — that’s too invasive and, frankly, pretty scary. The future of brain-machine interfaces is non-invasive. Instead of surgical implants, CTRL-labs uses state-of-the-art signal detection and machine learning to read your neurons from outside the body. The first step will be technology precisely picking up the signals from inside your body to control devices outside of it with little more than natural gestures. The next step — and we are already closer than most people realize — will be reading the intention directly from your brain.
This transition to intuitive control is a trend that’s been a long time coming. I call it “the half-life of technology intimacy.” With each advancing period of time, technology becomes less visible, less inorganic, less unnatural and less distant.
From one shorter time period to the next, tech disappears into more natural human interfaces. We see this already happening with the rise of Alexa and Siri using our voices and mostly natural human speech to control things. But speech is limited and linear. It’s why we gesture and gesticulate, throwing our hands around to communicate our intentions and manipulate the world around us. Speech is also limited as a way to input our thoughts into computers or control our technology — try editing a picture, highlighting text or sketching a picture with your voice.
CTRL-labs is pioneering something entirely new at the intersection of biology and code. He did this by assembling a rare cadre of scientists and technologists: PhDs in computational neuroscience; biomechanics paired with hackers and coders; experts in signal processing, machine learning and human-computer interaction; and industrial designers.
Reardon is a tech prodigy. He grew up with 18 siblings and studied graduate-level math and science at MIT when he was still in high school. Soon after, he made a fortune working closely with Bill Gates at Microsoft to invent Internet Explorer. He moved on to help start and run other public and private tech companies, and then went back to school. He spent the next decade studying neuroscience and earned a PhD.
The economic implications of the arrival of this technology are huge. Sure old devices, controllers and accessories will be obsolete, and gamers will love this — but that’s just table stakes. Hundreds of billions of dollars of disruption and opportunity lie in entirely new applications for consumers and enterprises: controlling factory, warehouse and industrial robots; unleashing productivity by unshackling people from keyboards and devices to work with really dense information in ways that make Minority Report look like child’s play. Fortunes made with universal operating systems installed on every desktop may pale in comparison to intuitive, universal controllers.
Stay tuned for more big announcements from CTRL-labs in the months and years ahead.