The Neurotechnology Age

Matt Angle
12 min readNov 26, 2019


Brain-computer interfaces exist today, are getting exponentially more powerful, and have a capacity for good that rivals other historic breakthroughs in technology and medicine.

Credit: Midnight Measure Design

Today our brains can only interact with the outside world through our bodies. They are encased in bone like a wartime command center with no windows and one door. This constraint has kept the brain safe from injury and tampering, but it has also made the brain difficult to repair when things go awry. Now brain-computer interfaces (BCI) are allowing people’s minds to plug directly into technology. The invention of a broadband ‘brain modem’ is a historic turning point in medicine and society. To better understand BCI, and to place it in context, I here compare it to several other historical technological breakthroughs and their lasting effects.

Brain-computer interfaces are often touted as the technology of the future, but, in truth, they have already arrived. Earlier this year, a completely paralyzed man walked for the first time since suffering a spinal cord injury, made possible by an exoskeleton controlled by electrodes implanted in his brain. More recently, similar technology was used to decode and reproduce speech using recorded brain activity. In ongoing clinical trials, paralyzed patients are using brain implants to control computer cursors and send text messages with their thoughts.

As astonishing and important as these early BCI demonstrations are, they represent only the very beginning of what is possible. In each of the cases above, the basic technology providing the readout of brain activity was invented decades ago. Brain decoding from this older generation of devices requires a heroic effort — every bit of data is sacred. In contrast, today we have technology capable of ‘reading from’ and ‘writing to’ the brain that is an order of magnitude more powerful. As these new devices find their way from the lab to the clinic, the implications are enormous, not just for medicine, but for society at large.

In addition to technical advancement, the last decade has also seen important advances in societal acceptance and adoption of neurotechnology. Specifically, doctors and patients are becoming increasingly comfortable with the idea of surgically implanting devices in the human brain. More than 150,000 people in the United States already have some form of a therapeutic brain implant, mostly for treating Parkinson’s disease. The surgery to implant advanced BCI technology is no more invasive than the surgery to implant these legacy devices, meaning the barrier for prescription and patient adoption is lower than some might think.

As BCI capabilities and adoption increase, one cannot shake the feeling that we are about to step into a new technological era. Luckily, this will not be the first time that humanity has stood on the threshold of monumental advancement and successfully embraced innovation. One way of thinking about the future for BCI is to look back at other disruptive technologies and how they unfolded.

Below, I focus on four historical examples — the vaccine, the modem, the lens and the integrated circuit. Each provides a vantage point for understanding a different facet of brain-computer interfaces.

The Vaccine

Medicine can be safe, effective, and evidence-based, even without a complete understanding of the underlying biology.

The first vaccine was developed by Edward Jenner in 1796, 13 years before Charles Darwin was born, and at a time when biologists were still uncertain about the cellular nature of life. Jenner, observing that milkmaids who were regularly exposed to cowpox acquired immunity to the deadly smallpox virus, used material from cowpox sores to inoculate patients against the more virulent human strain. His experiment worked, and vaccines saved millions of lives — all before scientists had any advanced knowledge of molecular immunology, genetics, or even biological cells. Arguably, no one really knew how vaccines worked until the 1970s when Nobel Prize-winning scientist Susumu Tonegawa and others discovered the genetic basis for antibody diversity. It could also be argued that even today we do not completely understand all of the mechanisms of acquired immunity. Yet, if we had waited until the 1970s or later to start vaccination programs, many millions of lives would have been lost to otherwise preventable diseases.

With high-profile entrepreneurs such as Elon Musk and Mark Zuckerberg getting involved in the field of brain-computer interfaces, it is fair to say the technology is receiving its share of hype. Many scientists who have devoted their life’s work to creating meaningful and lasting advances in BCI are justifiably concerned about managing expectations to ensure sustained public support. It is in this context that several leaders in the field have expressed, in some form or another, the following reservation: We do not understand how the brain works; therefore, we are far away from advanced BCI applications. I believe this sentiment, though rooted in good intentions, is both reactionary and misguided.

First, though we don’t yet have an all-encompassing theory of mind and consciousness, neuroscientists know a lot about the brain, neurons, and neural coding. We know more today about how the brain works than Jenner knew in 1796 about how the immune system works. Critically, we know a lot about the localization of sensory and motor functions within the cerebral cortex and how brain activity in those areas reflect their respective functions.

More importantly, building useful BCI technology does not require a perfect understanding of the brain. Arguably, it doesn’t even require a good understanding of the brain. We know how to record brain signals to control prosthetic arms and computer cursors. We know how to feed tactile data into the brain for prosthetic feedback. We know that stimulating the visual cortex has enabled blind patients to “see” visual patterns. While it is true that BCI could be enhanced by additional neuroscience research, waiting until we have a complete understanding of the brain before building BCI-based therapies would be no less folly than if our predecessors had withheld the polio vaccine until the sequencing of the human genome was completed.

The history of medical science is spun from two parallel, yet intertwined and supporting threads. The first is the pursuit of a deeper understanding of biology, grounded in causal, mechanistic understanding. The second is a pragmatic and patient-centered practice of care, which is very outcome-oriented. Of course, basic science drives better patient outcomes, but it is also the case that many successful therapies and breakthroughs work by treating the underlying biology as a black box and concentrating on outcomes rather than mechanisms.

To be clear, I am not advocating for BCI innovators to run roughshod over decades of established norms with respect to scientific rigor and medical practice. BCI technology is powerful, and it must, therefore, be developed within a safe and ethical framework with buy-in from many stakeholders and experts. What I am advocating is for an outlook on BCI that is outcome-oriented. We have an opportunity and responsibility to help people today in parallel to enabling fundamental research that will better people’s lives tomorrow.

The Broadband Computer Modem

The growth of applications for a technology platform often responds to the platform’s capabilities in a highly non-linear way.

The history of the internet is also the history of the modem and other networking hardware. Although this communication infrastructure is seldom celebrated, it was essential in enabling the modern internet. To understand how internet usage evolved, we need to understand the foundations upon which the web was built.

One of the best examples of how insufficient hardware can impede market development is Netflix. Today, Netflix has 148 million users and accounts for 15 percent of all internet usage, which helps explain why the company is worth billions of dollars. And yet, when Netflix was founded in 1997, and for 10 years afterward, it used approximately zero percent of the internet.

Back then, Netflix was a company that mailed DVDs to people’s homes. Downloading movies in their entirety was an hour long endeavor, and streaming was unthinkable. Readers who are old enough to remember early Netflix will remember the feeling of anticipation when opening an envelope — not for the movie itself, but for the anxious inspection of the DVD for scratches and thumbprints. That was Netflix for 10 years.

Netflix didn’t begin its streaming service until 2010. This timing wasn’t arbitrary. It corresponded to a market threshold requirement where enough people had enough download speed to support a streaming video model. In other words, the underlying infrastructure development enabled Netflix to create billions of dollars in value.

Similarly, the evolution of smartphone applications is tied to the development of wireless data. The broad migration of internet applications from PC to mobile was largely driven by the rollouts of 2G, 3G, and 4G connectivity. Many valuable smartphone applications were conceptually obvious, but they were gated by non-trivial hardware and data connectivity requirements.

The parallel between a brain-computer interface and a modem is easy to draw. BCIs are literally modems for the brain. This parallel makes home internet and mobile usage interesting case studies for how BCI applications might follow BCI capabilities. In both home and mobile internet, new applications arose in bursts. Underlying these trends were two non-linear phenomena. First, an exponential increase in technical infrastructure occurred in well-punctuated, discrete steps. Second, a stable of known applications-in-waiting were gated by threshold performance criteria.

The conditions are ripe for a burst in BCI applications. We already see enhanced capabilities as an important leading indicator. The devices and systems being developed in academia, and at specialized companies such as Paradromics and Neuralink, are orders of magnitude more powerful than the BCI technology currently seen in the news today. Next, we can also identify the existing, unmet market need — clear BCI use cases in prosthetics and bioelectronic medicine that are blocked by insufficient technological capability. Soon, next-generation BCI technologies will begin appearing in clinical trials and we will see an explosion of new applications as significant and valuable as any of the internet applications of the last decade.

The Optical Lens

Powerful enabling technologies lead to emergent markets that can dwarf their original application.

The most valuable technological innovations commonly appear in hindsight to have been purpose-built for their final applications. Often, this notion is the product of retrospective bias. Sometimes “necessity is the mother of invention”, but in other cases, problems and solutions arise independently and co-exist in plain sight before coming together in non-obvious ways. One of the most interesting examples is the lens, because it shows how an enabling technology can spread through different fields and have an impact far beyond its original use case.

The optical lens is one of the most pervasive and essential tools in modern society. It fundamentally changed the way that humans understand their place in the universe, and yet, the first lenses were essentially novelty items with no apparent practical value. The ancient Egyptians and Mesopotamians began polishing crystals because the result was beautiful and mesmerizing.

Gradually, people began to study the properties of lenses and discovered that they actually provided solutions to several long-standing human problems. The proper lens could make far away things look closer, make small things look bigger, and correct blurry vision. In Silicon Valley parlance, lenses were the ultimate “solution looking for a problem.” Spectacles for vision correction started appearing in the 13th century. The first modern telescopes and microscopes began appearing around the end of the 16th and the beginning of the 17th century.

By the 18th century, engineered optics were powerful, but it was still not obvious what the major impact of the lens would be. Telescopic lenses were being turned toward the stars and planets, expanding our view of the universe, increasing our sense of smallness and debunking assumptions about humanity’s own centrality. Microscopic lenses were peering inward, revealing the cellular structure of life and setting the stage for a revolution in biological and medical science. Spectacle lenses, meanwhile, simply helped people live their everyday lives with better clarity. In retrospect, all of these areas were important, and all were immensely valuable.

We are approaching an age of seamless data exchange between brains and computers. This capability allows us to reframe many long-standing conditions in solvable terms. Before corrective lenses, visual impairment was a permanent and irreversible problem. After the development of eyeglasses, vision correction became a solvable optics problem. Similarly, blindness and paralysis are currently considered untreatable and irreversible, but with advanced BCI, these conditions will be objectively characterized as data problems. BCI will transport visual data to the brain, enabling blind patients to see. It will transport data from the motor cortex of paralyzed patients to computers and an entire ecosystem of computer-connected devices. Increasingly, data science and machine learning will become indispensable tools for neuropsychiatry, because the primary mode of understanding and interacting with the brain will be computational. These are just a few of the obvious medical applications of BCI, and speculation is already rampant on what else BCI could be used for, even outside of medicine. If past innovations such as the lens are any indication, perhaps the most fascinating uses of BCI have yet to be proposed.

The parallels between the lens and brain-computer interfaces are interesting to examine, but a key difference between the two technologies is timing. Both have applications in healthcare and basic science, and both challenge us to think differently about ourselves and the world we live in. The field of optics, however, unfolded over a span of centuries, in the pre-modern era. BCI is developing in an era of advanced technological infrastructure and exponential technological development. We will see emergent and expansive uses of BCI in the span of a decade.

The Integrated Circuit

New industries arise when a convergence of disparate ideas and people create a whole much greater than the sum of its parts.

Today, the semiconductor industry is well established. In the early days of Silicon Valley, however, there wasn’t a cohesive sense of the field. There were no integrated circuit engineers in 1950. The pioneers of Fairchild, Texas Instruments, and later, Intel, were a motley crew of applied physicists, chemists, analog electrical engineers, industrial process-engineers, and mathematicians. The integrated circuit revolution was based on contributions from all these disciplines. Yet, more importantly, it arose from interactions between these disciplines. That meant that the early social and technical ecosystem was a critical component in the formation of a new industry.

Much like the early integrated circuit field, neurotechnology is a nexus of disparate areas of expertise, where major gains occur at the systems level, fueled by cross-disciplinary interactions. The most impressive gains have not been made by lone scientists but by diverse groups. Many collaborative, multi-lab efforts are supported by government funding agencies such as DARPA, the research wing of the Department of Defense, and the National Institutes of Health (NIH). One notable international collaboration, the Neuropixel team, has contributors from the Howard Hughes Medical Institute, Allen Institute for Brain Science, University College London, and the international R&D innovation hub IMEC. Another, the Braingate2 clinical trial program, involves scientists and clinicians from Brown University, the Veterans Association, Massachusetts General Hospital, Stanford University, and Case Western University. What all these efforts have in common is a breadth of expertise that spans traditional disciplinary boundaries. The private companies that are currently leading the next wave of BCI development are also academically and experientially diverse.

My own team at Paradromics includes scientists, engineers, and managers from the fields of applied physics, materials science, integrated circuit design, digital design, machine learning, high-speed data acquisition, and medical device design. In my experience, innovation occurs most profoundly at the interface between working groups: Our ability to function as a team has been more predictive of systems-level improvements than any one individual contributor’s raw genius.

In other words, BCI is a team sport. I would go one step further, however, and say that BCI is a league sport, and if we want to benefit from the positive example set by the early semiconductor industry, we should recognize that an open, thriving ecosystem with many successful players is the best way to accelerate progress.

BCI in the Broader Context

To better understand BCI, I have highlighted four transformative technologies that span generations, saved billions of lives, and created trillions of dollars in value. The brain-computer interface will inevitably take its place among these technological innovations, not because of the merits of any particular device or product, but because BCI will transform the brain from an inaccessible and irreparable organ into an addressable and serviceable system. It will allow us to pit our most powerful technologies against our longest-standing enemies — injury and illness.

Throughout recorded history, humans have struggled with untreatable conditions. Our species has looked forward to a better time, when “the eyes of the blind [will] be opened and the ears of the deaf unstopped.” Those words were first recorded in the Hebrew Bible in the 8th Century BCE, but they speak to an aspiration that is universal and still relevant. Now, at this moment, medical technology is close to providing profound new device-based therapies that will fulfill these long-standing aspirations.

BCI has recently captured the popular imagination as a futuristic gateway to the unknown, but the future of BCI is more knowable, more grounded, and more exciting than many would speculate. By making the brain accessible to technical solutions, BCI will tackle unmet human needs that predate civilization. This strikes me as the rawest and most potent form of technical innovation, and it is exactly the type of endeavor worth investing in.

Matt Angle is the CEO of Paradromics. If you are interested in learning more about brain-computer interfaces and Paradromics technology, make sure to follow us on Medium, Twitter (@Paradromics) and Facebook.