by Gordon Freeman on Nightcafe

Quantum Computing & the Future of Neural Interfaces

Understanding the complexity of 100 billion neurons takes some finesse. ⚡️

Tanvi Reddy

--

Neurons are not binary creatures. In fact, they operate far, far beyond any sort of black-and-white or simplified interpretation. It’s never just 0 or 1 — it’s more like everything everywhere all at once, an intricate network of enigmatic electrochemical signals organized across billions of nodes that we could only hope to understand fully someday. The brain is such an unpredictable organ that it’ll demand decades of patience, high-resolution imaging & signal collection, and powerful algorithms before we could possibly decode and encode every possibility and neuronal correlate in a singular brain, not even accounting for individual biological differences — and we may never even get there.

However, a problem so seemingly incorrigible often doesn’t stay that way. I found myself wondering, could the way we approach any of the three parts of a brain-computer interface system (the brain, the computer, and the feedback loop between them) have untapped potential that, if harnessed, could level up the entire field?

As always, there was more to be explored. And I found something pretty cool. There’s a synergy we can create between this field and another one at the bleeding edge of innovation; an intersection that could make it exponentially more conceivable to solve the brain’s esoterica, a dream pursued by every neurotechnologist out there.

Quantum computing.

Which, requires a lot of explanation (as do BCIs, for which if you would like an explanation, check out a previous article here.) I’ll offer brief overviews of each, but if you’d like to learn about the basics of quantum before reading on, check out this video by Kurzgesagt, and come right back.

This article will offer an exploration into the limitations of BCI, applying quantum computing to solve them, and the implications of this intersection on neurotech as a whole.

Navigate:

The Barrier to the Brain

Say Hello to QC

A Hypothetical Architecture

Limitations

Conclusion

Let’s go!!

The Barrier to the Brain

As a brief overview, brain-computer interfaces (BCIs) connect the activity and signals in the central nervous system (made up of the brain and spinal cord) to computing devices able to interpret and transform them into a machine’s output, which can then be used to restore, replace, or even enhance natural neurological function. Intelligent BCI technology utilizes this functional pathway: 1] physical hardware that receives signals from the brain → 2] algorithms that process and interpret the data into correlates and commands → 3] generation of output/action in a device or machine.

My favorite schematic of how brain signals turn into commands and outputs. Image: IntechOpen

While revolutionary, the majority of current BCI technology is still only capable of simplistic conclusions — determining if a person is calm or surprised, picturing blue or red, clenching or relaxing their jaw, etc. They lack nuance, and understandably so. It’s extremely difficult to instantaneously categorize the subtleties of billions of different neuronal signals and their hidden meanings, simply because an almost infinite amount of equal-probability possibilities exist all at the same time. On top of that are some other specific obstacles preventing the field from collectively advancing:

  1. Struggle for Signal Quality
    Environmental factors (like electromagnetic interference or movement artifacts) create lots of unnecessary noise that degrades the quality of signals acquired, meaning more advanced algorithms are needed to filter through and interpret the stuff that matters, most effectively.
  2. Variability in Brain Signals Across Individuals
    Brain signals vary significantly from person to person, and even within one person as they age, putting a huge damper on reliability and consistency until BCIs can scale to accurately interpret numerous triple-edged signals across a large number of users and sessions.
  3. Inaccurate Artifact Removal
    To effectively filter data, BCIs commonly rely on something called ICA, or Independent Component Analysis, to remove useless artifacts (i.e. unwanted signals & noise in the data) — but this method isn’t perfect. Oftentimes, the technique’s superficiality can overlook temporal or spatial resolutions across the brain’s neuronal network and mistakenly filter out important information.
  4. Slow Data Transfer Rate
    Latency, biocompatibility, connectivity, lengthy artifact filtering, and human factors put delays on the ITR (information transfer rate) of BCIs, an issue that plagues BCIs of all levels of invasiveness (i.e. an invasive BCI with eroding electrodes degrades in efficacy just as a noninvasive headset with limited pure signal acquisition does).

Why does this matter?

The acquisition and processing of brain signals in BCIs involve multiple complex stages that we’re navigating with simple AI algorithms at the moment. The truth is, though, that the most profound use cases of current and future BCI innovation will demand better systems to rigorously analyze neural data with high functional dimensionality. When it comes to such weighty applications (think cortical visual prosthetics, or a leg brace aiming for maximum naturalization), a BCI’s computing speed and accuracy can’t fall short if we’re aiming for the best patient outcomes.

The Opportunity here, explained in an analogy

Look at the traditional BCI as an extraordinarily powerful microphone in the middle of a football stadium. It can hear basically every single noise made in the whole stadium, from people cheering to munching on hotdogs to laughing. So it hears the overall noise of the masses really well — but you can’t use it to pick out what the person in exactly seat 66B is saying to their friend.

What you need is to be able to record and decode what specific noise each and every person in that stadium is making, and do it as fast as the noises are being made. In the same way, a higher-performing BCI would need the ability to rapidly & accurately listen to and understand all the possible meanings of each neuronal signal as it comes, allowing for the most accurate real-time control of any device.

That’s where a quantum computer would come in. It would have the capacity to understand each and every neuronal impulse out of billions and decode them as they occur to allow for the most naturalized use of any system that can’t afford to lag behind the brain, be it a bionic eye or a walking exoskeleton.

Say hello to QC!

Quantum computing, in short, leverages the principles of quantum mechanics to perform computations more powerfully than any computer we’ve produced to date. Unlike traditional computers that use bits to process information (where each bit represents either a binary value of either 0 or 1), quantum computers use…you guessed it, quantum bits (qubits)!

While classical bits operate in binary (either 0 or 1), qubits are allowed to have the best of both worlds.

Qubits can exist in a state of superposition, meaning they can represent both 0 and 1 simultaneously. This allows quantum computers to perform complex calculations way faster than classical computers, thereby being able to solve more difficult, nuanced problems because of their ability to explore multiple paths at the same time.

A little analogy: think of a classical computer solving a problem as a rat running through a maze, trying every path until it finds the solution. On the other hand, a quantum computer ‘rat’ would explore multiple paths at the same time, greatly speeding up the process of finding the solution.

Now use that same analogy in the context of the brain. There exist billions of possibilities for which brainwave signal corresponds to which certain motor action, spoken word, emotion, disorder, cognitive state — the list of possible neuronal correlates never ends, to the extent that a traditional computer can’t really go beyond drawing broad, 50/50 conclusions, and therefore doesn’t even attempt to interpret a lot of untapped information.

So you can imagine, in that huge labyrinth of possibilities, that having the parallel processing ability to explore multiple paths at the same time (like our quantum rat) would grant our algorithms exponentially higher speed, efficiency, and predictive power than ever before.

Which sounds awesome, but let’s explore a proposed way it could actually work!

A Next-Generation Computing Paradigm

At a high level, a quantum BCI’s process would look like this:
Electrodes → raw EEG data → conversion into logical expressions → processing through a quantum circuit → predicting the neuronal correlates → generating commands/outputs.

Here’s an explanation of what would underly this process, constructed using mini-hypotheticals based on a simulation on a virtual quantum machine at the University of Plymouth (linked in sources).

“The Logic of the Mind”

In 1998, Hellmuth Petsche and Susan Etlinger advocated that a BCI could not accurately infer states of mind by simply viewing, all at once, a snapshot of the averaged EEG from a whole set of electrodes. The idea is that mental states aren’t static or fixed; they instead evolve and change over time, which demands analysis of how different brainwave frequencies (or spectral components) interact with each other at different locations on the scalp. To research this, they cooked up the concept of the EEG-based logic of the mind, proposing that those nuanced interrelationships could be expressed as logical expressions.

Take a system like the one above that acquires EEG signals (let’s say beta and alpha waves) from multiple electrodes at specific time intervals, like 500ms. This system tracks the behavior and power of these two brainwaves over different electrode points across time, identifying which locations each brainwave rhythm projects most powerfully. Then, at time tⁿ, will encode this info extracted from the EEG data into logical expressions, with different variables representing the activity of specific electrodes.

Think of it as keeping things nice and tidy; a bunch of messy, raw EEG data being transformed into uniform expressions and logical inputs.

But what’s the catch? A little something called Boolean satisfiability problems, which are essentially mathematical tasks requiring us to determine the values of logical variables that’ll make our logical expression true. The roadblock here is that due to the data’s complexity, some logical expressions could contain a lot of variables that we’d be tasked with determining—I’m talking upwards of 50, which alone would require us to guess and check a whopping 1,125,899,906,842,624 different combinations. That’s insane.

Think about the laptop you’re probably reading this on: a personal computer capable of performing ~2 billion operations per second. Now, that sounds like a lot. But when you compare it to how many operations it’d actually need to perform to satisfy the logical expressions described above, a rate of 2B/s would mean a total time of 562,945 seconds to complete the task, or almost one whole week. If you’re thinking what I’m thinking, it’s worth considering that a quantum computer would have the potential to speed up this entire process…

Quantum Circuit for Logical Satisfiability

Take a different hypothetical BCI system that uses EEG beta readings to activate and exert commands to a device.

At any point in time, the system identifies the two electrodes with the highest EEG power from two electrodes and checks if they qualify as beta rhythms. Based on the truth values of that evaluation, the system creates logical clauses (like A ∨ C) as part of a larger logical expression that will form system commands.

Let’s say the system’s analysis generates the logical expression below, with the goal of determining whether specific conditions are met for activating commands based on beta rhythms. The symbols between the letters signify logical operators that represent OR, AND, and NOT (respectively).

And we can now check the satisfiability of this expression using a quantum circuit!

Constructing the Circuit & State of Superposition
A circuit can be constructed to implement the logical expressions, made up of quantum logic gates (called X & Z gates) designed to check satisfiability by constructing expressions with three clauses like the one above. And remember qubits, described way above? This circuit contains six of them: three representing the logical variables A, B, C, and three serving as ancillary qubits (essentially temporary stores of info as the process is carried out). In the circuit’s initial state, all its qubits are set to |0⟩, which is basically just an assignment of the value ‘0.’ The logical variable qubits then undergo a transformation using an H gate, which plunges them into a state of superposition where they exist simultaneously in states of |1⟩ and |0⟩. This superposition is essential for a quantum system to now perform parallel computations on all the different possible implications of our hypothetical EEG data.

Creating & Testing Possibilities
The quantum circuit undergoes a series of steps (defining OR clauses, storing outcomes in ancillary qubits, using a 3-qubit controlled Z gate for conjunction, and spin-marking the quantum states |001⟩, |011⟩, and |101⟩) so that it’s able to favor and output those three states much more frequently than any other possible values. Why? Because the situations those three quantum states represent satisfy the conditions of our logical expression! So essentially, the circuit is programmed to amplify the probability of those ‘correct’ states and therefore the desired outcomes.

Here’s a visualization of what this looks like in theory (I studied this diagram until I understood it—I suggest you do too, just for fun!)

The quantum circuit created by UPlym to solve this specific hypothetical

Mission accomplished!
Our 3-clause logical expression is satisfied in specific cases corresponding to those 3 favored quantum states, determining satisfactory values for A, B, and C and triggering the corresponding BCI command based on detected beta rhythms at specific electrodes. In the specific study at UPlymouth continued from this type of simulation, the outputs were then synthesized to map mental states to sound and successfully control a music player using good old EEG and a quantum algorithm.

What this allll boils down to (+ why it matters):

To review, a quantum brain-computer interface would achieve its goal of faster, more accurate signal processing by encoding raw brain signals into clean logical expressions, passing them through a series of quantum logic gates controlled by super-positioned qubits, testing & determining the correct neuronal correlates, and outputting them as commands for system control. Processing large Boolean expressions (those ones with trillions of combinations) to accomplish even a fraction of this higher-level computational goal demands the use of a quantum algorithm, with an invaluable knack for parallel processing & making messy problems look easy.

Limitations

Hardware Availability & Scalability

As you can probably imagine, the intersection of these two fields is still largely theoretical and doesn’t yet have universally proven advantages over classical BCIs. But, let alone that: while quantum computers are no longer just a thing of fantasy, they’re still nowhere near available for commercial use and experimentation. While Google, IBM, and Microsoft have come a long way in building quantum processors, and researchers have developed and implemented numerous algorithms on those embryonic quantum computers, it’s estimated to take another 10–20 years before the tech breaks into market. So, it’ll probably take a while for these aspirational conclusions to be studied enough to be proven as well as matched in promise by robust hardware. I believe this might actually be a positive, though — the general inaccessibility to quantum computers may work in our long-term best interest, to allow painstaking development and theory validation before we make any premature attempts at integration with the brain.

Ethical Considerations

As always, when dealing with any sort of biological data, privacy and regulation come to mind almost immediately. Quantum computers do have advanced processing capabilities and it’s largely unpredictable as of now how much control a subject would have over the collection and interpretation of their own brainwaves by a quantum BCI — but in my personal opinion, the data security measures already being deployed along with the growing consumer-grade BCI market are laying a sufficient base for whatever advanced signal processing capabilities may come about.

Regardless of limitations like these that should be considered, thorough yet scalable simulations like the one I described above on virtual quantum machines offer great opportunities for advanced algorithm development until the safest and most powerful hardware of the future is available. Where there’s a will, there’s a way. ☺

In conclusion…

What I described in this article was just one new computational procedure previously simulated and validated on a virtual quantum machine. Other methods are certainly out there waiting to be further developed, like RQNNs (recurrent quantum neural networks) for better EEG filtering, and quantum soft computing techniques that also require some deep exploration. Leveling up the computing speed and accuracy of neural interfaces is going to do wonders for making any user, especially those in need of sensory or motor capability restoration, feel more naturalized to their device and therefore hugely more positively impacted. In my mind, that’s the North Star of neurotech—improving the lives of people who often turn to this technology as their only shot at normalcy.

With that goal in mind, new ideas are emerging every day on how we can level up current BCI capabilities and push the limits of what we know to be possible — which doesn’t always require peeling back further layers of the brain. While it’s only one part of a bigger puzzle, the development of more powerful algorithms is one of the more safe and accessible areas of BCI innovation we can tap into starting now, two qualities that in the end will fast-track our progress toward impacting billions. 🧠

Sources:

If you want to get to know me, head over to LinkedIn! Catch you later! 💫

--

--