How to Import and Export Information into the Brain 🧠

Giselle Chan
14 min readOct 17, 2023

--

Picture this:

You’re a spy trying to reclaim your team’s jewels, but the blueprints of your enemy’s castle can only be discovered after you set foot in the building together with your team (and they reach the files room).

just like in The Matrix!

There’s no way you’ll be able to learn the entire blueprint in the 10 seconds before you’re forced to make a move, because the clock is ticking down every second, and every second, the chances of being discovered is growing!

In the zone? Okay, let me feed you a thesis:

Information can be uploaded (fed/imported — whatever you want to call it) into the brain.

shocked? don’t worry — Gina from B99 is also shocked!

(All these problems I’ve given you — they’ve all seemed to disappear, right? This is what dreams are made of!!)

But that’s a bold statement to start off of, so let’s delve into the science, technology, and proven experiments in this realm.

In this article, I will discuss…

You may know that brain-computer interfaces (BCIs) are capable of measuring brainwaves and electrical signals — and thus extracting your thoughts through machine learning, derivation, and precise predictive guesswork.

But as innovators, let’s reverse this logic. Instead of extracting information out of the brain, we could upload information into the brain. We could import knowledge by translating them, via our machine learning data, into electric signals, then feeding them into the brain through electrical stimulation - with the very machinery used to detect our signals.

Theoretically, we would instantaneously understand new knowledge.

In this article, I will go through…

  1. The background behind BCIs and how they measure brain waves
  2. How information is interpreted from these brain waves
  3. How information could be uploaded to the brain (combining the two and reversing the logic)
  4. The future of this industry

What are BCIs?

Brain-computer interfaces (BCI) are systems that allow communication between the brain and various machines.

They work in three main steps: collecting brain signals, interpreting them and outputting the therefore desired commands to a connected machine.

They measure time and frequency domain to determine if an event is taking place (strength of waves, i.e., how synchronized or how many neurons fire in a similar pattern shows how strong the response is, and which groups of neurons are active shows roughly what the event could be)

There are three types of BCIs: invasive (implanted into brain during surgery) for single or multi unit cells, semi-invasive (implanted on exposed surface of brain) for cerebral cortex, and non-invasive EEGs (electroencephalograms, placed on our scalp). This is what we’ll be focusing on today.

How do EEGs work?

Electrodes (small metal discs) are placed on the scalp to pickup the electrical current generated by the brain.

The electrodes detect tiny electrical charges that result from the activity of your brain cells. The charges are amplified and commonly appear as a graph on a computer screen.

What causes these tiny electrical charges?

When our brain undergoes action (e.g., when we think of moving our arm), our neurons communicate with each other. They do this by firing signals across synapses (the gap between neurons) — when firing, a neuron forms a dipole: if excitatory, a lower voltage at synapses and higher voltage at the axon of the next neuron; if inhibitory, a higher voltage at synapses and lower voltage at the axon.

The voltage shift is caused by ionic channels. For example, in excitatory transmissions, Na+ channels open along the dendrite of the first neuron, causing a flood of positive electrons to move down the axon, opening more sodium channels, and causing an electric charge to carry down the axon, discharging at the synapse, releasing neurotransmitters along with it.

These changes are minimiscial— and we can only detect them from the scalp when a GROUP of neurons fire together!

The recorded data is then sent to amplifiers, then to a computer or the cloud to process the data. The amplified signals, which resemble wavy lines, can be recorded on a computer, mobile device, or on a cloud database.

an example of what these brain waves look like!

How can information be derived from these brainwaves?

1. Superficially — by frequency of waves

We can derive the level of consciousness or state of our brain by the frequency of brainwaves.

As explained, raw EEG waves are fluctuating electrical voltages in the brain. They can be distinguished by their differing frequencies. The definition of frequency refers to the speed of the electrical oscillations (Hertz — cycles/second).

(to put things into perspective, one voltage fluctuation measures at a millionth of a volt, and a typical lightbulb requires 110 volts to light up!

Back to frequencies, there are 5 main types of brain waves with distinct meanings.

  1. Delta waves (slowest). Generated in deep meditation and dreamless sleep.
  2. Theta waves. Generated in relaxation and vivid dreamy sleep — inner focus.
  3. Alpha waves. Generated during quiet, thoughtful times — resting.
  4. Beta waves. Most common and generated in waking state — alert and problem solving.
  5. Gamma waves (fastest). Generated in highest levels of consciousness.

By correlating each brain wave frequency group to what it’s associated with, we can get a pretty clear idea of the state of our brain at any given time.

2. In depth — by machine learning & spotting patterns in wave characteristics

But what if we want to go further than that? What if we want to know the specific thought or command our brain is executing at the time?

Enter: machine learning.

By trial tests where countless subjects wearing EEG headsets are ordered to think about a specific thought, scientists are able to track their brain waves (which are highly variable but with distinct characteristics) repeatedly until a pattern is to be found — between the brain activity and what they were ordered to think of.

patterns could be noticed (correlation between activity and brain wave) from large data sets!

Brain activity is measured using 4 characteristics:

  1. Frequency domain — how frequent the brain wave oscillations are
  2. Time domain — the brain activity in respect to time
  3. Strength of wave — how synchronized neurons are / how many neurons fired together in similar pattern
  4. Area of brain the activity came from

From this, as you can imagine, a very very large database can be derived, where a clear correlation between brain activity and thought is shown.

Therefore, in experimental cases, using the collected brain activity, programs are able to compare it to all brain waves in the database and conclude with the most probable thought match.

Let’s look at some of this technology in action!

NeuraLink Monkey Pager plays Pong with his mind

You know you’ve spotted the right field to delve into when Elon Musk has his own money upon it too. His newest venture, NeuraLink, has grew in popularity as Musk drives the technology further than it’s ever been before.

One of the biggest accomplishments of NeuraLink was back in 2021. Pager, a monkey, could control a game of Pong with his mind!

Pager the monkey playing Pong!

How does that work?

NeuraLink scientists implanted 2,000 electrodes, each the width of 1/10 a human hair, onto Pager’s brain. Pager went through countless trials wearing the electrodes where he operated a joystick to play the game of Pong (his incentive to move the joystick rightly was a banana smoothie).

Links — which amplify and digitize the brain signals — were placed bilaterally, one in the left motor cortex (controls right side of body’s movements) and another in the right motor cortex (controls left side of body). The motor cortices were shown to be active when Pager was thinking about moving his hand to control the joystick.

this is where the mortor cortex is!

And then, with the brain wave data from the trials, in the words of NeuraLink,

By modelling the relationship between different patterns of neural activity and intended movement directions, we can build a model via machine learning (i.e., “calibrate a decoder”) that can predict the direction and speed of an upcoming or intended movement given the brain wave.

For instance, the neurons with upward preferred directions clearly increase their firing rates as the monkey moves his MindPong paddle upward, and the ones with downward preferred directions increase their firing rates as Pager moves his paddle downward.

neurons with upwards preferred directions in blue, downwards preferred directions in red

Thus by these data points and patterns observed, researchers can derive what Pager’s brain is trying to do based off the brain waves.

Perri Karyal plays Elden Rings with her mind

Elden Rings!

We saw what a monkey can do, but can a human top that? This is definitely a question for Perri Karyal!

Elden Rings is a trending role-playing game with a focus in combat and exploration, and Perri, an avid gamer and BCI-enthusiast, has mastered the art of controlling her character with her mind through EEG and machine learning.

How does that work?

Perri used a technique known as motor imagery — involving using machine learning to train software to recognize average brain wave activity she generated while imagining herself moving her body in different given repeatable patterns. These patterns were distinct and therefore its EEG signals could be translated into inputs every time she imagined something.

I was visualizing pushing something forward in my head [to move my character forward]. The pattern of activity my brain happened to consistently make during that task is what the software learned to recognize.

Perri said, describing this process.

As an example, here is Perri’s brain wave data (expressed visually) when she imagines herself pushing something forward (moving character forward). Brain signals are more frequent (brighter in the diagram) as she is focusing hard. They are also focused in her temporal lobes.

brain signals when character moving forward

When Perri relaxes and imagines herself releasing the object (character shoots gun), her brain signals are less frequent (dimmer in diagram) and focused in her frontal lobes.

brain signals when character shooting gun

As you can see these are two very distinct brain wave patterns from which the computer was able to compare to its database and derive the wanted character action.

How can information be uploaded to the brain?

This is the exciting bit.

This is what we’ve all been waiting for.

What if we could take all this information about how to extract information from the brain, and flip it around? What if we could upload information into the brain by reverse engineering the electrical signals of knowledge, and feeding them back into the brain?

Going too quick? Here is the theory behind how it would work:

Via the same process of machine learning, we could utilize information about relation between thought and brain wave pattern — and feed in the electrical signals (that correspond to information we want to upload) — as electrical stimulation through the same EEG electrodes. Then, in theory, our neuronal network would adapt and our brain would instantaneously understand this new piece of knowledge!

To put this into context, let’s explore some current developments in this field!

Newbie pilots become instant pros in California

This revelation started in the HRL Information and System Sciences Laboratory in California. Researchers hoped to take the electrical signals and state of brain of trained, experienced pilots whilst they were flying, and use that to modulate the brains of novice pilots who had never done this task before.

pilots x BCI

How does that work?

where the EEGs were set up on the pilots

Using EEG electrodes, researchers measured and extracted brain waves of the experienced pilots while doing their tasks. They then fed the exact information, using a head-cap with conductive gel and EEG electrodes to apply current and stimulation to the skin, into a group of novice pilots before they proceeded to do the same task, in hopes that the knowledge was transferred into their brains.

the brainwaves captured form experienced pilots!

Guess what?

This group of pilots performed, on average, 33% better than the placebo group, who were novices going straight into the task! The effectiveness of this experiment was also proven qualitatively through the group’s higher levels of landing consistency as well as lower levels in skill variance than the placebo group.

This experiment proves that it is possible to feed information back into the brain via electrical stimulation, and that the brain does utilize its neuroplastic properties to welcome the new information — opening up a realm of possibilities for us!

Duke University rats communciate to each other with their minds

This groundbreaking experiment was completed all the way back in 2013, where researchers attempted to use brain to spine interfaces to communicate from rat to rat.

How does that work?

During this process two rats were separated and exposed to identical rooms, which consisted of a lever and an LED. They were required to complete a puzzle, which was to push the lever directly after a flash of light.

the experimental set-up! (encoder = first rat, decoder = second rat)

The researchers recorded the real-time activity of the first rat’s hippocampus as they went through the puzzle. Then, they directly inserted said brain wave activity into the next rat before they entered the room, in hopes that the first rat’s knowledge and brain state would be transcribed into the second.

Guess what?

From repeating this experiment, it was concluded that the second group of rats performed significantly better than the first group! The first group had a response latency of around 20.56 seconds, and the second group 13.59 seconds.

This experiment also proves the effectiveness of feeding information (or experiences!) into the brain via electrical stimulation.

What’s even cooler was that the researchers were able to create a functional neural link as an extension to this experiment! One of the researchers summarized:

The behavior of the decoders [second group of rats] after receiving this novel information resulted in the first rat getting a feedback signal which influenced its brain activity.

[The first rat could get the feedback signal because] We were able to create artificial sensations using electrical stimulation of the spinal cord in the first rat.

experiments on animals like rats are what’s advancing our BCI field today!

This very much foreshadows the future of BCIs: not only importing and exporting information to and from the brain, but also bi-directional information transfer communication systems! (more on this later)

Reaching a plateau…

You might be wondering, with a technology so exciting and with so many possibilities, why BCIs aren’t more of a hot topic these days!

That’s because the development of information transfer into the brain has reached a plateau — after 2020, when Duke University’s brain-to-spine interface rat communication experiment was published in Scientific Reports.

Why might that be? These are some of my thoughts…

1. We do not yet know enough about extracting information from the brain.

To progress further in this field (uploading information), we need to further advance in the basic foundation from which this technology is derived from. At the moment, even the best information extraction strategies from the brain are 70–80% accurate at best, and they can be classified with “educational guesses” based off machine learning. Therefore each piece of information is not specific (i.e., our conclusion is the best guess based off our database and current knowledge, not necessarily the exact right one). How are we supposed to import information when the methods of extraction aren’t perfected yet?

That leads us to the second point…

2. Extraction is action-based. Importing needs it to be knowledge/information-based.

Have you noticed something from all the case studies listed? Both scenarios where we’re extracting information from the brain has the “thought” being an action — whether it be Pager the monkey moving the joystick in a particular direction, or Perri thinking of moving her character forwards or shooting a gun. But where we’re trying to upload information into the brain, what we need to feed in is not the an action we’re thinking of doing, like in extractions (that would be pointless). We want actual knowledge and information to be fed in.

That’s why both scenarios we’ve delved deeper into, they’ve simply taken the brainwaves from one subject and fed it into another. We do not yet know how to convert the knowledge/information we want to feed in into electrical signals or brain waves.

And to know how to do so, we would need an infinitesimally large database to convert between thought and brain wave. It’s easy to do so for a small selection of actions, but to scale that to, for instance, the combination of possible English words, or knowledge of everything in the world, that’s going to take A LOT more work.

So what? Why does this technology matter?

Need I say more?

round of applause — hahah!!

With technology to upload information to our brain at will, we as humans would have unlimited knowledge and communicational power within our reach.

We would be able to instantaneously understand or experience any knowledge.

We would also be able to effectively communicate with others or computers by absorbing the information directly into our brain.

Our brains would become supercomputers. We would become superhumans.

To conclude…

As we break out of the bottleneck stage in this industry of BCI information uploading, there is going to be a lot of competition to advance this technology. (We can almost count on Elon Musk and his legacy being involved in it too!)

This is the stuff our future of made of, the concepts sci-fi movies rely on all the time. And it’s up to us to pave the way to this future.

Nexts steps: some of my thoughts

There is still some accuracy and reliability work to be done in terms of uploading information into the brain. As seen from our case studies, the impact of transmitting brain waves to the second group of test subjects is noticeable, but not huge.

I wonder if feeding in the information multiple times by electrically stimulation would cause the brain to welcome it more — and have a bigger impact on the brain’s neurons via neuroplasticity?

EEG brain data is just the beginning. EEG is not as accurate nor as efficient as (semi-) invasive techniques.

As this technology progresses it’s almost certain that there will be a transfer onto more invasive techniques in brain information uploading.

Some hugely exciting possibilities that come with this technology include bi-directional information transfer, where when we perfected feeding information into the brain, we are able to communicate with computers or other humans back and forth seamlessly! There will be a decrease in need for spoken or written language following this though, which might be a cause of controversy.

say woohoo with the B99 crew!

Our takeaways from today:

We can can flip the very foundation BCIs are set upon (extracting information from brain) to achieve sci-fi level and further push humanity’s productivity (uploading information to brain).

Examples of current information uploading experiments include California pilots and Duke University’s rats.

The idea of feeding information into our brain is still in its infancy now — but is almost certain to bloom once we break out of the shell.

Along with the industry leaders, we’ll do all we can to progress this technology, as it holds the power to change our entire future!

References

Boi, Fabio, et al. “A Bidirectional Brain-Machine Interface Featuring a Neuromorphic Hardware Decoder.” Frontiers, Frontiers, 22 Nov. 2016, www.frontiersin.org/articles/10.3389/fnins.2016.00563/full.

Crew, Bec. “Scientists Claim They’ve Invented a Matrix-Style Device That Instantly Uploads Skills to Your Brain.” ScienceAlert, 2 Mar. 2016, www.sciencealert.com/sorry-guys-scientists-haven-t-invented-a-matrix-style-device-that-instantly-uploads-data-to-your-brain.

“Electroencephalogram (EEG).” Johns Hopkins Medicine, 8 Aug. 2021, www.hopkinsmedicine.org/health/treatment-tests-and-therapies/electroencephalogram-eeg#:~:text=An%20EEG%20is%20a%20test,activity%20of%20your%20brain%20cells.

“How I Play Elden Ring with My Mind (EEG).” YouTube, YouTube, 1 Feb. 2023, www.youtube.com/watch?v=rIbfNUA5pWk.

Studio, Play. “Pager Plays Mindpong: Blog.” Neuralink, neuralink.com/blog/pager-plays-mindpong/. Accessed 23 Oct. 2023.

--

--

Giselle Chan

Neuroscience x Computer Science 🧪 🧠 💻 || Innovator @ TKS