Mind-Reading Technology: What Would That Mean For Humanity?

If we could have an interface with artificial intelligence that could read our minds, a case where we could be immortal could exist. How would that change humanity and what would being human mean?

Merging the human brain with a computer would truly change our species forever. Researchers are developing technology that can transfer data between computers and our brains and even read people’s minds. Detecting brain waves and tracking electrical pulses within the neurons of our brains are some of the few things available right now. Researchers are using all of the available information, however vague, to aid disabled people and anyone that is in need and make life easier.

16-year-old Alex Pinkerton, the Co-Founder and CEO of Brane Interface, is one of the people leading this. “Really, all that we’re working on is brain-computer interface that utilizes graphene and hopefully if our math is correct, it’ll be sensitive enough to read the magnetic fields of human thought.”

It was in the 1970s that the Department of Defense first started funding brain computer interface research. That market is expected to reach a value of 1.72 billion dollars by 2022. Elon Musk and Facebook have teased their involvement in the market while other companies like Control Labs are showing more concrete evidence.

CTRL Labs created a wristband that measures electrical pulses sent from the brain to the arm’s neurons, giving the user control over a computer. MIT and The University of California in San Francisco are the two universities at the forefront of this research. So what is a brain-computer interface?

It is a way in which a computer can take information directly from the brain without you having to type or speak it in and translate that into some kind of action.

That’s what Pinkerton is working on; a connection between the brain and a device. Like your phone or a prosthesis. He was first inspired by his dad who works on clean energy.

“My dad came to talk to our class when i was in third grade about graphene, just to give a little presentation. I’m not sure why, but that sort of sparked my interest and i had just been thinking of like, why isn’t this being used everywhere. This is the perfect material. And so I started thinking of applications, and at first it was mainly like for the military or something. Now, it’s sort of focused away from that and to well, the brain-computer interface and VR.”

For the past few years, he has been spending his holiday breaks and the occasional weekend in his dad’s lab working on his graphene brain-computer interface. Graphene is an almost impossibly thin layer of carbon only a single atom thick.

The goal is to have a computer-brain interface that is small enough to fit in an earbud or the inside of a hat that will allow users to use thoughts to control physical devices like playing music on their phones or to control a prosthesis.

Elon musk’s version of this technology might be one of those skull-opening options. Neuralink, a company co-founded by Musk is working to add a digital third layer above the cortex that would work well and symbiotically with you.

“The purpose of Neuralink is to create a high-bandwidth interface to the brain such that we can be symbiotic with A.I.” (At the Joe Rogan podcast)

The Neuralink website has been little more than job applications for a while and an update has been teased as “coming soon” for months. But this technology would supposedly require invasive surgery.

And the envelope keeps getting pushed further, a recent breakthrough at The University of California in San Francisco showed how researchers can read the brain’s signals to the larynx, jaw, lips and tongue, and translate them through a computer to synthesize speech.

In 2018, MIT revealed their Alter Ego device which measures neuromuscular signals in the jaw to allow humans to converse in natural language with machines simply by articulating words internally. But how can you recognize certain brainwaves? How can you filter out the play music command over the constant noise of thoughts and brainwaves?

See, the examples from The University of California in San Francisco and MIT, both studies focused on computers working out a person’s intentions by matching brain signals to physical movements that would usually activate in a person’s vocal tract or jaw. They’re using signals that would usually trigger muscles to simulate what the body would do. The deep, internal thoughts and processes within our brain are still quite elusive.

That is exactly Pinkerton’s intent. You can teach the brain to do specific things over and over again. This is not entirely new information, we’ve been hooking brains up to machines to read electrical activity since the 1920s and brain-computer interfaces?

People have been working on these tool since the 1970s and there are still a lot of hurdles in terms of making them commercially available. For a lot of these tools, you actually need to sit very still and that doesn’t have a lot of real world applications. That limitation is something engineers need to face and try to distinguish signal from noise when the person is moving. That is the major problem.

However, the interest on commercial companies has been rising every year. A product that allows this to happen has been in this community for a while. An example is virtual reality, that is a clear way in which a BCI might be helpful.

Brane interface said it has been approached by several technology and investment companies but plans to finish its prototype later this year before seeking those opportunities.

Written by

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store