You can now think to scroll

Alex Castillo
Neurosity
Published in
4 min readJul 10, 2019

--

Every thought, feeling, and movement starts in your brain.

Scrolling on a screen is how we navigate, how we consume content, and how we learn. But how much do we navigate, consume content, and learn? According to recent studies by Nielsen, the average person spends over three hours a day on their device.

But what do all hand gestures have in common? They originate in the same part of the brain: the motor cortex. It’s the hub for all motor output. Every scroll, click, drag, and press starts there.

The video you are about to watch may appear to be taken out of a sci-fi movie. But believe me when I say this, it is not.

Think to Scroll

To understand how it is possible to use mind control, we need to appreciate the beautiful complexities of the human brain and the state of today’s technology. Allow me to walk you through this video.

She walks in the room. It seems quiet, but billions and billions of neurons inside her brain are firing. These neurons orchestrate all movement. They’re her steps, her breaths, they’re the lift of her hands. The conductor to this orchestra is the motor cortex.

“The motor cortex is the region of the cerebral cortex involved in the planning, control, and execution of voluntary movements.” — https://u.nu/motorcortex

After four seconds, she reaches for the Notion, an observable movement. But, what we don’t see is that a second before, her brain decided to make that movement.

After six seconds, Notion is worn. This changes everything. As its sensors make contact with the scalp, the electrical activity produced by her neurons are captured and instantly received by the N1 chipset inside Notion. The electrical activity is measured in microvolts. This happens 250 times per second. This cycle happens enough times during a given second that over 2,000 different numbers representing her brain activity exist.

View From the N1 Chipset

At first glance, these numbers may appear to be random. They are anything but.

The N1 chipset does several actions nearly instantaneously: it takes in raw brainwaves, applies a series of filters, extracts features, and predicts the probability of a thought using a classifier. The probability is all that remains, the metadata. Raw brainwaves are never saved.

22 seconds into the video, she engages her motor cortex. This time in a different way. We usually engage our motor cortex by planning, controlling, and executing movement. Mind control skips the execution part. This is an instance when less is really more.

At this time, she is recalling a previously trained hand gesture: a pinch with her fingers. This happens without removing her hands from the bowl to touch the screen. Just one thought.

This is exactly the case 22 seconds in. Notion detects her intent, and creates metadata that contains this information.

A split second later, thought-based intent turns into reality. As the mobile device receives the intent command created by Notion, the imagined hand gesture pattern is associated with a computer command. In this case, a scroll down event is executed. This happens repeatedly. She scrolls the page, allowing her to continue reading without lifting a finger.

33 seconds in, she stops imagining the hand gesture associated with scrolling and quickly realizes the power of her mind.

This is the machine learning behind Notion. Using software to imitate how the brain works without depending upon the body, is exactly what empowering the mind is all about.

When measuring brain activity in the motor cortex, motor-based intentions can be detected. Like the intent of moving your hand. As it turns out, without moving your hand, but by thinking about moving your hand, the motor cortex will plan that movement and produce enough electrical activity that can be detected by Notion.

Every brain is different, that’s the beauty of it. Detecting your interpretation of a given thought requires training. This is something we are already familiar with. We do it with our smartphones when we enable Touch ID or Face ID.

With advances in artificial intelligence, portable brain imaging, and powerful microchips, we are able to derive more quantitative information about our minds than ever before. Think to scroll is just one of them. As computers and wearables become more personal, I can’t help but wonder: What could we accomplish if our bodies didn’t stand between us and our ideas? What’s the next generation of art, music and architecture going to be like? What could we build if we could create what we see and feel in our minds?

At Neurosity, we believe in empowering the mind. One of the ways we do that is by creating thought-powered devices that allow you to communicate and understand you. We are releasing a Developer Kit and can’t wait to see what you’ll be building with it.

Join the waitlist

--

--