Sensing through the whole spectrum

Alexandre Winter
Norbert Health
Published in
4 min readMay 19, 2021

Perceptive AI relies on vital sensor data — cameras, microphones, and more — and there are brand new types of sensors available now that are accessible, affordable and that take perception to a new level. At Norbert we’ve built one of the first devices that combines many of these sensors in one piece of small hardware to detect how the body is functioning, and more.

The future of AI is perceptive AI. Human intelligence is deeply related to perception: 50% of our neurons are used for it. It is a critical ability for humans to survive, to thrive, and become self conscious intelligent beings. So mastering perceptive AI is a critical path to general AI. Researchers know this. 60% of AI papers over the past three years represent work dedicated to it, largely focused on vision, speech, recognition and haptics.

There is a revolution in building in perceptive AI. And it is fueled by sensors and the data coming out of them.

The full spectrum. Three of our five senses are based on vibrations and frequencies: air and pressure vibrations for hearing, electromagnetic vibrations for sight, and even vibrations of your skin on material for touch. Classic sensors focus on light and sound — the good old 0.400–0.750 microns visible light spectrum or the good old 300–3000 Hz audio spectrum.

Today, several types of new sensors are available and affordable, and span a larger part of the electromagnetic spectrum. This means more data and learning. It means new AI models. It means exciting new algorithms and results. It means better business for AI. It opens a sea of possibilities and fulfilled promises.

How does this work? If we start at the top—the visible light spectrum—and then move down one notch we arrive at the wave infrared (LWIR) band: the 8-14 micron spectrum reveals vital information about the level of heat a body is emitting. This is useful not only in measuring surface emission temperature, but in seeing blood vessel patterns — and identifying people with that.

Honing into millimeters, we find the exciting low energy microwaves radar band. Recent progress in autonomous cars and industrial applications has boosted the availability of FMCW radar imagers, magical devices that can build a 3D image of anything that reflects the radar waves. It can see through some objects (like clothing at an airport) and indicate the speed of these objects. It’s truly 4D imaging: x, y, z and speed.

Microwave imaging in the millimeter wave band

At the end we find the centimeter wavelengths: radar that provides 4D imaging and can see through walls. A 50cm wide concrete wall becomes a clear window. Detect the movement (like falls) of people within 10 meters in completely different rooms. Easily see if they are moving, and how. Even the IEEE is preparing for this advancement with new WiFi sensing standards launching in 2024.

A man seen by Wifi
Through wall imaging with dedicated cm wave radar

The challenge is that there haven’t been enough of these types of sensors to collect data and ultimately test and train neural nets to process them. Even further removed was the possibility of fusing them all together, so they supercharge each other in sync.

Until now. Norbert Health has combined all of these sensors and big compute, in one device. It spans the entire spectrum: sound, centimeter wave, millimeter wave, LWIR and visible light.

Sound, Radar, LWIR, NIR, visible light — Norbert has it all.

We can do it all, in one clean, simple and small box. We’re also already gathering massive amounts of important data to train our systems to process and better combine these sources.

And we are accelerating this by creating massive training sets in no time that automatically annotate our data. Our camera can create a data set to teach our radar to recognize people or their posture. If we add a wall, the camera will no longer work, but the radar will see through it.

Layer that with a myriad of features: a device that can measure body temperature, cardiac activity, breathing rate and depth, blood oxygenation level, blood pressure and more. It does it all with zero contact, up to 6 feet away. Imagine all of the ways this can transform how we understand and take care of our bodies. It’s an exciting time.

We’re just getting started. We’d love to tell you more, or have you join us in building the future of perceptive AI and healthcare.

--

--