Introducing Google’s Project Soli


Meet project soli, a sensing technology that uses miniature radar to detect touch less gesture interactions. The project is being developed by Google’s Advanced Technology and Projects group.

The Soli chip incorporates the entire sensor and antenna array into an ultra-compact 8mm x 10mm package. It tracks sub-millimeter motion at high speeds with great accuracy. The purpose-built interaction sensor uses radar for motion tracking of the human hand.

The device emits a broad radio beam and then collects information including return time, energy, and frequency shift to gain an understanding about the position and movement of objects in the field. Properties of the reflected signal, such as energy, time delay, and frequency shift capture rich information about the object’s characteristics and dynamics, including size, shape, orientation, material, distance, and velocity.

Your hands and fingers are the only interface you’ll ever need

This ubiquitous gesture interaction language will allow people to control devices with a simple, universal set of gestures.

Interactions the team is developing and imagining: From left: 1. an invisible button between your thumb and index fingers that you can press by tapping your fingers together. 2. Virtual Dial that you turn by rubbing thumb against index finger. 3. a Virtual Slider in thin air that you imagine rabbing and pulling

Feedback is generated by the haptic sensation of fingers touching each other. Without the constraints of physical controls, these virtual tools can take on the fluidity and precision of our natural human hand motion.

Even though these controls are virtual, the interactions feel physical and responsive.

Soli gesture recognition

The Soli software architecture consists of a generalized gesture recognition pipeline which is hardware agnostic and can work with different types of radar. The pipeline implements several stages of signal abstraction: from the raw radar data to signal transformations, core and abstract machine learning features, detection and tracking, gesture probabilities, and finally UI tools to interpret gesture controls.

The fully integrated, low-power radar Soli system operates at 60 GHz ISM band, with sensors that uses two modulation architectures: Frequency Modulated Continuous Wave (FMCW) radar and a Direct-Sequence Spread Spectrum (DSSS) radar. Both chips integrate the entire radar system into the package, including multiple beamforming antennas that enable 3D tracking and imaging with no moving parts.

Soli has no moving parts, it fits onto a chip and consumes little energy. It is not affected by light conditions and it works through most materials.

The Soli chip can be embedded in wearables, phones, computers, cars and IoT devices in our environment.

The Soli SDK enables developers to easily access and build upon our gesture recognition pipeline. The Soli libraries extract real-time signals from radar hardware, outputting signal transformations, high precision position and motion data, and gesture labels and parameters at frame rates from 100 to 10,000 frames per second.

Alpha Developers Showcase

See what developers have built with Soli Alpha Dev Kits!

Clearly, this is a remarkable feat, a cutting-edge gear and although the fruits are still yet to be seen, the good news is that Infineon is slated to bring the sensors to market sometime this year. Planned early applications include a smart watch and a speaker that both respond to gestures using the technology.

Like what you read? Give Ngesa Marvin a round of applause.

From a quick cheer to a standing ovation, clap to show how much you enjoyed this story.