Airy3D: A Straightforward Depth-Sensing Platform

Asgardia.space
Asgardia Space Nation
3 min readNov 2, 2018

Airy3D is a company that specializes in 3D Sensing. Their product, DepthIQ is a versatile and straightforward depth-sensing platform that acts as a solution that is far more computationally efficient than other methods, while also significantly reducing costs. What’s more, DepthIQ is “sensor agnostic,” which means it can be customized to any given CMOS sensor specification.

Asgardia aims to keep an eye on all startups and companies that could prove useful as we work towards our goal of setting up habitable platforms in low-Earth orbit.

As the company explained on their website, active solutions that use structured light and time-of-flight measure depth but do not produce 2D images, while, multi-camera solutions, such as stereo and depth-by defocus depend on light intensity data they can only create 2D images but do not directly measure depth.

Moreover, these devices all require multiple components and heavy computation which results in significant compromises. However, by using no sensor technology, DepthIQ can deliver both depth and 2D images in one single device.

The advantages of this product begin with light itself. To craft the perfect 3D sensing solution Airy3d uses a single-sensor system that directly measures the entire light field without excessive computational or manufacturing complexity.

Airy3D’s DepthIQ platform employs a Transmissive Diffraction Mask (TDM) to directly produce a unique dataset of an inherently integrated 2D image and depth information. The company’s proprietary computational imaging algorithms bring this uniquely complete and compressed data set to life, getting rid of the computational complexity involved in image reconstruction and depth mapping.

Disparate from infrared or multi-camera solutions, DepthIQ provides Single Sensor and 3D Solution. DepthIQ is underscored by a transmissive diffraction mask (TDM) made with standard semiconductor technology.

TDMs take advantage of diffraction, which inherently unveils the phase and direction of light to measure depth directly. What’s more, diffraction is conservative with no light loss. This one-of-a-kind solution patented worldwide can transform any CMOS image sensor into a 3D sensor for cameras used in various cutting-edge applications like AR/VR, ADAS, drones and other UAVs, robots, and IoT, in addition to next-generation smartphones.

DepthIQ also offers much lower hardware and computational costs. The product is a drop-in solution for existing CMOS sensors and no change to other hardware or assembly is necessary. Computational processing is fast and simple and uses minimal power. Both the image and depth information are captured simultaneously without any comparative analysis of multiple images or complex sensor fusion algorithms. In comparison, competing solutions need several components (ex: infrared emitters and receivers; multiple cameras) that can increase the cost of materials, resulting in manufacturing complexity and high computational demands.

Lastly, DepthIQ understands that in today’s market customers expect their smartphones to boast the latest features but still be small enough to fit in their pockets. This often places a heavy burden on end manufacturers and their suppliers to deliver the highest quality and functionality with the fewest number of parts. However, DepthIQ makes sure to reduce the number of components and manufacturing complexity, while ensuring that it is still customizable to any given sensor specification and does not increase the height of the camera sensor stack.

For more information follow them on: Twitter and LinkedIn.

If you’re passionate about technology, space and business then join Asgardia todayand network with like-minded people.

When preparing news, materials from the following publications were used:

http://www.airy3d.com/

http://www.airy3d.com/wp-content/uploads/2018/05/Airy3D-DepthIQ-Technology-Brief.pdf

The post Airy3D: A Straightforward Depth-Sensing Platform appeared first on Asgardia Space News.

Join us and learn morespace: Asgardia Space News

--

--