Multipath Interference in Indirect Time-of-Flight Depth Sensors

Refael Whyte
Chronoptics Time-of-Flight
10 min readMar 22, 2021

--

Indirect Time-of-Flight (iToF) depth sensors measure the distance from the scene to every pixel of the camera simultaneously. This is done by illuminating the full scene with a flash of light and measuring the phase difference between the emitted light and its reflection.

Multipath Interference (MPI) is caused by multiple paths from the illumination source to the same pixel. This is very similar to multipath propagation in radio communication. Multipath interference can cause significant measurement errors in iToF cameras. We explore the cause of MPI, how it affects the depth measurement, and discuss methods to minimize and resolve multipath interference.

In this article we cover

  • What is multipath interference
  • The mathematics of multipath interference
  • Ways to reduce multipath interference
  • Methods to solve multipath interference

About Chronoptics

Chronoptics are experts in designing indirect Time-of-Flight depth sensing modules, focusing on the processing from the image sensor to the point cloud, and tailoring each module to output the best point cloud for a given application. For advice on correcting multipath interference in your Time-of-Flight camera contact us at hello@chronoptics.com

What is Multipath Interference

Multipath interference is caused when there are multiple return paths from the laser source to a single pixel. There are numerous scenarios where this can occur, and the figure below shows five different causes.

In the figures below the blue ray indicates the direct path, the distance we want to measure, and the pink ray indicates the interfering sources.

Case a) is the ideal scenario, a single path from the laser to the pixel. Case b) is where there is specular inter-reflections in the scene between objects, this could be due to objects like a disco ball in the scene. Case c) is when there is translucent objects, there is a return from the translucent object, and what is behind it. Case d) is diffuse interreflections in a corner, there are many paths back to the same pixel. Case e) is caused by lens flare (ghosting), where by scattering inside the lens, both between elements of the lens, and between the image sensor and last optical element, normally a notch filter. Case f) is from bulk diffuse scatters, such as fog caused by Rayleigh scattering.

The causes of multi-path interference. The dark blue line is the desired return to measure, and the pink lines are the interference source. a) the ideal case of no MPI. b) MPI is caused by inter-reflections in the scene. c) MPI is caused by translucent objects. d) Diffuse interreflections, such as in corners. e) MPI caused by lens flare, interreflections inside the lens cavity. f) MPI caused by bulk diffuse scatters, such as by fog (water vapor) or by smoke particles.

The multipath interference can be represented as a diagram of time versus intensity. Where time is the delay between the laser pulse emission and being received by the pixel, and intensity is the number of reflected photons. In the ideal case there is a single return, in mathematical terms this would be a Dirac function. The figure below plots the time versus intensity profiles of the above causes of multipath interference, with the blue being the direct return and the pink being the multipath interference.

The time versus intensity profile of multipath interference. a) The ideal case, no MPI only a single return. b) The second return arrives after the first. c) The translucent sheet creates a brighter closer return. d) The corner creates diffuse returns over time. e) The lens flare creates returns based on the scattering. f) The bulk diffuse scatters before and after the desired return, hiding the true return in the scattering.

Multipath interference can be caused in multiple different ways. The reason why it is an issue is because it causes measurement errors in time-of-flight depth sensors.

Multipath Interference Mathematics

Indirect Time-of-Flight depth sensors measure the phase and amplitude of the reflected signal for each pixel, this can be represented as a vector on the complex plane, with the magnitude of the vector encoding how much light was reflected, and the phase the distance the light has travelled. The complex measurement zeta can be written as

Where each pixel measures a complex value. Multipath interference can be represented as the sum of each path

Where there are N discrete paths. This model works well when there are a limited number of paths, such as in the case of a translucent sheet c), and specular inter-reflections in b). However in the case of diffuse inter-reflections in a corner d) or of bulk scatters f) an integral would be a better representation. We categorize these into sparse multipath interference and diffuse multipath interference.

With the mathematics of multipath interference explained, we can simulate the error caused by a constant offset vector on the complex plane, as animated below.

Animation demonstrating the error caused by a constant smaller cause of multipath interference, represented as the green vector. The blue vector is the input distance that changes from 0mm to 3000mm, so rotates from 0 to 2pi. The pink dot is the resulting sum of the two vectors, and therefore the measured distance. A small amount of multipath interference can cause significant measurement errors.

Solving Multipath Interference

We have described how multipath interference is formed, and the mathematical forward model that describes it. Multipath interference is difficult to solve as for a single measurement there are infinite combinations of vectors that sum to that result, as demonstrated in the animation below. The sum of vectors on the complex plane is a trapdoor function.

Why Multipath interference is so difficult to solve. The red vector and blue vector always sum up to the same value, the green vector. There are an infinite number of combinations of vectors that combine together into the final measured value.

Minimize Multipath Interference

The first method of resolving multipath interference is to setup the scene to minimize the amount of multipath interference being caused.

When positioning the camera in the scene do the following:

  • Removing bright close objects, use a tripod to mount the camera above any bright surfaces, like table tops.
  • Position the camera away from any corners or translucent objects.

A common cause of multipath interference is lens flare, where a highly reflective object (very bright) scatters onto neighboring pixels because of inter-reflections in the lens. Lens coatings on the optical elements, and coatings inside the lens barrel can be used to minimize these reflections. The lens coatings depend on the wavelength of light being used. This is why lens vendors have ToF specific lenses.

Time-of-Flight specific lenses from various manufacturers. The green or blue hue is from the specific anti-reflective lens coating to reduce lens flare and therefore multipath interference.

Rayleigh scattering and sub-surface scattering is highly dependent on the wavelength of light. Short wave infrared (SWIR) due to its wavelength is less affected by fog, and a common reason why many automotive imaging applications use wavelengths between 1000nm and1600nm. Artilux is an indirect Time-of-Flight sensor developer, that is using Germanium-silicon (Ge-Si) pixels which has high quantum efficiency at SWIR wavelengths. An example of SWIR imaging “seeing through” fog is below.

Example of a visible camera image versus a SWIR image as demonstrated by Trieye. Trieye develop CMOS based SWIR image sensors, and are targeting the automotive market.

Subsurface scattering occurs when light penetrates the surfaces of the objects, bounces around inside and exits the object. This is common in biological objects, such as apples and tomatoes in agriculture and human tissue in medical imaging. The wavelength of light directly affects the amount of subsurface scattering, and a wavelength should be selected to minimize it for more accurate distance measurements.

The time-of-flight module design also can contribute to multipath interference. Avoid creating any indirect paths from the illumination to the image sensor. For example the OPT8241 sensor from Texas Instruments is not backside illuminated, and requires via’s on the PCB, if these via’s are not capped or filled then photons can travel from behind the PCB into the image sensor! The same with front coverings on camera modules, if a single acrylic plate is used it can act as a waveguide and as an indirect path from the laser to the image sensor.

Example of how a front plate of a camera module can cause multipath interference by acting as a waveguide. For best design practices use different sheets of acrylic and physically separate them so no multipath interference is created.

Chronoptics MixedPixel

Chronoptics has developed a method call MixedPixel to resolve multipath interference. It works by taking measurements at two modulation frequencies, the ratio of the amplitudes, and the differences in the phase for indices for a 2D look up table. The values stored in the look up table is used to compute the multipath interference.

Using measurements at two modulation frequencies, 50MHz and 100MHz. In the left animation the two vectors always sum to the same vector, but for the right hand measurements at 100MHz does change.

The video below demonstrates MixedPixel in action in three different use cases. Resolving lens flare caused by retro-reflective objects, such as high-vis safety gear, multipath interference caused by glass windows.

This video demonstrates MixedPixel in three different settings, resolving multipath interference caused by lens flare and by a translucent sheet. The hi-vis jacket causes lens flare because of the extremely bright reflection and the glass windows causes multipath. MixedPixel improves the measurement accuracy enabling time-of-flight depth sensing in a variety of different applications.

MixedPixel works well for sparse multipath interference, cases b, c, and d. Which are specular inter-reflections, lens flare and translucent objects.

Resolving lens flare caused by retro-reflectors enables time-of-flight depth sensing to be used in places where retro-reflectors are common, such as where personal protective equipment (PPE) is commonly worn which includes

  • Warehouses, ToF for robotic navigation, volumetric scanning of goods.
  • Factories, ToF for tracking people and emergency shutoff of equipment if people get too close
  • Construction and industrial sites, can use ToF for Face ID and contactless terminals.

MixedPixel also works well at measuring the reflection off a translucent surface, and the reflection behind it. This enables robotic navigation where glass sheets are present.

MixedPixel has the following constraints

  • Require two modulation frequencies, with frequency ratios of 1:2 or 1:3, so for example 50MHz and 100MHz, or 50MHz and 150MHz can be used.
  • Assumes sparse multipath interference, with two returns.

Melexis has licensed MixedPixel for use in automotive applications.

Multi-Frequency Methods

Indirect Time-of-Flight is also called Amplitude Modulated Continuous Wave (AMCW), as this is the modulation scheme used by indirect ToF sensing. Another sampling scheme is stepped frequency continuous wave (SFCW), in this scene the modulation frequency is changed between each raw sample.

The frequency of the raw samples encodes the distance to the object, the higher the frequency (more cycles) the further away the object is.

The main issue with SFCW is how to measure the frequency of the signal, which is a spectrum estimation problem. When multipath interference is present in SFCW signals another sinusoidal signal is present, but with a different frequency. Whyte et. al [1] used the MUSIC spectrum estimation method to measure distance using SFCW measurements and resolve multipath interference.

The stepped frequency continuous wave (SFCW) sampling can be expanded to complex numbers. When each frequency sample is a complex value because the phase has been measured as well. This is the problem of estimated exponentials in a signal, and Prony’s method was developed in 1795 to solve this problem.

Prony’s method has been expanded upon since 1795, and papers as recent as 2015 have further developed solutions to the problem [4]. The Matrix Pencil method has been used to estimate the parameters of a sum of complex exponentials [2], and was used by Bhandari et. al [3] to resolve multipath interference in ToF depth cameras. Kirmani et. al [5] used Cadzow denoising to solve the same equation to resolve multipath interference in ToF sensors as well.

Multi-frequency methods have been expanded for measurement of light transport, known as transient imaging. Heide et. al [9] used over 100 frequencies to resolve non-sparse multipath interference and visualize it.

Heide et. al [9] demonstration of transient imaging using indirect time-of-flight depth sensors.

Other multi-frequency techniques that are not directly based on spectrum estimation were done by Godbaz et. al [7], that worked on a closed form solution for two returns with four frequency measurements. Freedman et. al [6] used three modulation frequencies and a 4D look up table (LUT).

There are three issues with multi-frequency approaches to solving multipath interference.

  • The assumption of sparse multipath interference, a few discrete return paths.
  • The number of raw image sensor frames required.
  • The computational power required.

A single frequency measurement requires at least 3 raw frames. Chronoptics MixedPixel requires two frequencies, so a minimum of 6 raw frames. Freedman et. al [6] requires 3 frequencies, so a minimum of 9 raw frames. Matrix Pencil and Cadzow denoising require a minimum of 5 frequencies, therefore 15 raw frames. This limits the depth frame rate of the camera, and introduces issues with motion blur.

Most indirect Time-of-Flight image sensors range from QVGA (240x320) to VGA (480x640) pixels. Each pixel computes its own multipath removal method. The computation cost of iterative methods, like Matrix Pencil, are prohibitively expensive for many indirect Time-of-Flight depth cameras.

Raytracing

Raytracing can be used to simulate light transport in scenes to understand what is causing multipath interference. Microsoft have published a paper on ray tracing for time of flight, and Meister et. al [8] have published in the same field.

Conclusion

Multipath interference causes measurement errors in indirect time-of-flight depth sensors. Multipath is caused by multiple reflection paths between the light source and a given pixel, and there are many ways this can occur. We have explored various multi-frequency approaches, in particular Chronoptics’ MixedPixel method. MixedPixel works well for resolving multipath interference with a balanced tradeoff between performance, computational power and frame rate, and therefore is a suitable production-ready solution for embedded ToF camera systems.

To discuss the best method for your application, reach out to the Chronoptics team at hello@chronoptics.com

References

[1] Whyte, R., Streeter, L., Cree, M. J., & Dorrington, A. A. (2015). Application of lidar techniques to time-of-flight range imaging. Applied optics, 54(33), 9654–9664.

[2] Sarkar, T. K., & Pereira, O. (1995). Using the matrix pencil method to estimate the parameters of a sum of complex exponentials. IEEE Antennas and Propagation Magazine, 37(1), 48–55.

[3] Bhandari, A., Kadambi, A., Whyte, R., Barsi, C., Feigin, M., Dorrington, A., & Raskar, R. (2014). Resolving multipath interference in time-of-flight imaging via modulation frequency diversity and sparse regularization. Optics letters, 39(6), 1705–1708.

[4] Condat, L., & Hirabayashi, A. (2015). Cadzow denoising upgraded: A new projection method for the recovery of Dirac pulses from noisy linear measurements.

[5] Kirmani, A., Benedetti, A., & Chou, P. A. (2013, July). Spumic: Simultaneous phase unwrapping and multipath interference cancellation in time-of-flight cameras using spectral methods. In 2013 IEEE International Conference on Multimedia and Expo (ICME) (pp. 1–6). IEEE.

[6] Freedman, D., Smolin, Y., Krupka, E., Leichter, I., & Schmidt, M. (2014, September). SRA: Fast removal of general multipath for ToF sensors. In European Conference on Computer Vision (pp. 234–249). Springer, Cham.

[7] Godbaz, J. P., Cree, M. J., & Dorrington, A. A. (2012, February). Closed-form inverses for the mixed pixel/multipath interference problem in amcw lidar. In Computational Imaging X (Vol. 8296, p. 829618). International Society for Optics and Photonics.

[8] Meister, S., Nair, R., & Kondermann, D. (2013). Simulation of Time-of-Flight Sensors using Global Illumination. In VMV (pp. 33–40).

[9] Heide, Felix, et al. “Low-budget transient imaging using photonic mixer devices.” ACM Transactions on Graphics (ToG) 32.4 (2013): 1–10.

Cover image for the blog post.

--

--

Refael Whyte
Chronoptics Time-of-Flight

I'm a cofounder of Cambridge Terahertz, we are bringing solid state high frequency radar for security screening to life.