The Chronoptics PixelScope

Refael Whyte
Chronoptics Time-of-Flight
9 min readDec 2, 2020

The Chronoptics PixelScope is a tool, analogous to an oscilloscope, for measuring the true optical response of indirect Time-of-Flight (ToF) sensor pixels. This tool provides insight into the internal workings of time-of-flight pixels and allows the user to “see inside” the silicon and explore the pixel array and associated systems and configurations allowing for advanced characterisation and calibration.

About Chronoptics

Chronoptics designs indirect Time-of-Flight depth sensing modules, focusing on producing the best point cloud for a given application. For advice on sensor characterisation and calibration reach out to, hello@chronoptics.com

Introduction

Time-of-flight cameras measure distance for each pixel indirectly by measuring the phase change between outgoing and incoming (reflected) modulated light signals. The phase measurement is achieved by correlating the reflected light with a reference modulation waveform using a specialised pixel. The pixel’s optical response to the applied electrical reference signal is critical to the performance of the correlation function and ultimately the distance measurement quality.

A typical indirect Time-of-Flight pixel design. The electrons collect in Tap A or B depending on the electromagnetic field setup between DMIX0 and DMIX1 by the sensor modulation signal. The PixelScope can measure where the electrons are being collected

Given that a cross-correlation in the time domain can be easily represented by a multiplication in the frequency domain, it may be tempting to reverse engineer the pixel’s optical response from standard operating mode measurements of the pixel’s correlation signal. However, this approach is limiting, because the modulation waveform (normally square) contains zeros in the frequency domain, therefore while the light source response can be measured using a high speed photodetector, the pixel optical response cannot be accurately inferred from the cross-correlation and light source measurement.

The pixel response is measured by sampling the same part of the waveform with a pico-second laser pulse. This is the equivalent of sampling with the Dirac function.

The PixelScope measures the optical response of a time-of-flight pixel by sampling the pixel modulation signal using an approximation to a delta function — a pico-second pulsed laser signal. The pulsed laser source is triggered in sync with the sensor modulation signal (at the same, or a sub-harmonic frequency) and illuminates the pixel array. The incident illumination mixes with the pixel modulation signal to produce one sample of the optical response waveform. The entire optical response waveform is compiled by sweeping the relative time delay (or phase) between the laser pulse and pixel modulation signals, and sampling at each point. A simple illustration of the sampling process is shown in the figure below

PixelScope operation overview. The modulation signal is feed into a programable delay line, and the output of the delay line triggers a picosecond laser. The laser pulse illuminates the time-of-flight sensor array and the raw pixel value are readout after the integration time is over. Then the delay line value is incremented to sample the next value on the pixel response function.

The PixelScope measures the temporal pixel modulation optical response by generating short pulses of laser light synchronised with a modulation reference signal (typically the illumination modulation drive). The laser pulses are aligned at the same phase value for each modulation clock cycle. When these light pulses are directed onto a Time-of-Flight image sensor, the raw intensity pixel values represent a temporal sample of each pixel’s optical modulation response at a point in the modulation cycle corresponding to the timing of the laser pulse. With each successive frame acquired, the PixelScope sweeps the light pulse delay (td), thereby repeating samples at successive temporal locations, and building up the full waveform of the pixels’ optical modulation response verses time.

An example of the output data of the image sensor using the PixelScope. The waveforms of the pixel at four different frequencies are compared. It starts off being a square wave but changes into something else at higher frequencies.

The PixelScope measures the optical pixel modulation of Time-of-Flight sensor pixels, allowing you to quantify, visualise and compare the optical response and important characteristics of your pixels.

  • Quickly, easily and accurately measure the optical modulation response of your pixels.
  • Compare optical modulation waveform shape across all pixels simultaneously.
  • Quantify modulation waveform, perform advanced waveform and harmonic analysis, and measure modulation changes as the drive signal propagates through the sensor.
  • Compare modulation performance and characteristics, such as modulation depth, rise/fall time, duty cycle, relative delays, and analyse how these parameters vary across the sensor.
  • Visualise the effective modulation response, including the impact of intended or unintended modulation signal variations during the integration period.
  • Characterise and compare the response of standalone test pixels and experimental pixel designs without the need for full time-of-flight system integration.

PixelScope Specifications

The key PixelScope specifications are in the table below

Example Measurements

The next section describes in detail potential measurements with the PixelScope.

Pixel Response Waveform

A series of PixelScope measurements were acquired for the using a OPT8241 Time-of-Flight sensor and a camera developed by Chronoptics. The PixelScope was setup using a typical optical configuration where the lens was removed from the camera and the sensor was directly illuminated via fibre optic cable through a series of diffusers to achieve a spatially homogenous intensity profile. Meaning all pixels are illuminated evenly, allowing for comparison between pixels in different parts of the sensor array. For each PixelScope measurement, dark reference samples (with pulsed laser switched off) were recorded and subtracted from the waveforms to remove per-pixel fixed pattern offsets.

Optical setup for the PixelScope fibre optics. The single mode fibre is place through two diffusers to homogenously illuminate the pixel array, allowing for comparisons between pixels on the array.

The Texas Instruments OPT8241 sensor uses a differential time-of-flight pixel structure (two channels/taps, A & B) to measure distance. This sensor has a feature where either channel A or B can be grounded allowing for inspection of the individual pixel taps. In the figure below the optical response for a single pixel, at 30 MHz, is plotted for normal operating mode (differential, A-B), channel A (A-0), and channel B (0-B). In this case, the sum of (A-0) and (0-B) is approximately equal to the standard A-B measurement.

Example pixel response function (and its Fourier transform) at 30MHz using the PixelScope. The A-B is the normal operation of the pixel.

The corresponding frequency spectrum of the A-B and A-0 waveforms is plotted above. Examining this spectrum helps to highlight the value of the PixelScope tool. From a time-of-flight systems point of view, the amplitude of the 3rd harmonic is of interest as reducing its amplitude can improve performance.

Amplitude Linearity

The response of time-of-flight pixels as a function of the number of electrons is assumed to be linear. Wiedemann [1] when characterising a time-of-flight sensor observed a relationship between integration period and measured phase, but no credible explanation provided for such a relationship — perhaps an internal sensor characteristic, or an external effect such as multi-path interference. Using the PixelScope, these previously noted relationships that were hard to pin-point can be investigated further.

Initial testing is conducted on a OPT8241 sensor with a modulation frequency of 50 MHz. To vary the illumination amplitude, the number of pulses for a given raw frame (or sometimes referred to as quad frame) is varied from 17740 to 4838, and the pixel response for one raw frame (quad) is plotted. The fundamental amplitude and DC component of the depth measurement (from four raw frames, or quads) is plotted in Figure 4 (b). This indicates that pixel response is linear, however, while the amplitude gain is linear there is a slight increase in the DC offset, which is visible on the blue curve. This is partially due to duty cycle settings as discussed below. The pixel response close to saturation requires further investigation (with any saturation compensation features disabled).

Measuring amplitude linearity of the pixel by varying the number of laser pulses, using the pulse skipping feature of the delay line. This way the phase shifts with integration time observed can be further investigated.

Frequency Response

The response of a time-of-flight sensor changes with modulation frequency. The raw PixelScope optical response for a single pixel at a range of frequencies is plotted below. The PixelScope generates a pulse at a given time delay from the modulation edge, therefore the number of pulses generated is dependent on modulation frequency (and PixelScope pulse settings). To give a meaningful comparison of pixel amplitude across different frequency measurements, the PixelScope pulse settings are adjusted to ensure a consistent number of pulses (and hence illumination energy) in each integration period.

Both the phase shift of the pixel’s optical response, and amplitude change, due to frequency. The higher the modulation frequency, the more the waveform has shifted to the right and reduced in peak-to-peak amplitude. As the number of pulses has been kept consistent across all modulation frequencies the number of arriving photons should be the same, therefore the common mode value (A+B) should be the same for all frequencies. This implies that the observed relative change in fundamental amplitude would have an impact on the effective modulation depth of the pixel (as discussed below).

Not only is the amplitude changing with frequency, but so is the waveform shape. By taking the Fourier transform (FFT) of each waveform, the relative change of the harmonics can be observed, as shown below. The third harmonic is of interest as it can introduce distortion in the measured phase (distance), when four phase steps are used. As expected, when the modulation frequency is increased, the amplitude of the harmonics decreases — at 40 MHz the 5th harmonic is visible, while at 60 MHz it is within the noise floor.

The pixel response function changing with modulation frequency. The waveform changes in amplitude (become less effective at moving electrons into the correct Tap), and changes shape losing power in higher harmonics, as shown in the FFT plot.

Effective Modulation Depth

The effective modulation depth is a measure of how effective the pixel is at moving electrons into the correct pixel tap. If an electron is moved into the wrong tap it counts towards the background light level and is subtracted because of the differential nature of the measurement. In the case of square wave modulation, the first harmonic counts towards the signal amplitude, the higher order harmonics are either dropped or aliased which adds interference.

The effective modulation depth of a pixel is the amplitude of the first harmonic divided by the common mode measurement. The common mode is A+B, which measures the number of photons, and ideally every photon (that is converted into an electron hole pair) is used in the measurement.

The effective modulation depth at 30 MHz is measured for an OPT8241 sensor, and plotted below. The common mode is measured as the mean (over time) of (A-0) — (0-B) = A+B. The measurements shown in Figure 3 indicate that pixels become more efficient the closer they are to the edge of the array

The spatial pattern of the effective modulation depth (AC contrast) over the pixel array. Pixels in the center of the array respond differently to pixels around the edge of the array. This information can drive design changes in the next generation of sensor development.

Duty Cycle Offset

Changing the integration duty cycle of the sensor can introduce deviations from the desired waveform. Ideally the pixel duty cycle is 50%. The PixelScope is a tool to measure the actual pixel duty cycle. The figure below was obtained using the PixelScope and a time-of-flight camera operating at 50 MHz and adjusting the pixel duty cycle. The DC (constant) offset of the A-B Tap caused by a non 50% duty cycle is noticeable.

The measured pixel response with different sensor duty cycle settings. The PixelScope can verify the correct 50% duty cycle is being used.

Pixel Phase Offset and Temperature

For an accurate time-of-flight calibration a fixed phase offset per pixel is required. This phase offset can be broken into two values, a global value and a per pixel value. The global offset comes from the propagation delay between the sensor and light source and other light source properties. The per pixel offset from internal propagation delay in the sensor, and other pixel to pixel variations. The PixelScope can measure the per pixel phase offset.

The PixelScope can measure how the pixel phase measurement changes with temperature, the camera under test is placed in a temperature controlled oven, and the PixelScope outside with the fibre optic cable feed into the oven. This way the laser waveform stays the same while the sensor changes, enabling decoupling of the illumination and sensor from each other.

It is under development to apply this calibration method for waffer level calibration.

An example of a phase measurement suing the PixelScope for an OPT8241 sensor operating at 40 MHz is shown below.

The phase offset different between pixels at 40MHz. This measurement setup can be used to measure temperature effects or validate calibration methods.

Other Measurements

Other areas not explored in this white paper include measuring rise and fall times of the pixel response, investigating cross-talk issues on the pixel clocking tree, and investigating sensor noise characteristics.

Conclusion

In this blog we have described the output data from the Chronoptics PixelScope and provided examples of how this data can be used to characterise and calibrate an indirect time-of-flight pixel array. The modulation contrast, amplitude linearity and frequency response can be characterised, and the per-pixel phase delay calibrated.

The Chronoptics PixelScope is a powerful tool for gaining a deeper understanding of the inner workings of time-of-flight pixel arrays. To learn more contact us at hello@chronoptics.com

References

[1] Wiedemann, M., Sauer, M., Driewer, F., & Schilling, K. (2008). Analysis and characterization of the PMD camera for application in mobile robotics. IFAC Proceedings Volumes, 41(2), 13689–13694.

--

--

Refael Whyte
Chronoptics Time-of-Flight

I'm a cofounder of Cambridge Terahertz, we are bringing solid state high frequency radar for security screening to life.