Less than Picture Perfect

William L. Weaver
TL;DR Innovation
Published in
5 min readFeb 2, 2018

Digital Image Capture

As an experimentalist, I always experience a certain amount of envy toward the breathtakingly beautiful color images produced by the computational fluid dynamics (CFD) folks. As a programmer, I can appreciate the level of talent, effort, and sheer volume of code behind each frame of CFD animation. In a healthy symbiotic environment, experimental data is modeled numerically and initial CFD relationships are formulated. The CFD code is used to forecast the system state at different conditions and additional experimental measurements at these conditions are obtained in an effort to validate the code. The resulting expert system evolves the ability to test new systems virtually, thereby obviating the need to construct expensive prototypes and subject them to actual (often destructive) conditions. The code can also predict experimental pitfalls at these new conditions and experimentalists can use this information in the development of measurement systems capable of dealing with the potential interferences. It is a win-win situation.

Photo by Skitterphoto on Pixabay

The source of my envy is not any fear of experimental obsolescence. Unlike the CFD folks, we experimentalists will always have to be aware of how images are perceived by both the laymen and scientific communities. “Seeing is believing” and “…worth a thousand words” are so ingrained in our collective psyche that quantitative data presented as an image is often perceived to be completely accurate and free of uncertainty. As all practitioners of data acquisition know, any reported quantitative value is unacceptable unless it is presented along with its associated level of uncertainty. As the development of quantitative data visualization continues, increasing numbers of quantitative images are appearing in the literature. Most often, the author chooses not to mention the level of uncertainty associated with the values, or there is a blanket statement of uncertainty contained in the text, such as “image values are known to +/- 5%”. The first case suggests the author does not know the level of uncertainty in the values. In the second case, the mention of uncertainty suggests the experimental visualization technique is inferior to the CFD images, which are perceived to be “error free”. As analytical scientists, we must apply the same uncertainty analysis to images as we do to all other point measurements.

The workhorse of quantitative data visualization is the charge-coupled device (CCD). Like its one-dimensional cousin, the photodiode, each element of the CCD (a pixel) converts the incident light intensity into an electrical signal that can be fed into a data acquisition system. Both the photodiode and CCD pixels have responses that vary as a function of intensity (linearity) and color of the incident light (wavelength response, also known as quantum efficiency or QE). Photodiode output is proportional to the intensity of the incident light and its output tracts the intensity continuously. CCD pixels differ from a photodiode in that each CCD pixel is an integrating device that is emptied, filled with electronic charge at a rate proportional to the intensity of the incident light, and subsequently emptied again after the amount of charge collected is measured and reported.

As with all data acquisition systems, both photodiodes and CCDs must have their responses calibrated before they are useful. Since the CCD is most often advertised as a single device, the fact that it is a dense, two-dimensional array of discrete optical transducers can be overlooked during calibration. Each of the pixels in the CCD array has a slightly different response to the intensity and color of incident light. The Eastman Kodak Company, manufacturer of a very popular and varied line of CCD sensors, refers to this pixel-by-pixel variation as CCD “photoresponse nonuniformity” or PRNU. Values of PRNU can vary from 1–3% for the highest grade CCDs and be as high as 6–15% for mid-grade sensors. The PRNU is a measure of variation in pixel QE that results during the manufacturing process. Additional CCD parameters such as dark current and readout noise vary among CCD-system integrators, but are often much less then the PRNU.

The paramount step in CCD calibration, used to minimize the effect of PRNU, is known as flat-field normalization. When “flat-fielding” a CCD, a well controlled, featureless (flat) field of light is placed in front of the CCD. The flat-field source is commonly an integrating sphere, a device that provides the same intensity of light to each pixel in the CCD array. A flat-field image is acquired and is normalized by dividing each value of pixel intensity by the average pixel intensity in the image. The resulting pixel-by-pixel values in the normalized image represent the individual photoresponse differences of each pixel. During normal CCD image acquisition, the pixel intensity values in every image should be divided by the normalized flat-field image to correct for PRNU. This correction lowers the uncertainty level down to that of the contributions due to read-out noise. As CCD-system integrators continue to lower dark current and readout noise, CCD uncertainty will ultimately bottom out at the level of unavoidable shot noise.

There is definitely a need for a universal standard for representing uncertainty in images. The standard “error bar” does not lend its utility to images. Perhaps a version of error bars can be incorporated into the intensity scale or legend that accompanies each image. Or maybe uncertainty values in images could be communicated visually by adding varying amounts of blur. Blurry, indistinct regions in an image would effectively impart the idea of uncertainty to the most casual observers. In any case, even with uncertainty, quantitative visualization is extremely valuable. Using a standard 1k x 1k CCD, those 1,048,576 pixels are worth much more than a “thousand” words.

This material originally appeared as a Contributed Editorial in Scientific Computing and Instrumentation 17:9 August 2000, pg. 14.

William L. Weaver is an Associate Professor in the Department of Integrated Science, Business, and Technology at La Salle University in Philadelphia, PA USA. He holds a B.S. Degree with Double Majors in Chemistry and Physics and earned his Ph.D. in Analytical Chemistry with expertise in Ultrafast LASER Spectroscopy. He teaches, writes, and speaks on the application of Systems Thinking to the development of New Products and Innovation.

--

--

William L. Weaver
TL;DR Innovation

Explorer. Scouting the Adjacent Possible. Associate Professor of Integrated Science, Business, and Technology La Salle University, Philadelphia, PA, USA