What’s False About True Color

Robert Simmon
Planet Stories
Published in
6 min readJun 7, 2016

I’ll start with a quiz: is this picture real—is it what an astronaut would see from space?

How about this picture, seemingly from a similar vantage point, but including the far side of the Moon?

Or this view of the Pacific Ocean?

In a sense, they’re all real, even though they were collected in different ways. The first is a photograph taken by the crew of Apollo 11, on their way to the moon. The second is from the Deep Space Climate Observatory (DSCOVR), from its vantage point at 1,000,000 miles from Earth. The third is a day’s worth of data from NASA’s MODIS instrument, wrapped around a sphere and rendered with a simulated atmosphere.

But what makes a realistic picture? After all, our eyes and brain are reconstructing a sharp, full-color, three-dimensional environment from a tiny amount of information.

This illustration simulates the information sent to your brain by the retina. Visual acuity is highest in the fovea, which covers an amount of your field-of-view roughly equivalent to the size of your thumbnail, held at arms’ length. Away from that tiny spot, sharpness and color perception fall off.

Our brains infer the appearance of surfaces based on assumptions about the relative positions of lights, objects, and the shadows they cast. For example, look at the checkered pattern above. We can tell square A is black, even though it is brightly lit, and square B is white, even though it is in deep shadow—right?

Perhaps not.

Both squares are exactly the same color. The eyes and brain adjust perceived color and light based on the surroundings—they are not making precise measurements.

In addition to localized adjustments in perception, the visual system also adapts to global differences. For example, a white piece of paper will look white if we view it under orange candlelight or bluish light from an LCD screen. Photographs and other images need to be corrected accordingly—a process called white balancing. The photo above, of the International Space Station and southern South America, was processed to correct for 5 different color temperatures, from 2800K to 9800K. The center strip matches the color temperature of the Sun: 5800K.

Satellite images need to be processed to account for these and other features (quirks?) of our vision. This is the raw version of a Planet Labs image, showing fields in the Egyptian desert.

I adjusted the white balance of this version of the image to account for the color imparted by the atmosphere, and the brightness to correct for the non-linear response of our eyes—both global corrections. (If you’re curious, I’ve written a description of my workflow.) It’s improved, but lacks sharpness in the fields, and bright areas are washed out.

An additional set of local adjustments brings out details, and emphasizes structure in the desert sands. These processed images are more true to how we see, and convey more information, than the quantitative data provided by scientific instruments.

Similar processing techniques can help with the interpretation of otherwise abstract data. Compare this nighttime view of Italy taken from the ISS…

… with 9 months of city lights data, merged with a map of the Earth’s surface tinted blue to simulate night—a technique that dates back to the silent film era. It’s data visualization that reads like a photograph.

Sometimes, a realistic image is too detailed, like this Planet Labs snapshot of Utah’s Canyonlands. Complex topography, a silt-filled river, and unfamiliar lighting (sunlight is coming from the lower right) makes the landscape hard to interpret.

This National Park Service map uses abstraction to enhance readability. It’s less realistic—shadows are idealized, texture and color is computer-generated, and river water is a flat blue—but it’s easier to see the relationships between high mesas and deep canyons.

True color views can be limited. This red, green, and blue Landsat 8 image of Alaska shows green and brown boreal vegetation, silt-filled rivers, and spreading smoke from a wildfire.

The same area, shown with shortwave infrared, near infrared, and green light reveal subtleties in the vegetation, and clearly differentiates land from water. The infrared light even penetrates smoke, showing the advancing flames beneath.

We can also look outward into space, instead of inward towards Earth. This is what our eyes would see looking through a telescope—but without further enhancement—at the constellation of Orion. Dozens of stars appear against a black background.

But with a long exposure and special filters the dark dust and glowing gases of the Horsehead Nebula appear.

Scientific imagery and data visualizations rarely match what we would see with the naked eye, which is limited by our physiology. The best visualizations—even visualizations of the invisible—work within those constraints to reveal hidden truths.

Further Reading

Credits

  1. NASA Astronaut Photograph AS11–36–53392 NASA/NOAA
  2. DSCOVR
  3. NASA • Robert Simmon & Marit Jentoft-Nilsen
  4. ibid. ibid. ibid.
  5. Ben Bogart & Stig Nygaard, CC BY-SA
  6. Edward H. Adelson
  7. ibid
  8. NASA Astronaut Photograph ISS047-E-131811
  9. Planet Labs CC BY-SA
  10. ibid
  11. ibid
  12. NASA Astronaut Photograph ISS047-E-125942
  13. NASA/NOAA/DoD • Imhoff, Elvidge, Mayhew, and Simmon
  14. Planet Labs CC BY-SA
  15. National Park Service • Tom Patterson
  16. NASA/USGS Landsat 8 • Robert Simmon
  17. NASA/USGS Landsat 8 • Robert Simmon
  18. National Optical Astronomy Observatory • Travis Rector
  19. ibid

--

--

Robert Simmon
Planet Stories

Data Visualization, Ex-Planet Labs, Ex-NASA. Blue Marble, Earth at Night, color.