Exploring optical imagery — a spectral encounter

Valerie Pasquarella
Geospatial Processing at Scale
7 min readJul 30, 2021

“Light brings us the news of the Universe. Coming to us from the sun and the stars it tells us of their existence, their positions, their movements, their constitutions and many other matters of interest. Coming to us from objects that surround us more nearly it enables us to see our way about the world: we enjoy the forms and colours that it reveals to us, we use it in the exchange of information and thought.”

— Sir William Bragg, The Universe of Light (1933)

Measurements of reflected light are a fundamental part of our ability to obtain information about the world around us. Our vision is colored by natural and artificial light sources and their interactions with materials. Reflection, absorption and scattering of light precede a chain of visual processing that makes the sky appear blue, the grass green, and the deepest ocean unfathomably dark.

“Hypothesis III: The Sensation of different Colours depends on the different fre­quency of Vibrations, excited by Light in the Retina”

— Thomas Young, On the Theory of Light and Color (1801)

While human eyes are only able to perceive light with wavelengths between about 380 to 750 nanometers, cameras and other optical instruments can be equipped with sensors that allow us to measure reflectance over a much broader range of the electromagnetic spectrum. With the assistance of special lenses and detectors, we can unveil the ultraviolet wavelengths that are naturally visible to some insects, mammals, and birds, as well as longer infrared wavelengths that give us “night vision” and thermal imaging.

Source: https://commons.wikimedia.org/wiki/File:Electromagnetic_spectrum_-eng.svg

In the case of the many common Earth Observing satellite systems, reflectance measurements across a range of wavelength intervals or “bands” are recorded as multi-spectral images that can be used to provide critical information on the status and dynamics of Earth’s land surface, biosphere, atmosphere, and oceans. For example, the Landsat series of satellites operated by NASA and the USGS and the European Space Agency’s Sentinel-2 sensors both collect multispectral imagery with bands spanning the visible, near-infrared, and shortwave infrared regions of the electromagnetic spectrum.

Source: https://www.usgs.gov/media/images/comparison-landsat-7-and-8-bands-sentinel-2

Satellite images like those collected by Landsat and Sentinel-2 are typically displayed using an additive three-channel RGB color model, in which reflectance values are assigned to red, green and blue channels and added together to produce a broader array of colors. When an RGB image is created from bands peaking at approximately 650 nm (red), 550 nm (green), and 450 nm (blue), the resulting “true color” image will look very familiar and resemble a standard color photo. If you’ve ever explored in Google Earth or used a satellite basemap, you’ve encountered true color satellite imagery.

Example of a True Color RGB composite for a Sentinel-2 image of Boston, MA, USA

While true color imagery best mimics human perceptions of color, assigning measurements from outside the visible spectrum to the standard RGB channels creates new color combinations that allow us to see beyond the limits of human optics. So called “false-color” images can be used to emphasize particular surface properties. For example, photosynthetic vegetation tends to be very reflective in near-infrared wavelengths due to structural properties of leaves. If we create an image where the NIR band is assigned to the Red channel, the red band is assigned to the Green channel and the green band is assigned to the Blue channel, we get a whole new view of the world where vegetated areas appear in bright red tones.

Example of a False Color RGB composite for a Sentinel-2 image of Boston, MA, USA

Looking at reflectance as a function of wavelength, it becomes even more apparent that different types of surfaces and materials have distinct spectral profiles or “signatures”. These signatures can be used to group and label or “classify” spectrally similar observations or relate spectral properties with other biophysical attributes.

Image credit: SEOS Project https://seos-project.eu/classification/classification-c01-p05.html

The Spectral Encounter apps are a set of interactive tools developed to help remote sensing students, researchers, and other spectrally curious individuals explore all the spectral dimension has to offer. These apps were built using the Google Earth Engine platform, which facilitates unprecedented ease of interaction with Earth observation image archives. There are currently two Spectral Encounter apps, one for Landsat 8 and one for Sentinel-2, though the general interface could easily be extended to other image collections.

To begin your encounter, just open the link to the Landsat 8 or Sentinel-2 version of the app…

Sentinel-2 Spectral Encounter: https://valeriepasquarella.users.earthengine.app/view/spectral-encounter-s2

The map interface will automatically begin to load one revisit cycle of imagery, sixteen days for Landsat 8 and five days for a pair of Sentinel-2s. Both apps display Top-of-Atmosphere (TOA) image collections. In other words, these images represent measurements of at-sensor reflectance and may appear hazy or blue-ish due to the combined signals from both the atmosphere and underlying surfaces. You’ll also notice the images are acquired in long strips (orbits or “paths”) and may be occluded by clouds.

The date slider can be used to adjust the dates of imagery being displayed. Click the boxes along the slider to select nearby dates, use the arrows to go forward or back a week, or use the Jump to Date option to pull up a monthly calendar. Landsat 8 TOA imagery dates back to May 2014, while the Sentinel-2 Level 1C TOA collection began in June 2015.

Zoom in to take a closer look, and click to generate spectral signature charts. Each time you click, a point will be added to the map and the corresponding reflectance values will be charted using the same color, allowing you to interactively explore differences in spectral reflectance for different cover types or even pull reflectance signatures from different dates of imagery. If you want to start over, the Clear Chart and Samples button removes the chart and associated points from the map.

Displaying a True Color image and spectral signature chart for clicked points.

To round out your spectral encounter and put reflectance values for individual pixels in a spatial context, you can use the pulldown menus to change the bands used in the RGB composite and adjust their stretch parameters. A true-color mapping is displayed by default, but there are many other combinations to be explored. It is also helpful to note differences in the magnitude of reflectance values observed in the spectral signature charts, as you may want to rescale or “stretch” the max differently for different bands.

Displaying a False Color image and spectral signature charts for clicked points.

Questions you might ask yourself while investigating images and charts:

  • Which surfaces appear most different and in which bands do they most diverge?
  • How do the reflectance values plotted by wavelength connect with the RGB visualization?
  • Why is the reflectance scale measured from 0 to 1.0?
  • Do bright targets like clouds, snow, buildings, deserts and salt flats (check out Utah or Bolivia!) have similar spectral signatures?
  • Which targets are the darkest? Why might that be?
  • How do the available bands differ between Landsat 8 and Sentinel-2? Which instrument has more bands? Which bands correspond to the same wavelengths?
  • What is the most interesting RGB combination you can come up with? What does this combination emphasize in the imagery?

Whether you’re using imagery as inputs to a state-of-the-art machine learning model or simply looking to explore remote sensing datasets and create beautiful visualizations of our changing planet, building an intuitive understanding of the spectral bands available for a given sensor and reflectance properties of different surface materials is an essential first step towards mastering the art of working with spectral measurements. 👀 👻 🌈

2022–01–23: Updated Landsat 8 app to use Collection 2 Tier 1 and Real-Time TOA imagery.

2022–05–07: Updated Sentinel-2 app to use the “harmonized” L1C collection that accounts for the change to the 4.00 processing baseline in January 2022.

--

--