Going beyond the visible spectrum at SatSummit III

SatSummit
SatSummit
Published in
3 min readSep 5, 2018

By Charlie Loyd

At this year’s SatSummit, we’re tackling topics beyond what the human eye can see. As we cover familiar global challenges like disaster management, environmental monitoring, and large scale food security, we’ll be taking into account some of the lesser known, but highly useful, datasets out there. This means going past the human visible part of the electromagnetic spectrum.

The Mekong Delta seen in a composite of SAR observations from the Sentinel-1 satellites.

Seeing More than Red, Green, and Blue is a panel on synthetic aperture radar, hyperspectral imagery, and other unusual — but increasingly important — information beyond what our eyes can see. We’ll be discussing the state of the art, but also how people can start using this strange data in practical ways.

Synthetic aperture radar (SAR) provides research and operational data to users in dozens of fields, including seismologists, arms control analysts, glaciologists, quants, disaster mappers, ecologists, and fisheries officials. Space-based SAR started with a shock: when Seasat launched in 1978, the military was alarmed to discover that the satellite could see the incredibly faint wakes of submerged submarines. A few years later, civilian survey SAR also kicked off with a moment of astonishment: the SIR-A instrument (flown on the second shuttle launch) unexpectedly saw dry river channels buried under the Saharan sand, showing that the desert had been flowing with water only a few thousand years ago. Radar has kept its ability to surprise, with new applications being developed all the time, from flood mapping to oil slick detection to structural engineering. And yet, because the data has been hard to get in useful quantities until recently, and takes training to interpret usefully, SAR is missing from the toolkits of many remote sensing experts.

Hyperspectral imaging covers a more familiar part of the electromagnetic spectrum — visible and infrared light — but slices it much thinner. Instead of breaking visible light into red, green, and blue, for example, a hyperspectral sensor might split it into 10 or more bands. Hyperspectral has always been interesting to scientists, who have used for purposes like fine-grained geological and ecological mapping, and even archaeology, but it’s rarely been put to practical use. For one thing, hyperspectral sensors have typically been borderline experimental devices, producing noisy and unwieldy data. Plus, working with high-dimensional information is always a challenge; conventional remote sensing and computer science still have a lot to learn from statistics. Truly practical hyperspectral data, and the algorithms to make it manageable, have only just begun to appear.

To put it bluntly, SAR and hyperspectral datasets are weird. Most of us never see them. They have undeniably been hard to source, hard to understand, and hard to work with. Yet they are now far more available — from both open and commercial sources — than ever before. From a technical perspective, these kinds of data are completely ready to contribute much more to nonprofit, business, governance, and scientific applications. What was missing was a broad-based community of users and value-adding businesses able to make them as easy to understand and use as “normal” RGB data already is.

That motivated, creative, and methodologically diverse community is now appearing. This panel is about the potentials they are unlocking in this “weird” data. We’ve called on a cast of stars to talk about what’s most exciting and interesting to them in the field. Alphabetically:

  • Yotam Ariel, CEO of Bluefield, a startup at the state of the art in detecting methane leaks with hyperspectral data,
  • Julie Baker, co-founder and VP of operations of Ursa, an advanced SAR analytics startup with a focus on energy, and
  • Dr. Lola Fatoyinbo, at NASA Goddard, whose cutting-edge research on carbon stores and forest structure uses SAR heavily.

I’ll be moderating the session — please come to the panel on Wednesday, September 19 at 4:00pm in Hemisphere A! If you still need a ticket for the event, head over to our registration page.

(I work on imagery R&D at Mapbox and am a big fan of weird data.)

--

--