Watching the world burn
Machine Learning and Satellite imagery for wildfire detection
Wildfires are a serious problem across the US West, threatening homes, businesses and the natural landscape that attracts so many people to live in and visit the region. Back in 2000 a catastrophic wildfire almost destroyed my home and place of work in Los Alamos, NM. As a young post-doc at Los Alamos National Laboratory, I became a refugee for what felt like a very long and nerve-wracking week, which sparked my career-long interest in developing machine learning tools to map the world’s fires. As this year’s fire season develops, new satellite, cloud computing and machine learning technology is revolutionizing our ability to understand fire at global scale.
We didn’t start the fire (except when we did)
Each year, millions of acres burn, and the trend over the last 30 years is for larger and more intense fires. US Forest Service estimates that 90% of these fires are caused by humans. While here in the US we have the resources to fight most wildland fires, fires in remote locations might not be discovered for days and are sometimes allowed to just burn out naturally, based on firefighter safety and cost factors. Climate studies show that wildland fires are a significant contributor to greenhouse gas emissions, estimated at 50% of human industrial emissions, and scientists are working to better understanding the contributions of wildland fire to global climate change.
How to map all the fires around the globe, both in real-time and historically? And how to forecast the likelihood of future fires? The technology of fire detection has evolved from deploying people in strategically located watch towers (1900–1950), to manned spotter aircraft (1950–present), to the use of satellites (1970-present). Satellites are the key to consistent global scale fire detection, though current sensors only detect fires that have reached a substantial size and are therefore less useful than ground and aircraft systems for early detection.
US and European Union satellites (NASA/USGS Landsat, NASA Terra/Aqua, ESA Sentinel-3) carry thermal sensors that can detect the heat of wildfires from orbit. This enables daily mapping of wild fires globally, except for masking by clouds. Smoke plumes produced by fires can appear distinctive, as wildland fires loft tons of ash into the atmosphere.
The scars left by wildfires are also clearly visible from space, and provide a way to map the slow recovery of an ecosystem following a severe fire. Determining what was lost in a fire is critical, as many forests are not well or recently mapped. Drought, insect infestations, and wildfire are all linked, and detecting one type of disaster can help forecast other disasters to come. Mapping not just tree species but tree health is also important for understanding the risks to nearby residential zones and other human infrastructure (e.g., power lines, oil and gas pipelines, or roads/rail-lines).
Next Generation Global Wildland Fire Monitoring
Back in 2000 I was a young post-doc in the Space and Remote Sensing Sciences Group at Los Alamos National Laboratory when that human-caused catastrophic wildfire hit and I suddenly became a refugee. Volunteering to join the fire analysis team in the immediate aftermath of the fire, I helped develop a genetic-algorithm-based machine learning approach to teaching computers to map wildland fire burn scars (e.g., Brumby, et al., SPIE 2002), Back then, simply getting satellite imagery from my colleagues at NASA was an ordeal, requiring somebody at the satellite data center to cut a ~400MB Landsat data file to a CDROM and physically mail it, or else endure internet bandwidth 1000 times slower than today (we did both).
A lot has changed in the last decade. At Descartes Labs we’ve built a platform for analyzing all the world’s public and commercial satellite imagery, using machine learning in commercial cloud. Thanks to Google and Amazon hosting NASA/USGS, NOAA and ESA satellite imagery archives, we can process orders of magnitude more data in parallel on virtual machines that live “near” the data, and the visualization results can be shared near instantly over the web using social media channels like YouTube (founded 2005). This enables scientists to analyze all the available imagery, across satellite constellations, in a way that we only dreamed about in 2000. This increase in analysis capacity is critical, as the amount of new satellite imagery is increasing dramatically, thanks to new commercial sources of imagery like the Planet cubesat constellation which can be used to detect wildfire burn scars.
For the results shown below, we processed more than a hundred times the data used in our weeks-long original work on the Cerro Grande fire (Brumby, et al., SPIE 2002), and finished this new analysis in approximately an hour using the Google Cloud Platform. The video shows over a decade of catastrophic wildland fires, seen as holes opening up in the green forest. (We can also see annual snow pack (pink regions) and agriculture (yellow).)
With new machine learning algorithms and cheap, on-demand computing at a scale that previously required a national laboratory, we are on the edge of a new era in global monitoring, for historical analysis, real-time alerting, and forecasting the future. This new capability will have a great impact on understanding fire as a component of the Earth as a system, and help others avoid the fate of becoming a refugee from wildland fire.