Satellite images help detect areas in need after a disaster, and identify slums
The mission of the FDL Program Europe is to enable AI research for space exploration & all humankind.
We gathered at the British Interplanetary Society, a simple and nice-looking three storey Victorian Town House. The relatively small event room was packed.
It started with Belina Raffy, one of the organizers, asking us to answer any of the three following questions: say any 1) colour, 2) number, or 3) emotion that comes to your mind — all of us and all at the same time. After, she asked if we could discern any pattern in the answers, to which we all said — No. That is why we need data science, she explained — to help us find patterns and rules in the huge mass (or mess) of the data coming from satellites and from space in general.
The director of the FDL program, James Parr, joined online and introduced the program in more detail. After the original FDL which started in Silicon Valley, this one in London marked the first year of the FDL Europe program. He discussed also how creativity takes courage.
Dr Emily Shuckburgh (British Antarctic Survey and Univ. of Cambridge) showed a number of graphs depicting global changes on the planet over the years: from the growth of population and economy to the increase in the average global temperatures, extreme weather events and the sea ice extent decrease. Finally, she talked about a number of scientific studies conducted to tackle the question of whether what we are seeing are the results of anthropogenic climate change — and a big majority of those studies found a positive to answer to it.
The main message was, as depicted in the figure, that nowadays we have large amounts of data from space, significant developments in data science and artificial intelligence (AI), as well as requests from businesses and policymakers, and so this is the time which creates possibilities for tracking and monitoring environment, people and the bio ecosystems on the planet. This can be today achieved by the governments, space agencies, and new commercial entrants (startups) in the field.
At the end of the presentation, there was the question — given such a big goal and challenge — to create a mission control for planet Earth, what are the priorities? The answer was a bit surprising to me — because it is such a complex set of challenges, we need AI to solve them, and we can start by tackling any of the particular challenges, as then the AI tools will likely be transferable and applicable to the other challenges and domains, as well.
Then we heard from Iarla Kilbane-Dawe (the director of Φ-lab at ESA). He introduced the newly founded ESA lab as an answer to the challenge of the huge amounts of data from space that we constantly receive nowadays — Phi-lab aims to help to make sense of them. He talked about the 21st-century space agency. Moreover, he showed how the number of newly launched Earth Observation satellites constantly grows thanks to many startups launching. Among others, he mentioned ICEYE, the startup in which I worked prior to joining Bell Labs.
This presentation was also a good introduction to the upcoming team presentations, as Iarla described available open satellite data from ESA’s Copernicus programme. A number of satellites from the Sentinel family are recording the whole Earth surface roughly every five days, using optical, as well as radar sensors. All those data are publicly available for anyone to use, making this ESA program unique and the first of its kind in the world.
Finally, the two FDL teams who have spent the last eight weeks working on their challenges, presented the outcomes of their projects. The teams consisted of PhD students and PostDocs.
Team Disaster Relief
This team has focused on three particular disaster events for which they also had available satellite data:
- Haiti affected by the 2010 earthquake
- Houston, where hurricane Harvey made landfall in 2017
- Ecuador affected by the 2016 earthquake
Satellite data can be used to support mainly the first respondents during disaster events. Namely, there are crucial questions: where people are located and what is the level of damage, for which the respondents often lack timely data. The team looked how satellite data can help to provide them.
The UN international disaster charter mechanism directs satellite acquisitions towards the affected areas enabling more timely satellite images. In addition to the open, Sentinel-1 (radar) and Sentinel-2 (optical), the teams have also used very high-resolution (VHR) imagery provided by Digital Globe. Sentinel images come in the resolution of between 10m and 20m, while the resolution of the Digital Globe images is 0.5m.
For each of the disaster cases, the team had different tasks to tackle. For example, given that the Haiti earthquake took place in 2010, it caused a deficit of adequate and affordable housing up to today. Hence, it is relevant to discover and map existing settlements of displaced people. For Houston, after the landfall, it is relevant to detect the flooded areas and damaged buildings. In the case of Ecuador, after the earthquake, what is needed is the mapping of collapsed and damaged buildings.
For each of these cases, the team decided to train a deep learning model to detect relevant objects and changes in the images. Standard neural networks for classification would not suffice for this task, given that a prediction is needed not per image, but per each pixel in the image. A computer vision approach for this task is called semantic segmentation. In the recent years, there have been significant advances in the deep learning models for semantic segmentation based on deep CNNs. The team, in particular, used a PSP-Net encoder-decoder based architecture for all of their cases.
Such a supervised model requires label (ground truth) data. For mapping the buildings, the team used OpenStreetMap building footprints. For damages, they used mappings from social network data, as reported by the people. Interestingly, the model which was pre-trained for building footprints in Ecuador was successfully applied to detect settlements in Haiti. This showcases the power of transfer learning in AI.
The models are accessed based on the intersection over union (IoU) score, and it ranged between .69 and .78 on different datasets.
Team Informal Settlement Detection
Informal settlement is a name used for slums, poor neighbourhoods and any areas lacking adequate housing and facilities. Nowadays, over half of the world population lives in such informal settlements.
This team has focused on several places in the world for which they had ground truth data about the informal settlements present there, such as Nairobi, Kenya, and Mumbai, India. They also used both, Sentinel and VHR data from Digital Globe.
Once again, there is the semantic segmentation task at hand — classify the pixels in each image based on whether they correspond to informal settlements or not.
The ground truth data for this project came from Afrometer, annotated labels by an FDL partner, Catapult, and from the governments, such as the Indian one, who try to understand and improve the conditions in their slums.
Interestingly, this team has achieved very good results with a “traditional” machine learning spectral angle mapper approach, however, once they had more training data available, a deep learning-based model (DeepLabv3+) outperformed it.
The event finished with drinks and some informal time. Those of us who were interested could also join a small tour of the British Interplanetary Society building presented by its former president Alistair Scott. Besides the large space library, we could see on one of the walls a large collection of photos of all the previous presidents, including Arthur C. Clarke.