Digital Twin of the News — AI enhanced news and EO data just a few clicks away
Automated fusion of different types of data to provide up-to-date information on extreme natural events
A digital twin, a virtual representation of a physical entity, aims to mimic the behavior of its real-world counterpart in a digital environment. In the context of the Digital Twin of the News (DToN), the entity is the Earth and the natural processes focus on extreme natural events such as wildfires, volcanic eruptions, floods, droughts, or air pollution events.
As such events are likely to become more frequent and intense in the coming years due to climate change, there is a high demand for a quick and easy way to explore them. The DToN provides such a way by combining Earth Observation (EO) data with event-related news articles. This combination provides a great entry point for journalists, policymakers, analysts, conservation organizations, and the interested public to quickly access high quality background material.
Considering that a few years ago finding and processing event-related satellite imagery meant at least a couple of hours of work and required a good understanding of remote sensing data, offering this out of the box in an automated way is a big step forward.
Finding the needle in the haystack of news
To begin, a natural event (disaster) occurs somewhere in the world and several news outlets publish articles about it. Many of these articles are scattered across the internet, but how do you find them in a never-ending stream of news and link them to the same event? That’s where Event Registry, the world’s leading news intelligence platform, comes in. Every day they crawl more than hundreds and thousands of news pages and extract “events” using NLP (natural-language-processing) models. Each event is tagged with a summary, metadata and a list of related articles. This data can either be used directly by the DToN (summary, related articles, news sources) or as input for further processing steps (rough location, date, keywords) to find the most relevant satellite imagery.
What we need is the exact location and a date
In order to represent the event in the application, we need to link it to the satellite data. For this, two pieces of information are important — the (exact) location and the date. Since latitude and longitude are not a common part of a news article, we need to include some behind-the-scenes steps in the process, that vary depending on the event type — where the event takes place has a different meaning for a volcanic eruption compared to floods or drought.
In today’s world, it is almost mandatory that fancy machine learning buzzwords are used to solve the above problem — Deep Learning or even generative adversarial network (GAN) and the like. However, in our team we always try to find the simplest solution to a challenge, not necessarily the fanciest, and we only bring the heavy AI machinery into play when necessary.
Get the best wildfire visualizations related to an event from a rough location and event date
When a news article talks about “2 separate wind-driven brush fires erupt near Ventura” a human immediately understands that the fire did not happen in Ventura, but only close to it. However, an algorithm, might return Ventura as the location, since it is mentioned in every article. Even if the fire is not in Ventura, it is safe to assume that it is within a few hundred kilometers of the location mentioned. This means that we can take Ventura and the event date retrieved from the news articles as input to pinpoint a more accurate location and date of the event using satellite imagery.
1. Finding the exact event date
To find the correct event date for a wildfire event we use the European Forest Fire Information System (EFFIS). The EFFIS is part of the Emergency Management Services in the EU Copernicus program and provides, among other things, a daily updated active fire layer with a resolution of 375 m from VIIRS (more about VIIRS active fire detection here). To detect the exact event date, we count the active fire pixels for each day in the time range (30 days) before the event date (mentioned in the news article) and set the most recent largest increase as the event date.
2. Pinpointing the exact location and creating a bounding box
To find the exact location of the fire, the processing pipeline accumulates all fire pixels in the ten days following the exact date (left image below). Small isolated pixels are then removed and only the largest cluster is kept (middle image below). Finally, the bounding box is created to cover the detected cluster (right image below).
3. Finding the best tile(s) for the visualizations
For each event, we determine the best visualization date before and during/after the event combination of tile coverage in the area, cloud coverage, and fire pixels. The visualizations selected to represent wildfires are all based on Sentinel-2 bands and are retrieved via the Sentinel Hub’s processing API.
Once a new event has passed through the detection algorithm described above (varies depending on the event type), it is stored with all the newly derived information and is ready to be displayed in the app. For the user, the DToN app is the centerpiece of the entire process. It provides easy and direct access to all event-related news articles as well as relevant satellite imagery related to the event.
The DToN is a great example of how data from multiple sources (e.g. the background of the globe is a 120 m global mosaic created from 18 months of Sentinel-2 data, blog post below) analyzed with artificial intelligence and automatically linked, can provide added value to journalists, policymakers, analysts, conservation organizations, and the interested public.
Explore the Digital Twin of News for yourself!
Currently still in beta, but available for you to explore. Go to our Digital Twin of the News app and start exploring. Use the event type filter to view only a specific type of event (currently available: wildfires, volcanoes, floods, droughts and air pollution) or adjust the time range to explore events from the past. Once you find an interesting event, select it directly on the globe or in the list view to learn more about it and get the full picture.
The DToN project is funded by the European Space Agency (ESA) and is part of their Digital Twin Earth concept.
If you are an Earth Observation-enthusiast and want to help us develop applications that raise awareness of events affecting our planet, contact us, we are hiring!