Managing Relevant Information in the Aftermath of Natural Disasters: Launching PLJ’s Latest Data Analytics Platform
Responding to natural disasters effectively is vital for saving lives and limiting disaster impact. While each natural disaster comes with its unique conditions, two challenges generally exist: critical information pertaining to the situation on the ground tends to be scattered among responders; and valuable time often gets lost in the immediacy of these events as authorities prepare response strategies. Pulse Lab Jakarta recently launched its latest research prototype — an automated, open source platform that integrates multiple non-traditional data sets to aid logistics planning and information management following natural disasters. Ms. Ursula Mueller, Assistant Secretary-General for Humanitarian Affairs and Deputy Emergency Relief Coordinator in the United Nations Office for the Coordination of Humanitarian Affairs (UN OCHA), was the keynote speaker at the event, which was attended by a wide audience across the development and humanitarian sectors.
During its early development and prototyping, the platform was called DisasterMon (a portmanteau of the words disaster and monitoring). However, as the prototype was refined and new features were added, it evolved more into managing information to better inform response following natural disasters. Inspired by how the human mind processes information, the platform was rebranded and given the name MIND (which stands for managing information for natural disasters, namely cyclones, earthquakes, tsunamis, floods, volcanic eruptions and wildfires).
The Information Gaps
The platform is built on an automated data pipeline, allowing it to stream and analyse several non-traditional data sets all in one place. This data pipeline is triggered based on disaster alerts received from the Global Disaster Alert and Coordination System (GDACS), which is a global system aimed at closing information and coordination gaps.
While speaking at the launch, Ms. Mueller underscored that “Governments, disaster response authorities, NGOs as well as national and international agencies need useful, accurate and up to date data that can be accessed easily and fast. Yet getting accurate data on who are the people that need humanitarian assistance; how many there are; what kind of assistance they need is still one of the biggest challenges faced by the humanitarian system.”
Though information that can inform disaster authorities and citizens generally begins to generate right after a disaster hits, such information tends to be managed independently by different responders and information access comes with its own challenges. As an open source platform, MIND is designed to address these challenges, by publicly providing stakeholders with timely insights on affected areas, the needs of communities, among others.
Twitter: near real-time updates
The platform analyses Twitter data, in particular by examining public geotagged tweets that are within a disaster’s geographical parameters, known as a bounding box. The focus is on the location and words of each tweet. Users are able to explore the content of the tweets and get a sense of the number of related tweets and popular keywords. This information is neatly visualised within the platform to enable further analysis, for instance what messages can be inferred from popular keywords, as well as what the aggregate timestamps and locations of the tweets suggest.
OpenStreetMap and OpenRouteService: logistics planning
The platform integrates a pair of APIs from OpenStreetMap and OpenRouteService (an open source route planner developed by Heidelberg Institute for Geoinformation Technology at Heidelberg University) to help identify suitable routes for the transportation of aid and resources. This feature provides information on strategic Points of Interest (PoI) such as schools, hospitals, government buildings, airports and ports, which are important for exploring options for origin and destination points. With this information, users (especially responders who are unfamiliar with the topography of an area) can plan their journey, including determining the type of transportation to use based on the road surface and the amount of gas needed based on estimated travel distance. The feature also details the duration of travel, road type and elevation level.
Wikipedia: basic factsheet
Humanitarian fact sheets normally exist in tables and forms, and tend to be based on specific organisation themes and sectoral focus. It’s a useful tool to inform responders about basic information such as population size, population density, age group, religion and infrastructure within the affected locale. Recognising that Wikipedia is a popular go-to site for many internet users who are seeking to obtain general information in a simple summary format, the platform adopts its style. Beyond its purpose as a fact sheet, this feature demonstrates the possibility of gathering useful, background information from a variety of sources through an API in a fully automated process.
News API: casualty count
In reports on catastrophic events, we observed that news sources are inclined to define their severity in terms of the number of casualties. This feature of the platform incorporates a text processing algorithm to extract news articles that are related to a given disaster from a credible news API; an article at minimum should match up with the type of disaster that has occurred and the country affected. Based on the set of articles identified, the algorithm then proceeds to extract figures from the articles that describe an estimation of the number of casualties (words such as victims, fatalities and deaths are used as proxies). Articles are analysed on a daily basis and the number of casualties are plotted on a graph to monitor changes over time.
Google Trends: search results
Google searches from within impacted communities can help authorities to understand the concerns and needs of citizens affected. The Google Trends feature that is integrated in the platform gives a sense of what topics people are talking about and the type of content they are searching for. This is an important communication tool that can help government disaster authorities to better contextualise the information they share with the public throughout the disaster response phase. As seen in the screen capture below, it shows overall interest in a particular search topic over time, rising topics and queries, as well as statistics on related topics.
Open Invitation for User Testing
We’re grateful to our colleagues from the development and humanitarian sectors who attended the launch, including representatives from UNOCHA, AHA Centre and the Australian Department of Foreign Affairs and Trade.
The platform is intended to be used by various stakeholders and has a customised layer that allows users to export and visualise their own data set (acceptable in GeoJSON and Marker formats). We were also delighted to collaborate with Humanitarian Data Exchange, whose data set (similar to UN OCHA’s 3W products) was used to illustrate the benefits of the customised layer. Seen in the example below, it visualises data that is related to the International Aid Transparency Initiative, showing details about past and ongoing projects within affected areas and implementing partners involved.
The platform is designed to complement existing disaster response tools and can be modified to meet the specific needs of an organisation. We’re putting the final touches to the platform, afterwhich it’ll be publicly available. We encourage stakeholders across different sectors to participate in its user testing and share your feedback with us — email@example.com.
Pulse Lab Jakarta is grateful for the generous support from the Government of Australia.