Geek Culture
Published in

Geek Culture

Using Machine Learning to Expedite Humanitarian Action

Creating a Humanitarian Data Ecosystem

Today, there is more data produced in a year than the rest of human history combined. The humanitarian sector, which once struggled with data scarcity, is now overwhelmed with the amount of data collected for a single project.

A massive amount of primary and secondary data plus media-based data sources generate information on a scale that organizations and analysts cannot handle independently. There are many advantages of making proper use of the massively available data — the analysts get access to real-time information and make better-informed decisions.

Secondary Data Review (SDR)

Read: What is secondary data review?

There are quite a few tools available to process humanitarian data today. For instance, quantitative data from survey results are processed and often shared through HDX — a UN OCHA initiative. However, most qualitative data is unstructured; thus, data processing and analysis require a significant investment in terms of time and resources leaving a large part of the data unprocessed or underutilized.

Data Entry and Exploration Platform (DEEP)

Read: What is DEEP?

Following the Nepal Earthquake in 2015, Ewan Oglethorpe (Exec. Director at Data Friendly Space) landed in Kathmandu to lend a helping hand. Ewan, a data scientist with roots in Silicon Valley, joined the team of crisis responders at the tented camp in the UN country office compound. He realized that the UN’s system to analyze the available data was rudimentary and could benefit from more advanced technology. During the crisis, Ewan, with a small tech team, worked in consultation with humanitarian experts and analysts to create the first version of the Data Entry & Exploration Platform (DEEP) .

“There are several software solutions available to manage and process qualitative data, including Envivo, Mxeg and DEEP. Both ACAPS and UNHCR are piloting a project in DEEP, a platform specifically developed by and for humanitarian actors to process substantial amounts of unstructured data. Users can upload a variety of sources (news articles, PDFs, Word documents etc.) and tag/categorize them using custom analytical frameworks. Catalogued information can then be exported into Excel or Word for further analysis.”

Source: Pilot — Joint Processing of Qualitative Data on Rohingya Crisis (May 2018), Humanitarian Response, an OCHA Service (May 2018)

Since its inception, DEEP has been used for Secondary Data Review (SDR) and tagging large datasets in more than 1,200 projects supporting humanitarian responses across all humanitarian sectors all over the world. To name a few, DEEP was crucial to

  • UNHCR’s response to the Venezuela migrations crisis
  • ACAPS & IFRC’s response to the Rohingya crisis.
  • Funded by USAID and implemented by Data Friendly Space in partnership with iMMAP, DEEP is active in coordinating COVID-19 responses in 14 countries across Central and Eastern Africa, the Middle East, Southeast Asia and Latin America.
The DEEP Platform

Data Friendly Space and DEEP

Data Friendly Space (DFS) is currently the technical supervisor and host of DEEP. DFS has been implementing DEEP projects in collaboration with several major humanitarian organizations. DFS supports the board of the DEEP governing body, which includes UNICEF, UNHCR, UN OCHA, OHCHR, the International Federation of the Red Cross, ACAPS, IDMC, Okular-Analytics, JIPS and iMMAP. Since its first project in 2018, DFS has continued to increase its capacity and its partners’ capacity to make a lasting impact.

Today, with more than 85,000 annotated humanitarian response documents hosted on the platform, DEEP is in a unique position to leverage NLP models to fuel much faster responses to humanitarian crises.

Machine Learning to Expedite Humanitarian Action

Since its inception, DFS has been focused on creating data-centric applications to support humanitarian organizations in extracting actionable insights from their data and fulfilling their missions.

Today, with more than 85,000 annotated humanitarian response documents hosted on the platform, DEEP is in a unique position to leverage NLP models to fuel much faster responses to humanitarian crises. With the development of new NLP models, DFS aims to automate secondary data reviews done by content tagging teams (a lengthy process) and enable humanitarian stakeholders to respond rapidly to any crisis by focusing more on the analysis of the data and not just data acquisition.

DFS concentrates on the intersection between data automation processes powered by Artificial Intelligence and human knowledge, particularly when one can help the other execute the analysis. DFS’ NLP innovation team has grown to five specialized engineers spread across Europe.

In addition to the existing NLP team, iMMAP, a DEEP board member, has initiated a research partnership with the ISI Foundation, a prestigious private institution based in Turin (Italy) conducting research rooted in the area of Complex Systems Science. The ISI Foundation has appointed Nicolò Tamagnone, who will be exclusively working on implementing NLP features in DEEP.

Open-source Technology

As a leading global non-profit, DFS is dedicated to keeping its technologies accessible for emergency response and development organizations. To reach its goal and maximize its impact:

  • DFS collaborates with top academic institutions to conduct research and development of machine learning-based features for the DEEP and other humanitarian partners
  • DFS organized hackathons with CERN and EPFL (one of the most prominent technical universities in Europe) at the Applied Machine Learning Days
  • DFS shares data openly with partner universities and supports students for their master thesis projects, internships and class-related projects
  • Through these collaborations, DFS invites outstanding students to join the innovation team whenever possible. The results of their research are integrated into the DEEP and other services
  • Through these collaborations, DFS attracts the best talent and offers real opportunities for students who want to pursue a career in the humanitarian technology sector — during their studies or after the studies
  • Currently, three groups of students at the Johannes Kepler Universität Linz in Austria are working on sentence extraction and classification of humanitarian data extracted from the DEEP, under the guidance of Navid Rekab-saz (Assistant Professor at JKU and member of DFS NLP innovation team)
  • This summer, DFS will hire two interns from the EPFL

To learn more about DFS, its projects and DEEP, please sign up for our newsletter on www.datafriendlyspace.com

Or write to us at hello@datafriendlyspace.com

Written by: Rishi Jha — Communications & Partnerships — Data Friendly Space

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store