Collecting Mobile Data Responsibly: webinar recording & takeaway messages

Thank you to everyone who tuned in to our live webinar on Responsible Mobile Data Collection last week! Five panelists on three continents discussed the challenges of remote data collection projects and shared best practices, tools, and tips for adhering to privacy and protection guidelines — from the field level to the WFP context and across the broader humanitarian and development sphere.

But don’t worry if you missed the event — a recording is available above and here, and we have summarised the key takeaways messages for a quicker read below. We’re also sharing at the end of this page additional resources and answers to the questions that we were not able to answer during the webinar due to time constraints.

Thank you very much to our five great panelists:

  • Asif Niazi and Raul Cumba, Vulnerability Analysis and Mapping Officers, WFP Iraq Country Office
  • Angie Lee, Food Security Analyst, WFP mVAM
  • Jos Berens, Data Policy Officer, UN Office for the Coordination of Humanitarian Affairs (UN OCHA)
  • Michela Bonsignorio, Global Advisor on Protection and Accountability to Affected Populations, WFP
  • And moderator Maribeth Black, Food Security Analyst, WFP mVAM

What you should know about Responsible Mobile Data Collection:

Greater risks and challenges: In a pre-digital era, there was more direct control over data. Now, as data collection for humanitarian action relies increasingly on digital tools and automated processes, there is a real need to raise awareness of the risks and harms that can occur at every stage of the humanitarian life cycle as well as the methods for reducing these risks. Challenges include: data falling into the wrong hands, risks related to the IT infrastructure and outsourcing, bias and discrimination, and risks to the rights of data subjects.The example from Iraq highlights some of the challenges with remote data collection: In ISIL-controlled Mosul, people were afraid to answer calls, as it was illegal for the public to use mobile phones. When people responded, the length of the questionnaire and the short time available to ask the questions affected the data quality and response rate.

Step One: an even better understanding of the potential risks. Despite the recognition of the data protection risks and the development of ways to mitigate these, we, humanitarian actors, need to develop a better understanding of where these dangers lie. Sometimes we just don’t know how sensitive a dataset can be; in these cases, it is better to err on the side of caution. UN OCHA’s Centre for Humanitarian Data works on data policy with the aim of supporting a responsible growth of the use and impact of data in the humanitarian sector.

Step Two: mitigating the risks. Responsible mobile data collection has humanitarian principles at its core, such as “do no harm.” At every stage of the data collection and analysis lifecycle, these principles must be adhered to. In a first instance, understanding the local context and engaging in sensitisation campaigns and digital literacy trainings are important. The survey design should strive to minimise bias and ensure that no information other than the information that is really needed is collected. Just as important is the the transmission and storage of data using state-of-the-art security means. Finally, the publication of results ought to be anonymised and protection-sensitive, and there must be functioning and safe mechanisms for participants and others to report problems.In the Iraq case, the survey questionnaire was shortened, and the food security and market components were put together so as to minimize the time respondents had to use their phones.The WFP Data Responsibility Field Book offers both guidelines for the daily work of staff involved in mobile data collection and forms a basis for WFP’s internal dialogue on data responsibility.WFP’s corporate data privacy and security policies are contained within the Guide to Privacy and Personal Data Protection.

Several international collaborations already exist to address the issue of data privacy in humanitarian response. Examples are the Harvard Humanitarian Initiative; the framework developed by UNHCR and the Danish Refugee Council for data-sharing in practice, introducing a common language among humanitarian actors as well as a set of principles and shared processes; the ICRC’s work in collaboration with a Brussels privacy hub; and the International Data Responsibility Group, constituted of research institutes, think-tanks, and the international public sector.

Q & A

1. How do you ensure the authenticity of the interviewee, do you monitor the location of the mobile phones?

Firstly, the interviewer has the name and phone number of the respondent and will check with the person answering the phone whether they are talking to the same person.

Secondly, the service provider has series of towers and knows where a particular mobile phone is calling/answering from. That way, service providers can programme their computers to only call people from particular areas and we can ensure that interviewees are actually coming from a particular location.

Thirdly, we closely scrutinise the output of our data and analyses and make sure that, where the data does not seem to make sense, we investigate all possible sources of bias and error.

More generally, mVAM identifies respondents in three ways:

  • by asking respondents of traditional face-to-face surveys to agree to a follow-up phone survey;
  • by randomly calling people through mobile phone user rolls who have volunteered to take phone surveys. Telecom companies maintain a list of phone numbers of subscribers who agree to participate in surveys. Randomly selected mobile phone users in the areas of interest to WFP are then contacted, as per our sampling instructions.
  • by calling numbers generated through random digit dialing. Respondents are always given the choice to opt in to the survey or decline. Whether it’s WFP or third party providers that conduct phone surveys, the list of contacts (names, phone numbers, locations) are stored and managed in a safe and secure environment; only processed and aggregate data are shared for monitoring purposes — no individual’s statistics or geographic coordinates are released.

2. How do you ensure accuracy and validity of the information through phone call interviews?

In order to ensure the reliability of data, mVAM phone surveys are designed on the basis of representative sampling and using stratification techniques where possible. Results are reported by drawing inferences from large enough samples, complemented by thorough identification of key informants. The quality of data collected through phone surveys is also evaluated with reference to data from concurrent face-to-face surveys and/or secondary baseline data whenever feasible. For more information on representativity and how to account for bias, please refer to the methodology section of the mVAM blog: http://mvam.org/info/methodology/

3. What mechanisms are put in place to ensure the reliability of crowd-sourced data?

Data is triangulated with existing secondary sources of information including face-to-face assessments, field monitoring and key informant reports.

4. How can we integrate information security considerations during the early phases of a survey (especially during survey design and data collection)?

The advice given in the Data Responsibility Field Book is to:

Understand and engage with local context — Engage with the community about major risks related to the proposed data collection. This can be done by interviewing members of the community and through a quick literature review on the mobile phone landscape (e.g. mobile phone ownership and usage rate, social and gender norms) in the country. Work with a community-based organization (CBO) or NGO in the community that can sensitize people about the activity. It is vital to engage with the community before collecting data. If there are protection risks, these need to be communicated. Explore opportunities with ‘self-organizing’ groups, whereby respondents set up management committees themselves.

Choose the right provider — When outsourcing phone surveys to commercial centers, ensure providers are scrutinized and vetted. Undertake due diligence on candidate companies and assess their compliance with best practices in terms of data security and privacy.

Conduct a Privacy Impact Assessment — as outlined above. It is important that, prior to any intervention, WFP conducts a Privacy Impact Assessment (PIA). The purpose of a Privacy Impact Assessment is to identify, evaluate and address the risks arising from the processing of personal data within an activity, project, programme or other initiative. It is important to note that such risks are not only related to IT aspects; they necessarily span across social, political, protection and legal considerations. A PIA framework is available to guide country offices in conducting a PIA. Please contact michela.bonsignorio@wfp.org for further assistance.

Data minimization: collect data on a need-to-know basis only — Collected data must be limited to the minimum necessary to achieve the objective in order to avoid unnecessary and potentially harmful intrusion into people’s private lives. In particular, information about people’s ethnicity, political opinions, religious beliefs or health or sexual orientation/choices should be strictly avoided unless absolutely necessary to the purpose of the survey. This information is not usually collected in WFP’s food security surveys.

Ensure your data collection has a specified purpose — Given the sensitivities and risks of collecting, storing and sharing data, personal and demographic data should never be collected indiscriminately. The purpose of data collection and processing must be clear and unambiguous and must be defined prior to data collection.

Review existing domestic legislation — Local legislation may pose challenges when collecting sensitive data. For example, applicable domestic laws may contain provisions that could force your local partners to disclose personal data in their possession to the government. Under such circumstances, you should only collect data if it is comfortable with the data being shared with the government.

Furthermore, it is advisable to conduct an assessment of the data landscape (including a check of whether the desired data is already (being) collected by other organizations, and whether it would be possible to gain access to, or use that data?).

5. With the rapid increase in datasets shared through the HDX platform, is there any mechanism established to check the data quality and authenticity of these datasets?

Organizations joining the HDX platform are vetted by our team. Every dataset uploaded to the platform is subjected to a quality assurance process, including a data-sensitivity check. The HDX team is not in a position to verify the authenticity of all datasets –this is the responsibility of the contributing organizations.

6. What about the level of dropout of respondents in mobile surveys?

Non-response and attrition rates vary across countries and can be attributable to different reasons (e.g. insecurity, displacement, survey fatigue). Since the inception of the project, mVAM has been following the best practice of providing a modest amount of airtime credits to survey respondents as an incentive for continued engagement. However, more than material incentives, we found that altruism is the biggest driver of response. The respondents must, however, feel that their identity will be protected and they have no need to worry about any negative repercussions. Additionally, we are exploring ways to leverage mobile technology to empower vulnerable communities by increasing their access to information on food prices, nutrition and feedback mechanisms.

7. Is the information from the service provider input into your organization’s database?

Yes. Raw data is sent to WFP in a CSV file at the end of each data collection round which is then stored in a dedicated database for cleaning and processing prior to analysis. Phone numbers or any personally identifiable information are anonymized to ensure sensitive data remains confidential.

8. Thanks to all panellists! Michela mentioned the best practices of having mechanisms for research participants or survey respondents to access findings and/or have a say in how their data is used. Do you have any examples of good mechanisms that have been established for that, especially in areas where access is an issue or with hard to reach populations?

Consultations with the affected population prior to designing an intervention is commonly considered a good practice. Where the population is accessible, it is recommended to hold focus groups and interviews with key informants to gather a representative picture of the reality on the ground. This can also be part of a PIA (see above). In the case of mVAM, such consultations are aimed at understanding issues like effective access to mobile phones and technology, digital literacy among the population, possible social and cultural obstacles affecting individuals’ free participation in surveys, perception issues, security threats. The mVAM team is particularly committed to engaging with the local population at all stages of its interventions. Feedback from the people is regularly gathered by field monitors and through ad hoc field missions.

It would be equally important to ensure that people participating in mVAM two-way communications can contact us at any time to request clarifications and/or express concerns about the utilization of their personal data. This might be built as an ad hoc option into the same mVAM communication system or may be achieved through the establishment of dedicated communication channels (e.g. hotline, email, etc.). Existing complaint and feedback mechanisms previously set up for programmatic purposes can also be used to that end. For example, in Lebanon efforts are underway to set up an interagency common call center as a mechanism to address queries specifically related to the assistance channeled through a Common Card. The call center will be also used as a receptor of concerns and requests related to personal data protection.

When the population resides in a hard-to-reach area, WFP should still conduct proxy consultations with humanitarian partners who have access to the population, if possible. A soft preliminary survey could be also launched via mobile phones, aimed at reaching people and understanding any possible risk affecting the roll out of the mVAM initiative. The survey itself may be sensitive and potentially dangerous, so it is recommended to avoid highly sensitive topics and utilize neutral, soft scripts. The assistance of a protection officers/advisors to that end is recommended.

Resources referenced during the webinar:


Originally published at MVAM: THE BLOG.

One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.