Answering ‘the million dollar question’: How do we ensure evidence on community needs and perceptions is used in humanitarian response?

Elrha
Elrha
Published in
6 min readJun 15, 2022

Cordelia Lonsdale, Senior Research Impact Advisor for Health in Humanitarian Crises (R2HC) programme, shares some takeaways from our latest event on how community needs inform outbreak response in humanitarian settings.

Queue of women to receive cash assistance
The queue of people getting their cash assistance is long but orderly and fast. Credit: World Vision

Last week, we convened a rich discussion to share the learning from R2HC’s grantees. The panel gathered research-practice partners who have been collecting data on the impacts of COVID-19 on the needs, perceptions and priorities of people affected by crises in different humanitarian settings.

All our speakers were concerned with the critical but tough next step: using that data and evidence to drive real change. How to do this effectively is — as one of our speakers, Fatima Osman, Head of Learning and Change from MESH, put it — ‘The million dollar question.’

Exacerbated by the pandemic, humanitarian needs have significantly increased in many settings, while the rising costs of basic services and goods have knock-on effects not only for communities but for humanitarian organisations themselves.

Wordle on how COVID-19 has impacted humanitarian needs

Evidence coming directly from communities, if applied effectively, could help overcome some of these barriers and make better use of humanitarian resources, including by enabling programme adaptations.

Here are some practical, insightful suggestions from our speakers:

Incorporating evidence on community needs can strengthen humanitarian response

Gillian McKay, Research Fellow and DrPH Candidate at London School of Hygiene and Tropical Medicine, noted that decision makers who were engaged in use of the International Federation of Red Cross (IFRC) community feedback data (during the Ebola response in Democratic Republic of Congo) gradually came to realise how critical the data was, once acted on, for building trust between responders and communities. This trust was critical to effective delivery of the response- particularly important in conflict settings, where mistrust can provoke aggression or violence towards healthcare providers or responders.

In Lebanon, the Community Perceptions Tracker (CPT) helped Oxfam identify refugees’ emerging concerns about vaccinations. The CPT data enabled Oxfam staff to respond through flexible adaptations: covering transportation costs to get refugees to vaccine centres, and supporting them to enrol on the government vaccination website. This in turn successfully increased vaccine enrolment and, as the evidence showed, reduced refugees’ concerns.

Knowledge translation chain — getting the evidence used and applied

All speakers emphasised the importance of ensuring data is presented and formatted for the needs of decision makers. During outbreak responses, decision makers are besieged by data: presenting evidence clearly and accessibly is critical.

While the Oxfam CPT data was presented in the form of reports summarising trends in easy-to-digest graphs, the IFRC Community Feedback data team used dashboards and other knowledge products. Importantly, teams should triangulate and synthesise evidence from different sources where available, to give key top-line trends to busy policy makers.

A motorcyclist and a Red Cross member of staff speaking.
Chance Evariste is the vice president of “le motard” — the motorcycle association in Komanda, DR Congo. Red Cross began to work with the motorcycle drivers in communities like Komanda to help change their ideas about Ebola. Credit: IFRC

Community feedback data tends to be qualitative: care should be taken to ensure that it is presented meaningfully alongside quantitative data, considering that decision makers may be more familiar with the latter. It was suggested that qualitative data training could also be valuable to provide for decision-makers, as could capacity for data analysis and knowledge brokering through a ‘social sciences cell’ during outbreaks.

Finally, it is important to agree with all stakeholders in advance how data will be used, to ensure decision makers are sensitised to its potential value and make ‘room’ for its incorporation into decision making. For partnerships working on collecting the data, mapping relevant stakeholders, and agreeing who will present the data to those stakeholders and how, is critical to keep the knowledge chain moving.

Reflections for the humanitarian and research sector

For me, there are two key takeaways from our discussion:

  1. Generating evidence on the needs, perspectives, and priorities of communities is critical to inform humanitarian response — not just a ‘nice to have’

Seeing how the data was able to inform adaptations to make programmes more acceptable and effective, such as improving health messaging and supplies, demonstrates this. Not only is listening to and responding to communities in line with core humanitarian principles, it also clearly bolsters the motivation of humanitarian programmes’ staff if they can see the specific ways adapted programmes and interventions are helping people. And when problems are highlighted, humanitarian actors have access to solid evidence to undertake advocacy or lobbying for solutions.

This said, sudden shocks tend to exacerbate pre-existing vulnerabilities and in some cases confirm what humanitarian actors already know or suspect. Actors who undertake data collection should use contextual evidence and expertise already available and build on it. Katie Rickard, Director of REACH from IMPACT Initiatives suggested we consider investing in early-warning systems and real-time monitoring efforts, that can be responsive during outbreaks or other shocks: noting that existing research processes were slow to respond to the pandemic, creating a ‘black hole’ of evidence. As our Global Prioritisation Exercise is uncovering, funders of research need to be able to identify evidence needs more systematically, and we should respond more effectively as a system, to meet these needs.

2. We shouldn’t collect the data if we aren’t going to act

I’d argue (and have) this is true for any expenditure of public money, particularly ODA, on research. But it is critical here because of the potential for harm to communities, or reducing trust in humanitarian services, if we engage people in a data-collection process but then never respond to what we hear. We need to centre communities throughout, re-framing them as the clients- or beneficiaries- of a research process. For too long we have collectively considered writing papers and reports on the needs of communities a valid end-goal. This isn’t a tenable position anymore (if it ever was), since we know pretty well at this point how to do better in terms of uptake of evidence. But we do need to invest time and thought in our systems, capacities and processes to enable this.

One useful strategy our panellists suggested for driving action was to involve policymakers directly in interpreting the data and proposing policy responses, as decision-makers are best placed to identify actionable next steps. Following up on actions to ensure they are implemented is also critical. ‘Restitution of findings’ meetings with communities should include reporting back not only on what was found, but what has been done to share their data with decision-makers and the results of that process.

For funders, humanitarian programmes need to be flexible to adapt to the needs and priorities of communities. Consider how to incentivise delivery organisations to adapt to community needs or feedback: currently capacity pressures and funding constraints tend to encourage sticking to pre-made plans. Investing in learning time is clearly important: humanitarian staff valued the research partnership to help them stand back and reflect on implications of the data with input from external experts.

But this isn’t typical. Despite great initiatives like the Social Sciences in Humanitarian Action Platform it’s clear from our panellists that translating evidence and applying it at level of local and national responses is still challenging and requires greater support. As funders of research, whether it is community feedback data collection or other forms of evidence generation, we consider investments in knowledge translation, uptake and application of evidence as critical components of research projects. We are now thinking about how to target resources so that they benefit researchers and humanitarian partners who are directly engaging with both communities and local decision-makers, for greater impact.

ENDS

Interested in useful tools and approaches for collecting data on community needs and perceptions? Explore these resources:

--

--

Elrha
Elrha
Editor for

We are Elrha. We are a global charity that finds solutions to complex humanitarian problems through research and innovation.