Doing Evidence and Learning differently

Helen Guyatt
Start Network
Published in
4 min readJun 27, 2022

By Helen Guyatt

Banner displayed in Sibi about the about symptoms and preventative measures for heatstroke (GLOW Consultants, Pakistan)

As an Evidence and Learning team working in a system change organisation we are always looking for ways of doing things differently. One year ago, we published three commitments to the people that help us evaluate emergency funding released through Start Network (1) Clarity around purpose (2) Sharing our findings and (3) Measuring success differently.

We would like to reflect on our progress against these through our work with local consultants in evaluating Start Network funded humanitarian responses.

CLARITY AROUND PURPOSE

We wanted to carve out more time in the consultation process to explain the purpose of the interview and how the information will be used. This had proved difficult during Covid-19 when many interviews were held remotely. However now that we can undertake face-to-face interviews, the researcher can take more time in explaining why we are doing the interview and how it will be used. We have noticed that the information we have received over the past year has been very honest and people have shared really insightful stories of their lived experience. They have also fed back how much they prefer interviews being conducted this way — face-to-face — with an opportunity to tell their stories.

Researcher from GLOW Consultants conducting an in-depth interview with a community member in Gwadar (GLOW Consultants, Pakistan)

SHARING OUR FINDINGS

We wanted to explore ways that we can share the findings of our evaluations and research with the people who shared their experiences with us. Implementing agencies see their feedback in the evaluation report, as this is shared with them, but we wanted to explore how we share findings with other stakeholders, especially communities. In emergency responses, providing feedback to communities can prove challenging as this is very often a one-off interaction with the community. This means that it is difficult to feed this back informally through the sort of regular meetings that a development programme may have with the community. In the case of emergency assistance, if individuals had been targeted with support, then it may be possible to use the same channels that were used to contact them in the first place. If the assistance provided was more generally available, behavioural messaging through mass media and posters, or the provision of a public service such as a water pump, then it may be more appropriate to share publicly.

Banners in English and Urdu providing feedback on the learning and recommendations for behaviour messaging against heatstroke (GLOW Consultants, Pakistan)

Last month we shared feedback we had received from communities through banners posted in public spaces. The same public spaces that the assistance had been provided one year previously. The assistance had been cooling facilities and behaviour messaging, established to help mitigate against the adverse effects of a heatwave in Sibi, Pakistan in June 2021. In collaboration with GLOW Consultants we had spoken with communities about their experience of using the facilities and accessing the behaviour messaging and asked them how both of these could be improved in the future. Their feedback went into a learning report and helped shape the implementing agencies contingency plans for this year. We summarised some of the findings and recommendations into short messages in both English and Urdu that were then put onto banners and hung in those prime positions. We timed the sharing of the findings with the onset of the heatwave season to hopefully reinforce some of the behavioural messaging that was shared last season.

EVALUATING LOCALLY LED EARLY ACTION AGAINST HEATWAVES IN SIBI, PAKISTAN

MEASURING SUCCESS DIFFERENTLY

We wanted to move away from Western ideas and views around accountability and explore with communities how they would like us to be measuring the success of a project providing assistance. We received some very powerful and practical suggestions that we have now been able to incorporate into our evaluations. We heard from communities about both how we should ask questions and also what types of questions should be covered. The main changes that we have made in response to their feedback are to try and conduct interviews face-to-face rather than remotely, including an on-site observation tool in all our assessments (communities said it was important to observe what people do with the assistance) and provide some more time and space for in-depth interviews (people wanted us to “take time to explore”). Communities also recommended specific questions that we could ask around success which focused more on how they were treated, with dignity and respect coming up as important ways of interacting with people; and the impartiality of the organisation and the fairness in deciding who received what. They also suggested we continue to ask for their suggestions on HOW to gather information in addition to WHAT information is important to them.

HOW DO CRISIS-AFFECTED COMMUNITIES DEFINE A ‘SUCCESSFUL’ HUMANITARIAN INTERVENTION?

The Start Network has recently published its decolonisation framework and one of the commitments from the Evidence and Learning team is to continue to reflect on how we can decolonise our approaches to gathering, sharing and learning from information.

--

--