Breaking the mould: alternative approaches to M&E for humanitarian action

Global Learning for Adaptive Management
GLAM-blog
Published in
5 min readAug 21, 2019
Photo credit: OSOCC

Flexibility and change are central features of humanitarian action. As outlined in a forthcoming paper by Alice Obrecht from ALNAP, humanitarian crises are invariably dynamic situations, characterised by fast-changing contexts and complex interactions between many actors.

In contrast, most monitoring and evaluation (M&E) systems that are used in the humanitarian sector are based around standard logframe approaches, which have been criticised for their rigidity and inability to innovate in response to change. Why the mismatch between humanitarian context and M&E approach?

We know that the humanitarian sector has struggled with the uptake and use of traditional M&E systems, particularly at the project level. ALNAP’s work has shown that a great deal of information is produced about individual projects carried out by humanitarian agencies. But much of it is either shelved or passed straight up the chain to the donor.

ALNAP is a global network of NGOs, UN agencies, members of the Red Cross/Crescent Movement, donors, academics, networks and consultants dedicated to learning how to improve response to humanitarian crises.

One explanation for the difficulties humanitarians experience with such systems is that it has something to do with the nature of learning and decision-making within the sector, particularly at the project-level. Tacit knowledge — the hard-won lessons of experience, filtered through the beliefs, instincts and value structures of individual aid workers — and informal learning are prominent features of humanitarian decision-making. The formality of traditional M&E systems could, in theory, limit their real-time use for ongoing decision-making and iterative learning.

For this reason, ALNAP has reviewed a range of M&E approaches that are not typically used in the humanitarian space, but which have the potential to respond to the dynamic nature of humanitarian crises and support learning and decision-making at the project level. Much of this work draws from the ‘M&E for adaptiveness’ agenda, as well as the wider evaluation community — beyond the international development and humanitarian sectors. In some senses, we’re not talking about anything new. Our hope is that by bringing together a range of different approaches we can encourage humanitarians to intentionally design M&E systems that can tackle the problems of flexibility outlined in the past.

New report from ALNAP by Neil Dillon
New report from ALNAP by Neil Dillon

Why do we need to innovate?

ALNAP’s new paper identifies three reasons for changing M&E systems: timing, flexibility and perspectives.

Timing:
We identify the need for M&E systems that run with, but go beyond, the recent calls for more real-time reviews. Forthcoming research by ALNAP’s Paul Knox-Clarke and Leah Campbell highlights the ongoing, continuous nature of decision-making within humanitarian agencies. Much like the thinking behind M&E for adaptive management, this point speaks to the need for M&E systems that enable a continuous flow of information on project performance — and evaluation of its relevance and effectiveness — throughout the entire implementation cycle.

Flexibility:
Our paper makes three arguments for increased innovation in M&E systems. First, we highlight the need to respond to adaptation better. M&E systems need to be comfortable with significant and continuous project change. This means adapting monitoring and assessment frameworks when the project goals change from the original intervention logic. Second, M&E systems need to support adaptation itself. True flexibility requires project teams embracing innovation and experimentation throughout their management processes. To be fully relevant to such an approach, M&E systems need to be comfortable challenging initial assumptions, resolving initial uncertainties and moving towards a process of co-creation between project and M&E teams. And third, we need to evaluate the quality and success of adaptative processes. M&E systems need to be able to support meaningful value judgements about the adaptive process itself. If course-corrections and strategic changes in programming are an integral part of humanitarian action, then they must also be the subject of M&E work.

Perspectives:
M&E systems need to be more comfortable integrating multiple perspectives on a project’s relevance, performance and impacts. More than simply including affected-population feedback data, this means actively exploring and investigating the often-competing perspectives of different project stakeholders on what a project is, what it does, and how it interacts within certain contexts.

Photo credit: Isabel Coello/ECHO

What would change entail?

Our paper outlines a range of tools responding to needs in each of these three areas. The tools reviewed include: embedded evaluation approaches, developmental evaluation, systems mapping, systems dynamics, critical systems heuristics, social network analysis and agent-based modelling. In each case, we introduce examples from practice, discuss application challenges and potential solutions, and provide links to further resources. But much of the potential for change comes not from the off-the-shelf toolkits, but rather from efforts to embed evaluative thinking into pre-existing monitoring structures and creating space for informal learning and reflection during implementation. Our aim is to provide humanitarian organisations with the starting points for developing new approaches to M&E that better suit the pre-existing nature of ongoing decision-making and learning within their organisations, as well as enabling a more adaptive and flexible approach to project management, where it is valuable.

For sure, there are challenges to overcome. Many of the tools we discuss require significant resourcing and technical skills. Some will be very difficult to implement without separating M&E funding from the short-term contract cycles typical within the sector. And all will require a degree of letting go: whether by increasing the space for informal feedback and learning or by changing quality assurance processes to allow for evaluations to respond to adaptation and change.

But the costs of not changing — including using ineffective systems and generating great quantities of unused project-level data — are equally high. The question we, as M&E specialists, should ask ourselves is: how long until we break from the mould?

Neil Dillon is Research Fellow at ALNAP, and leads the Evaluation, Accountability and Learning workstream. Follow him on Twitter.

--

--