Evaluation in complexity — two approaches and how they helped us learn.

Keira Lowther
Centre for Public Impact
5 min readMar 7, 2024

--

Monitoring, evaluation, and learning practices in complex environments are currently undergoing a revolution*. People are rethinking the implications of complex adaptive systems for how we might monitor, evaluate, and learn about the changes we want to create.

Complex systems theory suggests that causal change happens differently than traditional evaluation presumes. If change happens differently, then how we monitor and evaluate these changes also needs to be different.

At the Centre for Public Impact Australia and Aotearoa New Zealand (CPI ANZ), we’re exploring the relationship between social imagination, belonging, and climate action in dense urban environments. But how are we evaluating our work, and what processes are we using?

A flashing sign with the words big if true on it
Complexity — big if true (and it’s true) © 2022 Kelly abeln

Our approach to evaluating the Imagination in Southbank project has drawn heavily on insights from this revolution. One principle from Blue Marble Evaluation is that evaluation is not an inert activity: it can be an intervention itself, catalysing change. Rather than ignore this or minimise the effects, we embraced this in our approach, choosing methods to create data for both evaluation and meaningful change.

Journaling

Using reflective prompts and journaling is an established practice to consolidate and enhance learning. With participants’ permission, we used the data they shared in their journals for our evaluation.

After each session, participants ran through a series of reflective prompts created by Kate McKegg. These questions draw on a blend of Tetramap (shared by Kataraina Pipi) and the ORID framework. The process and the prompts themselves helped the group to regularly reflect on their experience in a thinking, feeling, and action-orientated way:

  • One thing I’m feeling
  • One thing I’ve learnt
  • One thing that puzzles me or I’d like to know more of
  • One thing I must do.

The prompts were printed on a bookmark and placed in a travel journal for participants to document their experience of social imagination and capture feelings, thoughts, ideas and actions as they arose.

Our reflective prompt bookmark and a sample of participant reflective writing

At the end of each day, there was protected time for participants to respond to these prompts in their journals. We photographed the reflections of those who were comfortable with this to form the backbone of our dataset for evaluation.

We analysed this data using a pattern-spotting template based on work by the Human Systems Dynamics Institute and adapted by the Kinnect Group and the Developmental Evaluation Institute. This template — which explores generalisations, exceptions, contradictions, surprises, and puzzles within a qualitative dataset — makes engaging in a rigorous and systematic process easy. Team members reviewed the data separately and then combined insights and reflections in response to the data and the questions in our evaluation framework.

By collecting images (and matching handwriting to track anonymous individuals), we could watch change occur over time and understand what was meaningful to participants in each workshop.

H forms

As we moved into post-workshop support, we used H forms as a monitoring and evaluation exercise. The H form uses the horizontal line of a large H as a spectrum that runs from 0% agreement to 100% agreement for a statement written above the line.

Participants were invited to place themselves on a spectrum of agreement with the statement and consider the reasons for their choice. On one sticky note, they wrote why they were not in 0% agreement with the statement and placed it to the left of the H. On a separate sticky note, they wrote why they were not in 100% agreement and placed it to the right of the H.

A large piece of paper with a H in the middle. Above it are the words “How prepare are you to take action on something meaningful to you in our community?”. There are coloured dots to the right hand side of the paper and coloured stickies where participants have explained how they decided where to place their dot.
An example of a H form we used in a workshop

We used the following statements:

  1. How prepared are you to take action on something meaningful in your community?
  2. How confident do you feel that there are people to step in and support you if you did?

Because we were interested in change over time, participants were asked to put a yellow dot on where they felt they were now and a blue dot on where they felt they were before the workshop along the scale, then respond to the statements from their current position.

This brief exercise and the conversations that followed helped us understand some nuance of the participants’ experiences, whether we created change through the workshops and our initial work afterwards, and what that change was contingent upon. It also built the network of social capital as it became clear how many were keen to be active and support each other, reinforcing the work we had begun in the workshops.

So what did we learn?

The approaches we chose consolidated learning and created accountability, and the journals served as a record of personal experiences. They also enabled the project team to respond to the strategic and operational evaluation questions we set ourselves. In our experience, using approaches that benefit all involved is key to high-quality data. Data collection is no longer a burden; people focus and share their experiences generously, and we can draw out insights and learning.

We found that for a subset of participants, we needed to complement these approaches with interviews to draw out themes that emerged from the journaling data. This allowed us to probe, clarify, and extend our understanding.

If you want to learn more about our approach to rigorous evaluation in complexity or partner on a similar project, please get in touch.

For more information on the thinking that informed our approach:

  • See here and here for evaluation in complexity,
  • here for power in evaluation,
  • here and here for more information on developmental evaluation.

--

--