How We Define Impact for our Data Innovation Lab

Pulse Lab Jakarta and Mark Fiorello (Solidaritas)

Our Research Dives have brought together more than 100 participants to date from diverse professional backgrounds.

One question that keeps us on our toes is: How do we measure the impact of what we do at our data innovation lab? It’s a big question and one in which we have struggled throughout the years. Last year we had the privilege of working with Clear Horizon and SOLIDARITAS on setting up a results measurement framework for the Lab, and we debated long and hard about what accounts for “impact” in an innovation lab. Particular thanks go to Mark Fiorello from Solidaritas for his expert guidance and patience with the team at PLJ and for co-authoring this blog with us.

So, what’s your impact?

In particular, how do you measure impact in a lab that by design experiments, tests, prototypes, ditches failed prototypes… and typically hands over successful ones to partners for scaling up through their respective programmes? Whether these investments in experimentation result in significant positive change is a question very much worth asking, but that doesn’t mean it is a straightforward question to answer.

In this blog post, we thought we’d share our own thoughts and experience on understanding our impact, in the hopes that it might be relevant for other organisations that are asking similar questions of themselves. This piece focuses on defining what impact means to us, but we hope this will be the start of a series documenting our attempts to measure how the lab has contributed to positive change.

As we began to jot down a few key points, one of our favourite past reads came to mind — a thought piece by ODI Methods Lab that shows just how many different ways the term “impact” gets used, and that in its most colloquial use, “impact” is often interchangeable with other words such as “result”, “outcome”, and “effect”. This has led us to key lesson number 1: the first step of understanding impact is to define it in our own context. In other words, when we talk about “our impact” with others, we need to make sure we’re on the same page.

Impact for us as an innovation lab means that not only are our methodologies and platforms taken up by others for use, but also that more and more of our collaborators since our early days are investing in innovative practice, whether it be by building their own data analytics team, getting their own human centered design experts on board or even just bringing in specific expertise in design and data before embarking on a new project. To phrase this another way, we wholeheartedly endorse the Methods Lab’s statement underscoring that “the scope of the definition of impact… must therefore be appropriately bounded”.

Easy, right? If only!

In the context of PLJ, we have encountered at least three main challenges when it comes to defining impact:

  1. Essentially, our core function as a data innovation lab is to experiment and apply our methods across a range of different issues. Ironically enough for a data lab, one of our main challenges is when we are asked to quantify the effects of our work using measurement changes common for other development initiatives, such as number of beneficiaries, the accumulated quantitative use of a platform, or the number of projects commissioned. We can certainly keep track of all of these aspects, but should they be used to define the value of what we do?
  2. The changes we might consider to be “impact” may not necessarily be apparent at the time we set out to do something. This is consistent with the OECD’s definition of impact as “positive and negative, primary and secondary long-term effects produced by a development intervention, directly or indirectly, intended or unintended” (in other words: basically anything that is an “effect”, as long as it occurs over the “longer term”).
  3. As a Lab, we typically contribute to changes we might consider to be “impact”, rather than cause them. This means that in the PLJ context, we can categorically reject the definition of “impact” used by certain donors or organisations, which reserve the term impact for results or effects that are caused by or attributable to a certain intervention. Moreover, impact in this sense often comes at a later stage, so the notion that as a data innovation lab we ought to be involved in measuring impact in the full life cycle of an innovation falls short of genuine.

So where do these challenges leave us?

Based on our reflections alongside our partners over the years, in the context of PLJ, we think there are three main types of “impact” (or, if we want to use a slightly less debatable term, we are also comfortable referring to these as “influence”) where our work brings value to our broader stakeholders. In our recently published 2017 Annual Report (see pages 42 and 43), we describe each of these impacts using examples of projects we’ve worked on, and have here summarised them for the purpose of this blog.

First is “Operational Impact”, which we basically define as the effect our analytics or prototypes have on the ways that our partner/client organisations work. Improvements in operational effectiveness and/or efficiency due to the adoption or adaption of PLJ-inspired products, or due to an increased understanding of human-centered design issues, are among the aspects we consider highly relevant as PLJ’s “impact” — our Haze Gazer platform, which has been adopted by the Executive Office of the President of the Republic of Indonesia, is one example.

Second is “Methodological Impact”, which we define as the effects that we have on the practice and application of data science. If we have somehow contributed to other people or organisations using existing data in new ways or using new datasets or new analytical methods to address existing problems, we also consider that to be “impact” — our approach of using Twitter data to infer commuting statistics in Greater Jakarta is another case.

Finally, there is “Ecosystemic Impact”, which is important given our mandate to support data innovation more broadly. We are very conscious of the fact that we exist as part of a much more complex ecosystem of data innovation — not only in Indonesia, but also regionally and globally. Where we contribute to key stakeholders participating differently within this ecosystem, for example in terms of new collaboration or further research, we consider that to be an important form of “impact”. Our Research Dives are a good example of this: several former participants have continued the work they started during our dives and submitted their own papers for conferences and academic publications.

What’s next?

Well, uh, continuing to actually measure impact. We know that defining and keeping track of our impact is an ongoing process. The results measurement framework that we’ve developed with our friends from Clear Horizon and SOLIDARITAS has equipped us with a set of tools that helps us keep track of progress and identify significant changes as a result of our work. One of the approaches that we’re trying out now is tracing what happens to our data analytics platforms, such as Haze Gazer and VAMPIRE, once they are handed over to our partners. We’re also testing this approach to track the impact of some of our human-centred research. We’ll share what we find with you later this year — stay tuned!

Pulse Lab Jakarta is grateful for the generous support from the Government of Australia.

--

--

UN Global Pulse Asia Pacific
United Nations Global Pulse Asia Pacific

UN Global Pulse Asia Pacific is a regional hub that aims to drive data innovation and sustainable development to ensure that no one is left behind.