Weeknote — week 11
This week I’ve mostly been talking about impact
The big news for us at Power to Change this week is that we are to remain open for another five years, courtesy of a further £20m investment from the National Lottery Community Fund. The money will enable us to help community businesses navigate the Covid-19 recovery and build on our six-year legacy.
At Power to Change our interest in community businesses goes beyond those we directly support each year. We believe community ownership has radically transformative effects on the wellbeing of communities. Our ambition continues to be seeing more communities realising the benefits of this model.
So as we continue to grow the market, we want to understand the impact these businesses are having in their local neighbourhoods. But measuring place-based change is difficult, and despite years of debate and experimentation, the third sector all too often falls back on methods and approaches that don’t really show impact. That focus on the number and value of grants made in an area for example, rather than the impact and social value of these investments. So we tried another way.
For the past six years we have been investing in data collection and analysis to build a rich picture of local economies and the challenges they face. These investments have come to fruition, as evidenced in our latest impact report.
This impact report is ground-breaking in many ways. For starters, the innovative design (thanks to our friends at Blue Stag) points to a different way of disseminating this insight. But its distinguishing feature compared to many other organisational impact reports, is that it actually reports on whether tangible changes have been achieved. Not just counting outputs. Not just providing nice anecdotes and case studies. But actual changes.
For example, in the report you’ll see how we’ve pioneered the use of quasi-experimental design for measuring whether community businesses actually improve the neighbourhoods in which they operate (see p87 for our novel use of the Community Life Survey). And we pioneered the use of commercial data for public purpose, using credit and debit card transaction data to monitor the state of local economies in real time. Our impact report illustrates the usefulness of this, showing how covid-19 lockdown restrictions created a ‘localisation’ effect for many community businesses, with their customers coming from less further afield than before (see p81) (The team at SIB have been developing uses of this data even further, and this week launched the High Streets Tracker, which is definitely worth checking out).
And then we’ve also used robust methodologies to find out what our own impact has been on these community businesses. That includes creating a control group to monitor the additionality provided by our Trade Up programme on the proportion of trading income generated by community businesses (see p43). We’ve also used causal mapping to attribute change for Bright Ideas, our early-stage support programme (see p32). In both cases we’ve seen positive changes, but because of the research design, we can say with greater confidence that it is our support that has contributed to these changes. This is what most impact reports should be doing. The norm, rather than the exception.
I think the time for glossing over uncomfortable truths is over. The scale of challenges we face, and their complexity, means that better understanding and scaling what works, and binning what doesn’t, has never been more important. So as we look to the future, we’re building on a solid evidence base to improve our own practice and support, as well as that of community businesses. I hope to see more and more organisations doing the same.