The Digital Practice work-stream’s first output: de-fragmenting digital maturity

In the first piece of work from The Catalyst’s Digital Practice workstream, Nissa Ramsay and Helen Lang draw lessons from their mapping of 50 digital maturity frameworks

Nissa Ramsay
Catalyst
6 min readSep 3, 2019

--

Over the last three months, Innovation Unboxed, Think Social Tech, and CAST have built a picture of the shape and nature of digital maturity frameworks. Following contributions from a range of stakeholders (responding to this blog post), we identified 50 unique digital maturity diagnostic tools, think pieces, surveys, codes and reports most relevant to the charity sector in this Airtable. We analysed their content and identified patterns in how they describe what digital maturity should look like.

What surprised us most is that despite the ‘noise’ surrounding digital maturity, there is little shared vision as to what best practice looks like. The evidence-base for these tools is limited and milestones of progress poorly defined. As a result, it is incredibly challenging to identify the most appropriate pathways organisations could follow when taking the next steps in digital. At best, each framework simply offers a starting point for what organisations should be thinking about. At worst, they risk setting unfair standards and sending charities down a path that’s unhelpful.

This blog shares the resources we’ve created to navigate this field, makes recommendations both on what is useful and where we have identified gaps. We hope it provides a foundation for a better cross-sector understanding of what digital maturity means.

Figure 1: An overview of the numbers of digital maturity guides reviewed in this research

Choosing a digital maturity assessment tool

When we embarked on this project we knew we’d find an array of digital maturity frameworks, but we didn’t realise quite how confusing these would be to choose between, compare and use. We concluded that there is no one-size-fits-all authoritative framework for the whole voluntary and charitable sector. Instead, we reviewed a shortlist of tools and developed the following resources to help navigate these:

  • A guide to digital maturity frameworks (‘the Guide’): to help charities to choose an appropriate digital maturity framework and diagnostic tool for their organisation. It recommends and describes 8 key frameworks, and signposts to additional resources appropriate for specific topics or types of organisations.
  • A public map of digital maturity frameworks: an overview of the tools on offer, listing the 50 unique frameworks, their host organisation, date, target audience, description, use case, evidence base and focus areas.
  • A slide deck: detailing all of our findings, with further reflections on the pros and cons of digital maturity frameworks and our recommendations.
  • A poster: summarising our research.

We hope that, with partners signposting to these resources, this work rapidly increases charities’ awareness of and access to existing tools, and that this, in turn, accelerates good digital practice. Longer term, establishing a common definition of digital maturity and best practice is essential if we are to see a marked improvement in the quality of these tools.

The 19 areas of digital maturity

Across the 50 digital maturity frameworks, we discovered 19 common focus areas used to break down and describe digital maturity. Many frameworks lacked detailed milestones and indicators of progress. However, we did find consistency in how the 19 areas were described as shown in figure 2 (also outlined in the Guide).

Figure 2: 19 Areas of digital maturity

Coverage of these 19 areas is incredibly patchy, both across the 50 frameworks and for the 8 that we recommend in the Guide, as shown in figure 3 below. We also found that:

  • The focus areas have changed over time. Newer tools have a far greater emphasis on risks, culture and leadership, earlier tools focus more on IT infrastructure.
  • New focus areas are emerging. Frameworks developed by organisations promoting thought leadership and best practice in the social sector have in the past 1—2 years started to include service design, accessibility, diversity and responsible tech, topics that are likely to recur and become more prominent in future diagnostic tools.
  • Niche focus areas may well be important. Although likely to be very relevant to the social sector’s use of digital, innovation, openness, responsible tech and social impact are omitted from the majority of tools. This could be due to a lack of knowledge of how to describe, define and measure these.
  • Focus areas are directed by tool developers. Most tools are developed on the basis of internal learning and priorities, reviewing other tools and through user testing.

Furthermore, most frameworks typically only address 7 areas of digital maturity. This divergent coverage adds another layer of complexity for organisations looking for guidance on what to prioritise. Those choosing and using a framework may not know what they don’t know, and important areas for them might be missing from the tool.

We are not suggesting that these frameworks should all cover 19 different areas: indeed, we believe this could make them less helpful. However, we do feel more attention is needed about what is prioritised across different tools and where else organisations could go to learn more about other areas. Many of these frameworks stand alone and need to be better connected to support, resources and guidance to help organisations decide what next.

Figure 3: Focus areas of the selection of digital maturity frameworks recommended in the Guide

Finally it is important to note that diagnostic tools typically rely on self-assessment and scoring, on the basis of which they automatically suggest where organisations should invest their time, energy and resources. Benchmarking scores (for example, comparing scores with similar-size charities) can help organisations set appropriate aspirations. However, scoring can underplay or overplay the significance of key areas as well as the progress that needs to happen. Very few of the tools provide effective signposting to services or resources other than their own. This competitive support landscape results in fragmented provision: fragmentation which is ultimately unhelpful to those organisations they set out to help.

Conclusion

It is hard to see how these tools can work more effectively without strategic investment for tool developers and a coordinated approach to developing evidence-based milestones and indicators of progress. We know that there are many more diagnostic tools actively in use that are not in our list, particularly those used by consultants, as well as new tools already in development.

We believe there is a need to increase the quality of digital maturity frameworks and the support surrounding them. To do this, we need to:

  • Understand the journeys different types of organisations (primarily larger and smaller charities) follow in developing their digital capacity
  • Identify key indicators of progress for different focus areas
  • Learn from organisations who have invested in their digital maturity about the most appropriate pathways to developing digital capacity
  • Support tool developers to share what works, including insights from the use of tools
  • Research how organisations come to a digital maturity tool and the extent to which scoring practices are a hindrance or a help
  • Coordinate a set of best practice resources, guides and referrals to relevant support for each of the 19 focus areas

We are currently exploring options to address some of these needs and would welcome your feedback, ideas and support for continuing this project. This may include building the evidence base about the context in which digital maturity guidance and support is useful, where it is not and what this looks like in practice for charities and those providing diagnostic tools. Please do get in touch if you would like to take part in further research, or share your thoughts on Twitter.

--

--

Nissa Ramsay
Catalyst

Think Social Tech: Research, design and learning consultancy supporting tech for good initiatives (previously @comicrelief tech for good)