Revolutionizing data maturity assessments

Finally making it actionable

Willem Koenders
ZS Associates
12 min readSep 18, 2023

--

Usually when I write articles for Medium or other platforms, they aren’t sales-y — that is, they are not advertising a specific product or service. This one time I’m making somewhat of an exception, as I want to tell you all about the Data Maturity Compass™ (“DMC”) that we have developed at ZS in the past year. It is a solution that revolutionizes data maturity assessments — a topic I have been deeply passionate about for over a decade, noting that I have executed many dozens of such assessments over the years. Supported by an extraordinary technical and leadership team, I have personally led its design and development, and in this long-read I want to take the time to explain and demonstrate how we have been able to do it.

I’ll first briefly review the fundamentals of data maturity assessments in general. I will then explain how the solution works and introduce the guiding framework that structures the analysis. We’ll then unpack one-by-one the unique characteristics that drive its revolutionary character. We’ll discuss how it enables organizations to take charge of their own destiny as opposed to locking themselves in with external parties, and close with a brief review of a real-life case study.

What is a data maturity assessment again?

Data maturity models or frameworks are used to measure the degree to which organizations have developed data capabilities. They focus on a couple of dimensions that may include Strategy, Leadership, Culture, People, Governance, Architecture, Processes, and Technology. For these dimensions, maturity can be measured across 4 to 6 increasing levels of maturity. The lowest level is usually labeled something like “Lagging”, “Initial”, or “Ad hoc”, and the highest level is called “Optimized”, “Leading”, or “Transformative.” Figure 1 below presents an overview of the world’s leading frameworks and models, and the dimensions and levels they use.

Figure 1 — Overview of existing data maturity assessment frameworks.

A data maturity assessment uses such a model or framework to analyze the current state practices and capabilities. It is a very common tool in the toolbox of data management experts and consultants. It provides a starting point for both advisors and the organization that is being analyzed to learn about current pain points and gaps, which then informs the creation of a roadmap towards an enhanced, future-state organization.

But common as they may be, they are also known to be tedious, expensive, and without a clear path-to-action. External advisors can spend weeks — if not months — reviewing existing documentation and interviewing dozens of stakeholders. All the while, the client organization has to be patient, waiting for the first results to come in. This is actually an added benefit for external advisors, as it provides them with a foot in the door, gets them exposure to a set of strategic leaders, while learning more about the target organization and… while getting paid for it!

In the rare occasion that external advisors are able to accelerate the process, this is because there is an individual that has an extraordinary grasp of, and experience with, executing data maturity assessments. These folks have done this type of assessment several times before, which enables them to rapidly build a mental map of the organization, develop hypotheses, and create recommendations. But here, it’s all in the head of the expert — it’s not externalized. You have to pay the (usually highly inflated) rates of this individual to get access to the insights, and once you stop paying, you stop having access. So, how to can we avoid this? How can we externalize this process?

How does it work?

The Data Maturity Compass™ solution consists of 3 functional building blocks, as illustrated in Figure 2 below.

Figure 2 — Solution building blocks.

The Intake module facilitates that structured surveys are distributed to selected recipients in the organization. These may be business unit leaders, process owners, data engineers, data quality analysts — anyone that may have insight into the current state data management practices. Depending on their role, these recipients receive specific questions, and the answers are gathered and stored for further analysis.

The Analysis module then picks up, calculates in real time the current state maturity, and auto-generates a proposed roadmap. In the Insights module, the results are made available in an interactive fashion, allowing for slicing-and-dicing across recipients, regions, capabilities, dimensions, and time.

Across the solution, a management console allows you to take charge of the end-to-end process. Strategic priorities can be calibrated, survey recipients can be added, completion of surveys can be tracked, and access can be provided to the results.

Across the solution modules, best practices, benchmarks, and AI are infused to make it all work. For example, the internationally renowned DAMA DMBOK2 framework provides the core structure of the data capabilities, and an international, continually updated database of benchmarks enables you to interpret your results against a set of relevant peer organizations.

The operating framework

Similar to some of the frameworks presented in Figure 1, and inspired by the DAMA DMBOK2 framework, we defined a set of 12 core capabilities that sit at the heart of the analysis (see Figure 3). Together, they form a pretty much mutually-exclusive-collective-exhaustive (“MECE”) framework to assess data management practices.

Figure 3 — The 12 capability areas that form the core of the Data Maturity Compass.

Now, defining the capabilities is one thing, but it does not yet enable you assess organizations. For that, we need a set of dimensions that help us understand what good looks like for each of these capability areas. As part of our framework, we defined the following 5 maturity dimensions:

Figure 4 — The 5 maturity dimensions along which current practices can be assessed.
  • Strategy — The strategy and alignment with business goals and objectives, with specific relevance to the data capability in question.
  • People/Talent — The articulation of specific roles and their responsibilities, as well as required expertise, skills, and training.
  • Processes — The required processes, workflows, methodologies, and standards, as well as the establishment of quality control and review processes.
  • Technology — The tools required to support the required data capability and their integration with other systems and applications.
  • Adoption — The adoption and usage of the data capability within and across the organization and the tracking of metrics to measure the effectiveness and impact.

A critical insight here is that a minimum level of maturity is needed within each dimension. If maturity lags in just one dimension — no matter which one — it means the effectiveness of the entire data capability is compromised.

What is revolutionary about the Data Maturity Compass™?

There are a few components to the DMC™ that other solutions also have. This includes a survey process to gather inputs and visualizations to display results. Many offerings in the marketplace also already offer benchmarks to juxtapose your results alongside a set of peers.

So far so good, but then what makes the DMC™ stand out? There are a few things that, to the best of my knowledge, no other offering has:

Figure 5 — DMC™ differentiators.

Let’s step through each of these.

Incorporation of strategic profiles

Every existing data maturity framework that I am aware of measures maturity relative to a somewhat arbitrary target-state. That is, what good looks like does not depend on you — on the unique context and objectives of your organization. Whether you are a small retailer focused on selling low-cost consumer goods or a heavily regulated global bank — what good looks like doesn’t change. As a result, the maturity assessment invariably results in general, broad recommendations, that do not allow you to chart a course based on your unique situation.

We felt that this misses an important point, namely that data management does not generate any value in and of itself. It only drives impact relative to the overall strategy and objectives of the organization. To put strategy at the core of the assessment mechanism, we surveyed the marketplace to identify how competitive advantage correlates with specific data capabilities. We identified a set of specific strategic archetypes, across a spectrum of offensive-to-defensive postures. Three sample archetypes are presented in Figure 6 below.

Figure 6 — The 6 strategic archetypes that are used to calibrate target-state ambitions.

Our solution enables you to pick a primary and secondary strategic profile. For example, if you were to pick Innovate & Leadership as a profile, then the maturity assessment will be done relative to that archetype and show completeness towards it. And don’t worry — you can change this choice any time, and the results will be updated instantaneously.

Figure 7 — A sample outcome of measured maturity against the 3 sample strategic archetypes.

Predefined roles to streamline the intake process

A common pain point of existing maturity assessments is that the initial information-gathering phase is long and painful. Many stakeholders, typically >15, need to free up hours of time to be interviewed. Time is then spent consolidating all the inputs, and then even more time is needed to validate the results with a set of stakeholders.

A carefully configured survey can streamline the process, asking precise questions for specific data capabilities. But another risk looms here, namely, to ask targeted recipients with already busy calendars to spend a lot of time answering a large set of duplicative questions, many of which they don’t even know the answer to.

To alleviate this, we defined specific “job profiles.” When recipients open the survey, they can select the profile that best matches their (current and/or past) role and experience. This enables that they are only asked those questions where they can reasonably be expected to provide an answer.

Figure 8 — A complete set of job profiles has been preprogrammed. Survey recipients can pick the roles that most closely resemble their current job description or past experience, and the questions will be customized accordingly.

Auto-generated roadmap

In existing offerings, advisors manually craft a roadmap based on the gaps that are identified in the maturity assessment. This is an inherently complex task as it requires making tradeoffs and sequencing initiatives, which usually takes several weeks. The biggest innovation that we bring is that we automate this process. A roadmap is proposed in real time, the moment that the first results come in.

Depending on the results, the roadmap presents enhancement initiatives in about 3 to 10 phases (3 if there’s not a lot to do, and 10 when there’s a long road ahead). That is, some things you just can’t do straightaway. For example, quantifying the value of data for data assets across domains, or implementing enterprise-wide ethical AI standards, likely come some time after you have a confirmed basic data strategy, operating model, and high-level policies and standards in place. The roadmap does not randomly sequence initiatives — the initiatives in one phase unlock new ones for the next phase, gamifying the journey ahead. Each time new results come in, the proposed roadmap is updated automatically.

It would be giving away too much to explain exactly how we’ve been able to engineer this, but rest assured that it’s self-learning and aligned with universally observed best practices. See Figure 9 below for a snapshot of a roadmap that was created for a sample organization.

Figure 9 — A snapshot of an auto-generated roadmap, showing results for only 3 out of 14 capability areas.

Unrivalled level of granularity and comprehensiveness

The level of detail behind the 14 capability areas is immense. Each capability area has between 3 to 7 so-called “sub-capabilities.” For example, as Figure 10 shows below, the capability area Metadata Management is broken down into the sub-capabilities of Metadata Democratization and Usage, Systems and Applications Metadata, and Data Lineage. Each of these, in turn, can be analyzed through the already introduced 5 dimensions of Strategy, People, Process, Technology, and Adoption. And for each of those, 4 ascending levels of maturity are defined.

The breadth of the solution is differentiating too. There are frameworks that assess data governance, data storage, data privacy & security, and AI & data science, but as Figure 1 showed, there is none out there that brings these all together in a single framework and solution.

Figure 10 — Snapshot of the DMC™ that shows how capability areas are broken down into sub-capabilities, that in turn can be analyzed across 5 dimensions and 4 ascending levels of maturity.

Automated end-to-end process

The DMC™ is built on a modular, cloud-native infrastructure, allowing us to build and maintain the individual components without compromising the overall solution. This also enabled us to fully automate the process across the distribution of surveys, gathering of results, sending of reminders, consolidation and storage of results, calculation of maturity results and metrics, generation of the roadmap, and visualization of all results and outcomes.

It is this automation that allows us to offer the DMC™ at a meaningful discount compared to existing offerings in the marketplace, even though it effectively delivers more value and deeper insights.

Embedded AI

AI is embedded in several ways to drive enhanced precision, effectiveness, and actionability:

  • As respondents complete the survey and for a given question select an answer that seems at odds with previous answers earlier in the survey or with answers provided by other respondents before them, this will be detected and a prompt will appear, asking the respondent to double-check the answer.
  • For some open questions of the survey, respondents can click a button to record an answer. The audio is converted into text, and this text is then analyzed and summarized using Large Language Models-techniques. Multiple-choice questions form the basis of the quantitative maturity assessment, but it’s the open questions that provide the rich context of real-life experiences against which the results can be understood.
  • Various AI-functionalities are used to analyze and summarize the overall results, and to provide a summary that could be copied into e-mails, PowerPoint slides, or any other sort of reporting mechanism.

Accelerated timelines — to action in a matter of days

In traditional approaches, gathering of information and its subsequent interpretation and consolidation take several weeks. It’s 4–6 weeks into the effort that the first discussions are held around a possible roadmap. Everything we outlined above enables you to get to that roadmap in a matter of days — not weeks — and at a fraction of the costs.

Of course, this roadmap is not a fully vetted and validated one. You should interpret it as a highly actionable ~70% version. Put this in front of the right stakeholders and you can have a highly meaningful discussion on what initiatives to prioritize first, and why. It is that engagement that will drive the much-needed buy-in for your data management transformation journey at a more general level.

Enabling YOU to take charge

The DMC™ can be used in a one-off fashion, where the assessment is executed once as an input into a revamp of your data strategy and roadmap. But it can also be deployed as a product, where you hold the keys to the solution. That is, you can:

  • Operate the survey management center, where you can add as many new recipients as you like
  • Repeat the analysis over time, so that you can track results and progress over time (something both high-level executives as well as regulators are typically interested in)
  • Manage access to the results, where instead of sending people static PowerPoint slides (although that is still an option too!) you can give them access to the interactive maturity dashboard and roadmap

A case study — a global technology company

This is not just theory. It’s based on many years of executing data maturity assessment in practice. More importantly, it’s also been tested in real life.

In one example, about a year ago, we started to work with a leading technology company. The survey was kicked off and sent to around ~50 recipients, most of whom completed it within a few days. Within 2 weeks of the initiation, the preliminary results and roadmap were presented to various C-suite executives of the global enterprise. They indicated to appreciate the quantitatively expressed understanding of current maturity against a set of benchmarks — they suspected that there were gaps, but they did not know where, and how severe they were. They welcomed the initial roadmap, and responded to it by challenging the data team to come with specific funding asks against it.

Now, a year later, the enterprise is in the middle of executing on a carefully sequenced data capability enhancement roadmap, which includes uplifts to master data, metadata, and data cataloging capabilities. The DMC™ did not produce the eventually adopted, multiyear roadmap, but it did provide the initial impetus behind it.

Want to know more?

If you are interested and want to learn more, reach out to myself, Shri Salem or see here. We would love to be a partner on your data management enhancement journey.

References and recommended further reading

Mentioned data maturity frameworks and models:

Further recommended reading:

--

--

Willem Koenders
ZS Associates

Global leader in data strategy with ~12 years of experience advising leading organizations on how to leverage data to build and sustain a competitive advantage