Measuring the value of data management

A framework of now, next and near

Shri Salem
ZS Associates
13 min readFeb 27, 2024

--

Written by: Shri Salem and Willem Koenders

In the realm of data management, we hear a common refrain from our clients: understanding the importance of foundational capabilities is one thing, but demonstrating tangible progress to secure the necessary investments for enhancement is another. It’s a narrative of needing to prove value to pave the way for further development. Data management and enablement, then, is not merely about orchestrating people, processes, and technology; it’s about crafting a compelling story of growth and potential through measurable milestones. This journey, far from being immediate, requires patience and precision. Without the ability to measure, we navigate blindly, risking efforts and investments.

This article introduces a strategic Now-Next-Near framework. This does not just serve as a roadmap, but also as a crucial tool in securing the success of your data management journey by demonstrating clear, measurable progress at every step.

The Now-Next-Near Framework

When we talk about data governance activation, we’re referring to the process of implementing a data governance strategy — the activation of policies, the establishment of data stewardship, and the deployment of technology to manage, protect, and exploit data. Foundational capabilities are the bedrock upon which effective data governance is built. The exact minimally required foundations vary across sectors, regions, and unique organizational contexts, but they typically encompass things like defined data ownership and domains, a process around ensuring the fitness-for-purpose of critical data, and education of the workforce on data literacy. These foundational elements establish the operational readiness of an organization to handle data responsibly and strategically.

Evolving from operational readiness to strategic data governance and, ultimately, to business impact is a journey (often an iterative one). Each step builds on the previous one, and each set of metrics informs the next. A minimum set of foundational capabilities must be established before an organization can hope to see a sustainable impact from its data governance efforts. But an important pitfall looms, namely to invest in foundational capabilities that will not have a positive ROI within a ~1 year time horizon. It is extremely common for data teams to invest upwards of a year on the implementation of a data catalog or measurement of data quality, without being able to quantify the positive impact created on the business.

So, how do we still make sure to ensure that the right foundational capabilities are in place and to achieve business impact in the near term? This is where our Now-Next-Near framework comes in:

While it’s not possible to measure the direct business impact of foundational data management capabilities immediately, you can (and in our view, should) measure the effectiveness of the activities that lead to it. Metrics serve as signposts on the road to data management maturity, making sure that you continue to make the right progress at an acceptable speed, continuing toward the right destination.

More precisely, first you can measure the extent to which operational data management is in place. Next, you can measure how your strategic abilities are evolving, for example how quickly critical data can be accessed and how new data-driven insights can be created. Finally, you can measure the business impact. Data governance does not directly enable business impact; rather, it enables you to create and grown the strategic abilities that drive value.

In selecting these metrics, we aimed for them to be SMART: Specific, Measurable, Achievable, Relevant, and Time-bound. This principle ensures that our goals are clear and reachable within a specified time frame. The criteria of being Specific, Measurable, and Relevant are particularly pivotal. Specificity ensures that each metric targets a particular aspect of data management. Measurability allows us to quantitatively assess progress, and Relevance ensures that the metrics align with the broader business objectives.

Let’s now take a look at these three stages.

Operational (Now)

The ‘Now’ phase is all about establishing the operational capabilities that are necessary for effective data governance. In this phase, the organization focuses on getting the basics right — setting up the infrastructure, defining roles and responsibilities, and ensuring that data management processes are in place. This phase sets the tone for the data governance program and ensures that the necessary infrastructure and processes are in place to support future, more strategic efforts.

Driving initial foundational maturity also includes building specific capabilities. Which exact ones depends on your situation and objectives, but we can review a few examples, as well as risks associated with rushing their implementation:

  • Metadata management: Metadata is often referred to as “data about data.” It’s crucial for understanding critical data and information assets. Without standardized metadata, an organization will face challenges in data quality, lineage, and usage understanding. Without a consistent metadata management process, you risk inconsistencies in how data is interpreted and used, resulting in poor decision-making. Moreover, without at least a minimal metadata strategy, there’s a risk of creating redundant processes for metadata creation and maintenance, leading to inefficiency and wasted resources.
  • Interoperability standards: Interoperability refers to the ability of different systems and organizations to work together. Without a few agreed-upon standards, you will face challenges with data exchange and integration, creating systems that cannot communicate effectively with one another. This, in turn, leads to siloed information and the inability to leverage data across the enterprise. This not only hampers the operational efficiency but also increases the long-term costs associated with integrating disparate systems and data.
  • Operating model and framework: An operating model and framework for data governance provide a blueprint for how data is to be managed across the organization. Driving data governance without a minimum operating model or framework all but guarantees that you develop inconsistent processes and practices. This results in a lack of clarity around data ownership, governance roles, and responsibilities, and ultimately to a governance structure that is ineffective and unable to adapt to changing organizational needs. Additionally, you are likely to end up “reinventing the wheel” for each new data initiative, leading to inefficiency and a piecemeal approach that fails to leverage economies of scale and enterprise-wide best practices.

Indeed, various foundational capabilities are integral to a stable and durable data governance activation. But the fact that they are foundational does not mean that you should not measure them, nor that you can’t achieve impact in the short term. In fact, if you do it correctly, the opposite is true — with the right metrics tracking operational effectiveness of data governance, you’re over halfway on your way towards achieving impact.

The metrics in this phase are designed to measure the readiness and foundational aspects of data governance, and provide immediate feedback on the operational effectiveness of data governance initiatives. There are hundreds of metrics that you could define, but we recommend focusing on a small number of targeted metrics. Here are 5 of our favorite metrics for this phase:

  • Number of activated data domains: This metric tracks the number of data domains that have been identified and are actively managed within the governance framework. An activated domain is one where the data is (in the process of being) defined and cataloged, and has assigned owners and stewards. This metric evidences the extent to which critical data is managed as an asset, a precondition for enabling business success. In many organizations, a critical domain is Customer Data. If you can evidence that critical customer data is managed adequately, it follows that you’ll be able to understand and service your customers better and enable all sorts of data-driven use cases. It’s recommended to start with a manageable number of domains to avoid being overwhelmed.
  • Number of certified data assets: Certification of data assets involves validating the quality and reliability of the most vital data sets. This metric measures the number of data assets that meet established quality criteria and are approved for use across the organization, which is critical for building trust in the data. Rushing this process can lead to the use of unreliable data, undermining the data governance efforts and potentially leading to poor business decisions.
  • Number of activated domains cataloged: This is related to both activating data domains and certifying its data assets, and focuses on the extent to which critical data has been catalogued. Keeping a catalog of activated domains ensures that there is an inventory of data assets, their metadata, and their interrelationships. This directly aids in data discovery and management, and without it, data assets will remain or become fragmented and difficult to locate.
  • Percentage of workforce with self-service enabled: This metric relates to the extent to which data users can fully self-service themselves in their data usage journeys, highlighting the empowerment and independence of employees in accessing and utilizing data without direct intervention, for example by individuals from IT.
  • Number of data assets with data quality (“DQ”) controls: Data quality controls are mechanisms put in place to ensure the accuracy, completeness, and reliability of data. This metric tracks the number of data assets that have these controls in place. Absence of such controls can lead to DQ issues that may have far-reaching negative impacts on the organization. DQ controls do not always have to be DQ measures — the DQ dashboard is not a goal in itself. The goal is to ensure the fitness-for-purpose of the data. If you can embed controls in data capture, movement, and storage processes that guarantee that, for example, data complies with a given format, then measuring it after the fact may not be necessary.

Such operational metrics serve as indicators of how well an organization is laying the groundwork for its data governance. They are not merely numbers but are reflective of the quality of the initial steps taken and serve to demonstrate early wins and areas for improvement, all of which are crucial for building momentum.

Strategic (Next)

After establishing a minimal operational foundation in the ‘Now’ phase, you progress to the ‘Next’ phase, where strategic implementation takes center stage. This phase is about converting the operational capabilities into actions that drive the organization forward towards its business objectives, leveraging data for competitive advantage.

Metrics in this phase aim to measure the organization’s ability to execute on its data strategy. They indicate how well the organization is utilizing its data assets to support strategic objectives and to foster a culture of data-driven decision-making. They are the connectors between the operational groundwork laid in the ‘Now’ phase and the anticipated business outcomes in the ‘Near’ phase.

Strategic metrics also serve to motivate and guide the organization’s data governance initiatives by providing clear, actionable insights into how well the strategy is being executed. They help to identify areas where the data governance framework needs to be refined or where additional resources may be required to meet strategic goals.

The exact metrics you define and prioritize will again depend on your unique context, but here are 5 of our favorites:

  • Time-to-action: This metric measures the speed at which data can be accessed and used to make decisions or take action. It is an indicator of how effectively the data governance framework facilitates quick and informed decision-making. Slow time-to-action leads to missed opportunities and a lacking ability to respond to market changes.
  • Data-driven innovation: This metric assesses the extent to which data is used to drive new products, services, or business models. The exact calculation will be unique depending on your context, but they can relate to common metrics related to the time from insights to innovation, number of patents filed from data insights, and the number or speed of data-inspired products.
  • Data-driven decisioning / adoption: This metric evaluates the degree to which data influences strategic decisions and is adopted across the organization. It reflects the cultural shift towards valuing data as a key asset. This metric could be derived from the frequency at which specific reports or dashboards are accessed, but a simpler version might just ask critical business leaders to what extent they are able to make decisions based on data.
  • Data accessibility: This metric gauges how accessible critical data is within the organization. There are several variations of this metric, some measuring the turnaround time of data access approval and provisioning, and other focusing on the share of critical data assets that are accessible through preferred consumption mechanisms (e.g., Tableau or a specific API).
  • Stakeholder data satisfaction: This metric measures how well the data meets the needs of its users. This is an extremely simple yet all-encompassing metric. If your key data users are satisfied, you know that various underlying foundations are indeed in place. It has as an additional benefit that measuring their satisfaction is an active way of engaging users. It makes them feel heard, raises awareness of the data management program, and enables you to stay aware of emerging needs and pain points.

Metrics such as these establish a feedback loop, informing you of the success of your data governance initiatives and enabling you to make fact-based adjustments. With that, the strategic phase represents a transition point where the operational capabilities are being leveraged for strategic advantage, using data to innovate and make informed decisions.

Business Impact (Near)

In the ‘Near’ phase, the focus is on realizing the business impact. The metrics in this phase are outcome-oriented and are designed to measure the tangible benefits that data management brings to the organization. Indeed, the focus shifts to realizing and measuring direct business impact.

These business impact metrics are the culmination of the data governance journey. They reflect the realization of the program’s goals and provide a direct link between data governance activities and the organization’s broader business objectives. They demonstrate the return on investment for the data program and validate the efforts of everyone involved. In fact, ever more companies are attempting to measure an aggregated metric of ‘data ROI,’ based on various underlying business impact metrics, like the ones we list below.

For profit-driven companies, these metrics must, by definition, in one way or another map back to a finite set of benefit types: increased revenue, decreased costs, enhanced customer or client experience, and/or mitigated risk. For non-profit organizations, metrics would be related to the articulated mission statement and related objectives.

Let’s again review 5 of our favorite metrics:

  • Revenue impact from data: This metric assesses the contribution to revenue generation. It could be through data-driven product enhancements, targeted marketing campaigns, or new revenue streams enabled by data insights. Without an explicitly articulated connection to revenue, data governance can be seen as a cost center rather than a value driver.
  • Operational efficiency gain: This metric measures the improvements in efficiency that result from better data, such as streamlining processes, reducing redundancies, and automating data-related tasks. In one example we worked with, a globally operating technology company we worked estimated that based on a specific data asset, they could achieve over $20 million in cost savings by streamlining global shipping routes, automating supply chain planning processes and preventing product returns.
  • Market responsiveness: This refers to the ability to respond to market changes, which is a significant competitive advantage. This metric evaluates how the right, fresh data enables the organization to adapt to market conditions. The exact calculation will be unique for each organization, but they can relate to common metrics related to the time to launch new products, the (percentage of) revenues coming from data-driven products and services, customer feedback response times, or the adaptation rate to regulatory changes.
  • Customer experience score: Better data can in many cases drive an improved customer experience. This metric measures (the change in) customer satisfaction, for example as a result of data-driven insights and personalization. A common example is the Net Promotor Score (“NPS”). In one example, a financial services company was able to measure very precisely the decrease in number of complaints as a result of incorrect contact information, which previously had led to paper letters and physical credit cards being sent to incorrect addresses.
  • Data preparation hours saved: By improving data quality and accessibility, data governance can significantly reduce the time spent searching for and correcting data. This metric quantifies the hours saved as a result of these improvements. This is a business impact metric that is one of the quickest you can achieve. It’s already very substantial — many organizations have analytics teams that report to spend over 50% on finding and cleaning data, before being able to effectively analyze it and use it for analytics or AI.

Such metrics help to solidify the position of data governance within the organization, showing that it is not just about compliance or risk mitigation; it’s about driving business growth, enhancing customer experiences, and improving operational efficiencies.

By explicitly identifying, measuring, and publishing business impact metrics, you can evidence the value of the data program and secure ongoing support and funding. It also provides an opportunity to celebrate the successes achieved, or conversely, return to the drawing board if no impact can be discerned.

Closure

The Now-Next-Near framework we presented above enables organizations to systematically approach their data enablement journey, ensuring that each phase builds upon the last and that the metrics used to measure success are appropriate for the maturity level of the organization. It provides clarity and direction, and ensures that efforts are aligned with the organization’s broader strategic goals.

The value of this phased understanding cannot be overstated. More often than not, we work with leaders that tend to either demand business impact too soon, when the lack of foundational capabilities will cause harm and frustration, or too late, when investments in data management are made without explicitly planning for near-term business impact. The key is to get the balance right.

Finally, organizations must also invest in robust processes to effectively measure and track these metrics — defining them alone is not enough. This investment ensures that the metrics are not just theoretical constructs but practical tools that drive progress and accountability.

With a framework like ours, you can chart a path toward data governance excellence — one that is measured, impactful, and aligned with the ultimate goal of driving business success.

Read more insights from ZS.

--

--