A Modern Enterprise Data Management Framework

Cameron Langley
RightData
Published in
3 min readApr 4, 2019

The four pillars of Enterprise Data Management:

  • Accountability & Governance
  • Metadata Management
  • Data Provisioning
  • Data Quality Control and Certification

Accountability & Governance Strategy

A strategic approach to define the accountability and governance model for various data domains of an organization — including master data and transactional data. e.g. Customers, Materials, Vendors, Financial transactions, Orders, Deliveries, etc.

As the data travels through the data fabric of an organization, it gets transformed, enriched and the new data elements/metrics are added. The accountability and governance framework for the data domains should account for this shared ownership and accountability.

Below is the three steps process for incorporating an effective accountability & governance model for any organization’s data platforms.

Metadata Management Strategy

The increase in the adaptation of distributed architecture (for analytics and other business applications) introduced siloed systems. This has heightened the need for a holistic approach in metadata management. The functions of metadata management include:

  • Ability to trace the data from consumption layer back to the inception layer (Lineage)
  • Easy way to identify the accountable parties for each of the data domains (Data dictionary)
  • Capability to quickly identify the impact on all the integrated systems and processes for a planned change to an upstream data domain (Impact analysis)
  • Visualize the 360-degree view of the various systems and processes stitched together in the data fabric (Metadata discovery)

Data Provisioning Strategy

Throughout the lifecycle of the data, it goes through various hops; starting from data entry in the transactional systems to the downstream business applications, moving to the enterprise data warehouse/data lakes/data hubs and to subsequent siloed data marts and to analytics platforms. During its lifecycle, data gets enriched by additional regional/business process specific data elements. This introduces the by-design data redundancy but as a side effect results in challenges for users in identifying the right version of the data for their business needs.

With what’s been learned from the implementation of the “Accountability & Data Governance” and “Metadata Management” strategies, the data management team should identify the authorized data sources (certified data provisioning points) for each of the key data domains (Global Material, Local Material, Global Customer, Local Customer, Orders, Financial Transactions, etc.) and the methods to orchestrate the pull/push solutions for provisioning the data.

Data Quality Control and Certification Strategy

Trust is the key for adoption of any data platform. The main objective for a data quality control and certification solution is to provide insights into the data quality and health metrics to boost user confidence of the provisioned data.

Qualities of an effective data quality control and certification process:

The data quality control process includes management of data quality requirements, business rules for quality and integrity requirements, a plan to implement data controls, monitoring of data controls, reporting the findings, and change impact notifications and issues remediation.

The two categories of data quality control assessments:

  • Is my data reconciling with the trusted/authorized data source?
  • Is my data passing all of my business rule specifications?

Data Quality Control Flow

If you or your organization would like to learn more about Enterprise Data Management and Data Quality Control and Certification, please visit https://www.getrightdata.com/or email contact@getrightdata.com

--

--