Published in


[Spotlight] A plea from the ODC’s IWG: Data standardisation matters

A re-cap of ODC’s Implementation Working Group meeting held last September 2021

by Darine Benkalha, ODC’s Implementation Working Group co-chair

Photo by Dan Gold on Unsplash

It’s 2022 — it is no secret that data is a powerful tool to build resilient, effective, and inclusive public policies and respond to the world’s biggest social, economic, and environmental challenges. While data practitioners would agree with such a statement, it’s a safe bet that they would add that timely, high quality, interoperable (see: ODC’s 6 principles for further reading) data is what we would need to make a real change.

According to the Open Data Institute, standards are documented, reusable agreements that make it easier for people and organisations to publish, access, share and use better quality data. More specifically, standards exist to detail the language, concepts, rules, guidance, or results that have been agreed upon by a group of people or a community of practice. In 2015, the ODC along with governments, civil society, and experts from around the world worked together to develop the ODC principles with the goal to represent a globally agreed set of aspirational norms for how to publish data.

While the September 2021 ODC Implementation Working Group (IWG) discussion highlighted some of the common pitfalls that stems from a lack of data standardisation, it has also shown that ODC adopters have been developing and implementing quality data standardisation initiatives that allowed them to overcome some of these challenges.

Why does data standardisation still matter?

According to Silvana Fumega, our guest speaker from the Latin American Initiative for Open Data (ILDA), the quality of data has a clear reflection on the quality of the public policies developed to solve societal issues like feminicide. Thanks to her contribution to the “Data against Feminicide” initiative which aims at fostering an international community of practice around feminicide data, she was able to draw such a conclusion. This project has led their team to identify common pitfalls from the data encountered to undertake the study:

  • Data are not open
  • Incomplete data
  • No connection between institutions collecting and publishing the data
  • Inconsistencies in terminology (e.g., criminal vs political definitions of femicide)
  • Timeliness of the data
  • Data gap (quantity) and lack of disaggregation highlighting the absence of intersectional perspective in data collection and publication (demographic or gender considerations for example)

Listening to our guest speakers of this IWG session, Silvana, the Government of South Korea and the Government of Catalonia, data standardisation seems to be the solution to most of these challenges as it forces organizations to think proactively about what data they need, how to collect it, how to store it and how to use it.

The ILDA experiment: data standardisation to increase feminicides visibility

Silvana shared with the group about ILDA’s data standardisation initiative [BD1] aiming at combating feminicides more efficiently in the Latin America and Caribbean region. The ILDA Team came-up with a methodology to standardise data in the public sector with a gender perspective. While this work is ongoing and iterative, so far the experiment has allowed the researchers to draw key findings: the importance of defining the data collection process with a clear protocol and to put in place institutional mechanisms that will define how to share the data as well as who will be accountable for managing it. Another interesting conclusion is the importance of having a strong data governance model in place, and as strong of a data infrastructure. Silva concluded that when done well, data standardisation allows for regional, national, and international comparisons. This means that countries will be able to improve information exchanges (i.e. in an interoperable way, on a more regular basis) thus ensuring a more efficient, timely and adapted response to the plague that are femicides.

The South Korean approach: a strong policy and regulatory framework

Jieun Oh, our second speaker from the Government of South Korea, shared with the IWG how their government developed a highly sophisticated framework for data standards throughout the years. This work is based on four pillars:

First, they have put in place a strong legal and regulatory framework consisting of an open data law (2013), several directives (open government data standards, common terminology for the public sector) as well as some guidelines for open data management and standardizing public sector organization databases ).

Second, South Korea has developed a public sector data management framework as well as open government data standards. At the policy level, this was done through the elaboration of an open data quality management plan encompassing mid-long-term data standards policies. They have also worked on pan-governmental and institutional metadata management systems through which items constituting metadata (data relationships, common terminology of the public sector, etc.) are being collected, linked, and managed at different levels of government. Finally, South Korea built a strong data architecture and infrastructure that they apply to the whole data life cycle, from its generation to its release.

The Government of South Korea has also invested in developing capacity to ensure a coherent implementation of the standards amongst institutions, but also to monitor their compliance to the legal and policy requirements. This has been done through regular data quality management training targeted to Chief Information Officers (CIOs), Chief Data Officers (CDOs) of all public sector organizations. They have also partnered with private and specialized companies to develop and offer technical support (all year round) to all sector organizations.

Finally, South Korea developed a framework to monitor and assess the implementation of the standards. This work targets central and local government and public institutions, and imposes a yearly quality control of the data held by public institutions. These evaluations are conducted by control examiners trained by the government, andare based on data management framework indicators as well as data value management indicators. The results of which inform open government data policy implementation assessments and are used for public sector innovation evaluations to provide incentive to institutions when needed.

The Catalonian way: formalizing standardisation through open government action plans

The Government of Catalonia explained to the IWG that their standardisation journey started when they noticed that an increasing number of public bodies were publishing datasets on their Open Data portal. They then realized that it was crucial to develop standards that will ensure institutions employ the same criteria when publishing data on the portal and that the latter serve its purpose of being a simple and convenient tool for users. As ODC adopters since 2018, Catalonia has advanced their work with two key considerations in mind: the need to align standardisation with international standards and ODC’s open data principles.

This has thus led to the development of a guide for the standardisation of metadata intended for public bodies that are publishing data sets on their Open Data portal. Instead of opting for a regulatory or policy framework approach, Catalonia has formalized their data standardization work through their 2019–2020 open government action plan .They used this strategy to integrate this work to their broader goal of promoting common criteria and establishing standards that guarantees the quality of publicly published data.

This project comprised multiple steps, starting by commissioning a theoretical study to understand what criteria and recommendations they should take into consideration before creating the metadata catalog. They then implemented the identified recommendations on their portal and did some deeper work on data nomenclature by extending the guide to the metadata’s content and columns. The guide comprises technical details about the data sets. Users can also find recommendations on how to name the columns of their datasets. Some other interesting features are the possibility for users to publish in either English or in Catalan (allowing federation with European Union data catalogs) as well as an automatic cleansing process fixing errors without any direct supervision.

Key Takeaways

The approaches presented by ILDA, the Government of South Korea and the Government of Catalonia confirmed how important standardisation initiatives are to give back data its true value and power. Without better data, it will be difficult to tackle some of our modern society’s biggest challenges efficiently.

Thankfully, the experiences shared by our guest speakers have also help us identify some of the key elements that make for a successful data standardisations initiative:

  1. Strong institutional mechanisms and frameworks (e.g. laws, directives, or action plans) formalizing data standardisation gives authority to the standards and ensures public bodies comply with the requirements.
  2. A healthy collaboration between all institutions involved in the data life cycle (so that for example, terminology is agreed upon) is key to develop and implement the standards
  3. Data standardisation can’t be successful without a robust data governance model in place and well thought architectures and infrastructures.

Darine Benkalha is a Policy Analyst, Program Implementation and Intergovernmental Relations, Open Government Office of the Chief Information Officer for the Government of Canada. She is also the co-chair of ODC’s Implementation Working Group.

If you would like to be a part of our Implementation Working Group, please don’t hesitate to get in touch:




Learn how we are working towards a culture of open and responsible data use by governments and its citizens.

Recommended from Medium

Customer Segmentation for Arvato Financial Services

Electricity Production Forecasting Using Arima Model in Python

[Spotlight] Building a trustworthy data ecosystem

Insurance Forecast by using Linear Regression

BIG DATA: How much data is produced everyday?

Big Data & Analytics — Getting Started on Your Journey

Looking for A Dual Mattress Could Be QuiteExciting

Bicycle Sharing Demand Predict

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Open Data Charter

Open Data Charter

Collaborating with governments and organisations to open up data for pay parity, climate action and combatting corruption.

More from Medium

Workforce Equity at the County of Santa Clara

[Spotlight] Promoting collaboration at the national and local level to strengthen data governance

Crowdsourced Data Boosts Miami’s Climate Resiliency Efforts

Updated Selected Readings on Inaccurate Data, Half-Truths, Disinformation, and Mob Violence