The Quality of Data

I was wondering how often we evaluate or measure the quality of our data. An article just published by Harvard Business Review talks about that and what’s surprising is how poor the quality of the companies’ data is, with only 3% of meeting basic quality standards.

While the article mainly analyzes manual processing and assembly of data records, another aspect should be taken in consideration. It is not enough to have a good database design or good processes and workflows in place. When trying to solve data quality problems integration plays a major role. The quality of integration will often determine the quality of data.

Big Data usually comes from multiple sources. There is no reason to assume that all these sources will by default integrate with each other seamlessly. There are usually different standards, different approaches, and different ways of storing data and sometimes these are not fitting well together. A good integration would allow for all these different sources to collect the data and transform it in a way that would allow for analysis across standards.

This requires that the integrator understand well all the sources and the ways in which they would be used together to provide the best analysis and give value to business.


Originally published at dataandtechnology.wordpress.com on September 14, 2017.