Making data easily accessible and fully understood

Pieter Delaere
dScribe data
Published in
4 min readJan 31, 2022

When we first started working in earnest on our data knowledge product in January 2021, we had a clear picture of the customer challenge we wanted to help tackle. However, it was not until we started preparing to pitch to investors and officially launch as a company, that we summarized it onto one phrase:

“to make data easily accessible and fully understood”

So where did that come from? We could talk about the explosion of data volumes available to organizations. We could mention the increasing variety of structured and unstructured data formats. We could point to the many ‘self-service’ tools used by knowledge workers, making it easier than ever to access and analyze data. But those insights very likely would not be new to you.

The data market is booming and terms like the data lakehouse, data literacy and data catalogs were ubiquitous in the data-related news of the past year. Rather than talk about major market trends however, when meeting customers I prefer being more pragmatic.

Market trends can be inspiring, but they become empowering only when you are able to apply them to solve a customer’s specific challenge.

It is by advising customers on how a solution can help in their specific context, that value is recognized. Here are 3 tangible stories that in the past have lit up customer’s eyes during conversations.

An industrial holding, existing of 5 largely independent divisions was struggling to consolidate its reporting. For his revenue reports, the group’s CEO received over 30 spreadsheets monthly, each with its own format and calculation methods. To tackle this issue, the group decided to offer its divisions a shared data platform, offered freely for data storage and analysis. However, it was only when a dedicated effort was launched to harmonize the definition of revenue and all underlying data elements, that significant gains in efficiency and data accuracy were realized. A common language, not shared technology, was the breakthrough solution.

An international food company aimed to roll out harmonized KPI reporting across the top 500 of the organization. Since they were structured as a matrix (functional departments and geographic regions), to achieve this, great efforts were spent on gathering and aligning requirements from all stakeholders, for each KPI. To translate these requirements into straightforward input towards the development team, 2 spreadsheets were created: one containing the KPIs (including a definition, calculation and various labels) and the other containing definitions of individual data elements. Although working towards clearly structured requirements worked well in getting business and IT on the same footing, heavy reliance on the 2 spreadsheets did present a few challenges of its own. Definitions were frequently updated without review by other stakeholders, no history of changes was visible and at one point in time, the shared files were lost, forcing the company to continue from a 2-month old backup. It was by moving towards a tool dedicated to documentation and collaboration around data that these ‘growing pains’ were solved and focus could move back to the task at hand: establishing clear, harmonized data definitions as a foundation for consolidated KPI reporting.

At a mid-sized chemical recycling plant, a financial controller was in charge of providing the management and sales teams with weekly reporting on sales, receivables and payables. Having worked at the company for many years, he knew the ins and outs of its business. Via manual data extracts and a number of macros in Excel, he tried to automate his weekly efforts. That worked well at the start — the reports were built quickly and no business user had objections to working with figures in Excel. Over the course of the next 4 years however, additional business requests and an increase in the data volume caused the number of manual actions needed to finalize the reports to grow. What started with one file became 5 (2 with the actual reports and 3 with intermediate transformations). It certainly didn’t help that each file now took around 5 minutes to load and the original financial controller had left the company. Eventually, a data warehouse and visualization tool were implemented, replacing the excel reports. The trickiest part of this new setup was not rebuilding the excel logic, it was understanding why certain transformations were done in the first place. Documentation is often the last thing planned and the first thing left out when looking for cost optimizations. However, people’s memory is not limitless and when people leave, knowledge leaves with them. Not formulating and storing knowledge almost always means higher costs to fill knowledge gaps in the future. In the end, the new reporting implementation took 6 months. Less than a month to implement the new tools, but nearly 5 months to analyze the existing logic and validate the new results.

Referring to major trends in the data market is energizing. It allows you to stimulate people to adopt new capabilities and dream up inspiring use cases. But to really make a dent in the data space, let’s not neglect the need to make our ideas specific and tangible enough for people to be not only inspired, but empowered to move forward. For us, that is all about offering a pragmatic solution to “make data easily accessible and fully understood”.

--

--

Pieter Delaere
dScribe data

Fascinated by the possibilities of data. On a mission to enable everyone to generate value through data as CEO of dScribe.