Which do you choose — data quality or data quantity? At McDonald’s, we choose both

Global Technology
McDonald’s Technical Blog
3 min readApr 17, 2023

Managing data quality is critical to enabling informed business decision making, but McDonald’s scale means massive amounts of data — that’s where our data-quality monitoring tool is helping us deliver consistent and accurate insights.

by Nicole Sheridan, Data Quality Engineering and Molly Sullivan, Senior Manager, Data Analytics

At first glance, the consequences of poor data quality may seem subtle, but poor data quality costs organizations an average of $12.9 million* every year and makes it harder to enhance the customer experience and improve operational inefficiencies.

Consistent delivery of accurate, valid, and complete data is critical for the insights and analytics needed to run McDonald’s ever-growing business.

But how do you ensure you are using quality data? Proactive monitoring.

McDonald’s relies on proactive monitoring and remediating data quality incidents in real-time to help address bad data quality, strengthen our data community’s trust in our data, and drive critical business and operational decisions.

Incorporating data quality monitoring as a foundational pillar into digital initiatives can decrease the time it takes to scale them across markets and improve the time it takes to resolve data issues.

But as the company grows, our data grows, which means McDonald’s needs a solution that considers its the scale.

How it can make an impact
Proactive monitoring was an important factor for the pilot and launch of the MyMcDonald’s Rewards program.

During the pilot, we used the Lightup application to measure data quality and audit the entire data flow supporting the program, which helped us identify and remediate issues before the broader rollout. This enabled us to streamline our monitoring and resolution processes, so we could scale the program more efficiently, and also resulted in 99.99% accurate loyalty points data as the program was fully scaled.

Lightup’s out-of-the-box or custom SQL data-quality rules capabilities enabled us to arm stakeholders with the necessary intelligence to restore data issues as they occur. Data quality rules were productionalized 80% faster with significantly less development resources and accelerated the time to resolve data issues through real-time alerting and intelligent aggregation of issues.

Under the hood
Through our data-monitoring solution’s self-service capabilities and transparent data quality processes, our data providers and curators have more confidence in our business-critical data.

The solution’s no-code environment and push-down querying capabilities to numerous data sources, help teams expand their data quality intelligence and make decisions confidently. We’ve decreased time to market through out-of-the-box rule creation and AI-powered data quality checks, as well as automated issue alerts and the ability to reduce noise of incidents that are generated.

What’s next
The ability to monitor the data ingested into and out of analytical models has the potential to improve business outcomes and at greater speeds. We are in the process of expanding Lightup’s data quality monitoring capabilities to the markets and are enhancing the user experience.

With automated alerting and aggregation of incidents already possible, the next step would be the ability to remediate data issues that come up directly in a unified view.

* Source: https://www.gartner.com/smarterwithgartner/how-to-improve-your-data-quality

--

--