Industry Wide Standards Are Necessary to Resolve Provider Data Quality Issues

Slalom Healthcare & Life Sciences
Slalom Daily Dose
Published in
6 min readJul 27, 2021

There are many types of data that flow in and out of healthcare systems: claims data, provider data, patient/member data, medical data, etc. Managing each type is vital for healthcare businesses to function successfully. Each type of data has general patterns or issues that need to be understood to ensure the information consumed downstream is accurate.

Provider data, specifically, has known quality issues that are well understood but are tackled differently across organizations when the existence of industry wide standards would help resolve them.

Photo by Irwan Iwe on Unsplash

Provider Data: The Need for Management

Provider data is a crucial part of both provider and payer workflows. A provider can be an individual or a healthcare entity (hospital, facility, etc.). Provider data represents attributes such as name, location, network, specialty, National Provider Identifier (NPI), whether they are accepting patients, etc. Within the payer space alone, this data is used for various capabilities:

  • Connecting members with in-network providers
  • Conducting analysis of providers and health plans
  • Tracking information related to the various PCPs
  • Credentialing and claims adjudication processes

Not surprisingly then, the provider data management software market is expected to grow from $430.7 million in 2020 to $823.4 million by 2027, signifying the importance of keeping provider data accurate and accessible to users¹. Although the data is so widely used, it is not maintained in a central location and there are no industry-wide standards around how the data should be collected or stored. For reporting and business needs, organizations pull data from Centers for Medicare and & Medicaid Services (CMS), third party vendors, and/or roster data generated within their organization. Though provider information can change frequently over time due to changing contracts or provider status, it is not guaranteed that the data collected in the sources available will always be up to date.

Reasons Behind Provider Data Quality Issues

The lack of a single “source of truth” and industry standards, along with infrequent updates contribute to the data quality issues encountered at organizations. When using data from multiple sources, duplicate provider records can be found. For instance, since there is no standard on order or what constitutes good provider data, there could be two records for the same provider, but one includes the middle name and title, while the other does not. Resolving the duplicates to a single, accurate record becomes more difficult when the differences are significant, such as two records for the same provider with different address locations.

Organizations rely on data quality engines, either built internally or a purchased product, to resolve duplicates and run their downstream processes using clean data. The data quality engines are built using rules or machine learning to identify similar records and hierarchically prioritize different sources based on business rules. Without clean and reliable data, payers will have trouble gaining accurate insights from their data models or encounter blocks in the claims adjudication process, resulting in denied claims. Inaccurate insights could lead to poor financial business decisions, such as prioritizing a network that is costly to the payer.

From a patient perspective, the wrong record chosen from a duplicate can lead to difficulty finding a provider. For example, a patient may have trouble contacting a provider if the record chosen is of an older location the physician no longer practices at. This creates patient frustration with the payer for providing erroneous information.

While data quality engines can run algorithms to match records and clean data, it does not solve the issue of incorrect source data. Between 2017–2018, CMS analyzed provider directories of 52 Medicare Advantage Organizations.

As seen in Figure 1 below, after reviewing 5,602 providers and the 10,504 locations they are listed under, the study found that information for 50% of the providers and 49% of the locations were inaccurate².

Source: www.CMS.gov

Major inaccuracies were related to incorrect location or phone details and information regarding whether the provider is accepting new patients. These errors can prevent patients from reaching providers, effectively creating barriers to healthcare access.

The frequency of these errors and additional minor errors found can be seen in Table 5 below.

Source: www.CMS.gov

Paths to Resolve Provider Data Quality Issues

CMS’s Patient Access Rule paves the path to addressing provider data quality issues. The rule requires CMS payers to provide public access to their provider directory using a FHIR compliant API by July 1, 2021³. Plans must also update their data within 30 days of receiving updated provider information, allowing for more accuracy. The rule puts standards on data availability by requiring the following fields: provider names, addresses, phone numbers, and specialties⁴.

Enforcing this requirement will enable organizations to build robust processes to better maintain and update their provider data. These processes create a foundation to set these organizations up for success, especially if they are not tackling these data quality issues effectively today.

In a similar effort, healthcare leaders joined forces to create the Synaptic Health Alliance to tackle provider data issues. The alliance explored using blockchain to allow collaboration between organizations to create a secure provider data exchange. Through their initial pilot, Alliance members were able to successfully identify 88% of necessary demographic data corrections using shared data. The shared data helped members clean up inaccuracies in their own data that they would not have been able to identify alone⁵.

While CMS’s rule will put some standardization in place and encourage organizations to follow better practices, the rule is not a solution for all players in the healthcare space. The Alliance’s mission to explore the use of blockchain in healthcare will offer solutions in various capacities. However, it does not offer a solution to those outside of the Alliance nor does it guarantee high-quality data. The data exchange is currently only available to those who choose to participate in the Alliance. High-quality data is not guaranteed because if all members have various inaccuracies in their data, they will be sharing these inaccuracies among their members.

An Industry Wide Solution

An industry wide standard and process would better enable accuracy and success among the payers, providers, EHR systems, vendors, etc. A single location where providers can submit and update their details for specific locations in a timely manner would reduce the administrative burden on providers. The standard format would also allow consumers of this data, such as payers, to trust the information.

However, since standards are not yet defined or used industry wide, it is important for organizations to rely on provider data management software (Availity, LexisNexis, CAQH, etc.), data quality engines, shared data exchanges, and organizational best practices to maintain high-quality provider data.

Meet the Author:

Delu Vimalesvaran is a Principal in Slalom Chicago’s Data & Analytics practice. Find her here on LinkedIn.

Slalom is a modern consulting firm focused on strategy, technology and business transformation. Our healthcare and life sciences industry teams partner with healthcare, biotech and pharmaceutical leaders to strengthen their organizations, improve their systems, and help with some of their most strategic business challenges. Find out more about our people, our company and what we do.

--

--

Slalom Healthcare & Life Sciences
Slalom Daily Dose

We are Slalom's diverse group of healthcare and life sciences consultants, who bring industry expertise and a passion for driving change to this publication.