Epic vs. Particle: A Case of Improper Data Sharing in Healthcare

Shawn Flaherty
Tranquil Data
Published in
5 min readJul 17, 2024

At Tranquil Data, we help companies use data correctly in-line with complex rules found in regulations, contracts, and privacy policies. One such regulation is HIPAA, which covers both the security and privacy of protected health information. While the security provision deals with topics like hacker groups and ransoms, the privacy rule is equally important, but often overlooked.

In the recent Particle vs. Epic saga, the HIPAA privacy rule came front and center, and highlighted a ubiquitous problem in healthcare that doesn’t get enough attention.

The Background

Under the HIPAA privacy rule, certain groups like payers and providers can freely exchange information with each other with little to no friction as long as they are doing so for treatment, payment, or healthcare operations. This makes sense because we want our providers to have all the information they need to treat us, and we want our insurance companies to cover our treatment so that we don’t have to pay out of pocket.

While this free exchange of data between payers and providers empowers the healthcare system, it also creates a culture of nonchalance. For example, ask any GRC or technology leader at a health system or payer for a list of every third party with whom they share protected health information, what data was shared with them, when, and for what purpose. This quote from a CTO in a recent FTC enforcement action is the standard answer:

“[W]e need to strengthen our policies and procedures to ensure that we are consistent about what data we share to whom.” and acknowledged, “What we do not have is the data we are sharing by partner along with its purpose.”

The recent Change Healthcare breach further exemplifies the problem of not tracking where data flows. Many months post-breach, the company is still in the process of identifying effected parties, and the full scope of the affected data remains uncertain.

The problem of ensuring proper use and sharing is further exacerbated by a tool in healthcare called a health information exchange (HIE). These tools allow entities to exchange data through a middle-man rather than one-to-one. This also makes sense, as patients can’t remember details like doctor names, conditions, and prescriptions, that would be required to enable a one-to-one exchange of information.

With HIE’s in the picture, the question posed above has an added layer of complexity — what data has the HIE shared on our behalf of our patient, to whom, and for what purpose? The answer to the question posed above, “What data have I shared, with whom and for what purpose” was a resounding “No idea.” The answer to the same question with an HIE in the middle is “Absolutely no idea.”

At this point, it’s fair to ask, “How can this be?” First, healthcare is complicated, and in response to this complication, a lot of data is used and exchanged with many different groups for different purposes. At scale, it’s impossible to ensure proper use and sharing manually. Second, (other than us), no one has built a system of record to automate proper use and sharing in a way that is transparent to non-technical roles.

Let’s flush this out through the Particle vs. Epic example.

Epic vs. Particle

Particle is a company that gathers health information on behalf of their customers across all of the HIEs and other places patient data hides. One HIE, Carequality, discovered through anomalies in the patient record exchange patterns, like requests for large numbers of records within a certain geographical region, that Particle was gathering personal health information on behalf of organizations asserting the “treatment” purpose of use, that at face value don’t do treatment.

For example, one of these companies was Integritort, a company that is “Transforming Mass Tort Integrity” which some argued were using the patient data to try and identify potential class action lawsuit participants.

To understand how this happened, there were at least three points of failure.

Failure One: Particle Health

When Particle Health gathers PHI on behalf of a customer asserting a purpose, they have a duty to do some due diligence into who their customers are, and how they plan to use data. It’s reasonable to assume Particle isn’t a bad actor. So how then could they work on behalf of a mass tort tech company asserting the treatment purpose?

The answer goes back to the complexity of ensuring proper use and sharing at scale, and the lack of a system of record for data use that promotes transparency. Someone at Particle needed a dashboard (like we empower) that showed each company they were acting on behalf of, what purposes they were asserting, and what data they were accessing. Companies with names like Integritort that were asserting a treatment purpose should have raised red flags immediately for someone responsible with ensuring they weren’t violating law.

Failure Two: Carequality (the HIE)

Carequality eventually discovered that Particle was acting on behalf of customers that were asserting the “treatment” purpose that don’t treat patients. But why did this happen in the first place? Again, the same with Particles failure, it was the absence of a system of record (that we’ve built) that tracks where data came from, the rules for its use, and in real-time enforces proper use. Like Particle, Carequality needed an automated enforcement tool to ensure only the proper data flows to the proper parties, and a dashboard that showed in real-time who was requesting data, and for what purposes. This dashboard could also have been cut to show views for individual health systems and individuals so that they can see and monitor how their patient data is being used and shared.

Failure Three: Secondary Use

It’s commonplace for healthcare stakeholders to assert a valid purpose (e.g. treatment), to use the data for treatment, and then re-use it for other purposes. One way to stop secondary use is to ban it for participants of HIEs, but there’s no enforcement mechanism to do this except for the threat of being taken off the exchanges, and this won’t stop bad actors or groups acting in grey areas.

There’s another problem with secondary use that doesn’t get talked about enough, and it’s inadvertent wrongful secondary use. Take a hospital system seeing a patient as a common example. After a patient is seen, a record (likely many records) are created and sent to different databases, where the context with which this data was taken on is lost as it’s moved and co-mingled. At some point, someone uses this data for another purpose having no idea where the data came from, why they have it, or the rules for its use.

We call this problem “losing context on ingest.” The solution here is the same system of record for data context that captures where data came from, why you have it, and the rules for its use. This ensures inadvertent wrongful secondary use isn’t committed. The opposite is also true — risk adverse good actors that lock valuable data down due to the risk of secondary misuse can unlock the value of data while ensuring its used correctly.

At Tranquil Data, we’ve built a system of record for data context that automates correct use and builds trust and transparency. If ensuring correct use and building trust and transparency are challenges we would love to talk info@tranquildata.com

--

--