Operationalizing the AMA Privacy Principles to Increase Engagement in Digital Health

Shawn Flaherty
Tranquil Data
Published in
5 min readJan 20, 2023

Data is at the core of how digital health companies create value through online services. It is therefore necessary that patients trust digital health companies with their data to win their engagement.

According to a study by the Stanford Center for Digital Health and Rock Health, 23% of patients trust digital health companies with their data versus 77% for physicians.

The AMA found that 60% of potential users decide not to start using digital health applications in light of privacy concerns. These statistics illustrate that digital health companies have work to do to earn patient trust, and that trust is key to unlocking engagement (i.e. revenue).

One reason for the gap between physicians and digital health companies is the presence, and absence, of applicable regulation. Over the past 26 years, HIPAA (the law that governs how physicians, hospitals, and health plans may use and share personal data) has built the foundation for patient-physician data trust. In contrast, there is no wholistic framework for regulating what digital health companies do with the data they collect from users, including sensitive health and personal information that many users do not realize is not protected by HIPAA. Academics and associations have begun to raise awareness of the problem:

  • One study investigated the privacy features of 25 pregnancy / period tracking apps and wearables. Most of the products collect vast amounts of personal data, and then share it widely
  • Another study examined 35 diabetes mobile apps, all of which were sharing data with third parties, 16 of which were impermissibly sharing personal data
  • Another paper found that of the 36 top-ranked apps for depression and smoking cessation available in public app stores, 29 transmitted data to services provided by Facebook or Google, but only 12 accurately disclosed this in a privacy policy
  • Another study looked at 10 opioid addiction apps and concluded that, “… the sheer amount of data available to the majority of the apps we studied raises questions about the privacy and security practices of telehealth apps … Most troubling, however, is the access of unique identifiers by the majority of these telehealth apps and the capability for sharing these identifiers with third parties.”

To help fill the regulatory void for healthcare data that falls outside of HIPAA , and to help developers know where to start when designing privacy, the AMA created a set of Privacy Principles and a Developer Checklist. The Principles go beyond HIPAA to meet the unique opportunities and challenges in digital health for collecting, storing, and sharing patient data. Their stated goal is to, “ensure that as health information is shared — particularly outside of the health care system — patients have meaningful controls over and a clear understanding of how their data is being used and with whom it is being shared.” The Developer Checklist also highlights both the business case and health equity implications of including privacy in digital health solutions. For digital health companies looking to drive engagement with trust, the AMA Principles and Developer Checklist are best in class.

The survey statistics and studies outlined above are strong proof that there is a trust and transparency gap in digital health, and that solving for trust drives engagement. This should amount to a win-win for patient privacy and digital health companies: investment in implementing the AMA Principles increases patient privacy and engagement. Additionally, research indicates that patients are increasingly engaging their physicians to discuss digital health applications to use to help manage their health. Digital Health companies that publicly attest to compliance with the AMA Principles are more likely to be recommended.

Even though investment in trust and transparency is win-win for patients and companies, there’s been little adoption of the AMA Principles because of the complexity of implementing them. Most early stage digital health companies have lightweight technical teams or outsource their technology requirements. With the resources they do have, they are rightly committed to deploying the best tech stack they can to get an application done and in front of patients. These teams don’t have the time, focus, or skills to implement a core data platform capable of implementing the AMA Principles.

As digital health companies transform from the early stage to mature platforms they begin to invest in building technology teams. However, the difficulty of managing the collection, use, and sharing of sensitive data at scale far outpaces their investment. For example, see this Facebook internal memo that estimates 600 engineering years to fix their privacy infrastructure at scale. While the scale of Facebook is an unfair comparison to any digital health company, it illustrates how hard it is to get collection, storing, and data sharing right at scale when there are complex requirements. Even for technology teams that are working on their data platform strategy, the complexity of implementing the AMA Principles at scale is a nonstarter, and there has not been a solution in the market for them to adopt until now.

Digital health companies looking to invest in trust and transparency need a trusted partner that specializes in “ensuring that as health information is shared … patients have meaningful controls over and a clear understanding of how their data is being used and with whom it is being shared.”

At Tranquil Data, we’ve been working on this problem since 2017, and developed software purpose-built to develop trust and transparency in data use and data sharing for companies from the earliest stages to enterprise level. We solve for this complexity by acting as a system of record for data context, forming a new kind of meta-data (graph) model that captures where data came from and why you have it. Context is then used as input to Tranquil Data policies, which are machine-enforceable versions of common data requirements outlined in agreements and regulations. We use this context to automate correct use and sharing, and to support strict regulatory requirements and scalable and flexible consent management. Policy decisions are audited to share with internal and external stakeholders to show exactly why any given decision was made, building trust, lowering risk, and creating new data driven opportunities.

If you believe in the AMA Principles, or more broadly that building trust and transparency drives engagement and data sharing, we are happy to share more details about our solution.

--

--