Privacy = patient engagement: insights from AMA’s patient data privacy survey

Shawn Flaherty
Tranquil Data
Published in
4 min readJan 20, 2023

The AMA recently released a 1,000 patient survey titled, “Patient perspectives around data privacy.” The survey focused on patient privacy for data that is shared outside of the HIPAA framework (i.e. non-covered digital health entities). While academics, non-profits, and advocacy groups have been pushing to address the trust and transparency gap between current practices and patient expectations, there has been little done to date understand patients’ perspectives on their privacy in the context of digital health, and how patient privacy affects engagement.

The survey results are strong proof point that privacy is about more than risk — trust and transparency help build the engagement necessary to realize the full potential of scalable and accessible care. Below, I highlight and provide commentary on some of the key insights from the survey.

“When considering whether to use health applications (apps), technologies or platforms in light of privacy concerns, nearly 70% of patients hesitate at least sometimes, and more than 60% decide not to start using the tool.”

According to the survey, patients decide not to onboard due to privacy concerns 60% of the time they interact with an application. Although this percentage will vary depending on the sensitivity of data being collected, and whether the application requires patients to import additional EHR data, 60% is a staggering number.

Digital health companies that incorporate transparency best practices will minimize patients lost due to their privacy concerns. In the context of onboarding and engagement, transparency best practices include educating users on what data is being collected, for what purpose, and ultimately what will be done with that data. Far too many applications bury these details in terms and conditions with legal language patients don’t understand. Likewise, marketing and product teams grovel at putting anything in front of patients during onboarding that will cause them to drop out of the funnel. In this case, the opposite seems to be true — educating users on data collection and privacy in a way they can understand will drive more patients onto the platform.

The second transparency best practice is going beyond telling patients what will be done with their data. Instead, enterprising digital health companies should show patients what is being done with their data. This means operationalizing policies and preferences in a user interface that can show patients the data being collected about them, and what’s being done with that data in real-time.

“Almost 80% of patients want to be able to opt-out of sharing some or all their health data with companies.”

To give patients the confidence that only necessary data is being collected, digital health companies should start with strong data minimization principles, “limiting the collection of personal information to what is directly relevant and necessary to accomplish a specified purpose.” Beyond data minimization, discretionary consent management is about empowering patients to select the type of health record information they are comfortable sharing. A discretionary consent best practice allows patients to toggle on and off the types of data they wish to share for different purposes. To drive transparency, the consent management system must become a system of record that is trackable, auditable, and transparent to patients.

“93% of patients want health app developers to publicize if and how their product adheres to industry standards for handling health data. Patients and consumers are demanding transparency.”

Patients want to know that digital health applications are following privacy best practices (some of which are covered above). The AMA and CARIN Code of Conduct have both published opt-in privacy principles for digital health companies. Neither set of principles set the bar unreasonably high, and both are opportunities to promote trust with patients.

“More than 75% of patients want to receive requests prior to a company using their health data for a new purpose.”

Using data for new purposes is about continued trust and engagement with patients. Unfortunately, it is commonplace for terms and conditions to allow digital health companies to change their data use policies whenever they want without notifying patients. A best in class privacy practice notifies patients of a new use and allow patients to choose whether to opt-in to the new use. One example of bad practices is Cerebral (among many), who have been in the news too often recently.

What not to do

Conclusion

The survey does a nice job of illuminating some key insights into patients’ privacy perspectives that fall outside of the standard data breach and risk frameworks. It also begins to frame the the link between privacy and trust to ultimately drive better patient engagement. The solution to leveraging privacy to increase engagement is part human factors design, part operationalizing those changes in systems. We have some great partners working on the human factors work necessary to drive engagement with privacy. Our product operationalizes the necessary changes in systems by enforcing correct use and sharing, enabling scalable and flexible consent, surfacing transparent tracing about why data was used for which purposes, and creating integrity over the lifecycle of data from the time data is created or onboarded to the moment data is shared with external partners, customers, and ecosystems.

--

--