Why mHealth apps are bad at privacy, and what they can do about it

mHealth apps and devices offer big benefits to consumers, patients, and healthcare providers. But many of those apps are letting users down with their unhealthy privacy practices.

Rachel Dulberg
CodeX
14 min readSep 30, 2021

--

This hurts adoption and market growth and undermines vital consumer trust in these technologies. Here are all the tips, tricks and tools for mHealth startups that want to become great at privacy.

why health apps are bad at privacy and how they can fix it
Image by freepik.com

“Although becoming globally data compliant can be seen as a big cost centre, it has also shown for us to be a revenue driver. Health tech startups should take note that being compliant with various regulations isn’t only necessary but can be a boon for business.”

Ariel Katz, CEO and co-founder of H1

Privacy has so far been an afterthought for many mHealth app developers. But both consumers and regulators are rapidly catching up, making it increasingly harder for health apps to get away with poor privacy habits and driving health app developers to up their privacy game.

Privacy shouldn’t just be treated as a risk factor. It’s time to think of privacy not as a chore but as a competitive differentiator and business driver. The market is clearly heading that way.

THE TROUBLE WITH mHEALTH APPS

We’re in a golden age of healthcare innovation. The health tech sector is expected to continue to grow by 15% a year and triple in value to US$300 billion in the US alone by 2028 (up from US$110 billion today). The digital health market has grown by 12% in the past year thanks to the covid-19 pandemic driving huge demand for telehealth solutions.

Telehealth isn’t the only segment that’s booming. mHealth apps are used by billions of people around the world — there are now over 350,000 mHealth apps available in major app stores.

The global mHealth apps market size was valued at US$40.05 billion in 2020 and forecast to grow by 17.7% a year in the next five years to reach almost US$150 billion in value.

“mHealth” includes a broad range of smartphone apps or wearable devices that allow consumers to track and monitor anything from fitness and nutrition to menstruation and sleep patterns, and enable physicians to communicate, monitor and diagnose quicker and with fewer errors.

These apps and devices are — mostly — a good thing; they enhance the overall healthcare experience, improve patient health and provide consumers and healthcare providers with increased efficiencies.

But data privacy, security and trust concerns are hampering the adoption of mobile health technologies and impeding market growth.

Source: Accenture — “How can leaders make recent digital health gains last?

While over 50% of American consumers believe that technology can improve their healthcare, privacy and security concerns have a direct impact on their willingness to use health technology, according to a recent Oxford Economics study.

Data breaches and malicious attacks are an obvious issue. A 2019 report published by Fortified Health Security shows that the number of U.S. data breaches in the healthcare sector increased by 16% a year.

In Australia, the OAIC (data and privacy regulator) said recently that the healthcare sector reported the most number of data breaches between January and June 2021.

But security breaches are only part of the problem. Many mHealth apps are simply bad at privacy. In the past few years, a slew of health apps have been exposed for a host of privacy offences:

Source: The US Federal Trade Commission (FTC) guidance “Does your health app protect your sensitive info?

THE GLOBAL PRIVACY LANDSCAPE IS CHANGING

It’s important to note that mHealth apps are not necessarily doing anything illegal (yet).

In some jurisdictions, while “health data” and medical devices are heavily regulated, laws haven’t been updated to reflect the collection of health data by new technologies. So mHealth apps were in some cases operating in a grey area of sorts, not technically caught by regulations. Consequently, consumers have been left to fend for themselves.

In the U.S. for example, data collection by mHealth apps has been — until recently — as regulated as your average Instagram ‘like’.

The Health Insurance Portability and Accountability Act (HIPAA), the law that governs how health data is collected and used by healthcare providers, doesn’t strictly apply to many direct-to-consumer apps that provide health and pharmaceutical information. It also doesn’t technically apply to heart-rate data generated by a sports watch or a Fitbit, information entered into period-tracking apps or fitness data held by running and cycling apps.

Similarly, in many other jurisdictions such as Australia and the UK, there has been no clear regulatory guidance for mHeath apps and, even where guidance has been issued, it’s not been strictly enforced.

But that’s all changing rapidly as both regulators and consumers around the world are raising the bar on privacy and data protection.

Six key benefits of data privacy compliance. Source: TechTarget

Here are a few recent key regulatory developments that mHealth app developers should know about. Note that these may apply to your app even if you’re not based in a particular jurisdiction but are collecting data from US, EU or Australian citizens for example.

U.S.

FTC policy statement re mHealth apps

Earlier this month (September 15, 2021), the FTC has announced its intention to reign in mHealth apps’ privacy problems. In a new policy statement, the FTC is specifically seeking to ensure that entities not covered by HIPAA face accountability when consumers’ sensitive health information is breached.

The new health breach notification rule means that companies that hold fertility, heart health, glucose levels and other kinds of health data must notify consumers in the event of a data breach.

The FTC specifically noted that “health apps and other connected devices that collect personal health data are not only mainstream — and have increased in use during the pandemic — but are targets ripe for scammers and other cyber hacks. Yet, there are still too few privacy protections for these apps.” Companies that fail to comply with the rule could be subject to monetary penalties of up to US$43,792 per violation per day.

CCPA

The recent California Consumer Privacy Act (CCPA), which became effective on January 1 2020, was the first U.S. state regulation to address data privacy and protection. Modelled after GDPR, CCPA is focused on companies that operate in California or collect/use data on California residents even if the business is located outside California. For consumer-focused apps that fall outside of HIPAA, such as mHealth apps, the new California law requires significant changes, ranging from updating privacy policies to implementing a consumer right of erasure.

Australia

Australia is in the midst of bringing in major changes to various laws that will result in more onerous privacy and data protection requirements, and potentially impact mHealth apps.

TGA

In Australia, mobile apps that are sources of information or tools to manage a healthy lifestyle and software that does not meet the definition of a “medical device” aren’t regulated by the Therapeutic Goods Administration (TGA).

However, as of February 2021, the TGA has begun to regulate a very broad set of software named “software based medical devices”. It has recently issued guidance (September 2021) on what amounts to “regulated software”. According to the TGA, any software that is intended to automate diagnosis, treatment decisions, or otherwise replace the judgment of a health professional, is likely to be considered a ‘software based medical device’ and therefore regulated (and its data collection practices will be scrutinised).

The TGA’s guidance lists many types of apps and software that won’t be subject to regulation, but it also sets out a very broad definition of what will amount to ‘regulated software’. This includes:

  • ‘low risk’ software that, for example, monitors a patient’s disease state or progression or makes treatment recommendations (e.g. for migraines);
  • ‘medium risk’ software such as for diabetes diagnosis, blood pressure analysis tools or cardiac MRI scanners; and
  • ‘high risk’ software (which will attract the most regulatory scrutiny) that recommends specific treatment or diagnosis for life threatening conditions, such as a consumer smartphone app that helps screen for and detect malignant melanoma.

Importantly, the TGA’s regulated software list is not exhaustive and it will be important to consider carefully if an mHealth app could potentially amount to a ‘software based medical device’ under this new regulation. mHealth apps may be considered “low” risk but it’s worth keeping an eye on this and re-assessing, especially if new features/functionalities are added down the line.

Privacy Act

Regardless of whether your mHealth app is covered by the TGA, mHealth apps that operate in Australia or target Australian users need to comply with Australia’s privacy laws — the Privacy Act and Australian Privacy Principles (APPs) — including requirements around how health data is collected and used.

A breach of an APP that’s regarded as a ‘serious interference with the privacy of an individual’ could lead to civil penalties of up to A$2.1 million per breach.

Additionally, in March 2019, the Australian government announced wide ranging reforms of its privacy laws that impose much higher penalties for breaches of the Privacy Act and bring Australia’s laws more in line with Europe’s GDPR. The new reforms are expected to become law in 2022 and will increase current penalties for privacy breaches up to A$10 million, 3 times the value of any benefit obtained through the misuse of the information, or 10% of the company’s annual domestic turnover, whichever is greater.

Consumer Data Right

Lastly, the new Consumer Data Right, designed to give consumers greater access and control of their own data and better privacy protections, currently only applies to the banking sector but is expected to expand to the entire Australian economy in the next few years and will eventually cover all sectors, including health.

Breaches of the ‘privacy safeguards’ required under the CDR Rules will attract similar financial penalties to the new privacy laws (i.e. up to AU$10m, 3 times the value of the benefit obtained from the breach or 10% of annual domestic turnover, whichever is higher).

EU

In Europe, data protection and privacy of personal health data are governed by the General Data Protection Regulation (GDPR). The GDPR is now considered the gold standard of global data protection and imposes high standards for collecting, storing, and using Personally Identifiable Information (PII) which broadly includes all manner of medical and personal data that may be collected by mHealth apps. GDPR requires data protection, data minimisation and privacy by design and default, among other things.

Fines for getting it wrong are high and can reach €20 million (US$22m) or 4% of a company’s annual revenue (whichever is higher).

Swedish healthcare provider, Capio St. Goran, received a €2.9 million (US$3.4 million) GDPR fine for failing to carry out appropriate privacy risk assessments and implement effective data access controls.

Worldwide

Much like the GDPR, Brazil, Japan and Hong Kong have new privacy laws that apply not only to companies headquartered in these jurisdictions but also to those that store data in those countries or target consumers who live there. In the past decade, over 60 countries have enacted new privacy laws.

TOP PRIVACY TIPS FOR mHEALTH APPS

So we’ve established that meeting privacy and data compliance requirements can be complex but getting it wrong can be fatal to your mHealth app’s success. Here’s what you can do:

  1. Don’t rely on data monetisation as a business model monetising mHealth apps is notoriously difficult. While selling user data to third parties may seem tempting, it’s not the most advisable strategy from a privacy and user trust perspective. Not only will you be risking sharing sensitive data and destroying your brand’s reputation, but you’ll also be exposed to regulatory fines if you’re not transparent about it with your users and/or you haven’t obtained appropriate user consent to do so, as the HealthEngine case has shown. Other models such as subscriptions, advertising, sponsorships or transaction fees may be more ethical and avoid privacy pitfalls.
  2. Incorporate privacy by design and by default — don’t leave it until your app is ready to launch on the App Store. Adopt privacy engineering approaches, include privacy in the product roadmap and product requirements, from the outset, and incorporate privacy into the SDLC. The market is awash with tools and technologies that can make this process a breeze:
  • Terra True which helps product teams automate privacy compliance;
  • Privitar which enables enterprises engineer privacy protection into their data projects and leverage large, sensitive data sets while complying with regulations and ethical data principles;
  • OneTrust, a data privacy management compliance platform that helps businesses adhere to the growing array of regulations around the world, including GDPR and CCPA;
  • Skyflow’s Healthcare Vault helps keep sensitive health data isolated, encrypted, and fully secure, with built-in data tokenization, masking, and redaction functions.
  • Innovative privacy enhancing technologies such as differential privacy and federated learning are becoming mainstream and help unlock data access and privacy bottlenecks, particularly for AI applications in healthcare.

The other benefit of embedding privacy by design into your product is that it’ll give you bragging rights — you can advertise the fact that you are a privacy-first app and gain consumers’ trust, especially useful if your competitors aren’t on top of their privacy game.

3. Carry out a Data Protection Impact Assessment (DPIA) — If you’ve never put your product to a privacy test, a DPIA is a good place to start. This is a useful exercise to flush out any potential data and privacy risks to your users and figure out how to reduce those risks. DPIAs are also vital to demonstrate compliance and meet the GDPR ‘accountability requirement’. The good news is that DPIAs can be automated through tools such as TrustArc’s privacy assessment can help with implementation of a DPIA programme.

4. Identify your legal reason and purpose for processing personal and medical data — under GDPR and other laws and regulations around the world you must have a legal reason for collecting and processing each and every item of personal data. Also, you must have the user’s express consent for these activities and personal data can’t then be processed for a purpose that’s different to the original purpose for which user consent was given. In practical terms this means you need to:

  • Identify all personal data that your app is collecting and using
  • Be accountable and transparent — let users know what data you’re collecting and using
  • Get user clear and express consent to all of the above
  • Continuously monitor how user data is being used to ensure your app isn’t deviating from original data uses for which you have user consent (for example when new features or services are introduced).

5. Minimise personal data processing — Many app developers tend to collect too much personal data. Think about how much of the data you process is really needed for the app’s functionality and features. Aim to collect only the data you need and consider whether the data you collect can be anonymised or at least pseudonymised. Check if you really need the user to register and, if so, minimise the amount of personal information collected for that purpose.

6. Consider leaving the data on the user’s device — Many COVID contact tracing apps have tackled privacy issues by taking a decentralised approach that basically leaves personal data they analyse on the user’s device, with a minimal amount of information uploaded to the government’s server. Google and Apple have used this method to make their contact tracing apps more privacy friendly. Med tech AI tools have recently proven that federated learning, differential privacy and encrypted computation are promising techniques to preserve privacy while delivering healthcare innovation at scale.

7. Sort out your data ops — this includes a host of data operational policies, systems and processes including:

  • Appoint a data privacy officer or privacy manager to plan and manage your privacy programme and be the face of your brand’s privacy-first approach both internally and externally. Privacy evangelism is critical for building your brand’s trust capital as well as improving internal privacy education and cementing your privacy culture.
  • Develop identity- and access-management practices for employees according to their roles, with security-access levels determined for different data categories.
  • Prioritise data security.
  • Have a crisis response plan for when a data breach occurs so you can respond quickly, notify users and regulators and know what to do to limit your damage. Under the GDPR, for example, you’ll have to announce a data breach within 72 hours.
  • Develop clear, standardised procedures to deal with requests for the removal or transfer of data. These should ensure that you can expedite compliance and response to user requests for the identification, removal, and transfer of data. This is a GDPR requirement, and will also apply under the new Australian Consumer Data Right regime. Consider automation tools for data subject request management to save time and resources.
  • Work closely with third parties, affiliates, and vendors you may share data with, to understand how and where data is stored and used. Make sure any third party contract you enter into gives you the protections and assurances you need to comply with your own obligations and ensure you’re not exposed to unnecessary risk.

8. Sort out your data infrastructure — create an infrastructure environment that can readily accommodate the increasing volumes of data you’re collecting and cater for the particular sensitivity of health and personal data. Store data in a limited number of systems, depending on data type or classification.

Amazon HealthLake is a newly launched, HIPAA-eligible service offering healthcare and life sciences companies a complete view of individual or patient population health data for query and analytics at scale. GE Health Cloud is a healthcare solution designed to be a scalable, secure, connected cloud ecosystem to help manage the volume, velocity and variety of healthcare data.

WHY PRIVACY COMPLIANCE CAN BE A BOON FOR BUSINESS

When it comes to privacy, market signals are loud and clear:

  • Privacy regulation isn’t going anywhere. If anything, it’s only getting stricter and broader in scope and eager regulators are vigorously handing out hefty fines for non compliance.
  • Global consumers are willing to walk away from brands that don’t respect their privacy, particularly when it comes to medical data.

The bottom line is that if your mHealth app is targeting users either globally or in any of the above jurisdictions, you’ll need to think about and prioritise compliance with a growing webwork of privacy and data protection regulations.

Failing to have robust privacy protections can not only lead to massive fines but can impact your:

  • Ability to gain your customers’ trust
  • Brand reputation
  • Chances of obtaining VC funding
  • Regulatory approvals or licences
  • Entering new markets

As Ariel Katz, CEO and co-founder of H1, a health tech platform, points out, when competitors weren’t GDPR compliant, his company was able to swoop in and win over clients with a comprehensive global data privacy infrastructure.

Privacy today is no longer seen as an unpleasant chore but rather as a competitive differentiator and a business driver. While the growing regulatory burden has caused some companies to exit certain markets or completely shut down, others are rising to the challenge, setting up robust internal data protection processes and building privacy-first products. It’s time for mHealth apps to do the same.

--

--

Rachel Dulberg
CodeX

Privacy, data + product nerd. Former tech lawyer + founder. I write about issues at the convergence of innovation, technology, product & privacy.