Why every digital health startup should hire a privacy expert

Here’s a radical thought: In the current Age of Privacy, privacy product managers are a must-have. Particularly in digital health.

Rachel Dulberg
CodeX
7 min readOct 25, 2021

--

image by jcomp — www.freepik.com

Digital health startups are transforming the future of healthcare. If you’re reading this you’re probably one of over 8,000 healthcare startups in 18 countries around the world today that are building groundbreaking products and helping improve healthcare outcomes, quality and access.

Digital health leading 150 global startups for 2020
Source: Digital Health 150: The Digital Health Startups Transforming The Future Of Healthcare

Healthcare innovation has massive potential. But it’s also one of the most difficult industries to disrupt. It’s got a complex webwork of incumbent stakeholders (service providers, clinicians, payers, patients, researchers etc.). It’s known for its deep fragmentation and data silos. And it’s a highly regulated sector.

Obviously, getting the tech and product right is paramount. As is figuring out the path to commercialisation, finding market-fit, navigating regulatory approvals and hiring a stellar team.

But many growing digital health companies seem to lack an important — and newly emerging — role on their leadership teams: a privacy expert. And by that I don’t mean a legal or security subject-matter expert (although you may need those too). Think of it more as a privacy product manager.

What’s a privacy product manager?

A privacy product manager oversees the full lifecycle of how privacy is (or isn’t) observed within an organisation. The role is similar to that of a traditional product manager, and the more recent specialised “data product manager” and “AI product manager” which are now common in Silicon Valley.

The idea is that instead of treating privacy like a risk factor or a compliance burden, it’s treated like a product.

The privacy product manager basically acts as a privacy expert within an organisation and is generally hired into their role with a strong combined background in privacy law and regulation, product management and technology.

The privacy PM’s key function is to balance the product strategy, governance, and implementation of anything privacy and ethics-related, and facilitate the conversations between all relevant stakeholders — executives/leadership team, engineers, analysts, product teams, marketing teams, regulators, external partners and customers/users.

Just as, traditionally, PMs were said to represent the user’s voice in the product development lifecycle, privacy PMs represent the end-user’s trust expectations, and their job is to ensure that values such as privacy and ethics are practically implemented in your organisation and embedded into your product. They’re also your company’s internal and external privacy evangelist. This is far broader than simply providing legal and regulatory advice although that’s obviously a part of the value-add.

5 reasons why building a privacy-first product is key in digital health

Privacy is an emergent issue that goes well beyond compliance — it cuts across the software development lifecycle, engineering, product design, data flows, analytics, marketing and tech architecture. In short, privacy is a foundational element that needs to be factored into every aspect of your product’s design and commercialisation path, particularly in digital health. Here’s why:

  1. Medical data is a high stakes game

We’re living in the “Age of Data Privacy”. The world’s biggest brands are moving beyond privacy as a risk factor and viewing privacy and data ethics as a USP that‘s brand critical. Nowhere is this more relevant than in digital health, where companies are trading in the personal data marketplace.

Given the sensitivity of clinical data, digital health is a high stakes game. The consequences of unethical or unauthorised data sharing, prediction errors in medical AI models or data breaches could be dire.

Digital health — unlike other tech companies — cannot afford to move fast and break things.

Clumsy slip ups like Facebook’s Cambridge Analytica data scandal in the healthtech space would be ruinous and could directly harm people in a very real way (and lead to hefty regulatory fines and legal liability claims).

2. Trust will be a critical feature of digital health products

Google, Apple and Facebook have come full circle from monetising personal data, and denouncing privacy as passé ten years ago, to pouring huge amounts of cash and resources into shiny new privacy agendas.

This is all driven by significant shifts in consumers’ privacy expectations when it comes to data control, transparency and ethics. Digital trust is becoming a board priority — digital businesses that lead in privacy, data ethics, security, and reliability are expected to be the titans of tomorrow.

3. Regulatory approvals & healthcare procurement will depend on privacy

In April 2021 the Australian Therapeutic Goods Administration (TGA) issued its revised cybersecurity guidance for pre- and post-market medical devices, including SaMDs and AI-driven devices advised healthcare providers’ procurement teams to ask detailed questions about the developer’s privacy and security protocols, including how is data from the device logged and stored; are third party cloud services used and if so, what are their privacy and security policies; is the data stored onshore; and how will the developer respond in the future if a data breach incident occurs.

In the US, the FDA’s newly proposed “Software Precertification Program” and the “Artificial Intelligence/Machine Learning (AI/ML)-Based Software as a Medical Device (SaMD) Action Plan,” both highlight that the FDA’s evaluation of a developer’s safety and quality standards will now include security, privacy and data protection systems, processes and controls.

Also, the FTC’s recent policy statement on digital health app developers, connected devices and other health products aims to protect users’ privacy and now requires developers to now comply with the Health Breach Notification Rule, whether or not they’re covered by the HIPAA privacy and security rules.

4. Privacy compliance is key in digital health

Recent years have brought a raft of new privacy regulations in more than 60 countries, motivated by the need to protect consumers’ data in the digital age and inspired by the EU’s General Data Protection Regulation (GDPR) and the huge fines it entails.

GDPR imposes a raft of strict obligations on anyone who collects, stores and uses health (which is very broadly defined), including the obligation to embed privacy by design into your product from the outset.

The Australian government, for example, is in the throes of bringing in wide ranging privacy law reforms that will align Australia’s dated privacy laws with GDPR and impose much higher penalties for breaches and will directly impact the digital health industry.

But there’s also a complex webwork of new global laws, regulations and standards on the horizon that will govern how AI companies use data and will have a huge impact on the global medtech sector.

5. Gaining access to quality data is critical

Medical AI products are only as good as the data that fuels their models. They require close long-term collaborations with hospitals, labs and clinicians who are under strict legal, regulatory and ethical obligations when it comes to handling patient data.

To foster partnerships with healthcare stakeholders, whether for data access or product evaluation and validation (such as this recent multi-jurisdictional collaboration between Rhino Health, a startup, and 20 hospitals across 5 continents) you’ll need to have robust privacy technologies and practices deeply embedded into your product or platform’s architecture (in addition to security).

Whether your product helps with process / service delivery automation, diagnosis and treatment, clinical trials, disease management or virtual care delivery — you will likely have multiple sensitive data touchpoints. Your business partners won’t just want to know that your product works reliably, they’ll also expect it to be private, ethical and secure.

What can a privacy expert do for your digital health startup?

Having bulletproof privacy practices isn’t just useful for dealing with regulators and ticking the compliance box. It also makes good business sense. A privacy expert could:

  1. Incorporate privacy & ethics into the product development & design lifecycle, ensure you’re adhering to privacy by design principles and potential biases are properly controlled both during model training, testing and validation and once the product is in production through continuous monitoring.
  2. Operationalise privacy — they’ll develop clear standard procedures, policies and systems to incorporate privacy into your data collection, development and design process, handle user requests for removal or transfer of personal data and prepare your infrastructure environments for these future needs.
  3. Drive growth — transparency and ethics are already becoming a competitive advantage and help build trust in your product. Particularly with medical AI, both clinicians and patients will want to understand how the model reached its prediction or why a particular treatment or action is recommended. This could impact adoption, reputation, industry collaborations, data access and market penetration.
  4. Help fund raising/investment — VC due diligence looks at data privacy, particularly in the medtech sector. This could affect your future valuation and ability to enter new markets. When it comes to private equity investment in medtech (for more mature companies), PE firms will look at how you acquire and repackage proprietary data as one of the key indicators of scalability. Data-privacy regulations, the need for patient consent for data sharing, intellectual property, data quality, or simply a lack of customer participation often prevent companies from achieving significant growth.

3 tips for hiring privacy experts into startup leadership roles

Look for someone with:

  • A broad and diverse skill set — given privacy’s multiple touch points across the business and the need to collaborate with a multidisciplinary team, it’s important for your privacy expert to be proficient in the relevant laws and regulations in all relevant jurisdictions. But they’ll also need a solid understanding of data and product principles, technology, commercial frameworks and the healthcare industry landscape.
  • Startup experience — anyone who’s ever worked in or founded a startup (or watched HBO’s Silicon Valley) will know that building the plane in mid flight is a common feature of startup life. A solid corporate background is useful but so is adaptability, resourcefulness and the ability to deal with the fast paced unpredictable nature of fast growing businesses.
  • A diverse background — it’s well established that diversity and inclusivity are important. But when building AI products diversity ensures that bias is controlled and your product is built with a variety of perspectives and approaches. This is especially relevant if the product is intended to be used by or for the benefit of a particular user group/specific patients (e.g. women).

Medical AI companies are doing important, transformational work that will benefit society as a whole. It’s vital that they set themselves up for success by preparing for key market and regulatory shifts and building privacy-first products.

--

--

Rachel Dulberg
CodeX

Privacy, data + product nerd. Former tech lawyer + founder. I write about issues at the convergence of innovation, technology, product & privacy.