Working with Children’s Data in Digital Health

Shawn Flaherty
Tranquil Data
Published in
6 min readJan 18, 2023

I previously wrote about the explosive growth in digital health, and the trust and transparency gap here. Digital health companies who build solutions for, or are accessible by*, children represent an adjacent set of challenges worth exploring in more detail. Companies building solutions for children have an exciting opportunity based on attributes of those patient cohorts, but also a host of unique challenges based on general aptitude, digital acumen, and the involvement of family or other personal representatives in the care journey.

*It’s worth calling out that even if a digital health company believes it is not targeting or working with children, there is precedent that they might be without knowing it, and the FTC is paying attention. More on that here.

Opportunities

The market for digital pediatric health solutions is nascent, but investor interest has grown over the past few years. Per 7 Wire Ventures, private investment grew from $205M in 2020 to $300M in 2021, with companies like Brightline, Akili, and Cognoa leading the pack. Just like the broader digital health market — most of the funding and progress is being made in mental health.

The bad news, or perhaps the opportunity, is the prevalence of mental health issues among U.S. adolescents and young adults that has risen sharply over the last decade. There is a growing consensus that these trends are connected to the rise in technology use. Digital health offers a positive counter-balance to use technology for good. The good news for digital health companies focused on children is that Generation Z is and will be more educated and digitally native than earlier generations. As this generation matures to adulthood, digital health has the opportunity to be widely accepted, much like the mobile native Millennial Generation before them.

Challenges

It is a cultural norm that children have an elevated right to protection in all aspects of life, including healthcare and privacy. Part of the motivation is philosophical — we inherently care more about how our children are treated. Another motivation for protection is aptitude — at what age can children meaningfully understand consent and its associated consequences?

While the federal government has struggled to come up with a national privacy framework, they did so for children over 20 years ago with the Children’s Online Privacy Protection Act (COPPA). COPPA provides a number of provisions to protect children including plain language notice, parental consent, strict rules for data sharing, the right to review and delete (much like GDPR), data minimization, and minimum security requirements.

Part of the opportunity for digital health is their ability to reach a larger audience than traditional care through digitally native solutions. This allows digital health companies to scale solutions to more users across geographies with less friction than traditional care. Comparing and contrasting traditional therapy with digital therapeutics is illustrative of this advantage. Traditional therapists must be registered in each state in which they practice. There is little reciprocity between states, creating a barrier to entry for therapists looking to expand across state lines. In contrast, digital therapeutics can reach clients in all 50 states with far less friction.

However, scaling digital therapeutics across state lines isn’t without regulatory challenges, especially for those focused on children. Much like HIPAA, COPPA is a regulatory floor, not a ceiling. This has left room for individual states to create their privacy rules for children. Unfortunately, states have created a patchwork of inconsistent laws that protect children’s health information for certain categories (e.g. mental and reproductive health), at varied ages, and if adolescences have certain attributes (e.g. emancipation, pregnancy, marriage status). The problem gets more dynamic as children age, families move across states, and care situations change.

The historical lack of a solution to this complexity has led digital health companies (as well as incumbent payers and health systems) to some bad practices. Some ignore state law nuance, others use blunt instruments to subvert the complexity. One common instrument involves creating a single login for each child with no associated parent login. Purpose built solutions understand the active role parents play in children’s healthcare, and that usernames and passwords are often created by parents or shared with them. For purpose-built solutions for children, this practice is akin to willful ignorance, and clearly violates children’s privacy. The lack of a parent login may also stray from the HIPAA mandate to enable personal representatives to access the health information for those under their charge.

Another blunt instrument is to mandate children wave their right to privacy or agree that they are of a certain age on signup. This is also a dangerous tool as it may come up against some explicit COPPA mandates for plain language notice and consent frameworks. See here for precedent on the FTC regulating this type of instrument.

Both ignoring regulation and building blunt instruments are poor choices. While regulators are almost always behind the rate at which technology progresses, it will not be long before state regulators and the FTC wake up. When they do, digital health companies focused on children will be the first to fall under a microscope. Of note, there’s also a push from groups like the AMA, congress, and academics x1, x2, x3 for increased regulation and enforcement in digital health.

Recommendations

Working with children in digital health is a hard problem, and one that if not done right, will invite regulatory scrutiny and destroy value. Here are some best practices for digital health companies that work with minors:

· Work with your general counsel or outside counsel to create a landscape of the regulations that affect you (links to helpful resources for FTC, HIPAA, COPPA, and state law landscape). The good news — the rules are semi-static and don’t need to be updated often

· Craft clear plain language privacy policies and fine-grained consent interfaces to match the requirement of your regulatory landscape

· Build the capacity for personal representatives to have adjacent accounts, and build appropriate privacy policies and consent frameworks to match parental and/or personal representative’s rights

· Operationalize the landscape by deploying a set of policies independent from code that automate enforcement. If engineering can’t show non-technical roles policies, outputs, and the way enforcement works, there’s a black box of risk

· Double down on meta-data. Policies are semi-static and should only need to be updated when regulations change. Metadata like state of residence and age (that often change) should drive when a policy comes into enforcement. An oversimplified example: capture user date of birth to appropriately age-out children for COPPA and various state laws

· Treat consents, policies, and policy decisions as first class datasets — be able to convey what was consented to, what rules were in effect, and what was done with data at any given point of time

This article is heavily focused on risk mitigation strategies that require heavy lifting from engineering, a group that is not always built to understand this type of risk. This usually creates a dynamic where legal or compliance roles are fighting for budget and attention from engineering, especially in startups where risk can be a tricky sell against competing priorities (see the recent Cerebral complaint for a case where even leadership was allegedly fighting this battle).

At Tranquil Data, we have seen that platforms built on trust and transparency create cross-functional value that far surpasses cost. For example, value can be unlocked in existing data as it becomes clear what can and cannot be done with varied data segments. Marketing can explicitly market trust and transparency to customers. Product teams can understand what new products can be built in line with requirements. Business development can focus on new data-driven partnerships. Sales and account management can show current and prospective customers (especially payers) that you are a better steward of member data than they are. This level of trust and transparency will enable MSA, BAA, and diligence negotiations to close faster, and payers will be more likely to share sensitive and valuable data with your platform.

Investment in trust and transparency should be positioned as much about competitive advantage as it is about risk mitigation, especially when working with children.

--

--