Key lessons for digital health leaders arising out of Meta’s FTC mess

Shawn Flaherty
Tranquil Data
Published in
5 min readMay 26, 2023

In 2020, Facebook reached a $5 Billion settlement with the FTC due to their deceptive practices in sharing data with third parties, notably Cambridge Analytica. While $5 billion is a significant sum, it amounted to less than 1% of Facebook’s market cap, making it merely a cost of doing business for them.

One overlooked aspect of the settlement was the appointment of an independent auditor tasked with evaluating Meta’s commitment to establishing an “effective privacy program.” The primary objective of this program was to ensure Meta’s compliance with relevant regulations regarding data usage and sharing, while also guaranteeing alignment with the information disclosed to users. This is the same program digital health companies PreMom, GoodRX, and Betterhelp signed up for in their respective FTC settlements earlier this year.

In a surprising turn of events, the FTC recently issued an order to Meta, demanding proof that they are adhering to the terms of their agreement. The order came after Meta’s independent auditor, Protivi, discovered “significant gaps and shortcomings” throughout the heavily redacted report. This development came as a shock to Meta, considering they have invested billions in their privacy program, and this was the first time the FTC had ever issued such an order after a settlement. Meta freaked, calling it a “political stunt” that came right after, “the Commission lost its bipartisan membership.”

There are many that believe Meta is a political punching bag that is unfairly targeted, others believe Meta is a bad actor who doesn’t care. Those opinions aside, Meta has the same problems almost every digital native company has: how do you ensure that data is used and shared properly in line with complex multi-national regulations, business agreements, and user consents? And how can companies make this process transparent to non-technical stakeholders to demonstrate that their intricate requirements are met? The truth is that Meta is grappling with a complex problem that many companies struggle to solve. Achieving the right approach necessitates a significant investment of time and resources, which most companies don’t have. Case in point: Meta’s estimate in a leaked internal email was 650 engineering years to “have an adequate level of control and explainability over how our systems use data.”

The current best in class process to ensure data is used and shared properly involves documenting requirements (legal advice amounts to “Document, Document, Document!”), access control measures, and training engineers with access on the requirements to use and share data properly. However, a significant challenge arises due to the disconnect between legal and compliance roles and technical implementation. Legal and compliance professionals lack the ability to read or write code, making it difficult for them to verify if their requirements are met. Conversely, engineers struggle to determine if they have correctly implemented the legal requirements because they aren’t lawyers. This lack of transparency, and the inherent complexity involved in ensuring appropriate data use and sharing, consistently leads to problems, as witnessed in recent actions against PreMom, Twitter, GoodRX, Betterhelp and UK Tik Tok.

Meta has 30 days to respond to the FTC demand, and should they not meet their burden, the FTC will ask to modify the settlement with some very harsh provisions, including:

1. A blanket prohibition on monetizing data from children and teens under 18, affecting approximately 5% to 10% of Meta’s user base.

2. A pause on the launch of new products, services, or features without written confirmation from their independent assessor that their privacy program fully complies with the order’s requirements and shows no material gaps or weaknesses (a brutal provision).

3. Strengthening existing requirements, including third-party monitoring, data inventory management, access controls, employee training, and self-disclosure of any violations.

As part of their FTC settlements, digital health companies PreMom, GoodRX, and Betterhelp committed to annual independent audits to assess the effectiveness of their privacy programs for the next 20–30 years. While these companies will not face the same scale challenges that Meta has, they must allocate appropriate time and resources according to their own scale. Meta, despite investing over $1B annually, failed to meet their obligations. For GoodRX and Teladoc (Betterhelp’s parent company), this translates to a minimum expenditure of $4 million and $8 million respectively each year to ensure they prioritize and address this issue as diligently as Meta did when they failed.

To fulfill their obligations, PreMom, GoodRX, and Betterhelp (and all digital health companies) should lean on the work Meta has already taken outlined in this leaked internal memo. The memo outlines Meta’s approach, and the challenges they faced when “building a system capable of scalably enforcing policy to ensure they an adequate level of control and explainability over how their systems use and share data.” If internal teams lack the expertise to build such a program, they should look to partner with a team of domain experts who can.

The realization of the challenges involved in ensuring proper data use and sharing inspired the founding of Tranquil Data in 2017. Over the past few years, we anticipated that the FTC would place greater scrutiny on the digital health given its relatively early stage of development, and the inherent necessity to use and share sensitive data. In response to this trend, we recently introduced a streamlined version of our platform software specifically tailored to address the unique requirements of digital health companies.

Our strategic move has proven fruitful, as evidenced by the unprecedented settlements involving three digital health companies since January 1st, and the increased importance placed on correct data use and sharing by digital health leaders. With our purpose-built solution, we provide digital health companies with a robust program that automates responsible data use and upholds promises made to users. Moreover, our software offers transparency in policies and policy decisions, allowing non-technical stakeholders to see that their intricate requirements are met.

If you would like to take the software for a spin, please get in touch.

--

--