Why re-consent is hard, and what the FTC’s latest enforcement action means

Shawn Flaherty
Tranquil Data
Published in
5 min readSep 18, 2023

The Federal Trade Commission recently reached a settlement with genetic testing company 1Health. The settlement included monetary penalties, 20 years of stringent information security requirements, and 3rd party independent assessments.

The key FTC claim was that 1Health retroactively changed its privacy policy in a way that was considered “material,” and should have obtained customer’s re-consent to the changes. Specifically, before April 2020, their privacy policy allowed limited sharing of personal information with medical professionals and business partners. Later in the year, they expanded the scope of sharing to a wide range of third parties like pharmacies and supermarkets for marketing purposes, and applied the change to those who consented before the revisions.

Under FTC case law and policy, companies must provide prominent disclosures and obtain opt-in re-consent before using consumer data in a “materially different manner” than that was claimed originally. What is considered “materially different” has been open to interpretation for over a decade. In 2010, the FTC sought comment “on the types of changes companies make to their policies and practices, and what types of changes are regarded as material.” After the comment period closed, the FTC published the following guidance that did little to settle the matter, “at a minimum, sharing consumer information with third parties after committing at the time of collection not to share the data would constitute a material change. There may be other circumstances in which a change would be material, which would have to be determined on a case-by-case basis, analyzing the context of the consumer’s interaction with the business.”

The FTC’s guidance on what constitutes a “material change” sounds a bit like the famous “you know it when you see it” test: not very helpful. The guidance does not further define their one example of materiality, nor does it provide any criteria of what else may or may not be material. A simple interpretation is that as use is expanded, some intervention may be required. For example, as materiality increases (x-axis), so should the respective intervention (e.g. notification or re-consent) (y-axis). The slope of the line is unknown. Therefore, what actions fall above and below the line are also unknown.

The safest practice is to require re-consent when any change is made, regardless of the materiality. In practice, companies are constantly developing new products and services involving new data uses, and working with partners in new ways. While safe, this practice would mean a significant disruption to users.

The re-consent process is also a massive lift for the engineering teams that are charged with operationalizing the changes. Depending on the scope of the change and complexity of the company, the required work ranges from weeks to months, and from tens of thousands to millions of dollars of effort.

The opposite extreme is the attempt to avoid the need to require re-consent entirely with privacy policy language like the example below. As covered by The Verge, this means that “even if you do a careful read of a privacy policy before signing up … and even if you feel really comfortable with that policy — sike, the company can go back and change that policy whenever they want.”

The right approach lies somewhere between the two extremes of “doing nothing” and “always re-consent.” The right approach requires collaboration and alignment between product, marketing, and compliance. These stakeholders need to agree on when notification or re-consent will be required, and when they are not. This is a simplistic example of one such framework:

Once the logical framework is established, the team needs to collaborate with engineering to understand the time and costs associated with operationalizing the required intervention. Relevant processes that need to be considered include:

· Building a baseline by re-constructing the history of consents to understand what has been consented to, by whom, and when

· Using the baseline to ensure that only the necessary users re-consent at the right time to avoid consent fatigue

· Planning, testing, and staging the intervention

· Building the ability to fork users’ data into separate groups (e.g. those who have re-consented and those who have not), to move users into the “re-consented” bucket as re-consents are received, and to ensure that only those who have re-consented have their data collected or used in the respective new ways

· Ensuring that the logical framework is always on, and always correct

· Creating a first class data set for consent that accurately reflect what was consented to at any given time, under any given version of the consent, and the details of how data was used under each version of consent

· Building transparent audit capabilities that allow users and all external stakeholders to access comprehensive information, including timing, consent versions, and data use

Given the thousands of engineering hours required to do this right, it is no surprise that many companies try to sidestep this process with privacy policy language like that provided above. What we now know from the 1Health enforcement action is that such practices violates FTC policy if the change is material.

The good news: building a robust and flexible consent process is about more than regulatory avoidance. 75% of customers want re-consent to use their data for a new purpose, 80% want to opt-out of sharing some or all their data, and 46% consider other brands if how data is handled is unclear.¹ These statistics show that investing in a robust consent program builds trusts with potential customers, tightening the top of the customer acquisition funnel. Trust is also a series of promises kept. This means that as you engage users in their privacy choices, their trust grows, and so does their willingness to share valuable data about themselves.

The team at Tranquil Data has been talking about consent for years. We saw regulatory, market, and user consciousness growing, and designed a fundamentally new solution that puts scalable automation, consent, transparency, and audit at the heart of data platforms. Re-consent is a great example of a feature of our product that handles all the complexity of the re-consent process with a simple checkbox.

If consent, or many of the regulatory frameworks that includes consent, is a topic of interest we would love to talk. info@tranquildata.com

  1. Why Digital Trust Truly Matters, McKinsey & Co. (2022), Patient Perspectives Around Data Privacy, AMA (2022)

--

--