Tech For Good: Making Digital Innovation More Ethical
In January, 2019, as part of Global Week at Castilleja School, 12th graders took a four-day workshop called “Tech For Good: Making Digital Innovation More Ethical.”
The workshop was led by Natasha Singer, a technology reporter at The New York Times, and Jen Kagan, a technologist at Coworker.org.
Over the course of the workshop, students studied, discussed and debated the promise, and potential pitfalls, of emerging technologies including facial recognition and digital phenotyping — the use of social media and other non-health data to make health predictions.
At a moment when Congress is preparing to draft comprehensive consumer privacy legislation, the students in the Tech for Good workshop embarked on their own privacy regulation project.
First the class divided into three teams — representing companies, consumers and legislators — and researched state, federal and European privacy rules. Each of the three teams interviewed at least one industry, academic, legal or consumer rights expert. Then each team drafted its own privacy legislation. Finally, the teams negotiated among themselves to reconcile their bills into one privacy law.
Students on the industry team said they wanted to make sure that companies would be free to continue innovating with data. But they also pushed for more transparency around company use of personal data with the idea of bolstering consumer trust. They also insisted that consumers not be given the right to individually sue companies for potential violations of the privacy law, fearing that such a private right of action might open companies up to untenable liability risk.
Students on the consumer team worked to ensure increased data rights for individuals, pushing for consumer rights to access, correct and delete their personal data. In particular, their draft bill gave consumers the right to clear explanations of how companies would use their data as well as the right to be able to access online services even if consumers declined to share their personal data.
Students on the legislative team had the most challenging task, trying to reconcile and balance the priorities of their classmates representing consumers and industry. In the end, the students in the workshop came up with remarkable compromises.
They decided to allow data collection by default for most consumer personal information. Yet they carved out an exception for data they deemed “sensitive,” including health-related, religious, political or precise location data.
They denied consumers an express private right of action — instead directing the Federal Trade Commission to expand its definition of privacy harms to include predictive analytics, or algorithmic decision-making, that could negatively affect a consumer’s life trajectory.
Students also decided to double the F.T.C.’s budget, to fund additional lawyers and technologists to enforce their law. And they gave companies two years to prepare to comply with the law.
What follows is the final version of the students’ privacy law, followed by the initial versions of the law drafted by the consumer, industry and legislator teams.
Final Compromise Version of the Privacy Law:
The CRISPI Act
The Consumer Rights, Integrity, Safety, and Privacy in Information Act
The United States is one of the few industrialized nations to lack a comprehensive data privacy law. This regulation aims to remedy that deficit. This law is intended to give consumers more autonomy over their data while also protecting industry interests.
The CRISPI Act intends to strike a balance between consumers’ privacy rights and continued economic growth in industry. The law gives consumers the right to access their personal data and to understand how companies use it.
To empower the consumer, companies must make clear how they will use a consumers’ personal information before beginning to collect and track that user’s data. Users will have the ability to opt out of data tracking while still being able to utilize the majority of a company’s application or online service.
To support industry, the law will allow sufficient time for companies to comply with these new rules and provide a supportive transition.
Be it enacted that:
Consumers have the right to access and correct their data. Consumers deserve to know what information is being collected about them and how it is being used or shared.
This law awards consumers data autonomy, giving them the right to access and correct copies of data held about them by companies. This law will allow consumers to better understand and control the data being collected about them, enabling greater transparency between companies and consumers. This law will require all companies of more than 400 employees to create a system to provide consumers with a copy of their data and a process by which individuals may request correction of their data.
Companies must clearly explain how consumer data will be used and where it will go. Companies that collect consumer data must clearly and concisely explain where the tracked data will go and how it will be used. In this law, clear and concise language is defined as language that is easily understandable, transparent, and not dense. Consumers should be given this clear explanation before they agree, or decline, to have their data collected.
Just as laws like HIPAA enable people to access their medical records and understand where the data is going, this law gives consumers the right to know how their online data is used and shared. Notwithstanding the differences with medical records that are processed and protected under HIPAA, online data is also deeply personal and merits special protections.
Companies may collect consumer data by default — except for the collection of sensitive data which will require specific opt-in consent. In order to maintain a smooth, user-friendly experience, companies may collect consumer data by default. The collection of this information would in no way violate the rights of the consumer; and if they felt it necessary, consumers could simply opt-out.
However, for sensitive data, companies must ask for consumers’ permission before collecting such information. Such an opt-in system respects consumers’ rights to data privacy, transparency and security. Under this law, sensitive data is defined to include health or health-related, political, religious and precise location data. (Companies may continue to collect country, state or city-level location data).
This law requires companies to minimize the amount of consumer data they collect and store. Companies may record and store only the minimum data necessary to perform the services provided to consumers and may not hold data indefinitely. This clause aims to minimize the amount of data that companies store, limiting the amount of data that would be vulnerable in a data breach.
This law prohibits a “take it or leave it” policy. Companies are prohibited from employing unfair terms that render an app or another digital program ineffective if consumers refuse to share their data.
Individuals cannot be penalized for deciding not to share their private data and should not be forced to decide between relinquishing their data and engaging with a given system, or not using a service.
This policy will ensure that consumer access to online services is equitable and nondiscriminatory. Companies cannot require users to pay an extra fee to keep their data private, differential treatment that would result in inequitable access. This may mean that an app will need to charge all users for services or that the quality of certain data-reliant apps decreases.
Exception: Certain services, like mapping apps, which require data to fully function are exempt from providing full functionality to consumers unwilling to provide their information.
This law prohibits consumers from filing individual lawsuits against companies over perceived harms stemming from the collection, use or sharing of their data. Under the CRISPI Act, consumers will not have a private right of action. This means consumers, either as individuals or as a class, do not have the right to sue companies for alleged violations of the act.
Consumers may instead privately pursue complaints using a company’s arbitration process. This limitation is important because it protects companies from being slandered, flooded with lawsuits, or unduly fined.
This law empowers the Federal Trade Commission to enforce companies’ compliance with this law. In the past, the F.TC. requested an annual budget of $310 million. This law will more than double that budget, providing the agency with an additional $350 million dollars per year to fund additional technology and legal staff to enforce the CRISPI Act. This act empowers the F.T.C. to enforce the law and also expands the definition of harm under which the agency may pursue violators.
This law defines harm to include any clear potential risks — including the use or analysis of consumers’ data to influence major life-changing decisions such the selection of or eligibility for jobs, educational opportunities, insurance, loans, or access to health care. Under this law, consumers have the right not to be subjected to algorithmic decision-making when it relates to life-changing effects on consumers’ financial status, health, jobs or education.
This law provides companies with two years to comply with the new rules. To avoid harming business interests, specifically small tech companies and start-ups of fewer than 400 employees, the law will allow sufficient time before being going into effect on January 1st, 2021.
This law intends to ensure that companies have adequate time to adjust their practices to comply with the legislation, and the law aims to allot enough time for companies to negotiate the details of regulatory implementation. Companies that do not comply these requirements may be sued and fined by the FTC.
Consumer Team Draft Bill
Protection of Consumer Data Act (POCDA)
As there is no existing privacy law, there is a need for a Federal Law to protect the privacy of consumers and their digital data. As of now, companies and large organizations have the ability to collect and distribute consumers’ data without real repercussions.
In order to ensure fair data practices, companies must provide a transparent explanation of data use; consumers must actively consent to share their data; and, following collection of their data, users must have right to access, delete, and correct their own data. Companies and consumers would be held to fair data practice through consumers’ private right of action as well as oversight and enforcement by the Federal Trade Commission along with state attorneys general.
Be it enacted as follows:
Section 1: Algorithmic Transparency.
When registering for new products or services, consumers must be clearly notified when and how their data will be collected and/or distributed to third parties. Companies should consistently ask for specific, affirmative consent, rather than simply relying on one-time blanket consent during initial registration. Additionally, consumers must be notified whenever a company’s policy regarding data collection or distribution changes. Further, individuals will have the right to access, correct and delete information held about them by companies.
The “Terms and Conditions” page may continue to be the main method of communicating a company’s product terms; however, concerning any and all data personal collection, use, and sharing, a company must separately and specifically ask for affirmative consent to access consumer data. This process will be referred to as “opt-in”. When users register, instead of having to opt-out, users now will now have the right to opt-in to data collection. The above conditions will ensure transparency between the company and consumers, and allow users to regulate their own privacy and exercise control over their own information.
Section 2: Prohibiting “Take it or leave it.”
Companies are prohibited from employing unfair terms that render an app or another digital program ineffective if consumers refuse the sharing of their data. Individuals cannot be penalized for deciding to not share their private data and should not be forced to decide between retaining their data, or ceding their data and engaging with a given system. This policy will ensure that consumer access to online services is equitable and nondiscriminatory regarding their personal privacy preferences. Companies cannot require users to pay for keeping their data private, resulting in inequitable access. But, this may mean that apps need to charge all consumers for services or that the quality of certain data-reliant apps decreases.
Section 3: Enforcement Mechanisms.
In order to hold large corporations and businesses accountable to the fair data practices set forth in this law, individuals are awarded an express private right of action. If a corporation fails to abide by the restrictions outlined in Section 1 and 2, the consumer may sue the corporation that has violated their privacy rights. This is important because it allows consumers to protect themselves against massive corporations that have the bandwidth to overpower them. Additionally, giving consumers the right to individually sue will hold companies more accountable and give them an incentive to abide by the law due to the potential financial threat that lawsuits by millions of consumer could pose.
The F.T.C’s Bureau of Consumer Protection (BCP) would be responsible for regulating, and prosecuting violations of, consumer privacy and data protection rights. Additional funding will be directed to the Bureau of Consumer Protection to adequately support compliance with the new regulation. Any companies found to violate the aforementioned law may be fined an amount per violation to be determined by a judge.
Industry Team Draft Bill
The CRISPI Act
Consumer Rights, Integrity, Safety, and Privacy in Information
The United States lacks a comprehensive consumer privacy law. The CRISPI Act remedies this by outlining thorough privacy legislation to protect the rights of consumers, bringing transparency, clarity, and consistency. The CRISPI Act aims to balance the rights of consumers with the interests of industry, simultaneously allowing for privacy, prosperity, and innovation.
BODY OF THE LAW
SECTION 1. Companies should be transparent with regards to what data is collected.
- In order to maintain a smooth, user-friendly experience, companies may collect consumer data by default. The collection of this information would in no way violate the rights of the consumer; and if they felt it necessary, they could simply opt-out.
- However, for sensitive data, companies must ask for consumers’ permission before collecting such information. Such an opt-in system recognizes consumers’ rights to security of their personal information.
- Sensitive data includes health information, political affiliations, and religious beliefs and affiliations.
- This provision allows for the best possible outcomes for both consumers and the industry. While the industry will still be able to benefit from data collection, consumers will have autonomy over their personal information, particularly sensitive data.
SECTION 2. Once a company has collected consumer data, the company may use that data internally without obtaining renewed consumer consent. Companies may also consumers’ personal data for research and innovation purposes in collaboration with other companies they have contracts with.
- In order to be able to innovate, companies must be able to use the data they have collected to conduct research and to test new products and features. Under more stringent provisions that require consumer consent to collect and use each individual piece of data, even for research purposes, users would have a far worse experience and the companies would be stifled in their ability to innovate.
- Flexibility that allows for new industry innovation, and the ability to adapt to the challenges and opportunities that new technologies bring, is a must for a consumer privacy protection law.
- We believe in the protection of consumers from new potential harms. But equally so, technology can benefit so many and save so many lives, and the best innovations of today require room to develop.
SECTION 3. Companies should not be penalized if consumers are not demonstrably harmed by data collection, sharing or use.
- This law prohibits fining companies if consumers are not demonstrably harmed (by identity theft, etc.) by certain collection, use, or sharing of their personal data. For example, if data is collected during a data breach and sent to other companies but does not impact the consumer in any significantly negative way, then a company may not be fined or penalized.
- The law protects the rights of industry to develop new technologies. Companies should be focused on preventing data abuse and harm rather than on abstract precautions, because the impact of new technologies is unforeseeable.
- This law will allow for efficient innovation without unnecessary concern from consumers about potential negative impacts of technology. As a result of efficient innovation, individuals will be able to reap the benefits of new technologies sooner.
SECTION 4. This law prohibits consumers from filing individual lawsuits against companies over perceived harms stemming from the collection, use or sharing of their data.
- Consumers do not have an express private right of action when it comes to consumer data privacy; meaning they cannot sue a company in public court. All issues must be taken up privately using the company’s arbitration process.
- This law denies the individuals the right to sue a company for privacy violations. The law allows consumers to settle their complaints with the company only in a strictly private environment, allowing the company to remain unharmed by the citizen’s complaint.
- This limitation is important because it protects companies from being slandered or unduly fined, keeping the industry’s reputation intact. The industry and consumers will be able to handle matters privately, securing the company’s integrity and individual’s privacy.
- This law will allow for a fair and consistent process of justice for both the consumer and industry. The law ensures protection of the industry while also meeting consumers’ needs.
Legislator Team Draft Bill
Digital Protection Act (DPA)
The United States is one of the few developed countries to lack a comprehensive data privacy law. This law will, for the first time, lay out consumer protection for data privacy nationwide. It will give consumers autonomy over their data, while also protecting the interests of the tech industry.
This law intends to strike a balance between consumer’s privacy rights while continuing to spur economic growth in industry. Consumers will have the autonomy to access, delete, and be privy to the usage of their data. To empower the consumer, industries must make clear to users how they will use the personal data they harvest — prior to tracking that user data. Users will have the opportunity to opt out of data tracking while still being able to utilize the majority of the company’s application or service.
To support the tech industry, the law will allow sufficient time for industries to comply with these new rules and provide a supportive transition.
Consumers have the right to access and delete their data.
This law will give consumers autonomy over their data, allowing them access to their data and the right to delete it. Consumers deserve to know what information companies are collecting about them and should have the right to correct their personal data or delete information they do not want public.
This law will allow consumers to better understand and control the data being collected, enabling more transparency between companies and consumers. Consumers should understand and be able to control the data collected about them — even if it means companies receive less information.
This law will require companies to create a system to provide consumers with a copy of their data and a process by which individuals may request correction or deletion of their data.
Companies must clearly explain who will access consumer data and where it will go
Companies that collect consumer data must clearly and concisely explain where the tracked data will go and how it will be used. In this law, clear and concise language is defined as language that is easily readable, transparent, and not dense. Consumers should be given this clear explanation before they are asked to agree, or decline, to have their data tracked. Just like citizens have the right to understand where their medical data is going through laws like HIPAA, consumers have the right to know where their online data is used. Despite having different content than the information protected under HIPAA, online data is still deeply personal and deserves to be highly protected.
This law intends to prevent discrimination against consumers who deny an app the right to collect, use and distribute their data (“app” is the medium collecting consumers data).
Companies must obtain consumers consent before collecting their personal information. If consumers do not consent, they should still be allowed full use of an app. Consumers should be allowed full use of an application even if they decide to exercise their rights surrounding the collection and disclosure of their data. This stipulation ensures that consumers are not coerced into agreeing to a company tracking their data out of fear of not being able to use the service.
This law provides companies with two years for compliance with the new rules
To avoid harming business interests, specifically small tech companies and start-ups of fewer than 400 employees, the law will allow sufficient time before being enforced on January 1st, 2021. This law intends to ensure that companies have adequate time to adjust their practices to comply with the legislation, and the law aims to allot enough time that companies can negotiate the details of regulatory implementation. Companies that do not fulfil these requirements may be sued and fined by the F.T.C. Due to this added responsibility, the F.T.C. will be provided with substantially increased funding.
This law requires companies to minimize the amount of consumer data they store
Companies may only record and store data that is the minimum necessary to perform the service provided to consumers. Companies may not hold data indefinitely. This clause aims to minimize the amount of data that companies store and limit the amount of data that would be vulnerable in a data breach.
This law empowers the F.T.C. to enforce companies’ compliance with the provisions described above. The F.T.C. requested a budget of $310 million dollars per year prior to this legislation. So the agency would need an additional $350 million dollars annually to fund additional staff to enforce the DPA.