The EARN IT Act: Sen. Graham & Sen. Blumenthal’s Anti-Security Bill That Will Backfire
Senators Lindsey Graham and Richard Blumenthal are political opposites, but they found common ground when it comes to giving the government new powers to access our devices and digital data — even when we’ve chosen to encrypt our data. Unfortunately for Americans, that bipartisan belief could be enshrined into law if the duo’s recent legislation makes its way to the president’s desk.
The EARN IT Act (Eliminating Abusive and Rampant Neglect of Interactive Technologies) gives the Attorney General new powers to access private communications and dictate how online services operate.
Businesses that don’t follow the Attorney General’s preferred practices could face billions in lawsuits and liability under federal and state law. In essence, the EARN IT forces businesses to create backdoors to encrypted data, devices, and services — undermining Americans’ privacy and chilling free speech online.
The big picture:
Although billed as a tool needed to fight online child exploitation, the EARN IT Act is chock full of unintended consequences that will actually make it harder to fight child exploitation. And as a backdoor to encryption, it’ll leave Americans’ digital security vulnerable to attack.
If passed, the EARN IT Act would:
- Make it easier for bad actors to attack and disclose Americans’ digital lives;
- Drive child predators underground and out of law enforcement’s reach;
- Undercut efforts to detect and combat child exploitation online;
- Stifle legal speech online; and
- Entrench large online services and make it harder for new competitors.
Senators Graham and Blumenthal surely didn’t intend these harms when they drafted EARN IT. And surely no rational American would disagree with the goal of fighting online child exploitation and abuse. But like many well-intended laws, the EARN IT Act does more harm than good.
Here’s how the EARN IT act causes more harm than good:
- The bill expands legal liability for businesses like Dropbox, Etsy, Pinterest, Craigslist, Facebook, and Twitter. Under the act, business will be held liable when users violate 18 U.S.C. § 2252 (child-exploitation crimes) and those businesses are allegedly “reckless” in preventing it. In other words, businesses will be held liable for what others do.
- Under the act, a business acts “recklessly” when it doesn’t comply with the government’s best practices. So if encryption is “reckless,” then businesses could encrypt only if they’re willing to face billions in civil damages under federal and state law.
- Business can’t accept that risk, so encryption would effectively be lost.
- Without encryption, vulnerable groups — including children — would be put at risk. Child predators, for example, could more easily hack a family’s iCloud account and download photos of young children. Or hack a video monitoring system that helps parents keep watch over their young children at home.
- At the same time, weakened encryption would risk adults’ security, too. Everything from medical records to emails would be exposed to new security threats.
Undermining Encryption Protections
EARN IT is a backdoor that undermines true encryption. Senator Graham, who chairs the Senate Judiciary Committee, and other members of the committee, including Senator Blumenthal, have repeatedly tried to prevent businesses from providing encryption services. These encryption systems are used for many purposes, including securing private conversations, personal financial transactions, and confidential corporate documents.
Unfortunately, Senators Graham and Blumenthal agree with U.S. Attorney General Barr that businesses must intentionally weaken their encryption so that the government has a “backdoor” into their users’ devices, accounts, and conversations.
But businesses repeatedly told the government that they don’t currently have the ability to let the government in through a backdoor without also letting in malicious hackers. With public concern over data privacy and security growing — and with ever-more sophisticated attempts by Russia, China, and others to attack America’s cybersecurity — businesses have responded by moving to encryption.
But the EARN IT Act undermines that effort. First, it creates new liability for businesses that seek to secure their users’ data. At the same time, the EARN IT Act provides a “safe harbor” from liability only when businesses comply with “best practices” created by a new commission of unelected individuals.
But this group could easily turn these optional-in-theory-only best practices into a backdoor to encryption by simply declaring that businesses violate the best practices by offering encryption to users.
The EARN IT Act therefore gives businesses the choice of accepting billions in liability or providing a backdoor to the government. Given the choice between protecting users’ security or going out of business entirely, many businesses would abandon encryption efforts.
Losing full encryption would be dangerous for every American — Although encryption is a deceptively simple concept, it is a strong defense of our data. The EARN IT Act threatens encryption protections used to secure:
- Photos that parents take of their newborns or young children, and store on their phones or online in cloud accounts;
- Videos recorded by baby-monitoring devices installed in schools and nurseries;
- Sensitive corporate documents stored in Google Drive or Microsoft Office 360;
- Confidential communications between attorneys and their clients;
- Survivor-to-advocate communications between victims of domestic violence and those who may be able to help;
- Private communications between reporters and their confidential sources;
- Whistleblower complaints sent to authorities or to media;
- Text messages between significant others;
- Health records and lab results that are sent and stored online; and
- Banking and investment records.
As our lives become ever-more digital, encryption not only protects our privacy but also protects our security. Even if someone withdraws from today’s digitized society, that person likely has personal information stored by a third party somewhere online. So everyone benefits from encryption.
And encryption protects children from predators. But the EARN IT Act’s attack on encryption makes it easier for child predators to gain access to a family’s photo albums. Or to learn about that child’s daily routine. Or to watch that child inside the home. Or to steal that child’s personal documents.
To be sure, encryption makes it harder for law enforcement to access digital evidence of child exploitation and abuse. Tech businesses like Facebook flagged over 12 million unencrypted images of potential child exploitation for the government last year. So by banning or weakening encryption, the government would have an easier time collecting digital images as evidence. But that outcome relies on unrealistic premises.
The EARN IT Act ignores these likely outcomes:
- Child predators will adapt and move to encrypted services developed abroad;
- We still don’t provide enough resources to investigate and prosecute predators and abusers;
- Law enforcement’s other investigatory powers, including witness interviews, are sufficient; and
- Tech businesses will keep improving best practices to detect, report, and combat illegal content.
EARN IT also raises important questions: Does the United States really trust that China won’t try to hack American business records or hack Americans’ personal files (again)? Or that Iran, Russia, North Korea, China — or even an ally — won’t try to hack firms that work with our military and intelligence services?
In other words, the government may initially have an easier time prosecuting crimes without encryption. But a harder time prosecuting crimes is not the same as preventing prosecution altogether. And given that encryption protects children, privacy, and national security, the long-term benefits of encryption outweigh short-term challenges faced by law enforcement.
But wait, how exactly is this bill a “backdoor to encryption”?
EARN IT would hold online services accountable for users’ behavior if the businesses don’t follow the government’s best practices. That would hold true even if deviating from those best practices was deemed necessary to protect children. Again, the EARN IT Act forces businesses to choose between incurring billions of dollars in new legal liability, or giving the government a backdoor to its users’ encrypted communications, documents, and images.
EARN IT creates this problem by amending Section 230 of the Communications Decency Act. Section 230 was passed in 1996 and codified traditional liability principles — namely that those who harm others are responsible for their actions. The law was a reaction to courts that considered expanding liability to cover online communications services in the ’90s. At the time, the internet was new — and so many judges were new to using it — that wasn’t clear how long-held legal principles should apply to bad actors who used the internet to harm others.
Enter Section 230. First, the law made clear that individuals are responsible for their own content, and that online services are responsible only when they actively contribute to creating harmful content. Second, the law sought to encourage online services to develop their own filters and content-curation policies.
In other words, if a website wanted to be “family friendly,” Congress wanted to give that website the power to filter content without getting labeled as a co-creator for any harmful content that might slip through the cracks.
Both supporters and critics of Section 230 mistakenly describe these provisions as a form of “immunity.” In reality, these provisions are just traditional rules of liability enshrined in federal law. (Congress has been enshrining liability rules for over a century.)
Congress also went out of its way to ensure that Section 230 holds online services liable whenever they break the law. In fact, the law explicitly allows prosecutors to go after online services for any violation of federal law. That’s how federal prosecutors took down the notorious sex-trafficking site Backpage.com.
But the EARN IT Act turns this understanding of liability on its head. It conditions 230’s principles on a business’ compliance with whatever best practices the government comes up with. If a business doesn’t comply — and the bill doesn’t explain fully how compliance is determined — the business would face nearly unlimited liability. Indeed, the bill would amend the law to:
- Allow states to prosecute businesses for the real criminals’ behavior if that behavior is illegal under federal and state law; and
- Create a new federal and state cause of action that would let plaintiffs sue businesses for harm those bad actors cause.
To be sure, no one is saying that a business deserves immunity when it breaks the law. But that’s not what the bill says (because it would be redundant). Instead, it conditions fairness — the traditional notion that those who are actually guilty or responsible be held liable for what they do — on whether businesses do what the government wants.
With billions on the line, businesses will be compelled to follow the best practices. Although the plaintiffs’ bar will love the new avenues for class-action lawsuits, the bill shifts responsibility from the bad actors to others. All Americans will then have to pay the price for those actors’ illegal behavior.
Hm, but “best practices” doesn’t sound so bad, especially if they protect children . . .
True, “best practices” is a seemingly harmless corporate buzzword. After all, most industries develop guidelines for relevant businesses to follow. But the EARN IT Act’s best practices aren’t industry-created and aren’t industry-imposed. Instead, they’d be the product of a government commission stacked with law enforcement and short on tech experts. They also would not be optional, despite the bill’s suggestion that they are.
On paper, the commission seems fairly reasonable. Although stacked with non-tech experts, the commission can recommend best practices to the Attorney General for adoption only if 14 out of the 19 members vote in favor of them.
But here’s the catch: Aside from violating the normal notice-and-comment rulemaking process, which is designed to give the public a voice in federal rulemaking, the bill would disincentivize businesses from tweaking or experimenting with new methods of preventing or removing harmful content.
- The best practices will encourage a race to the bottom. As mentioned above, businesses will feel compelled to follow the best practices and will be discouraged from experimenting with new procedures. Under threat of litigation and federal prosecution, businesses won’t want to deviate from the best practices even if their internal research shows that another method may work better in protecting children.
- Although the commission can issue “alternative best practices” that consider a company’s “size, type of product, and business model,” the commission’s likely to issue industry-wide, one-size-fits-all best practices. What works best for Facebook is likely different from what works best for Wikipedia, or Yelp, or your local community’s online Neighborhood forum. So those with the best knowledge of how their sites work will have little to no say in developing and implementing procedures that respond best to their unique circumstances.
- Even if the commission issues different best practices for different companies, the bill assumes the commission will develop the best best practices. That’s hardly a given — indeed, the bill applies so broadly that it sweeps all corners of the internet. Even if the commission were staffed with the most knowledgeable experts, it’s a daunting task to develop best practices that are responsive to the entire internet ecosystem.
- And those best practices will not be responsive to unforeseen developments. Child predators will find ways around those best practices and businesses will be unable to respond in real time if responding would mean deviating — even if ever so slightly — from the best practices. The businesses will instead have to wait for the government to act. And for better or for worse, the government is not known for its agility or its rapid response.
The bill would stifle legally protected speech
As discussed above, the EARN IT Act would condition Section 230’s liability rules on adherence to the government’s preferred business model. But it also contains ambiguous terms that would encourage businesses to take a heavy-handed approach to censoring content out of an abundance of caution.
Without Section 230, we wouldn’t have social media, Wikipedia, Yelp, Pinterest, or even Ebay. Section 230 also protects your ability to forward an email without assuming liability for its content. And it protects news organizations that allow readers to comment on stories. Section 230, it bears emphasizing, is responsible for all the innovations behind today’s internet.
But the bill threatens free speech in another way, too. It requires the commission to issue best practices that regulate “child sexual abuse material” (CSAM). And it defines CSAM as having “the same legal meaning as the term ‘child pornography,’ as that term was used in Federal statutes and case law.”
Although child pornography has been illegal for decades, and although it is not protected under the First Amendment, there’s a risk that the commission’s best practices — and businesses’ compliance with them — will be overly broad, sweeping protected speech under the umbrella of illegal content.
Why? Because the definition is coupled with a reduced mens rea. Under existing law, it’s already illegal for online businesses to knowingly let users upload, share, or store child pornography. But under EARN IT, businesses will be liable for reckless behavior. Because reckless is a much lower standard — no actual knowledge of illegal content is required — businesses will respond by taking a heavy-handed approach to censorship, blocking both legal and illegal content.
Although that’s not certain to occur, very recent history suggests it will. In 2018, Congress tweaked Section 230 to hold online services liable for “facilitating” or “promoting” human trafficking. Opponents of the bill, including NetChoice, pointed out that under existing law, human trafficking was already illegal and that the bill would stifle legal speech.
Two years later, that warning became reality. The D.C. Circuit Court of Appeals recently held that plaintiffs had a credible claim to challenge the law’s constitutionality. Specifically, the Court found that the law’s use of the words “facilitating” and “promoting” were sufficiently ambiguous enough that they may violate the First Amendment’s Free Speech Clause (overly broad terms that chill protected speech) and the Fifth Amendment’s Due Process Clause (too vague to give notice of what’s illegal).
Although we’ll have to wait to hear the Court’s full analysis on the merits, its opinion is striking.
First, it’s worth remembering that the U.S. Constitution applies only to the government. FOSTA/SESTA was ostensibly aimed at private conduct — the facilitation or promotion of illegal human trafficking. But because the law’s terms are broad and open to multiple interpretations, businesses responded with an abundance of caution: They banned speech that could, from a distance or from an from an overzealous prosecutor’s perspective, appear like “facilitation” or “promotion.” The Court reasoned that although the businesses were private, they were censoring users’ speech only because they feared government prosecution — and that was reason enough for the plaintiffs to have standing to challenge the law’s constitutionality.
The same problem may occur here.
The result? An internet with less free speech. Although less free speech may not trouble everyone, it should concern us all that the government can coerce private actors into censoring legal speech, even if it’s only because the businesses are (sensibly) overly cautious. Curtailing free speech out of fear of the government is a loss for every American.
And again, child pornography is already illegal under federal law and Section 230 makes clear online businesses are subject to prosecution under federal law. But with a reduced mens rea and with state law enforcement empowered to prosecute, businesses will censor illegal and legal content.
Okay, that sounds pretty bad; is there anything else I should know?
Fourth Amendment Concerns: The bill suggests the executive branch develop best practices about “coordinating with non-profit organizations” “to preserve, remove from view, and report material relating to child exploitation or child sexual abuse.” The non-profit organization in mind is almost certainly NCMEC, a nonprofit created by Congress to act as a clearing house before law enforcement gets involved.
Although businesses already report suspected instances of child exploitation and abuse to NCMEC, this may run afoul of the Fourth Amendment, which requires the government to have probable cause before it searches or seizes someone or his constitutionally protected property.
It’s still open for debate whether someone’s digital records are constitutionally protected, let alone when those digital records are shared with a third party like Facebook or Google. But the Supreme Court has found that the contents of someone’s cell phone are protected, and recently found that historical cell phone location data are protected (at least when it’s 7-days’ worth of data). So if the government issues a best practice that requires businesses to coordinate with NCMEC, and if law enforcement doesn’t have a warrant, there’s reason to believe such evidence would be suppressed at trial. That would mean victims of child exploitation and abuse would have an even harder time seeing their perpetrators brought to justice in criminal court.
And this isn’t merely theoretical: The Tenth Circuit, in an opinion written by then-judge Neil Gorsuch, found that NCMEC was a government entity and that the Fourth Amendment applied to it.
Mission Creep: Although the EARN IT Act allegedly responds to concerns over child exploitation, which is already governed by federal criminal law, it nudges the commission toward regulating online speech in ways that have nothing to do with exploitation — and that the Supreme Court has already struck down. For example, the bill suggests the commission develop best practices for “employing age limits and age verification systems,” “employing age ratings and related disclosures,” and “offering parental control products that enable customers to limit the types of internet websites and content accessible to children.”
These suggestions underscore the bill’s breadth: This isn’t a narrow attempt to protect children from exploitation. It’s another example of Congress’s repeated attempts to regulate the internet in ways that violate the First Amendment. It also underscores the potential for mission creep: At some point, the commission will view everything related to the internet as something that falls within its gambit.
Anti-competitive Concerns: Section 230 allows start-ups and small tech firms to compete against larger competitors. Without 230’s liability rules, these firms will struggle to innovate and compete. And that means the law will either drive these firms out of the market or will prevent them from entering it to begin with. Given Congress’s renewed interest in antitrust enforcement and competition, it’s curious that Congress would want to entrench market players and add a significant barrier to entry for their competitors.