Can Terrorists Tweet?

Or, how the judiciary’s interpretation of CDA 230 is shaping the future of user-generated content, censorship, liability, and competition

Michael Kiley
The Startup

--

**Update 12/31/2020: This has come to the fore in recent weeks as a result of President Trump’s demands for CDA 230 changes as part of the latest stimulus bill. Enjoy this refresher on the possible ramifications!

This past week Tuesday (10/13), the Supreme Court issued a denial of a petition for a writ of certiorari in the case of Malwarebytes v. Enigma.

In simpler terms, it means that four Justices did not agree to review the case, so the ruling of the 9th Circuit Court was left to stand. While it may seem like a case that the Supreme Court didn’t even rule on can’t have much importance, it is in fact another important step in the federal judiciary’s interpretation of Section 230 of the Communications Decency Act, a statute with broad ramifications for the internet as we know it. Even though the court did not issue an opinion, their denial represents a tacit support of the 9th Court’s ruling in the case and one more step in the road to defining the interpretation of CDA 230. Furthermore, Justice Thomas’ published opinion on the case provides a key glimpse at the future of this important legislation.

What is CDA 230?

Section 230 of the Communications Decency Act provides some protection against liability for technology companies dealing in user-generated content.

Its most important points come in 230(c), where the following two protections are provided:

No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.

and:

No provider or user of an interactive computer service shall be held liable on account of any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected

Practically, these provisions have been taken to mean that technology companies dealing in user-generated material are not responsible for the content of that material. And, if they do decide to censure some user-generated content, they are can’t be sued for those actions either, as long as they are taken “in good faith.”

As Justice Thomas describes in his comments, the CDA was enacted based on the tradition established during the days of pre-internet publishing and distribution. During this time, “publishers” of information, such as newspapers, were held more responsible for their content than “distributors” of information such as newsstands or libraries. Section 230 of the CDA serves to extend this system to the internet, ensuring internet companies won’t be held responsible as “publishers” of their content.

How has CDA 230 been shaped by the courts so far?

The protections provided by Section 230 of the CDA have come before the federal judiciary in a multitude of cases, providing ample opportunity to determine its interpretation and application. Importantly, it has never made it as far as the Supreme Court — all rulings have come from the Circuit courts or below. The following are a sampling of noteworthy cases:

Fields v. Twitter

In a case opened in 2016, Twitter was sued for providing “material support” to the terrorist organization ISIS by allowing them to create and use Twitter accounts along with the direct messaging and information sharing capabilities that come with them. A district court ruled in favor of Twitter, stating that CDA 230 protected them from being held responsible as publishers of the terrorist’s content, and that furthermore the defendants did not provide enough evidence of causation in the case. On appeal to the Circuit Court, the same decision was reached, although the Circuit Court ruled only on the grounds of the causation argument.

Force v. Facebook

In another case attempting to hold a social media company responsible for terrorist activity, Facebook was sued for displaying messages from Hamas encouraging violence in Israel, with the added wrinkle that their news feed algorithms promoted this content to some users susceptible to its message. Force argued that this action by the algorithm rendered Facebook responsible as a publisher. The Second Circuit Court found that Facebook was protected under CDA 230, despite their algorithm’s activity.

(The preceding cases are just two examples of the many suits brought against internet companies attempting to hold them responsible for terrorist activity; further examples include Gonzales v. Google, Crosby v. Twitter, Pennie v. Twitter, Cohen v. Twitter, and the list goes on)

Sikhs for Justice v. Facebook

While the previous two cases were all about the portion of CDA 230 that protects internet companies from liability for what they do publish, this case was fought over liability for what they elect not to publish. Here, Sikhs for Justice sued Facebook, arguing that the social media company discriminated against their organization by removing their posts. The courts found that Facebook was indeed protected under CDA 230(c) for their removal of the page, and could not be sued for discrimination.

Doe v. MySpace

Here, MySpace was sued when a user was sexually assaulted after meeting in-person with someone she met online through the website. The courts found that the social media company was protected under CDA 230 and was not responsible for the injuries she received due to the information she exchanged on the website. The protections provided by this ruling, however, were affected by the subsequent Doe v. Internet Brands case. Here, the courts found that while an internet company cannot be held responsible as publisher for information that causes an in-person crime, they can be held liable for a “failure-to-warn” crime if they know of a plot to commit an in-person crime and are negligent in informing users of it.

Goddard v. Google

In this case, it was found that the CDA protects Google from being held responsible for showing third-party ads that lead to fraudulent websites or services.

Gentry v. Ebay

Ebay was sued when a user marketed and sold forged sports memorabilia on its site. The courts found that Section 230 of the CDA once again applies, protecting Ebay from responsibility.

Why is CDA 230 so important now?

As of October 2020, the top most visited websites in the US are as follows:

  1. Google.com
  2. YouTube.com
  3. Amazon.com
  4. Facebook.com
  5. Zoom.us
  6. Yahoo.com
  7. Reddit.com
  8. Wikipedia.com
  9. Myshopify.com
  10. Ebay.com

Data pulled from Alexa list of top websites by traffic, see updated list here

What do each of these websites rely on as a cornerstone of the product they offer to the hordes of visitors they receive? User-generated content.

Without the protection provided by CDA 230, these companies could potentially be taken to federal court for every instance of libel, slander, or illegal activity that occurs on their platforms (and we all know there is plenty). Furthermore, any steps they took to curb this activity could be liable as well, as litigants could argue that their 1st Amendment free speech rights were being infringed upon.

It is no coincidence that 9 out of these 10 companies are based in the US, as this affords them the protections provided by this statute.

Furthermore, as you might have noticed, certain bits of user-generated content have been making quite the splash in the news lately. Our most recent election cycles have brought the problem of fake news to the national political stage, and social media companies are under increasing pressure to take steps to curtail it. The protections provided by CDA 230 allowing these companies to make “good faith” attempts to limit content deemed “objectionable” prevents them from being taken to court by those who have their posts taken down in the fight against fake news.

Finally, Section 230 of the CDA has even extended its influence into the sphere of anti-competition litigation, another issue at the forefront of public debate. This application to anti-competition lawsuits was a key point of argument in Malwarebytes v Enigma, providing an excellent segway into our final topic…

Malwarebytes v. Enigma and the Future of CDA 230

In last week’s Malwarebytes v Enigma case, the antivirus software company Malwarebytes was sued by its competitor, Enigma, because Malwarebytes had configured its software to block that of Enigma. Enigma argued that this was anti-competitive. The 9th Circuit Court found the CDA does not provide protection for companies when sued for anti-competitive behaviour, thus not providing protection for Malwarebytes in this case.

Malwarebytes then attempted to bring the case to the Supreme Court, who did not accept the case, letting the 9th Circuit Court ruling stand. This in itself is important, as it establishes that CDA 230 does not protect internet companies from anti-competition lawsuits. This could prove crucial as public scrutiny of the anti-competitive side of big tech grows to a fever pitch.

The court did comment on the case and the future of CDA 230 in the form of an opinion from Justice Thomas, in which he outlines why the Supreme Court has good reason to take up a case involving the CDA in the near future. His main points are as follows:

Destruction of Distributor Liability

Justice Thomas states that Section 230 of the CDA was enacted under “specific background legal principles,” i.e. the rules governing publisher vs. distributor liability for traditional print media. He goes on to argue that “this modest understanding is a far cry from what has prevailed in court,” and that the federal judiciary has fallen into the habit of “reading extra immunity into statutes where it does not belong.

Before the birth of internet social media and the enactment of the CDA, Thomas argues, distributors of content still had some responsibility for their actions; the standard of responsibility was merely not as strict as for those deemed publishers. He goes on to state that judicial interpretation of the statue has served to eliminate this distributor liability as well, protecting companies even when they knowingly distribute illegal materials, and he questions whether this was the intention of the law when it was enacted.

Internet companies and their own content

His next area of concern involves what he sees as the freeing of internet companies from liability for their own content. He argues that any alteration of content makes the company a content provider and not just a distributor; thus, they should be liable. The courts, however, have repeatedly given protection to companies even when they modified content. These concerns call to mind Twitter’s recent steps in flagging fake or questionable posts, which could possibly imply Twitter participation in the creation of the content.

Freedom to censure

Justice Thomas argues that by interpreting CDA 230 “to protect any decision to edit or remove content,” the courts have “curtailed the limits Congress placed on decisions to remove content.” Internet companies, Thomas worries, can now effectively censure or leave up any content they want no matter the grounds. He argues that the language of CDA 230 was never intended to allow such sweeping freedoms for distributors of third-party content on the internet.

Freedom from product-defect claims

Finally, Justice Thomas argues that the CDA has been overused in court to protect companies even when the suits brought against them attempt to hold them responsible for flaws in their own product, and not for the content it shows. For example, the CDA has been used to prevent litigants from suing for the actions of the algorithm that distributes Facebook news (Force v Facebook), for a lack of built-in safeguards on a dating site (Herrick v. Grindr), or for product features that caused distracted driving (Lemmon v. Snap). In Thomas’ view, this goes well beyond what was initially intended by the law.

The Future of CDA 230

The protections provided by CDA 230 are undoubtedly crucial to the continued operation of the free internet as we know it today. The extent of its application, however, is something that is still being actively determined by our federal courts, with important repercussions on the future of user-generated content. Its protections impact a variety of hot-button issues in the world today, from terrorism, to discrimination, to anticompetitive behavior, to the level of responsibility algorithms have for their actions. Justice Thomas’ opinion has now set the stage for the Supreme Court to take up a case involving this statute and determine the future of this influential legislation.

Sources and further reading:
Justice Thomas’ opinion
Full text of CDA 230, available here
The Ten Most Important Section 230 Rulings, Tulane Journal of Technology & Intellectual Property — read full version here
Eric Goldman’s Technology and Marketing Law Blog, especially this article on Malwarebytes v. Enigma
Washington Post article on Sikhs For Justice v Facebook
The Electronic Frontier Foundation, especially this section of the website dedicated to CDA 230
Columbia Global Freedom of Expression post on Fields v Twitter

--

--

Michael Kiley
The Startup

Software developer. Mobile, serverless, voice. Knows some things, curious about all things.