Section 230 Needs to be Adapted for Changing Times

Edwin Covert
11 min readSep 5, 2021

--

Photo by Karolina Grabowska from Pexels

Congress passed the Communications Decency Act (CDA) in 1996 to “protect the public from the misuse of the telecommunications network and telecommunications devices and facilities” (Communications Decency Act, 1995). Specifically, Section 230 of the CDA “protects providers and users of interactive computer services from liability for defamatory content posted to their platforms by third parties” (Murcia, 2020, p. 235). While the US Supreme Court ruled large parts of the CDA unconstitutional, they ruled Section 230 itself to be constitutionally valid (Reynolds, 2019).

This article will review several aspects of Section 230. First, it will review the potential users affected by Section 230. The article will then examine the ethical issues around Section 230, specifically which ethical framework best suits reviewing the section. From there, it will pivot to an abbreviated legal analysis of Section 230 (contrasting those with the previous ethical considerations). Finally, this article will propose mitigations to address legal and ethics concerns arising from the section.

Users Affected Under Section 230

Section 230 was Congress’s attempt to bridge competing interests. On one hand was the newly emergent Internet and the potential it provided for unfettered communication and commerce. On the other hand, was the desire to protect society’s most vulnerable members, namely children (Leary, 2018). As Leary (2018) notes, some see Section 230 as a key element of what the Internet is today; others see that as the exact problem. They see it allowing for the trafficking of sex slaves, child pornography and exploitation giving equal opportunity to both the legal and illegal economies in the world (Leary, 2018 and Carney, 2019). This is because Section 230 states, “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider” (Cornell University, n.d.).

Section 230 protects many categories of users society feels require special protection. For example, Section 230 provides legal liability protection for providers who host protected speech, such as activists who create potentially controversial content. This enables the free expression of ideas (Electronic Freedom Foundation, 2021). However, it also protects more mundane content, such as uploaded YouTube videos, Instagram pictures, and comment sections on websites according to the Electronic Freedom Foundation (2021). Section 230 does this by ensuring content hosted by a third party cannot be removed just because someone does not appreciate it; there needs to be a legal action directed against the actual creator of the content. Of course, there are exceptions, such as the Digital Millennium Copyright Act (DMCA). The DMCA allows providers to avoid legal trouble for copyright infringement if they remove the offending content within a certain time limit (Reynolds, 2019). The DMCA is focused on intellectual property rights though.

However, as with any technology-focused law, Section 230 of CDA is double-edged. For all the protection it affords protesters and regular users looking to comment on a Chicken Cordon bleu recipe, it also allows for the posting of nefarious content with no ramifications for the hosting provider. Section 230 enables sex and human trafficking via many Internet-enabled methods: “[t]he impunity for facilitating sex trafficking that the Internet offers goes beyond advertising to include so called ‘hobby boards,’ where purchasers rate prostituted people and victims of trafficking as they would rate a restaurant on Yelp — except with graphic, vulgar, and violent detail” (Leary, 2018, p. 572). This is because the court system has broadly interpreted Section 230 to create an immunity for hosting websites for all (legal and illegal) content (Leary, 2018). Clearly, the range of users affected by Section 230 is broad; it covers all users of the Internet in the United States (Cornell University, n.d.).

Ethical Analysis of Section 230

When discussing ethical issues, it is valuable to have a framework to gauge the actions undertaken. Velasquez et al. (2015) offer five such frameworks to view the goals and effects of Section 230: utilitarian, rights, fairness/justice, virtue and common good. The utilitarian argument says that to be ethical, an action or set of actions should create the greatest good for the greatest number of people or create the least amount of harm for the least amount of people (Velasquez et al., 2015). A rights-based approach would begin with the idea that “humans have a dignity based on their human nature per se or on their ability to choose freely what they do with their lives” (Velasquez et al., 2015, para. 5) and that these rights need to be protected via actions undertaken and laws enacted.

The fairness/justice view of ethical questions starts with the simple idea that people should be treated equally, or baring that, fairly while the common good view states that actions should be performed in the best interest of the community at large (not to be confused with the utilitarian definition of the greatest good for greatest number of people) (Velasquez et al., 2015). Finally, the virtue approach requires that actions or laws be consistent with idealistic concepts such as honesty, courage, and tolerance so society can maximize its potential (Velasquez et al., 2015).

When considering Section 230 , the virtue framework clearly does not apply as the ideals to which it ascribes are not universally defined and agreed upon; courage and tolerance for example mean different things to different audiences. However, the others are all viable candidates for reviewing Congress’ actions and the effects of the law. The utilitarian approach, with its focus on the greatest good for the greatest number of people (and the converse about harm) makes sense. Congress was attempting to limit the harm for hosting platforms and make sure content was widely accessible.

The rights framework for ethical consideration says individuals have dignity, and that dignity is derived from choosing how to live our lives: “the best ethical action is that which protects the ethical rights of those who are affected by the action” (Bonde & Firenze, 2013, para. 12). Here, the question of whose rights are being protected arises. Did Congress enact Section 230 to protect the rights of the commenters to express themselves freely or Congress enact Section 230 to protect the business rights of platforms hosting content? The text of the statute makes no claim to protect rights (only the potential of the technology as perceived in 1996) so perhaps the rights approach is not the best framework (Cornell University, n.d.).

Under a fairness/justice approach to evaluating Section 230, all content is treated equally. Specifically, clause A(4) states the Internet is a “benefit to all Americans” (Cornell University, n.d., para. 5); therefore, Americans benefit from unfettered access to the content covered by Section 230. Perhaps treating both legal and illegal activity equally is fair, but is it just? Intuitively, this argument feels wrong. Letting human traffickers post, as Leary (2018) alleges, with the same impunity as someone commenting on the latest Britney Spears legal drama seems to tear at the collective sense of right and wrong. However, as Bonde and Firenze (2013) note, the fairness/justice approach does not consider the consequences of actions to be of a viable concern.

The last framework under consideration is that of common good. This view focuses on the idea that “actions should contribute to ethical communal life” (Bonde & Firenze, 2013, para. 9). Here, Congress determined the community to be the whole of the nation. However, there are sub communities that need to be discussed. The community of content creators or comments engaging in legal free expression co-exists with the community of criminals using the Internet to facilitate crime. Are both considered equally? Congressional intent on this aspect is unknown based on the text of Section 230.

When viewed within the context of the four viable frameworks, it is clear the utilitarian approach is the best one to view Section 230. Congress was attempting to maximize the value the Internet could bring to the lives of everyday Americans as they represented “an extraordinary advance in the availability of educational and informational resources” (Cornell University, n.d., para. 2). When viewed through the lens of utilitarianism, the facilitation of the free expression of ideas creates the greatest good.

Legal Analysis of Section 230

Under Section 230, hosting platforms are not considered publishers of content and are immune to legal action, unlike traditional media ((Cornell University, n.d. and Burke, 2011). This is due primarily to a Fourth Circuit Court of Appeals interpretation in Zehan v. America Online (Burke, 2011). In its ruling, the Court ruled that a print newspaper could be liable for defamation if it printed a defamatory letter to the editor, but a website could not (Burke, 2011). The law defines defamation as the sullying of one’s character or standing (Garner & Black, 2021).

Prior to enacting Section 230, under a case known as Stratton-Oakmont, hosting providers became liable for the content if they exercised editorial control over the content on their platform (Burke, 2011). With the passage of Section 230, platforms were no longer liable for defamation as they were not considered publishers in the traditional sense (Cornell University, n.d). Many, including Leary (2018) and Burke (2011) feel the Courts are misinterpreting Section 230. As Butler (2000) notes, the original intent of the law was to “prevent the Internet from becoming a ‘red light district’ and to ‘extend the standards of decency which have protected users to new telecommunications districts’” (pp. 251–252).

If this was the original intent of Congress, some consider it a failure. According to Carney (2018), “buying a child for sex is as easy as ordering a pizza online” (p. 353) via sites such as Backpage.com. While Backpage.com is now defunct (the US Department of Justice eventually seized Backpage.com for facilitating prostitution under the US Travels Act (Loew, 2021)), sites like it are still up and selling sexual encounters online (Gamiz, 2019). Under Section 230, sites like Backpage.com and its brethren can claim they are not responsible for the content posted on their website because they provide no editorial control, i.e. they are not publishers as defined by the law (Carney, 2018). State cases in court have also ruled against those seeking remedy to remove offending or defaming material from platforms (Murcia, 2020).

Ethical Versus Legal Considerations

There is a fundamental difference between ‘can’ and ‘should.’ Section 230 illustrates that difference in very stark terms. Under the law, courts can interpret a web platform as possessing immunity from the effects of posting defamatory material or hosting illegal activity. However, should they? Perhaps Congress should lean more into the utilitarian nature of the argument, and not as Barque-Duran et al. (2017) suggest is a half-hearted deontological approach (“prompted by the emotional content of a given dilemma” (p. 184)).

If the argument under utilitarianism is for the greatest good or least amount of harm, Congress’ efforts could be seen as a failure. By removing the ability to hold people accountable for defamatory material and illegal activities (80% of prostitutes are women (Lubin, 2012) and women are nearly twice as likely as men to be targets of abuse (Duggan, 2020)), Carney (2019) suggests Section 230 de facto promotes abusive behavior towards those women (Carney, 2018). How does this reconcile with the least harm goal when it appears Congress privileged Internet expansion as a greater good for the country?

Mitigating Ethical Concerns

The obvious question to ask is how to address these concerns about problematic content while still fostering the innovation the Internet made possible. One solution was the FOSTA-SESTA law, signed into law by President Trump in 2018. Congress designed this law to address the challenges Leary (2018), Carney (2019), and others bring up. It amended the the CDA to “not stand in the way of civil and criminal action against those sites that violate sex trafficking laws” (p. 365). However, recent reports on FOSTA-SESTA’s effectiveness show it is struggling as federal law enforcement has only brought a single case to court using it (Pexton, 2021). Of course, this change in the law brought unintended consequences as well. FOSTA-SESTA drove underground many voluntary sex workers and made their lives more dangerous (Tung, 2020). One solution to address this unintended consequence is to legalize voluntary sex work, i.e. prostitution, thus removing it from the reach of FOSTA-SESTA.

However, FOSTA-SESTA was not designed to deal with the defamatory content issues (at least under California law) that Murcia (2020) notes. While not as vulgar as child sex trafficking, reputational damage to one’s livelihood and standing can have significant consequences (Murcia, 2020). Murcia (2020) proposes a solution that allows those claiming defamation to allow injunctions against platforms that host defaming material. Without the benefit of a formal legal education, it is beyond this paper to determine if this proposal passes constitutional muster, but it seems appropriate.

Conclusion

Section 230, while crafted with arguably good intentions, has caused several problems through its legal interpretations in the courts. First, it allows defamatory content to remain online because there is no legal way to force platforms to remove it. This has specific ramifications for those targeted by the defamation. Second, it prevents federal law enforcement from targeting online sex trafficking (and remedies to this problem have proven less than useful). Section 230’s effectiveness can be improved in these two areas by clarifying Congressional intent through additions to the law around defamation and by removing the dragnet approach to all sex work by focusing specifically on the illegal trafficking of human. Section 230 served a purpose as the birth of the Internet, but it needs to be adapted for changing times.

References

Barque-Duran, A., Pothos, E. M., Hampton, J. A., & Yearsley, J. M. (2017). Contemporary morality: Moral judgments in digital contexts. Computers in Human Behavior, 75, 184–193. https://doi.org/10.1016/j.chb.2017.05.020

Bonde, S., & Firenze, P. (2013, May). A framework for making ethical decisions. A Framework for Making Ethical Decisions | Science and Technology Studies. https://www.brown.edu/academics/science-and-technology-studies/framework-making-ethical-decisions.

Burke, M. (2011). Cracks in the armor?: The future of the Communications Decency Act and the potential challenges to the protections of section 230 to gossip web sites. Boston College of Law, 17.

Butler, C. (2000). Plotting the return of an ancient tort to cyberspace: towards a new federal standard of responsibility for defamation for internet service providers. Michigan Telecommunications and Technology Law Review, 6(1). https://repository.law.umich.edu/mttlr/vol6/iss1/6/.

Carney, E. (2018). Protecting internet freedom at the expense of facilitating online child sex traffifficking? An explanation as to why CDA’s Section 230 has no place in a new NAFTA. Catholic University Law Review, 68(2), 353–378.Communications Decency Act, Congress.gov (1995). bill. https://www.congress.gov/bill/104th-congress/senate-bill/314/text.

Cornell University. (n.d.). 47 U.S. Code § 230 — protection for private blocking and screening of offensive material. Legal Information Institute. https://www.law.cornell.edu/uscode/text/47/230.

Duggan, M. (2020, September 18). Online harassment 2017. Pew Research Center: Internet, Science & Tech. https://www.pewresearch.org/internet/2017/07/11/online-harassment-2017/.

Electronic Freedom Foundation. (2021). Section 230 of the Communications Decency Act. Electronic Frontier Foundation. https://www.eff.org/issues/cda230.

Gamiz, M. (2019, April 6). Backpage is gone, but a more Graphic version is AGAIN Fueling prostitution busts in the Lehigh Valley. The Morning Call. https://www.mcall.com/news/breaking/mc-pol-sex-trafficking-backpage-shutdown-skip-the-games-20190322-story.html.

Garner, B. A., & Black, H. C. (2021). Black’s law dictionary. Thomson Reuters.

Leary, M. G. (2018). The indecency and injustice of section 230 of the Communications Decency Act. CUA Law Scholarship Repository, 41, 553–622.

Loew, M. (2021, August 30). Trial begins Wednesday in the case against Backpage.com founders. AZFamily. https://www.azfamily.com/news/investigations/cbs_5_investigates/trial-begins-wednesday-in-the-case-against-backpagecom-founders/article_37997c1e-0952-11ec-8367-33fa6e6b719a.html.

Lubin, G. (2012, January 17). There are 42 million prostitutes in the world, and here’s where they live. Business Insider. https://www.businessinsider.com/there-are-42-million-prostitutes-in-the-world-and-heres-where-they-live-2012-1.

Murcia, E. A. (2020). Section 230 of the Communications Decency Act: Why California Courts interpreted it correctly and what that says about how we should change it. Loyola of Los Angeles Law Review, 54(1), 235–274.

Pexton, P. B. (2021, June 29). Advocates for sex Workers vindicated in Section 230 debate by new report. Roll Call. https://www.rollcall.com/2021/06/29/advocates-for-sex-workers-vindicated-in-section-230-debate-by-new-gao-report/.

Reynolds, G. W. (2019). Ethics in information technology. Cengage Learning.

Tung, L. (2020, July 10). FOSTA-SESTA was supposed to thwart sex trafficking. Instead, it’s sparked a movement. WHYY. https://whyy.org/segments/fosta-sesta-was-supposed-to-thwart-sex-trafficking-instead-its-sparked-a-movement/.

Velasquez, M., Moberg, D., Meyer, M. J., Shanks, T., McLean, M. R., DeCosse, D., Andre, C., & Hanson, K. O. (2015, August 1). A Framework for Ethical Decision Making. Markkula Center for Applied Ethics. https://www.scu.edu/ethics/ethics-resources/ethical-decision-making/a-framework-for-ethical-decision-making/.

--

--

Edwin Covert

Cybersecurity, guitar, jazz, bourbon, rye, enterprise security architecture, current trophy husband. CISSP-ISSAP, CISM, CRISC, SCF, PMP at www.edwincovert.com