Internet vs Government: Is banning a solution?

DeCode Staff
DeCodeIN
Published in
3 min readApr 18, 2019

With the rising importance of the Internet in our daily lives — from work, information dissemination, and recreation, there are a lot of cases where obscene content posted by users is coming to light. A few people are not using web services judiciously, hence, creating a lot of content which is offensive or unsuitable for young minds. The Indian government is resorting to banning and moderating content and services on which this content thrives. For example, banning TikTok earlier this week as it provided free reign for pornographic and violent content to spread on the platform. Without sufficient checks and balances, the government felt the need to step in.

Is there a better solution?

Criticism of hate speech, extremism, fake news and other content that violates community standards has the largest social media networks strengthening policies, adding staff, and re-working their algorithms. The government is working, to regulate and content things which they deem unfit for wide audiences. But there’s another solution, in a recent white paper, the New York University Stern Center for Businesses and Human Rights suggested another option based on their research — moderation from the social media companies themselves, with limited government involvement.

When the companies are themselves acting responsibly to regulate content which is put up, there will be fewer interventions coming in from the government. The report, Harmful Content: The Role of Internet Platform Companies in Fighting Terrorist Incitement and Politically Motivated Disinformation, looks specifically at political propaganda and extremism. While the group says social media platforms shouldn’t be held liable for such content, research suggests the platforms can, and should, do more to regulate content. We can start with some basic things and hope that these can sort out the current issues faced. Read on!

Cooperation on No Go Zones

The first issue is cooperation. Extremist content, instructional terrorist material, as well as funding campaigns to raise money for terrorist groups, can be found on all parts of the internet — with varying degrees of accessibility. Therefore, regulation of the internet will only be possible with the cooperation of multiple government agencies, private sector companies, and end users, particularly when it comes to regulation to remove harmful or hateful material, and content that threatens national security. And do so in a timely manner.

Platform Responsibility

We can only determine the legal liability of online platforms for the content they host after deciding how best to deal with unacceptable online content: that of an extremist and/or illegal nature. By setting a base response time for acting on objectionable content posted on their platforms, we can ensure that every piece of content which is reported can have action taken (including the decision to not take action) in a timely manner. This will limit the spread of content promoting violence, abuse, and the sharing of private information. Governments should retain the right to take action against platforms only when they do not conform to these norms in a timely manner.

Transparency

On the part of technological companies, greater transparency is needed when it comes to the government’s definition of terrorism, obscenity, and other objectionable content. If the definitions are vague so will be their interpretation. It will help to see through various issues and provide a solution to problems, as they arise down the line.

There are steps which some companies are taking to control the spread of objectionable content on their platforms. However, the fear of government intervention should be just that — fear. Fear is a good motivator to get platforms to act faster and more responsible. Unfortunately, this is a fine line often crossed by governments which default to heavy-handed bans as in the case with TikTok. Earlier bans on pornographic websites haven’t held up in court. It remains to be seen whether these actions are simply posturing on something more substantial.

--

--