Censorship Resistant Moderation
The Decentralized Moderation Paradox — Part 1
It can be argued that a free decentralized platform should be completely open, accept any content and let every user have access to everything, with no roadblocks or barriers whatsoever. If that is the way the majority feels like, then moderation is probably not necessary.
However, most people probably want some degree of control over the content. To ensure proper tagging, categorization and prevent children from having access to adult content, for exemple. The respect of copyright and compliance with laws in general should also be taken into consideration.
While a zero-control policy is definitely a possibility, and will probably be implemented by some projects. We will present in this article the reasons why some sort of content management is necessary for a global media distribution platform.
In the context of blockchain based platforms, the moderation should be decentralized, meaning that the moderation rules have to be created by consensus.
The statement above “most people probably want some degree of control” is vague and can hardly justify any action. “Most people” and “some degree” would need to be quantified. Although we intuitively know it to be true, lets be a bit more precise about this specific point.
Decentralized moderation is all about building consensus, but the first consensus to be reached is indeed whether or not moderation is required in the first place. The community will have to agree on the degree of control to be exerted over the content. Using an arbitrary scale, we could imagine the content control levels to range from “0: no control at all” to “100: full control”.
“Most people probably want some degree of control” means that it is extremely likely that the consensus on content control level > 0.
By not offering any method of moderation, a platform would make the decision, impose its views on the community and lock the content control level down to 0. Not giving the users a choice would effectively replicate the centralized operation mode that they are combating.
The response to too much moderation (censorship), is not the absence of moderation. Moderation needs to be inclusive and decentralized in order to give everybody a voice and ensure that real consensus is met.
We all live in lawful societies. Whether or not we individually agree with the laws in place and the way they have been legislated is very subjective and is beyond the scope of this article. The fact is that: laws exist, and they are enforced by governments (more or less strongly depending on where you live).
Although it is frequently perceived as such, the law is not (just) a way for a government to impose its will on the citizen. In most cases, laws are aligned with the general (local) consensus. Additionally “the law” could refer to a jury which rules in favor of an individual, ordering a piece of content to be removed. In which case a decentralized platform might want to comply with the court order and respect the citizen privacy, right to be forgotten, etc…
In theory, a decentralized platform has the potential of ignoring the law. However the law remains a critical aspect to be taken into consideration when designing a content management strategy. Attracting the attention of law enforcement would surely hinder the development of a platform.
Lets not forget that there is still a lot of R&D going into decentralized platform architectures and a lot remains to be figured out. The bulletproof media distribution solution doesn’t exist yet. And while on paper, all decentralized projects couldn’t be tampered with, in reality, they are still fragile. Today, a powerful enough adversary would still be able to shut most platforms down, or dramatically reduce their reach.
An interesting alternative could be a moderation system that takes into consideration the law, but still gives a say to the community and allows it to overrule the decision.
If a content censored by a government is made available on a platform, it would be with the support of the community, giving the platform more legitimacy to distribute it. More importantly, such a capability would let the platform remain censorship resistant, by giving the last word to the community.
There is a mindset in the decentralized world that goes against anything that could benefit the big powerful corporations. Copyright can sometime fall into that category as it is one of the main tools used by big movie studios and major record companies to secure their revenue stream.
Still, copyright enforcement shouldn’t be perceived as incompatible with decentralized solutions. On the contrary, if a platform wants to distribute major shows, movies, or songs, or books, it will have to guarantee that copyright is respected and that the revenue associated with a piece of content is directed to its rightful owner(s). As advertised! For big and small content providers alike.
Copyright is often associated with big productions because they are more likely to see their content illegally distributed. But it is equally important for smaller content creators. Youtube is launching its “Copyright-match” tool which will finds full re-uploads of an original videos on other YouTube channels. Similar features will be necessary for platforms whose mission it is to provide better monetization and protect content creators.
Copyright management will be one of the key factor for the adoption from the big players. “Providing a better content distribution infrastructure to existing online video provider” has been clearly identified as a mission critical by several projects.
Big movie studios and traditional broadcast channels will join decentralized platforms eventually. And so will infrastructure providers (datacenters, network operators, CDNs, etc…). But those major corporations and their armies of lawyers will not get exposed to any liability. A platform that can show an efficient and reliable moderation process is likely to be popular among the top players of the current industry.
However, the risk here is to see big players favor platforms that enable bad old censorship.
The balance between fair moderation and censorship will be extremely hard to find.
— The suppression or prohibition of any parts of books, films, news, etc. that are considered obscene, politically unacceptable, or a threat to security.
— General: The action of making something less extreme, intense, or violent.
— Internet: The action of enforcing the rules of a forum.
In order to moderate a collection of content (articles, videos, books), or in other words, to make an ensemble less extreme (cf. moderation), we need to suppress the more extreme, intense or violent components. Therefore, the moderation of the whole is achieved by suppressing (or “censor”) some of the parts.
In many ways, moderation is censorship. This is why the moderation process has to be designed carefully. Moderation inevitably opens the door to bad old censorship.
And this is where the paradox lies. If moderation is censorship, the title of the article: “Censorship Resistant Moderation” now becomes:
Censorship Resistant Censorship
Not possible! But this is indeed what we need!
The goal of a decentralized platform is not, in fact, to be censorship resistant, but rather “centralized power resistant”.
The problem is not that some piece of content is suppressed or censored. The problem arises when a minority makes the decision against common consensus.
To be continued…
In future articles, we will go through some of the requirements for an efficient and decentralized moderation system. We will also look at some of the technical hurdles that make that task challenging.