Crowdsourcing Crowd-Control: A New Frontier

Nimses
Nimses
Published in
4 min readDec 28, 2018

Major social media platforms have become a place of mistrust. This mistrust is top-down more than it is peer-to-peer. Silicon Valley fears and mistrusts the behavior of the average user more than average users fear each other. Silicon Valley, under government pressure, has rightfully begun taking more responsibility for the kind of content that appears on their platforms. However, in this noble gesture, the elitism of the system is apparent.

A centralised monitoring system suggests that: the public that cannot be trusted, the public is gullible, the public creates and shares fake news and cannot discern between legitimate and inappropriate content. The public is often presented as a grey mass that must be protected because it cannot protect itself.

This is fundamentally flawed. Social media of all forms be it Facebook, Instagram, Twitter or even new platforms such as Nimses will always be communities where the loudest dominate. The loudest (i.e. the most active) are so visible that we forget the silent majority are even there. But it is the silent majority that is often the most reasonable and moderate. They fear the judgement of their peers if they share a controversial article or post a boring photo. These are the people we should trust. This is a vast resource of human time spent on social media that is purely passive; what if we activated just a small part of this resource in order to make the entire society cleaner and more trustworthy? Let’s create a Social Media ‘Neighbourhood’ watch: the ‘police’ in California have already got too much on their plate.

Nimses, an integrated Social-Media/Economic platform is taking new steps in content control. They have decided to crowdsource content policing. The app’s users not only report content, but can be the ones to ‘judge’ whether a post should be removed and what ‘disciplinary’ measures should be taken.

Nimses has guidelines for inappropriate content (hate speech, pornography etc.) like any platform. It encourages users to report posts that break these guidelines, like any other platform. However, when a post is reported, it is not reviewed exclusively by an over-burdened support team. Instead, the ‘case’ is reviewed by a ‘jury’ of trusted users on the application that independently judge if the complaint was justified according to the guidelines.

How does Nimses incentivise them to do this? Well, the whole Nimses platform revolves around ‘nims’ based on the value of one minute of a human life. The jury gets rewarded in nims for being upstanding citizens. If you make a justified complaint you gain nims, and if you’re the one who posts something deemed inappropriate, you are fined in ‘nims’. Of course, if you make a rogue unjustified complaint you will also get fined if it’s found to be unfair.

In most cases it never gets this far, the content usually does not ‘go to court’ simply the offender agrees they made a mistake and removes the post right away with a minimal fine. The magic of the ‘nim’ is that it incentivises users to take care of the community. Users are rewarded for good behavior in the same way that they are rewarded for good content. They are rewarded by their peers, not by a centralised body. Of course, the Nimses support team is always available for sophisticated problems, but allowing users to take part in the maintenance of their own community means that users are more invested in the platform and, crucially, more discerning in how they consume content.

Imagine giving each tabloid reader a share of the editorial responsibility. They would make sure every ‘t’ was crossed and that every fact was checked. They would engage with the content as a co-creator, not just as a consumer. This is the environment that Nimses seeks to create.

Nimses is still in the relatively early days of its exposure, but already the ‘Court’ feature has been a great success among its users. Users are actively engaging with content, reporting inappropriate material and volunteering to be on the ‘Jury’ for cases. A little bit of trust goes a long way. A little bit of nim incentive doesn’t go amiss either. Nimses combines the two by crowdsourcing content control. This is a social network built on responsibility and respect. You too can take part.

--

--