User Moderation

Podium
4 min readMay 21, 2019

--

User moderation is the only solution to social media toxicity. No centralised solution could ever cover the volume of content involved and AI may never be ready to fill the void.

Previously on Podium:
1. Introducing Podium

But empowering everyone also means empowering the very users who are creating toxicity — how could that possibly work?

We start with the premise that most people are good. If they’re not, then we’re all doomed anyway — so we might as well assume otherwise. But, even if bad actors are a small minority, how do we stop them corrupting any process of user moderation?

Juries

It would be infeasible to have every user vote on every moderation decision. Therefore, when content gets reported, we delegate that judgement to a Jury of users.

Jury voting is also anonymous.

Juries are selected at random — so there’s no way for bad actors to target reports they want to corrupt. The only factor in this process is each user’s Bias — the measurement of which will be covered in a future article.

As a Jury expands its population, it keeps track of the aggregate bias of all its members. If the bias skews in one direction, users with similar bias become less likely to be chosen — keeping the overall Jury bias-neutral, while ensuring every user has the same opportunity to participate.

There is no fixed size for a Jury — users will be added continually until a bias-balanced, statistically significant result is reached (with a minimum number of users to ensure a robust result).

At scale, thousands of users will be online simultaneously — meaning toxic content can be reported, judged, and addressed in minutes. Not the weeks sometimes taken by Facebook and Twitter.

Laws

If a platform adds a new rule, that rule will generate more reports.

This means centralized platforms (Twitter and Facebook) have to either spend more — to employ more people and process those reports — or allow the already-low standard of moderation to drop. They are actively incentivized to keep their rulebooks as narrow as possible.

User moderation has no such limitations — so the number of Laws Podium can have is limited only by what keeps the system accessible and effective.

A dog, participating in judgement.

As such, we can apply far more granular, precise reporting to abuse, harassment, and intimidation. And we can cover offences omitted by other platforms entirely — like misinformation, propaganda, and anti-transparency.

Furthermore, we can implement features that would be abused elsewhere (like an Edit Button) and police that abuse through the same system of moderation. And we can even apply different rules to different users — so verified accounts can be held to a higher standard. We can even use the Jury process to perform the checks required to become verified in the first place.

Different people will apply different definitions to the same terms (e.g. two people may have very different ideas on what qualifies as “racism”). As such, Laws cannot require Juries to make purely qualitative judgements on the suspect content.

Instead, each Law comes with a series of tests that apply the platform’s definition and break the judgement down into clear, binary decisions.

No system of Laws will be perfect — it needs to evolve and adapt alongside society and our community.

Therefore, while Podium will write the initial Laws (with much input from our community), everything will ultimately be handed over to users — to amend or expand, as the community sees fit, via platform Governance (a topic for another time).

Sanctions

Laws would be useless without some penalty for violating them. But Twitter has long demonstrated the ineffectiveness of short-term bans and the pointlessness of permanent bans that just cause the offenders to open a new account.

“And I would’ve got away with it too, if it weren’t for those meddlin’ users…”

Podium’s philosophy is that violations should cost offenders influence.

Every user begins with a certain set of platform Rights — governing everything from how many users can follow them, who they can engage with, and how prominent their posts appear to other users.

These Rights can be expanded by earning “Integrity”. Integrity is Podium’s reputation system and it is principally earned via Jury duty and by reporting content that infringes the platform Laws. This incentivizes users to take part in moderation and demonstrate consistent commitment to upholding the rules.

We’ll discuss Integrity and Rights in a future post. In the meantime, please:

We want you to set the agenda for these blogs — so help us decide where to focus first.

NEXT: Bias Mapping

Podium is actively seeking angel and pre-seed investment to help build our MVP. You can start a conversation by emailing hello@podium-network.com.

--

--