500 million tweets are sent every day. That’s an absurd volume of content — almost as absurd as the idea that a single company could ever hope to moderate it.

Facebook has over 15,000 people tasked with moderation. And no one could reasonably claim it’s working.

This is all horrible - so here’s a picture of a kitten who agrees with you.

We’ve all seen the results. Abuse and harassment are rampant. Misinformation and propaganda are ubiquitous. And that’s without even discussing issues like privacy and transparency.

Bad actors have learned how to game these systems — corrupting them to their own ends. And there has been little substantial response.

AI has been “a few years away” for more than a few years, and — if it ever arrives — we’d just be replacing one opaque, corruptible algorithm with another.

Legacy social media will never solve this problem.

Every rule they add costs more to enforce, every account they ban reduces the quantity of engagement (regardless what it does to the quality). They are economically incentivized to do the bare minimum. And they always will be — their business models depend on it.

So what can be done?

Look — it’s dawn (signifying hope) and a tree (signifying a network, sort of).

Only the userbase of social media can scale to match the volume of content it generates.

User moderation is the only viable solution, but how can we stop that being gamed and corrupted like everything else?

Well, it’s actually quite simple — literally.

The simpler a system, the harder it is to game. And the more transparent, the easier it is to trust. With these principles as a foundation, we created Podium.

Future blogs will go into detail about how the platform will work, but — to give a brief overview:

On the surface, Podium will be a microblogging site like Twitter. Users post content and — if you think a piece of content violates the rules — you can report it.

Voting, a demonstration.

But — unlike Twitter — these reports are decided by bias-balanced Juries of users who vote on whether that report is upheld according to a series of tests for each Law in the Podium Code of Conduct. If the Jury supports the report, the user will face Sanctions that directly reduce their influence on the platform.

Because we don’t need to worry about the volume of reports, Podium can have a more extensive ruleset than any other social platform. It will also mean reports are handled rapidly; resolving in minutes — not weeks, as is currently the case.

This is all possible thanks to Podium’s Bias Mapping functionality — which allows us to randomly populate Juries that fairly represent the userbase while factoring out bias and protecting your data.

The possibilities unlocked by this system are truly revolutionary and we can’t wait to tell you about them. (But we will, because you have things to do.)

In the meantime, please:

We want you to set the agenda for these blogs — so help us decide where to focus first.

Podium is actively seeking angel and pre-seed investment to help build our MVP. You can start a conversation by emailing hello@podium-network.com.

a Social Network with Integrity

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store