The size of a user’s platform should depend only on the quality of their content and the integrity of their conduct. Podium’s “Integrity” system of reputation ensures this is the case while also preventing other systems from being gamed.

Previously on Podium:
1. Introducing Podium
2. User Moderation
3. Bias Mapping

The intent behind Integrity is to characterize users’ consistency of conduct.

Gaming the System

There are few ways to game the Bias Mapping algorithm because it is intentionally simple — only the most basic data goes in and only the most basic data comes out. It’s also a fully transparent, yet data-opaque, process — so users can see precisely how it works and verify the results, while never actually seeing the underlying data.

The only way to effectively game the system, therefore, is to intentionally feed it false data — i.e. reacting to content in the opposite (or generally different) way to your true reactions. This would put the user in the wrong location in Bias Space, allowing them to trick Jury selection into adding them incorrectly.

While this approach already has some serious limitations, we can make it wholly pointless by introducing a system of platform reputation — which we call “Integrity”.

Changing the Game

Plenty of existing communities use systems of reputation — Wikimedia, StackOverflow, etc… — with varying degrees of hierarchy and oversight. At the core is some base expectation that the user either exceeds (and gains reputation) or fails (and loses reputation).

The simpler the game, the harder it is to cheat.

For StackOverflow, for example, this expectation is that the user will solve the coding problems with which they engage. Providing bad or unworkable solutions costs reputation; whereas providing better solutions than other users gains reputation.

For Podium, this expectation is Bias.

Whenever a user takes part in moderation, their Bias provides an estimate of whether they will agree (or not) with the reported post. Meeting that bias is considered a neutral act — but defying that bias tells us one of two things:

  1. The user is defying their bias to prevent the fair enforcement of the rules — indicating that their bias coordinates are incorrect.
  2. The user is ignoring their bias to enforce the rules instead — indicating an intent to be fair and objective.

We can determine which is the case from the outcome of the Jury decision. If the user voted with the majority, they are considered to have voted fairly and gain Integrity. If they voted with the minority, they lose Integrity.

To keep this process robust, the change to Integrity will be very small — requiring a consistent pattern of behaviour to move the needle. The volume will also be dependent on the confidence in the Jury’s result — so a 90/10 outcome is a clear signal and a larger change, whereas a 51/49 will produce almost no change at all.

In this way, users are not only incentivized to remain objective, but to consider what vote is best for the community instead of just themselves.


Once we’ve measured each user’s Integrity, we need to apply it. And we can do far more than simply using it to control a user’s influence over moderation — we can use it to control a user’s influence over everything.

A puppy attempting to influence you.

Integrity weights a user’s vote — so the higher your Integrity, the more your vote counts and users who act to undermine the platform ultimately end up with very little power to do so.

Integrity also acts as a gating mechanism to Podium’s system of platform Rights — meaning you have to earn Integrity to expand your platform.

Rights can be unlocked and upgraded by reaching the required Integrity and staking some amount of POD against them. Violating Laws or losing Integrity causes Rights to be revoked and the stake lost — requiring them to be earned anew.

Upon joining Podium, users will start with a standard set of rights governing things like:

  • How many users you can follow
  • How many users can follow you
  • The prominence of your content in curation/search
  • Which users you can engage with, depending on their relative Bias

And other controls — all helping to determine your platform influence.

For typical users, this provides a clear path to a large platform and should not limit the experience at all — with most people needing time to build a follower-base anyway.

But for those seeking influence for nefarious purposes, the system makes influence hard to earn and easy to lose. Only a consistent and transparent pattern of healthy behaviour can lift someone to prominence. But all that can be lost with a single post if that behaviour becomes toxic.

The greater the power, the greater the responsibility.

This fundamentally changes the risk-reward for bad actors. A small return from a long-term, high-risk process makes it simply not worth the effort. Any attempt becomes a harmless waste of time.

Now we’ve covered the core aspects of Podium, we can start talking about what matters to you — our future users. So get in touch and tell us what to write about next:

We want you to set the agenda for these blogs — so help us decide where to focus first.

Podium is actively seeking angel and pre-seed investment to help build our MVP. You can start a conversation by emailing

a Social Network with Integrity

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store