Reputation and Identity in decentralized systems
Summary of reputation systems and alternatives
“It takes 20 years to build a reputation and five minutes to ruin it. If you think about that, you’ll do things differently.” — Warren Buffet
Centralized Reputation Systems
- Government: passports, ID cards, trademarks, licenses
- Commercial, Academic: credit scores, CV and resumes.
- Online: Ebay ratings, Google Pagerank, Twitter follower count, Yelp reviews, Reddit up-votes
Each of these examples need to be meditated by a central authority because reputation is valuable and it attracts cheaters. The algorithms and data that underpin these systems must be kept secret in order for the central authority to keep the upper-hand in the inevitable arms race. When cash is involved these systems generally leverage government-mediated systems, e.g. requiring a local bank account (online => commercial => gov’t). This moves the enforcement against fraud into the local legal system.
Decentralized Reputation Systems
Decentralized reputation systems have all of the challenges of centralized systems plus a few more. The main challenge is that they’re open to anyone; without a central entity it’s very difficult to restrict access. Bad actors can create new and/or multiple identities for themselves.
“The Sybil attack in computer security is an attack wherein a reputation system is subverted by forging identities in peer-to-peer networks.” — Wikipedia
The system can’t use history to build reputation because if you have a bad reputation you’ll just abandon that identity. The system can’t even reliably use positive reputation because in a distributed application a bad actor can easily create new identities, and then program their puppet accounts to engage in whatever type of activity leads to good reputation. There are some techniques to mitigate this weakness (network analysis, proof-of-burn, transaction fees) but everyone seems resigned to the fact that these systems are ultimately vulnerable to Sybil attacks.
Other issues to consider
Accountability vs. Privacy
Another complication with reputation systems is that you may want to avoid reputation. For example, as we use the internet each day each one of us creates a trail of tracking data and that data is collected, aggregated, and monetized by advertisers. This is a case of an identity being built without your consent and then deployed against you to influence your behavior.
Positive vs. Negative Identity
In many cases positively identifying a user (“having the password for this account proves you’re the account holder”) is not enough — we may also want to prove uniqueness (“the account holder only has one account”) to prevent collusion.
“Bash it until it looks like money”
Vinay Gutpta has published two must-read pieces over the last few months about identity and trust and reputation: “Tell Me Who You Are” & “A Blockchain Solution for Identity”. He makes a strong case that identity cannot be secured by simplifying the problem through ‘reductive logic’ (e.g., “identity == birth certificate”, “identity == public/private key-pair”) — there are simply too many edge-cases in the human condition to build a strong identity-system on top of a reductive model.
His solution is to define ‘identity’ in terms of it’s practical use: accountability (and more precisely: financial accountability). When a merchant checks your identity they’re not interested in “who you are” in some meta-physical sense — they are establishing your identity in order to reduce their financial risk. In these cases we could keep our identity private, and instead provide the merchant with an insurance contract to cover their risk. Because they’re ultimately interested in money anyway, we can just put identity and trust and reputation in a pile and, in Gupta’s words: “bash it until it looks like money” : )
Leveraging Social Collateral
Social capital acts as collateral for borrowing
“This paper builds a theory of trust based on informal contract enforcement in social networks. In our model, network connections between individuals can be used as social collateral to secure informal borrowing.”
Trust is Risk
OpenBazaar recently featured a post by Dionysis Zindros on a concept called Trust is Risk. As I understand it, it goes like this:
- You trust your (real-life) friends, and would be willing to put some money at risk to demonstrate that you trust them. So you put money into an escrow that your friends could take at any time. By putting actual money at risk you’ll prove that you trust them.
- Your friends will prove that they trust their friends, and so on, and the system will form an extended web of connections. Each link in the web will represent a relationship between friends, and each relationship will have a monetary value attached to it that will represent the amount of trust in that particular relationship.
- Using a maximum flow algorithm you can calculate the amount of transitive trust between any two people, and further, prove that their risk of interaction is always equal to or less than the calculated amount of transitive trust.
Links: the OpenBazaar blog post. “Trust Is Risk: A Decentralized Financial Trust Platform” link 1 (.pdf), link 2 (.pdf), both of which provide clearer explanations of Trust is Risk and many examples.
Trust in Friends
- Same directed graph as “Trust is Risk” but the amounts are escrowed in order to (optionally) reimburse the person in case of cheating.
- If you are cheated you can “call-in” those escrows from your friends. If they choose to reimburse you then your friends will call in their pledges, and on down the line. The reimbursement requests travel back through the graph and (assuming the chain is not broken) arrive back at the cheating party. If the chain is broken then it implies that someone’s trust was misplaced — the breaking party values the cash more than the relationship.
Link: “Implementing a Decentralized Trust Inference System” (.pdf)
Implementation Concerns with Social Collateral
- The concept centers around ‘staking’ money on your friends, which requires cash in escrow. This is a fairly high barrier to entry; it may be difficult to bootstrap.
- A system based on staking will exclude people that can’t afford to tie up their money in escrow. A related concern is that it’s a very local system, and and if an economic shock affects your whole network this ‘credit’ could dry up when it’s needed most.
- Staking makes the value of your social bonds legible; this may be awkward, socially. All of your friends will know exactly how much you trust them (or don’t).
- On-chain transactions are currently slow and expensive. In the short-term this system may only be appropriate for large transactions and/or networks with very stable connections (i.e., cheating is rare). This issue will likely diminish as the technology improves.
‘…in the old days, if you were broke but respected, you wouldn’t starve; contrariwise, if you were rich and hated, no sum could buy you security and peace. By measuring the thing that money really represented — your personal capital with your friends and neighbors — you more accurately gauged your success.” — Down and Out in the Magic Kingdom, Cory Doctorow