Algodiversity: Thoughts on Decentralized Algorithmic Governance

When our lives are governed by powerful algorithms, we need to fight back on the same level.

Chris Tse
Cardstack
6 min readMay 8, 2018

--

Governance consists of rules, judgment calls, and action. On the Internet, your digital life is increasingly governed by algorithms — a script or program that follows instructions to make judgment calls across a variety of situations, and then respond with some kind of action in each instance.

In our networked world, algorithms have increasing influence over the distribution of information and wealth. Often these algorithms will sit between two parties, but only one party will have control over the algorithm — or even know how it works.

When you are in this kind of asymmetric relationship, it has economic consequences. You are reduced to praying that you will be “blessed by the algorithm.”

Algorithms Create Asymmetric Power Relationships

Algorithms have material consequences. In the case of Facebook, a secret algorithm called EdgeRank determines what shows up on our social media feeds. Facebook also uses proprietary algorithms to identify and block spammers, and control how third parties access its users’ data.

Facebook’s control over its algorithms means its users have no real say over how Facebook works. It means people weren’t able to stop the platform from being manipulated by Russian agents in the 2016 U.S. election, or the collection and dissemination of their private data by Cambridge Analytica.

The problem is not that Facebook lacked the resources to prevent either of these situations from happening — it’s that they weren’t interested in doing so. Their algorithms were working as intended. Facebook didn’t implement governance that would’ve been more fair or transparent, because they saw no reason to.

The point here is not that algorithms are bad. It’s that the algorithms that control our lives are controlled by centralized entities that don’t have our best interests at heart. Algorithms instructed to maximize one thing (such as Facebook’s profits) are unconcerned with the externalities (say, privacy, or democracy). And on the platforms of digital superpowers like Facebook, the parties affected by an algorithm — the end users — are unable to adjust the balance.

Who can people appeal to? When Silicon Valley does something bad, users rely on the press and public pressure to perform governance adjustments, maybe once every one or two years. That’s not enough, considering the damage caused by their algorithms.

Even government bodies that have the will to rein in Silicon Valley, like the EU, lack the tools for effective regulation: on one side you have companies running advanced analytics to maximize profitability on every level, and on the other side you have a bureaucrat saying something like “you need cookies.” We also don’t want the other extreme — an authoritarian state like China using algorithms to intervene in every aspect of its citizens’ lives.

If we’re serious about decentralizing the Internet then we need to develop an approach to algorithmic self-governance.

Adversarial Algorithms

What if we could counter the algorithms that control our digital lives with algorithms that defend us? Algorithms should be matched against adversarial algorithms that can use the same data and the same processing capabilities, using the same open-sourced machine learning libraries, to produce different outcomes.

Remember that an algorithm doesn’t just make a judgment call, but also takes action based on the judgment call. In the same way that tech companies’ algorithms take action against users (banning, burying, demonetization, etc.) why can’t we do the same for tech companies that break their promises to us? What if we could encode binding agreements as logic, so that if something goes wrong, an algorithm could enact consequences against a company?

The idea of adversarial algorithms could allow us to create our own regulatory bodies that run at the speed of the digital economy rather than be perpetually behind. Instead of relying on governments to regulate tech, we should draw on the principles of open-source to create room for all kinds of people to contribute to algorithms, so that their interests are represented in a system of shared self-governance.

Toward Algodiversity

To fix the asymmetry in algorithmic relationships, we need algodiversity. Let’s think of this as referring to diversity in algorithms along three dimensions.

1. What type of algorithm is used?

To take an example from ecology: if every plant had the same DNA, one fungus could wipe out everything. Similarly, if something is very important to us, we shouldn’t have just one kind of algorithm controlling it. That’s a central point of weakness. Traders who use a mix of algorithms to make decisions should be familiar with this concept. You need a variety of approaches to ensure resilience.

2. Who is running it?

An algorithmic ecosystem should have a proper balance of power. An algorithm that optimizes for something should be balanced out by equally powerful algorithms that represent its externalities.

3. How is its judgment executed?

An algorithm’s judgment call is only as valuable as its maximum consequence. For algodiversity to work, adversarial algorithms need to be able to have a direct and possibly immediate impact on governance — especially considering the algorithm will have material impact on people’s economic destinies.

Algodiversity on the Decentralized Internet

For this to work, we need fungibility between algorithms. Currently, it’s difficult to say I want my Facebook news feed to work more like Twitter. But in law, a contract can take clauses from multiple sources to create something new and unique.

In the same way, we need an approach that makes algorithms more composable and layered so that we can allow governance to evolve through mixing and matching.

The good news is we now have a foundation for this in the emerging decentralized Internet, where governance is not defined by single authorities, but by the values encoded into a diverse and growing set of protocols — each agreed upon and enforced by a decentralized network of participants.

Smart contracts can make these community agreements more powerful by expressing them as logic. A smart contract’s role is to take action: i.e. the slashing of a deposit, the banning of an activity, a penalty imposed. If a decentralized application were to harm users, a smart contract could revoke the application’s access to user data, or trigger the application to switch from one algorithm to another.

A smart contract does not make the judgment calls; smart contracts as implemented today are poor at complex pattern detection. Judgments could be made using machine learning and AI systems — fed by data processed using diverse formulas, designed by different people, with different interests.

The key is to create the opportunity for people to choose. In most current token ecosystems, participants stake monetary value in exchange for using software and sharing in the ecosystem’s benefits. This makes users true stakeholders who have the power to influence the technology they use.

The coming digital world will have more algorithms in it, not less. It’s not enough for us to wait for governments to make rules, or take tech companies at their word when they claim to be more inclusive. We need a more elastic, dynamic process, where we can try out algorithms and react when they prove to be unfair. We need to be part of the decisions behind how computers make decisions. With the dawn of the decentralized Internet, we can jump ahead of the problem and design an approach to algorithms that puts people first.

Read More

Join our community

To learn more about Cardstack, visit https://cardstack.com.

Join the Cardstack community channel on Telegram at https://t.me/cardstack

Chris Tse (@christse) is a technologist and designer who has been working to humanize blockchain technology since its early days. In 2014 he founded Cardstack, where he leads a team of blockchain architects and open-source contributors to build the experience layer of the decentralized Internet. He is also a co-founder of blockchain companies Monegraph and Dot Blockchain Media, and has more than a decade of experience leading R&D and innovation teams for Fortune 500 companies. Chris has a degree in Computer Science from Columbia University.

Illustration: Chris Gardella / Cardstack

--

--

Chris Tse
Cardstack

Technologist, designer, and founding director of the open-source @Cardstack Project. Building the experience layer of Web 3.0.