Towards Autonomous Organizations — Past, Present, Future

Jubin Jose
a-mma
Published in
7 min readOct 25, 2019

Jubin Jose

https://linkedin.com/in/jubin-jose-dev

Introduction

Autonomous Organizations are one of the near future promises of current technology and is a buzz word in the field of Decentralized networking and Blockchain oriented Smart Contracts. But when we look closer and analyze the details of it, we see that the technology requirement for engineering Autonomous Organizations is not only limited to aforementioned fields of technology but many other fields including Artificial Swarm Intelligence, Internet of Things etc. We also see that, the terminology itself is not limited to closed organizations that are controlled by shareholders but an open network of tiny organizations that are governed by simple rules written by individuals contributing to self identifying, communicating and organizing swarm of collective intelligence. We believe that Web 3.0 and advancements in Artificial Intelligence will be the greatest influencers of this. We observe that the progress towards Autonomous Organizations has already been started and will improve slowly and will win by adapting to and improving iteratively the existing technology, especially the Web. This approach will also encourage the gradual switching of end users without even noticing the technical complexities during the progress.

One of the main challenges that we face today is the lack of proper standards to achieve this goal of building the infrastructure. Currently being at the bleeding edge of all potentially contributing technologies, it is not possible to define a final standard for everything. So in this paper we are going to look into proposed, the high level building blocks of the infrastructure. And we will eventually break them down into smaller components by consulting current technologies and will suggest improvements to be made. We hope that each of them can eventually be standardized with the help of community and keeping open to the community.

Related Works

One of the oldest remarkable works in Multi-Agent Systems (MAS) is done by the Foundation for Intelligent Physical Agents (FIPA). Which formed in 1996 as an independent Swiss based organization later merged into IEEE CS standards organization. FIPA specifications were originally written for the Multi-Agent communications on the Web. Due to the eventual inactivity of the organization, the specifications are currently inactive and mostly outdated. However the specifications (which are made public and still bound to undisclosed patent restrictions) at its core provide enough inspiration to start building MAS related systems.

FIPA specifications cover three low level modules — Agent Communication, Agent Management, Agent Message Transport upon which a MAS can be built. Agent communication details the data structure of both the message being sent between two agents in a network and the shared Ontology between two agents in communication. A message being exchanged is the fundamental element of MAS which facilitates a Transaction between two agents. Ontology brings in context and background knowledge which enables self identification between two agents. Ontology plays an important role to bring meaning and mutual understanding between agents to accomplish a goal. Agent Management covers agent management services, agent management ontology and agent platform message transport. This specifies a model for the creation, registration, location, communication, migration and retirement of agents, which can be condensed down into an agent lifecycle supported by agent location service (Directory Facilitator). Agent Message Transport describes on-wire protocol for message exchange. We are not interested in Agent Message Transport now because it is outdated and there are better methods to replace it.

Recent advancements in efforts to build Knowledge Graphs over the semi / fully structured Web and Connected Devices led to two popular open standards — Schema.org (RDF is another popular standard) and W3C WoT specification. With which any agent (human or machine) can identify, interpret the ontology and contents of a web oriented site / app or a connected device and interact with it directly without a third party translation.

Schema.org solves the problem of ontology to a large extent for MAS. Adapting the Web to it will eventually create a planet wide, dynamic Knowledge Graph (it is like a database) that any agent can access at any time. It is up to the agent to decide which part of it to be accessed at a time (localize). Today, it is mostly available as embedded information within rich documents. The scope of Schema representations is not limited to documents but to the APIs as well, to make avail ontology description by an agent at a location in the Web.

W3C WoT on the other hand solves the problem of self identification (primitive ontology) and message exchange (Transactions) within a limited scope — electronic devices connected to the Web. It allows the agent to first retrieve the device information — to understand the capabilities of that device (ontology) and then read / update the device parameters or run actions on behalf. Mozilla’s adaptation of this specification includes message exchange details as well.

Current self proclaimed Web 3.0 standardization efforts are clustered around blockchain technology. Two such notable projects are the DAOStack and Aragon. There are more projects out there exploring the same domain but we ignore them for now because, they either follow similar patterns or are in primitive stage. Both DAOStack and Aragon put effort to specify in detail — the internal workings of a DAO, but ignored the networking details by explicitly mentioning that there possibly coexist a mesh network of DAOs and agents. This suggests that DAO’s can be built as independent nodes / agents in a network exposing a common interface for communication. This opens up a window of opportunity offering full freedom to choose multiple network communication protocols as well as the seamless integration with other nodes in the Web 3.0. This ensures the interoperability with diverse app ecosystem and upgradability from Web 2.0 to 3.0 without the end user even noticing it.

DAOStack is a framework (they call it an operating system for DAOs — which is true in a sense when Ethereum is considered as a Virtual Turing Machine) to build DAOs on Ethereum blockchain. It is built on Arc — which is an abstraction over a set of Smart Contracts (open library of governance modules and templates) implemented on the Ethereum blockchain. This abstraction underlies the governance systems decomposed into actions, ​schemes and ​global constraints that every agency can be built of. External agents are limited at the DAO boundary by only allowed to send inputs through Subscribed Schemes to influence an action through voting. This includes self modification of the DAO governance itself. Aragon is vary similar to DAOStack in it’s architecture, where the governance is overseen by Aragon Court decisions.

Web by design decouples logic from data. Any logic can access any data as long as they are directly / indirectly reachable and have the right permission. InterPlanetary File System (IPFS) is a promising content addressing protocol that best match the decentralized nature of the Web. IPFS by design prevents data redundancy, permanent availability of data (as long as at least one node is serving it in a network), and offline first data distribution (enabling seamless communication within temporarily fragmented network — eventual consistency) through this concept of content addressing.

IPFS protocol is made itself transport agnostic with the help of libp2p module — which offers a drop in solution for Web 2.0–3.0 compatibility. libp2p brings a single network transport layer for the agents within Web to communicate over diverse transports through protocol negotiations. IPLD is another notable module emerged out of IPFS, which tries to make avail all hash-linked data structures under a common data model. One use case of IPLD allows the agents in a network to explore and analyze public blockchains through a unified interface. IPFS already has received wide acceptance from the developer community for decentralized data storage and one such notable application is distributed key-value databases that further extended to accept SQL Queries.

Decentralized Identity (DID) management is another important dependency to build trust between Autonomous Organizations. A self explainable code is more trustworthy than any authority that issue IDs. Digital Identities will become an inevitable part of any organization, agent (human / bot) or device. A DID when created will associate zero credibility with it (which is different from real world IDs) but will eventually add up credibility and increased trust in the system. Associated credibility with a DID will exert varying influence over governed decisions made by Autonomous Organization from ID to ID — which at the collective level is the driving force to maintain the democracy within the Web to keep sustainability with it.

One notable effort in standardizing the specifications (which is in the final stage to become official standard) for DIDs is taken by W3C Community Group. This specification highly relied on distributed ledger technology (blockchain). The DID registry is a distributed ledger to build a Distributed Public key Infrastructure. A DID is a text string composed of the URL scheme identifier, the identifier for the DID Method and the DID Method-specific identifier. This DID resolves to a DID document contains the context of that document, cryptographic authentication information for that DID and services that can be used to interact with the entity. DID support create, read/verify, update and deactivate operations on DID document.

Migration to Web 3.0 will be slow and steady. Currently, we have parts of technology that combined will give birth to the first generation of Autonomous Organizations in the near future. Web applications we see today will eventually become autonomous agents / services with right enough intelligence (even though it is not necessary). At the same time, innovations in Artificial Intelligence today is growing fast, that might add more organic behavior to this collective ecosystem of Autonomous Organizations. What we should be doing right now is to establish a standard engineering path to properly combine what’s available and invent what else is needed to achieve the final goal.

This article is an exert from an on going open research by Jubin Jose, progressing under a_മ്മ. Support, Point out mistakes and Github watch updates at http://bit.ly/TAO-github

--

--