Cybernetic Reemergence

Millennial Glyph
Jun 14 · 10 min read

Adapted from my talk given at the 2019 RadicalxChange conference

In Radical Markets Eric Posner and E. Glen Weyl introduce data as the next frontier of labor. This would have huge repercussions for social media networks. But it doesn’t stop there.

We’re not the only ones being watched. Farm animals are monitored through wireless collars. Fishery stocks are quantified. Wild animals are tagged and tracked by satellite.

How do we approach the data collected from non-humans? Are they laborers as well? This may seem like a silly questions. A cow is a cow. But what about a national park, a river, or a marine sanctuary? What about an AI, or a cyborg assemblage? If value can be exchanged and held without human oversight, what’s to stop the more than human world from owning its labor power?

This may seem like a fringe concern, but as the possibility of granular-scale carbon markets become more likely, it’s not difficult to imagine individual trees getting paid for their carbon sequestration services. Fifty years ago, cybernetics was forcing such questions to be considered. Today, blockchain protocols are making these questions eminently important.

I’ll explore three case studies that highlight various facets of these questions. I begin with a case study, looking back to early cybernetics in order to highlight the performative aspects of biological computing. Next, I’ll present a project that is emblematic of the way blockchain groups engage with the more than human world. Finally, I’ll present a case studies that brings these two threads together. I’ll conclude by noting that to we tend to employ the the labour of the more than human in the wrong way, surveying, modeling, overlaying AI overtop. I’ll highlight the way, cybernetic perspectives show us how we can move forward opening computation to multiplicity of intelligence, employing the more than human not just as opportunities to mine data, but as active constituents in the computational process: opening the Ethereum world computer to be just that: a world computer.

The first project I’ll be exploring comes out of the tradition of early British cybernetic. Known as the Cybernetic pond, this project was undertaken by Stafford Beer, in the early 1960s. As the name suggests, this project sought to employ a pond within a management system. At its’ heart, the project was designed to employ the homeostatic intelligence of an ecosystem in the management of a factory. As the pace of the market increased after the Second World War, Beer believed that only adaptive systems would be able to survive. To this end, he hoped to develop mechanisms that would allow a factory to dynamically adjust to shifting market modalities. Beer set about linking the biological functions of the pond to the computer managing factory operations. Collecting water from local ponds, Beer installed tanks full of daphnia. He envisioned an ecosystem acting as a bridge between two aggregate switchboards (similar to what we would now refer to as neural networks). The inputs (which could consist of changes in added nutrients, electrical signals sent through the water, the addictions of heavy metals to the habitat) all represented data from the factory. The behavior exhibited by the ecosystem as it reacted to the environmental change, would then be recognized as the output. Adapting to the factory influences, the ecosystem would do the work of computing.

As one of the early pioneers in computer management systems, Beer saw this as imminently practical in improving decision making and management efficiency. To accomplish this, beer was more than happy to employ non-human intelligences. This was advantageous, as daphnia were willing to work for free. Moreover, non-human intelligences have other qualities digital computing machines lack: they want to survive. In this way, Beer was intent on linking the organism’s drive to survive with that of the factory. He was going to harness the adaptive qualities of life to ensure that the factory would not fail.

Crucially, Beer wasn’t interested in understanding how the pond maintained its equilibrium. He was only concerned with the fact that it could perform this balancing act — that it was able to account for changes and respond accordingly. In this way, the pond was a kind of computational black box. Instead of creating a model, beer was allowing a scenario to be performed. This was important for early computation when the requirements for even basic modeling outstripped the capabilities of the hardware. Not only could cyberneticians achieve results without employing large amounts of computational power, the results were also extremely high fidelity; the only limiting factor being the sensitivity of the instruments used. This made it an economical solution, at least that’s how Beer was able to sell his clients on it.

This project, and British cybernetics in general proposed a different way of looking at the world; one that acknowledged the agency of non-human life. This continues to represents a shift away from the representational logic that tends to characterize computation. This mean a fundamentally different ontology. For the cybernetician, the world was place vibrant with intelligence. The biological systems doing managing were considered equally important as the source code running on digital hardware. This can be seen in the interdisciplinary endeavors of the time, with cyberneticians working on EEG machines, artificial brains, systems of computation using ammonia crystals, and other transdiciplinary endeavors.

This brings me to my first case study. Nature 2.0, a proposal by Trent McConaghy, lays out his vision for a sustainable future. McConaghy imagines a vibrantly transactional world in which human non-human stakeholders actively participate in the market through overlaid protocol: the non-human represented by AI DAOS.

McConaghy envision a future that capitalizes on these mechanics, layered over top ecosystems and economies, allowing more than human systems of plants, animals, autonomous vehicles and even power grids the ability to own themselves and engage in complex market interactions. Not only could a forest own its own market value, selling its timber through selective, sustainable harvest, a road could also own itself, charging the autonomous vehicles passing over top.

Augmenting DAOs with AI, McConaghy proposes a richly interactive system. With an advanced AI, a self owning forest may not only be able to expand through the purchase of adjacent property, but may also be able to influence local conditions that determine ecological health. For example, a fertilizer plant may be located in close proximity to the forest. Because of leaks, fertilizer enters the local watershed causing toxic algae blooms. This has a negative impact on the ecosystems, damaging the waterways and the ruderal environments that provide crucial buffer zones for the forest. In order to eliminate this harmful occurrence, the forest may decide to purchase the fertilizer company. In control of this organization, the forest can either invest in upgrades that reduce the danger, or liquidate existing infrastructure, leveraging maximum profit from the assets while eliminating the possibility of future leaks.

Though much of this is speculation at this point, there are some concrete point. The network is particularly suited this task because of its, as he writes, “disintermediating,” quality. A user doesn’t know who’s on the other side. As he says, it could be a half plant, half robot. In other words, the anonymity of the network is advantageous to the opening of ontologies. Value can be exchanged seamlessly between beings as long as it can pass through standard protocols.

Not only would the more than human world be surfaced as agent through market activity, frameworks are already in place that would facilitate the legal recognition of more than human systems, augmented by AI DAOs. As McConaghy writes, the easiest method would be to transfer corporate ownership to a DAO. Already recognized as a legal person, this would instantly extend rights to such an assemblage. This could fundamentally alter our understanding of personhood, inherent value and agency. It shows us a glimpse a future that is surprisingly non-modern. An animist vision of the earth where the voice of non-humans is heard within the marketplace.

Nature 2.0 proposes such an ontology, but stops short of performing it. There’s a crucial distinction that must be made. In McConaghy proposal, the more than human world is not acting, AI is acting on its behalf. While agency has shifted, moving towards in a non-anthropocentric direction, it is imperative not to confuse an AI acting for the benefit of a forest with the forest acting for itself. This is but a further displacement. And While it is import and necessary to explore and employ the agency of artificial intelligence, it is myopic to focus solely on this. It is but one of a vast world of unexplored forms of intelligence.

My third case study takes a different approach. Concluded in the fall of 2018, Flower Tokens, a project by Terra0, is an experiment connecting crypto collectables to the fungible. The project could accessed via their website was centered around a live stream recorded the growth of 100 dahlias. Users were able to buy and trade Flowertokens. These tokens were connected to individual flowers: each of the 100 tokens corresponding one of the 100 plants. The state of the plant is interwoven into the smart contract. Proof of work is in the hands of the plant. It has to bloom for the flower, and investor’s share to enter into the flowering pool. When the flower blossoms, the investor owning the share will be compensated from a pool of capital.

The plant acts as a black box within the system. Its intelligence is employed. The way it grows, flourishes within the environment, plays a crucial role in determining the corresponding value of the investor’s token. And enmeshed in the smart contract, the flower also plays a role in consensus.

But here was where flower tokens ran into problems. Connecting the fungible to block chain requires consensus among multiple nodes: a multiplicity of proofs. But measuring living plants with cameras, from multiple angles proved to be imprecise. Terra0 was not able to achieve reliable consensus. In the proof of concept, only one camera was used, rendering the system open to security flaws.

Despite these flaws, the project is an important proof of concept. It took the bold step of employing more than human life as constituents in computing. Flowertokens also took advantage of the network, creating specific and sensitive relationships between the the investor and plant. Through the project, the plants were given a value beyond that of raw material. As active participants in value creation, their agency was reinforced and given authority within the network setting.

This issue of connecting the fungible to blockchain was never fully resolved. Perhaps it never will be. Yet this doesn’t spell an end to cybernetic explorations within the blockchain. Rather than employ the more than human within the blockchain, I see the primary role of computational black boxes as one of management. While there is widespread excitement about the union of AI and blockchain technology, AI is but one of many kinds of intelligence. Just as we hope AI can aid in autonomous governance, more than human intelligences, included within the computational process could do much the same: governing a coin, a factory, as Stafford Beer suggests, or a self owning forest, as Terra0 hope to one day achieve.

There is endless potential. Just as the scientist search the rainforest and study indigenous medicine looking for plants and animals with potentially life saving chemical compounds to be used in medical application, the same can be done with more than human intelligence. There is a whole world of performative computation waiting to be discovered.

So I suggest developers look to the natural and to projects that employ more than human intelligence within the computational process. In particular, if you’re interested in this subject, it is worth investigating Benjamin Hertz’s robot controlled by a cockroach pilot. And earth computer, a generative project. Specifically, I suggest blockchain developers consider such cybernetic outlooks because I believe there is a kinship between these two technological trajectories. While Beer had to cut the cybernetic pond short, due to financial reasons and lack of institutional support, the token mechanics that blockchain enables provide a template that could sustain even the most radical projects. Furthermore, the disintermediation of blockchain networks that McConaghy highlighted could be even more impactful with DAO not only observing, but being managed by the intelligence of more than human stakeholders. A whole world of human and non-human interaction could exist, far beyond what cyberneticians could have imagined, amplified by the infrastructure of the blockchain.

At the same time, we must steer clear of overlaying protocol on top of existing systems, inscribing the more than human world within yet another commodifying layer. This points us in an interesting direction: towards a kind of computation that looks less like computation and more like life itself. Stafford Beer, later in life, described the Irish sea as the most perfect computer. There is no better way of calculating the effects of the wind and the heavenly bodies and the ocean waters, Beer said. This is what I think of now when I hear the phrase “Etherium world computer.” The earth is a system calculating homeostasis on a global scale. It doesn’t need our help with this. We need it’s.

Thinking about the changing climate, Michael Taussig asks, “ is it possible that subjects will become objects and a new — which is to say ‘old’ — constellation of of mind to matter, body and soul, will snap into place in which writing,” we can read code, “will be neither one nor the other, but both.”

My thoughts as I conclude are these: We are at a kind of plane of immanence: where forms, digital and biological can emerge and perhaps converge within a common system. Including non-human intelligence into the act of computing, investigating DNA and other forms of embodied computing, even inviting AI into the DAO are all acts emanating from this obscure plane. And perhaps this is the way forward: systems with no distinction between the biological and digital: employing multiplicities of intelligences: plants acting, lichen calculating, AI trained on indigenous languages traversing networks, humans and DAOs all interacting together, forming a vibrant and imminently material discipline of computation.