Social Dynamics of Second Wave Automation

jason.potts
9 min readDec 10, 2017

--

Jason Potts, Ellie Rennie, Julian Thomas

RMIT University

Second-wave automation promises to reshape not only economies but also societies. We consider how some of these social dynamics might unfold, and the risks involved.

Jacquard Loom (WikiMedia Commons)

Second-Wave Automation

First-wave automation began around the period of the Jacquard loom in 1804 and runs through to today. It includes tractors replacing draft horses, hydraulic actuators controlled by an operator replacing shovels and pick axes on construction sites, electric motors taking the place of a treadling foot or a cranking hand. The industrial robots operating Toyota production lines and propelling goods around Amazon warehouses are also part of first-wave automation, which uses machines to augment or replace the labour of human or beast, or what economists call capital-labour substitution.

Automation is an application of technology. Technology historian Brian Arthur[1] defines technology as the harnessing of natural phenomena (i.e. the discoveries of science) into technical elements. He explains that technologies are essentially made of combinations of other technologies, and therefore evolve through the discovery of new and useful combinations of these elements. From the 1940s a new class of cybernetic technologies (then called ‘servo-controlled’, now called ‘smart’) developed in which machines augmented by information technologies dynamically adapt as they receive feedback from the environment, blurring the line between human and machine and between machine and environment.[2] First-wave automation is when these technologies substitute or augment tasks performed by human hands (which can manipulate the world) and minds (which can process information). The result has been a vast increase in human productivity, realized as the growth of real income.[3]

Second-wave automation continues this process through a new class of technologies built around the convergence of digitization, data and analytics. However, in this second wave the target of ‘capital substitution’ is not just individual human labour (incorporating both physical and mental work), but higher-order systems and social institutions. The target of second-wave automation is not machine governance but rather human governance. To simplify this distinction: in the first wave, automated machines made things; in the second wave, they are making decisions. And we are just at the beginning of this new era of technological transformation.

What are second-wave technologies?

Second-wave technologies refer to a convergent cluster of digital technologies with origins through the 1970s -1990s but that have come of age in the early 21st Century. These include[4]:

· Artificial intelligence (including machine learning, deep learning neural nets, cognitive computing, semantic web)

· Internet of things (sensing technologies combined with networking technologies)

· Virtual reality (including augmented and mixed reality, combining computer visualization and representation technologies)

· Distributed ledger technologies (peer-to-peer computing, cryptography, consensus protocols, smart contracts and decentralized apps)

· Data analytics (supercomputing plus big data)

· 5G (very high bandwidth low latency mobile internet and cloud computing)

Each technology in this cluster is developing somewhat independently — although powered by similar underlying drivers of increased processing power and hardware improvements. However, the significance of second-wave automation comes from the convergence of this suite of technologies into new capabilities for social organization and institutional governance. These technologies afford new ways of organizing and coordinating human activity. Like human institutions, they provide systems for making decisions. So second-wave automation extends beyond labour-capital substitution into the domain of institutional-capital substitution.

In this way, second-wave automation pushes ‘smart’ technologies deep into economic, regulatory and organizational infrastructures in order to complement human decision-making by reducing laborious administrative processes and paperwork, and furnishing that decision-making with vastly increased inputs and information processing power. It boosts productivity, but also alters or removes management structures. These technological capabilities enable partial or even total substitution for human agency through automated analysis and decision-making.

The impact will be both visible and invisible. In first-wave automation robots replace people doing ‘work’. Second-wave automation seeks to build deep and powerful autonomy into things — such as self-driving vehicles, smart infrastructure, digital assistants — in order to create vast new infrastructures of services. These will range from autonomous public transport services using smart community governance to green cities and smart government. Importantly, these infrastructures might break from current public and private ownership models; rather than public agencies contracting private providers to build and maintain utilities, these infrastructures could feasibly be designed to return profits back into their own maintenance and development, removing both middlemen.

How is this achieved? Second wave automation will be embedded into the human environment through hardware infrastructure and will operationally depend on vast arrays of sensors, ubiquitous connectivity, and continuous development in computational processing, data storage and network technologies. These are necessary infrastructural foundations. However, the actual advance and capabilities of second wave automation is almost entirely carried by software, particularly in new protocols. This means that the development costs are in significant part in talented people (and teams), and that speed of adoption can be extremely rapid.

Another key feature of second-wave automation is that it harnesses the powers of deep data processing. When data processing is combined with automated transactions (smart contracts), it creates create new capabilities and architectures for public services and public goods. For instance, both private insurance and social insurance (viz. welfare) may be reconstructed not as risk pooling, where parties pay into a common pool of funds with drawing rights triggered by some probabilistic event, but as counterparty contracts with a distinct holder of the reciprocal contract (as has long existed, for instance, in reinsurance markets). Common pools are efficient when transactions costs are high, but they are exploitable and prone to moral hazard. Counterparty contracts, on the other hand, can in principle be created with smart contracts on the Ethereum blockchain. This enables an entirely new architecture of insurance.

The shift to second-wave automation will not be immediate, but achieved through an incremental combining of existing governance mechanisms (law, government, corporate structures) with automated processes. One example already in development is Mattereum, a code-based contractual infrastructure for automated transactions accompanied by systems for cross-jurisdictional adjudication methods when disagreement or unanticipated outcomes eventuate.

Even with these considered steps, it might still be difficult for non-specialists to evaluate and monitor these new protocols. It can be hard to appreciate their full emergent implications, and the hazards they pose. When people are making decisions and governing social systems, simply by being human they offer the prospect of legibility: they are understandable, observable and controllable through threats, constraints and incentives. But what happens when deeply complex software is performing those same social governance roles?

The Existential Threat of Second Wave Automation

Second-wave automation promises to reorganise economies, but in so doing it also threatens to reorganise societies as new software protocols replace many aspects of institutions that have evolved and adapted over hundreds of years, associated with bureaucracy, courts, cities, management processes, healthcare, economic management, diagnostics, high skill organization, creative adaptation. Over centuries, and particularly in recent decades, liberal polities have collectively built a great many protections into not only the legal and political institutions of our societies, but also into the moral and ethical codes of culture and behaviour that uphold our shared deep human values. There is a broader global concern here too. Hard-won human rights frameworks and institutionalised societal protections are the great achievements of recent centuries. The existential concern is that these protections might not survive second-wave automation, or may be distorted in unexpected ways.

Second-wave automation brings machine decision-making into realms that, at their best, have been organized using human attributes of sympathy, judgment, ethical reasoning, expert discrimination and the wisdom of experience. While fallible, costly and slow, these nevertheless buffer and protect our intrinsic humanity and values in the governance of human systems.

When society relies on machine learning, unless that learning is guided by ethical considerations then it will construct futures based on the data to hand — datasets of prior actions, including our discriminations and mistakes. Such data discrimination is, of course, already part of our reality (an algorithm that directs job advertisements to particular groups, another that directs police to particular neighbourhoods). However, new identity technologies are appearing that promise to give users control over how their data is used. These will pose new questions for data ethics: What does data agency entail? What incentives will underpin data markets? Will open data create better datasets, and how will we know? One thing is for sure: those who are not connected in the present will be invisible to that which governs them.

The promise of second wave automation to reduce transaction costs and to facilitate economic and societal coordination is enormous. The same promise also creates a significant risk; that if we lose the institutions that performed we might lose the gains and protections of liberty, equality, diversity, privacy, inclusiveness and social justice that have accumulated — however partial and imperfect — and been built into the institutions that previously performed these tasks.

Our modern institutions have evolved and adapted to be broadly protective of our human values. Will they continue to be so when automated?

We have a well-developed narrative around tail events for new technologies — from rogue AI to viral outbreaks — that elaborate our catastrophic scenarios of natural events such as earthquakes or asteroids (e.g. Centre for Study of Existential Threats). But what happens when these strike our social, political and economic institutions? We have mostly tended to think of these terrible scenarios in terms of the rise of bad actors (e.g. various dictators). But bad protocols could be equally destructive without necessarily being coupled to bad actors or intentions.

Second wave automation will impact on institutional complexity. We therefore need to better understand how digital society and digital ethics interacts with complex institutions through an integrated humanities and social science research agenda. We are only now beginning to describe and understand what these new modes of coordination might be.[5] How do we make this transition without risking transforming society in unexpected and possibly harmful ways?

Civil Rights March Washington DC (WikiMedia Images)

Evolution of Social Governance

There is a familiar economic historical narrative about the importance of institutions: from the ancient to the modern era the structure of incentives that provide social governance has evolved and developed through connected institutional complexes of legal, political and cultural mechanisms. All societies need to reward cooperation, punish rule violations, and protect people from predation by the powerful. All things being equal, societies with cultural, legal, social and political institutions that can facilitate cooperation and overcome free-riding problems seem to flourish,[6] and the institutions that enable that growth thus tend to replicate.[7]

But with second wave automation (and notably blockchain technology) we are about to enter a world in which what are still largely experimental protocols for economic incentives may provide the operational forces and mechanisms to shape social governance.

Economic incentives for social governance may have some advantages: they could prove, in some cases at least, more targeted and efficient than political, legal or cultural incentives. However, we have very little experience in using economic incentives in this way, and little understanding of the individual, social, and global risks involved.

There is a fundamental need for deep foundational interdisciplinary research to seek to understand the nature of these new technologies and the risks and opportunities they bring.

[1] Arthur, W.B. (2009) The Nature of Technology: What it is and how it evolves. Free Press: New York.

[2] Mirowski, P (2001) Machine Dreams. Cambridge University Press. Mindell, D. (2017) Our Robots, Ourselves: Robotics and the myths of autonomy. Viking: London.

[3] McCloskey, D. (2016) Bourgeois Equality. University of Chicago Press: Chicago.

[4] This is part of what is sometimes called the fourth industrial revolution, or Industry 4.0, and also includes 3D printing (or additive manufacturing), autonomous vehicles, robotics, quantum computing, wireless power, nano technologies and Virtual personal assistants.

[5] Couldry, N. and Hepp, A., (2016) The Mediated Construction of Reality. Polity Press: Cambridge

[6] Wilson, D.S, Wilson, E.O. (2008). ‘Evolution ‘for the good of the group’. American Scientist. 96(5): 380–389.

[7] North, D. (1990) Institutions, Institutional Change, and Economic Performance. Cambridge University Press: Cambridge.

--

--

jason.potts

Director of Blockchain Innovation Hub at RMIT University