Law as a Complex System

Eight Books on Complexity Theory for Jurisprudents

Campbell Law Innovation Institute
Assembling
16 min readSep 18, 2021

--

Key points

  • Complexity theory is a trend in the social sciences.
  • Few jurisprudents (and fewer everyday lawyers) have heard about it, nor was it likely covered in law school.
  • Nonetheless, the focus on complexity theory across many disciplines has implications for what can and cannot be modeled in a theory of law.
  • Complexity Theory is going to enhance the emergence of new theories, like experimental jurisprudence, which needs a more robust model that challenges philosophical perspectives on the ontological and epistemological foundations of social thought.

Introduction

It is useful to take a long view of history. For example, if one discounts the long view of the history of law and legal thought, one might conclude that law and its ordered principles are eternal truths. But the reality of history suggests that law is a phenomenon of the agricultural age. Law did not exist until there were settled communities engaged in farming, only about 15,000 years ago. Prior to that, for nearly 200,000 years, men and women met, fell in love, raised families, organized for production, traded, loaned, had disputes, died, and passed along their possessions without the benefit of law. The population was low. Some estimate that the population in 10,000 BCE was only 2.4 million, and it did not reach 1 billion until the nineteenth century. Moreover, when we think of law in its modern sense, we think of a written form. Legal positivists sometimes argue that law is what is written in books, fixing the origins of law with the origins of writing, about 5,000 years ago when the total human population was still only about 115 million. Viewed in this light, law is a relative latecomer to human history. It should be understood as a phenomenon associated with complex human societies that emerged as the population centers grew and complex social interactions became a norm. Law is a recent development associated with changing modes of production and growing social complexity. It was a means for skillfully coping with complex systems that brought human beings into new relationships with the environment and with the evolving networks of social complexity.

Complexity is now explored in system theories, which view it in terms of subtle phenomena like emergent behavior, nonlinear scaling, and statistical mechanics. Today, complexity theory is an influential approach to understanding similar phenomena that occur across many different areas in the natural and social sciences. While it has only recently been applied to law, the field is growing in influence. But, it is hampered by a lack of a thorough jurisprudential theory, at least in the United States, where legal theories are still highly reductionistic and non-realist — two claims that are contradictory with complexity theory. There has been little interest, so far, among jurisprudents who typically fall into one of two camps: the analytic jurisprudents and the postmodern jurisprudents (with some license, I include neopragmatists here for their common views about realism and rationality). Also, the barrier to entry into this fascinating, nuanced, and productive field of complexity theory is quite high. The mathematics for a comprehensive understanding is prohibitive for most jurisprudents, who tend to be math-shy. And, the philosophy is difficult, requiring knowledge of ontology, epistemology, philosophy of mind, and philosophy of mathematics. And particular systems theorists, like Niklas Luhmann and Giles Deleuze, are important to understanding contemporary debates in the social sciences, which are significant for thinking about the nature of law.

Complexity theory is also a challenge for the ethicists who investigate these issues because it does not fit neatly into traditional moral theories. Complexity theory poses substantial challenges to utilitarian theories and some deontological human rights theories because it suggests that human beings are not capable of fully understanding the moral meaning of their actions, and this is not simply a failure of knowledge, but an unavoidable trait of complex social systems. And, similarly, virtue ethics will struggle to accommodate the radical evolutionary nature of the person that is suggested by complexity theory.

Readings

The following list of books is a starting place for understanding the significance of complexity theory today. It is a place to begin a journey that is intellectually demanding and yet joyful, with many insights and perspective-changing encounters with profoundly brilliant minds and advanced technology. I offer it here in the hopes of stimulating some interest among jurisprudents, social theorists, and perhaps a few lawyers who can look up from their immediate task and wonder about the meaning of their work.

The Information: A History, a Theory, a Flood, James Gleick, (Pantheon Books; (January 2011).

In this magnificent work, which won many awards for science writing when it was published in 2011, James Gleick tells the story of information science which he describes with an artful blend of scientific knowledge, clear exposition, and a storyteller’s craft. He begins with the humble example of the Keke-speaking Africans in the Congo who used drums to communicate across the rainforest. He ends with the flood of information brought on by the modern information Age. It was initiated by Claude Shannon’s mathematical theory of information, published in 1949, which is the foundation of modern communications theory.

Shannon achieved a mathematical description of information and communication by separating the signal from its meaning. We need not understand the signal to send and receive it, he argued. We only need to know that a transmission is sufficient (has the bandwidth) to carry the intended signal. It was this breakthrough at the time when meaning was viewed as essential to communications. By separating signal from meaning Shannon allowed for the flood of information that exists today. And, it was the mathematical expression of Shannon’s theory that connected information to the second law of thermodynamics and the concept of entropy. Information has a physical expression that the physicist John Wheeler expressed with the quip, “It from bit.” This realization has allowed for the rapid creation of information and communications technologies (ICT) that now shape our lives.

Now, we mere humans find ourselves overwhelmed by vast seas of meaningless information. Quoting from a short story by Jorge Luis Borges, “The Library of Babel,” Gleick concludes his book with this depiction of our contemporary existential dilemma:

We walk the corridors, searching the shelves and rearranging them, looking for lines of meaning amid leagues of cacophony and incoherence, reading the history of the past and of the future, collecting our thoughts and collecting the thoughts of others, and every so often glimpsing mirrors, in which we may recognize creatures of the information.

We frail humans are the only creatures that can give meaning to the sea of information that now overwhelms us.

The Fourth Revolution: How the infosphere is reshaping human reality, Luciano Floridi, (Oxford University Press, 2014).

In this book, Italian philosopher, Luciano Floridi, continues the story of information, adding his own keen insights. He begins covering some of the same ground as Gleick, describing the development of information from pre-history, to recorded history, to the period we are now in, which he calls “hyper-history.” What is distinct about our present age of hyper-history is what it is telling us about our own human nature. This is, he says, the fourth revolution in human self-understanding, where each successive revolution discloses another dimension to the artificiality of the distinction between the natural and the human — a distinction that Bruno Latour identifies as a theme in modern thought.

As Floridi tells the story, successive generations of transformative revelation have “decentered” human self-understanding by showing that human beings are a part of nature. WE are with nature, not against or above nature. The first of these revolutions that Floridi describes was Copernicus’ theory that displaced the earth from the principal place at the center of Creation. This was followed by Darwin’s displacement of the human from the Apex of living beings. The third is Freud’s revolution that suggests that human beings might have primitive motives that dominate human behavior in veiled workings of a subconscious mind.

The fourth revolution in human self-understanding is being brought about by the information age, which has once again shown the divide between the human and the natural to be artificial. The achievement of Alan Turning was to prove that rules can be followed by relatively simple, naturally occurring systems. Following a rule is not a feat of cognitive skill or understanding. It is, as Wittgenstein would say in his Philosophical Investigations, merely doing what is expected within a practice. Understanding the significance of this for human nature has resulted in a tremendous transformation of human self-understanding, which is still unfolding. Floridi makes several important observations about the emerging understanding of the person. First, human beings are information agents. He calls this being an inforg (a combination info and cyborg) existing in an information environment, which he calls the infosphere.

We now live within the information networks, connected for most of our lives. Floridi says we no longer go online, we now must see ourselves as having “onlife.” And, for Gen Z and following generations, who were born on-life, the world is experienced differently. Critically, Floridi notes, that being onlife changes the way we understand ourselves. He argues that our personal identities emerge from group similarities. That is, since we collect so much data about individuals, personal identities form around the traits we share with others. We are, that person of a particular gender trait having a particular set of preferences for food, music, books, and other consumer goods. Since so much of onlife is directed toward consumption, we become marketing profiles determined by the data exhaust of our onlife activity. This is what it means to be an inforg. Floridi explains:

“In a proxy culture, we may easily be de-individualized and treated as a type (and) used to reidentify us as specific consumers for customizing purposes.”

To live decent lives in this onlife, we must pay attention to the collective sum of the information environment and try to shape it for human well-being. Critically, we must remain mindful of the need for a human rights theory that protects the dignity of persons.

Complexity, A Guided Tour, Melanie Mitchell, (Oxford University Press, 2011).

Onlife, as Floridi describes it, occurs within an evolving network that is made possible by the dramatic increases in the amount of available data and the powerful systems that continuously analyze it. Since the 1990s, the ability to detect patterns in that data has led to the awareness of complex systems. Mitchell concisely describes complex systems as a collection of concepts and practices that make up an evolving theory of complexity. The book is broken into five parts and 19 chapters which present an accessible overview of the topic. The first chapter describes a complex system with rare clarity and grace. This is particularly difficult since complex systems have been recognized and described (somewhat differently) in many fields of natural and social science.

As Mitchell describes complex systems, they are systems that are characterized by a number of common traits. Notably, they involve a large number of components interacting within a closed system with internal feedback that create an equilibrium stated that is coupled to the external environment. They are characterized by emergent behaviors that evolve without a central authority from the collective action of individual component agents. The sum of the action is more than the parts, as in a synergistic reaction, but unlike a system in synergy, the emergent behavior is different from the behavior of the components. An example of emergence is the behavior of a flock of birds. The flock has a collective behavior that emerges from the relationships among the birds. It is not the same behavior as that of the individual birds; it is not a super bird. The second feature of complex systems is that they possess nonlinear scaling — the idea that a tiny change in a variable can have a system-wide influence. This is the so-called “butterfly effect” that holds that even the flapping of a butterfly’s wings can shape an entire weather system.

Two themes in Mitchell’s book are particularly useful. The third chapter focuses on information and classical thermodynamics. Mitchell shows that using statistical mechanics, complex systems can be described in terms of Shannon information and the concept of entropy. Information entropy is a necessary feature of complex systems. The fourth chapter deals with Kurt Gödel and Alan Turing. The Turing proof demonstrates that not all propositions are decidable (i.e. effectively computable). Mitchell writes for the reader with little math or computer science. Carefully following her guide here is well worth the effort because she shows that complexity theory is fundamentally connected to information theory and computer science.

This allows for a better understanding of the nature and limits to reductive explanations of complex systems. They are not “compressible,” in the sense that computer science uses that term, because the information description of them always includes Shannon entropy or the random signals that are a necessary part of a complex system. The implications of Shannon entropy for complexity theory in the social sciences are particularly important.

Emergence: Contemporary Readings in Philosophy and Science, Mark A.Bedau, Paul Humphrey, eds. (MIT Press, 2008).

This book takes a deep dive into the concepts of emergence that exist in the philosophy of science. It provides more detailed background on the philosophical and scientific questions about why emergence has become an important interdisciplinary issue. The editors of this collection of essays view theories about emergence as needing testing in actual scientific work in physics, biology, and computer science. They distinguish contemporary theories from early ones which occurred in philosophy, outside science, that involved the mind-body problem. Since the 1970s, that debate concerned the concept of “supervenience,” which captured the logical meaning of emergence. The editors here argue that emergence is relevant for philosophy of science more than for philosophy of mind or metaphysics and imply that philosophers interested in emergence should engage scientific theorizing on the topics.

The book contains many examples of emergence in the sciences, not just consciousness so that findings from throughout science are crucial for the philosophy of emergence. This is essential for the first task for philosophy, which is accurately conceptualizing emergence to meet against arguments that emergence cannot or does not take place. An aspect of this is to determine whether emergence is a discrete property or refers to a spectrum of traits. Drawing from a wide range of scientific examples, the volume demonstrates that new techniques of modeling have altered our understanding of the emergence and its pervasiveness. Of particular interest in the social sciences is a well-known paper that shows how racial segregation arises through non-racist individual choices. But the conclusions of this work are far from settled. Skeptics of emergence will find support as will the enthusiasts. But, in both cases, the nuanced analysis will change their views of the topic.

Reassembling the Social: An Introduction to Actor-Network Theory, Bruno Latour (2005).

Bruno Latour is a French philosopher known for his writings on the philosophy of science. He argues that sociologists conceive of the social as something that already exists. He suggests that we have taken this category for granted, assuming it is an established entity or something that is predetermined like a “natural kind.” Latour, however, argues that the social must be formed or produced. Social scientists should be asking about how the social comes to be. According to Latour, the social is generated in networks. This is what Actor-Network Theory (ANT) is about. It is a description of how people and objects come together to form the social, which includes culture and knowledge.

So, one key point to remember about Latour’s theory is that he is concerned with how humans interact with non-human actors. Basically, ANT wants to understand how things like machines or technological systems come to interact with our society. Think of society as a mix of these different actors. Ultimately, Latour is interested in looking at the connections between people and things. Objects are very important in this framework.

ANT proposes that any system we encounter can be most effectively approached if we look at all of the parts — whether they are natural, technological, or human — as active members of the system. According to ANT, each human, each piece of technology, and each natural factor (such as sunlight, air movement, temperature, etc.) has a part to play in the system and must be considered.

Assemblage Theory, Manuel DeLanda (University of Edinburgh, 2016).

Manuel DeLanda is one of the most interesting thinkers to work on the philosophy of Gilles Deleuze. This book follows several penetrating interpretations of Deleuze's work alone and with Felix Gautarri, which develop the concept of assemblage that Deleuze and Gautarri originated in A Thousand Plateaus. Over the years, DeLanda has expanded the original conceptualization of assemblage by bringing in empirical studies from theorists ranging from Erving Goffman to Fernand Braudel. In the current formulation, DeLanda shows that societies work on several levels of social ontology.

Earlier, in A New Philosophy of Society, DeLanda affirmed a realist epistemological perspective. He writes, “a realist approach to social ontology must assert the autonomy of social entities from the conceptions we have of them.” For philosophy to account for the autonomous existence of social entities, social ontology must attribute objective reality to social entities. That is to say, social entities, from that of the individual person to the “networks” in which he/she is directly involved, to the larger “organizations and governments,” to “cities and nations,” each level is ontologically real and not reducible to other levels.

This perspective DeLanda calls a “flat” ontology. In Intensive Science & Virtual Philosophy DeLanda describes flat ontology thus:

…while an ontology-based on relations between general types and particular instances is hierarchical, each level representing a different ontological category (organism, species, genera), an approach in terms of interacting parts and emergent wholes leads to a flat ontology, one made exclusively of unique, singular individuals, differing in Spatio-temporal scale but not in ontological status.

In a flat ontology, there are ultimately only individuals. Species and organisms are groups of individuals that are situated in time and space. If species are not eternal essences or forms defining what is common to all particulars of that species if they exist in space and time, then this is because species, as conceived by biology, are not types but rather are really existing reproductive populations located in particular geography at a particular point in time. For DeLanda, then, being is composed entirely of individuals. Everything is an assemblage, and “assemblages can be component parts of other assemblages.”

Complexity Theory and the Social Sciences: The State of the Art, David S. Byrne & Gillian Callaghan. (2013).

This book puts together information theory, complexity theory, and the social sciences. Starting with complexity theory, it describes some of the leading work in the field among social scientists. After introducing the concept of a complex system, the authors note a distinction between restricted and general complexity that was described by Edgar Morin. By “restricted theories,” he means theories that anticipate the possibility of mathematically modeling and universal rules of a complex system. This approach is found in Agent-Based Modeling, for example, that looks toward describing agents with mathematical features like non-linear scaling and genetic evolution. Byrne & Callaghan describe these types of models as “attempts to develop accounts of emergent social reality which can be expressed in terms either of mathematical formalisms…or sets of rules governing the behavior of agents….” They note that a “key distinction” between restricted complexity and general complexity, as Morin describes them, is that the restricted complexity theories are ontologically realist.

They seek a middle path between what they take to be overly simplistic restricted theories and anti-realist general theories. For them, a “soft-fundamentalism” acknowledges the concerns of anti-foundationalism and yet respects the ontological commitments of the mathematical formalism of restricted theories. They investigate the ontological status of groups and individuals, viewing it as an open issue among social scientists that provokes debates about mereology that questions whether mathematical formalisms reify social entities. This presents effective arguments against both overly reductive modernist theories (like analytic jurisprudence and formalism) and anti-realist theories (like the ones that might be collectively called post-modern [including neo-pragmatism, critical theory, structuralism, and post-structuralism]).

Complexity Theory and Law, Mapping an Emerging Jurisprudence, James Murray, Thomas Webb, & Steven Wheatley (2018)

This collection of essays contains some significant statements about complexity theory is being experienced and understood among legal scholars. The introduction, by the editors, contains some critical material framing the essays within the debates in the social sciences. Other important essays are J.B. Ruhl and Daniel M. Katz’s, “Mapping Law’s Complexity with Legal Maps,” which introduces the concepts of complexity and suggests a analytic framework. The other essays take up applications of complexity theory to various areas of law, ranging from criminal law to professional ethics. A useful contribution to jurisprudence is Minka Woerman’s essay, “Complexity and the Normativity of Law.”

What is lacking in this collection is a thoroughgoing engagement with the major approaches to jurisprudence. At a minimum, the theoretical approaches that have dominanted in the the United States have been analytic jurisprudence, critical legal studies, and neo-pragmatism (albeit with some, like Brian Tamanaha, still attached to the classical pragmatism of Pierce and Dewey). Complexity theory challenges the epistemological foundations of each of these, and therefore a new jurisprudence of complexity theory cannot achieved simply by reading law’s complexity through any one of them. It also cannot be ignored since it is being applied in empirical analysis of the law, both for academic and commerical purposes. Simply put, it has arrived as a reality of the current state of legality, even if the jurisprudents have not been aware of it. Since this knowledge of complexity is new, it challneges us to think anew. The way ahead requires a rejection of the dogmas of the past and free ourselves from outdated beliefs.

Key take-aways

  • Information theory developed over a long time as human beings sought to communicate over distance. The key to the modern theory known as Shannon Information Theory is the separation of the signal from meaning.
  • Information theory changed the way people understand themselves and their world. We are inforgs in an information environment.
  • Complexity theory is a part of systems theory that explores how evolving systems take place. Complexity theory interprets such systems as having emergent behavior and nonlinear scaling. There are information interpretations of complex systems that view the order and chaos in such systems in terms of the compressibility of information.
  • Social theorists like Bruno Latour and Manuel Delanda have applied systems theory to understanding society. Latour’s Actor-Network Theory (ANT) and DeLanda’s Assemblage Theory view society as complex systems of systems with multiple inputs and outputs that are, to some extent, not determined by deterministic rules.
  • Recent work in social science has attempted to incorporate the insights of complexity theory into social thought. Simply put, the implications, drawing from information interpretations of complexity, suggest that societies contain complex systems, cannot be reduced to simple rules, and yet are not infinitely mutable.
  • Currently, there is no definitive understanding of law that robustly incorporates the implications (particularly significance for epistemology and ontology). Currently dominated jurisprudence cannot adequately explain the phenomena that complexity theories identify and are useful in creating useful legal technology.

--

--