In my last post on teaching the fundamentals of quantum computation to a toddler, I touched on the subject of computational complexity, noting that this field should not be confused with the more general study of complexity that one may encounter at Santa Fe Institute for instance. I’d like to use this post to expand a little on that more general field of complexity, perhaps instead of teaching it to a toddler it will be geared for teenagers this time (who are thus able to make a few of their own deductions and inferences). I’ll use for analogous illustration of concepts elementary cellular automata, music, and illustrations. The use of cellular automata here was inspired by Stephen Wolfram’s A New Kind of Science (which I will abbreviate here as NKS), and much of the discussion will be guided by subjects that were addressed in that work. I’ll also borrow several images of these automata from Wolfram’s Atlas of Simple Programs.

Although NKS was not written predominantly about complexity, it was one of my earlier exposures to the concept and partly inspired a reading journey from authors such as Melanie Mitchell, Stuart Kauffman, George Johnson, Murray Gell-Mann, Yaneer Bar-Yam, and others — for those that wish to learn more about the subject these authors are good places to start, as are the MOOC’s available from Santa Fe Institute’s Complexity Explorer.

First a brief introduction to cellular automata. An elementary cellular automata is basically a set of rules for determining the progression of values for a 2D grid of binary (black/white) cells which evolves from top rows down. The grid is seeded with an input row (most examples here will have a seed of a single black cell or a row of randomly placed black/white cells), and then based on the values of the seed row the rules determine values of cells in each subsequent row and similarly on down, with the resulting output images given some interesting shapes of varying degrees of complexity. For the simple versions we will be discussing here the rules will derive a value for a child cell based only on the three parent cells immediately adjacent in the row above. Since there will be 2³=8 combinations of parent cells requiring a rule, and thus 8 potential inputs for child cells, we can then derive that there will be 2⁸=256 potential variants of elementary cellular automata, which are traditionally identified using a numbering convention of 0–255.

Shown here are a few highlighted rules out of the 256 that will demonstrate the variety of potential output shapes. Most of those shown have a seed of a single black cell in top row although a few such as 128 or 136 are shown here with multiple seed cells to demonstrate some particular behavior. Often a single seed cell will result in some kind of triangle shaped output as the cell’s area of influence expands wider in each subsequent row, although for some rules (where three white parent cells result in a black child cell) a row-wide transformation may be immediate.

The study of cellular automata wouldn’t be very practical without computers to simulate them. This turns into one of the themes of NKS, about how computer simulations could open the door to new routes to scientific discovery (aka a “new kind of science”). The elementary cellular automata merit so much attention because even in their very limited space of 256 simple rules there can be found a wide band of potential behaviors and complexity. We can sort the potential behaviors into four classes. Class 1 is the most simple with initial conditions leading to some uniform final state (examples include rules 0, 4, 255, etc). Class 2 may have many potential final states but they all are a set of simple uniform or repetitive shapes (examples include rules 1, 50, 190, etc). Class 3 is where things start to get interesting, in these rules the behavior gets more complicated with apparent randomness and fractal nesting of shapes (examples include rules 22, 30, 90, etc). Class 4 is perhaps the most interesting, it involves a mixture of order and randomness, with local structures that may appear simple but through progression move around and interact with each other in complex ways — this behavior may become more obvious with more elaborate seedings beyond a single cell, we’ll have some demonstrations to follow (examples include rules 110, 124, 137, etc). It should be noted that the boundaries between these four classes are not firm and some rules may exhibit properties of multiple classes to some degree. If you remember one thing about elementary cellular automata it should be this: these rules illustrate that even from extremely simple rules it is possible to generate highly complex and irreducible behavior of a system.

The grouping of these cellular automata into the four categories based on behavior is a kind of measure of their complexity. Researchers have developed more general descriptions of a system’s complexity based on a variety of metrics such as how hard the system is to describe, how hard it is to create, or the degree of organization — a representative list of 42 such descriptions was assembled by Seth Lloyd here. Note that Lloyd doesn’t claim that this list is exhaustive, and actually calls for submissions of additional metrics which perhaps implies some ambiguity as to a formal agreement on definition of complexity between researchers. Melanie Mitchell has helpfully clarified that “As yet, Complexity is not a single unified science; rather, to paraphrase William James, it is still ‘the hope of a science’. I believe that this hope has great promise.”

Just as the toolbox of an industrial engineer are based on probability and statistics, the tools of complexity research are models via computer simulation. This post will primarily feature simulations of cellular automata whose rules of evolution are based on values of adjacent cells, however there are plenty of other types of systems that could be simulated using more elaborate rules — some examples being crowd dynamics, street traffic patterns, or possibly even electrical grid distribution systems. Through the simulations of elementary cellular automata behavior we will attempt to explore a few features of complexity here.

## Class 1 Elementary Cellular Automata — Simple Outputs

Class 1 is again the most simple category so I won’t spend a lot of time on it. In this class initial conditions lead to some definite uniform final state independent of inputs. Rule 128 exemplifies this behavior. A seed of single black cell will immediately phase out into a field of white, so to demonstrate how this progresses shown here is a seed of 7 adjacent black cells, which each subsequent row shrinking toward its final state of a white field.

The behavior here represents an evolution towards low entropy due to it’s uniform and organized final state as well as low complexity using the complexity measure of difficulty of description — this final state is very easy to describe. Even though it took a couple of iterations to reach this final state the long term evolution is what dictates the classification.

## Class 2 Elementary Cellular Automata — Repetition or Uniformity

Class 2 elementary cellular automata may have many potential final states depending on inputs but they all are a set of simple uniform or repetitive shapes, and thus generate an ordered and predictable regime. Rule 1 exemplifies this behavior.

What these class 2 cellular automata systems lack in complexity they make up for in clarity. Stepping outside of cellular automata for a second, it’s worth noting that repetition can serve an important function in music, literature, or imagery by drawing attention to important points. Through the repetition and rhythms of life we slowly reinforce the forces of habit into our subconscious, and in the vernacular of psychologists complex systems which were previously the domain of our systems 2 thinking are gradually worked into our system 1 through the development of heuristics. Sometimes just the simple repetition of being in someone’s presence or proximity over time we may even grow a certain fondness. The author Jorge Luis Borges once said that “a poet has maybe five or six poems to write and not more than that. He’s trying his hand at rewriting them from different angles and perhaps with different plots and in different ages and different characters, but the poems are essentially and innerly the same.” The same could possibly be said for complexity scientists.

## Class 3 Elementary Cellular Automata — Apparent Randomness

Class 3 is where things start to get interesting, in these rules the behavior gets more complicated with apparent randomness and fractal nesting of shapes. An important example of this class is rule 30, which was long used by the symbolic mathematical computation program Mathematica to generate randomness — I believe they’ve since traded for methods more effective. Even from a simple single cell input, rule 30 generates apparent randomness, with an interesting contrast between the regularity and predictability along the left edge of the triangle and complexity and irreducibility trending to the right, with apparent randomness found in both the distribution and size of triangular gaps. With these Class 3 rules the output is more chaotic than orderly, and we should find a kind of chaotic sensitivity to initial conditions. Such chaos becomes an amplifier of uncertainty.

The contrasted regularity along left edge disappears when we seed this rule with random inputs — hence you can start to see the usefulness of the single cell seeding as a convention for illustrating properties of the 256 rules.

Although not to the degree of complexity, randomness also has a few measures and metrics based on a system’s observed behavior — NKS offers some discussion of the history of defining the concepts of randomness and complexity in the end notes (as an aside NKS is the rare example of extensive end notes that deserve a comprehensive read on their own — not since Infinite Jest have I found such compelling end notes, and even if they don’t have the DFW flare for theatric literary tangents and multi-tiered departures, they make up for it in offering a comprehensive resource of definitions and history for concepts discussed in the main text (which is (presumably) geared towards a more general audience)). Perhaps a definition from the main text is more suitable for this discussion though: something is considered random if there are essentially no simple programs that can succeed in detecting regularities in it.

Complex behavior such as rule 30 illustrates the irreducibility of these simple rules. By irreducibility, I mean that if we wanted to know the value of the nth step, the only way we could determine would be to run through n separate derivation steps. A system’s irreducibility implies that trying to predict a system’s behavior using traditional scientific methods simply will not work, and that such behavior can be generated by such simple rules is one of the more interesting features of elementary cellular automata and hence complex systems in general.

Many of the rules that fall into this Class 3 won’t necessary illustrate randomness to the degree of rule 30 when seeded with a single cell, for others the single cell derives a pattern of fractal nesting.

There are many examples of this fractal nesting that can be found in the 256 elementary rule sets, Rule 18 is just one:

The type of fractal shape generated here is known as a Sierpinski Sieve. Fractals are another example of simple rules deriving complex behavior. The origination of a fractal doesn’t necessary have to be a cellular automata, they could also be derived by algorithms of subtraction from a form (such as removing progressively smaller triangles) or perhaps even a non-linear algorithm such as the famous Mandelbrot’s Set’s Z(z)=z²+c.

Fractal images can be detected by their self-similarity at different scales. A traditional measure for describing such creations is a fractal dimension. Although the Rule 18 Sierpinski Sieve is drawn on a 2D grid, when we take account for the gaps we can derive a fractal dimension of less than 2 (which won’t necessarily be an integer). If we allowed an infinitesimal grid size Rule 18’s effective dimension would fall to 1.585.

Note that fractal nesting is traded for a more random distribution when Rule 18 is seeded with random inputs.

Fractal nesting and self-similarity can even be found in music, art, or literature. For example, some have even found a Cantor Set style fractal structure in Bach’s Cello Suite 3, which if you’re reading this post correctly you should be listening to right now.

As an aside, we should take note that the fractal dimensions discussed above only express differences in non topological aspects of a form. In his book The Fractal Geometry of Nature, Benoit Mandelbrot points out that most problems of real interest combine fractal and topological features in increasingly subtle fashion.

## Class 4 Elementary Cellular Automata — Mixture of Order and Randomness

Class 4 is perhaps the most interesting category, it involves a mixture of order and randomness, with local structures that may appear simple but through progression move around and interact with each other in complex ways — this behavior may become more obvious with more elaborate seedings beyond a single cell. In the set of 256 elementary rules, the class 4 category is the rarest, however if we were to increase the space of possible rule sets by allowing three shadings of cells (black/white/gray) for instance the portion of Class 4 systems grows.

In his book At Home in the Universe the complexity researcher Stuart Kauffman offers a description of complex systems as those that fall in the ordered regime near the edge of chaos — this is a perfect description of the behavior we find in this Class 4. For the example of Rule 110 shown above order can be found both in the background pattern as well as the structure of the leg protrusions — however it is in the interactions between the various protrusions that we find a degree of randomness and chaos. This type of complexity is analogous for behavior we find in all kinds of systems in both the natural and artificial world, with some examples being organizational dynamics, ant behavior, or perhaps even electrical grid distribution systems.

This is the first we’ll consider complexity of systems beyond those most elementary of cellular automata, and although it is quite a leap we’ll find that a lot of the same principles apply even to those systems at the societal scale. Some primary principles that will be considered in modeling a system (along with their analog for cellular automata) include the type of interactions (rules) of agents (cells), the extent of interaction with other agents (in elementary cellular automata a child cell is influenced only by three parent cells), the uniformity of agents (in elementary cellular automata all grid points are identical 2-state points), the extent of hierarchy in governing rules and agents (not applicable to the elementary rules), the impact of randomness or probabilistic behavior in the rules of interaction (in elementary cellular automata there is strictly determinism), the number of potential behaviors (elementary cellular automata cells only have two states), the dimensions of interaction (our automata act on a 2D grid but because they only evolve in one direction they are considered 1D), the parallel or independent nature of agent updates (in our automata grid an entire row updates in parallel), or even the scale of the system.

In his book Making Things Work the complexity researcher Yaneer Bar-Yam uses concepts taken from the study of complexity and applies them to evaluate human scale systems such as healthcare, education, or organizations. It is a common theme in these systems that scale effects can have a big impact on the behavior of a system — there is a reason that a startup company has material advantages when competing against the bureaucracy of a Fortune 500 firm. A complexity theorist studying an organization may look at the interactions of participants for sources of complexity, and as we increase the cross-silo interactions of an organization from old-fashioned email to social CRM tools like salesforce.com’s Chatter or modern communication tools like Slack the potential for organizational complexity increases (a good thing to those engaged in a complex competitive environment). To those looking to operate in a environment of extreme complexity, Bar-Yam cautions that traditional systems engineering approaches to design are not suited for this category (as has been the case in large scale failures such as the attempted redesign of the US air traffic control system or IRS tax modernizations) and instead suggests the alternative of an evolutionary approach to system development through experimentation, if necessary coupled with redundancy.

The concept of evolution plays an important role in the field of complexity. The reason we are able to find complexity in the real world is because evolution takes systems there. Consider that in an ordered system such as those we found in Classes 1 or 2 there is no potential for mutations or deviations that would seed an evolution, while for those more random systems we found in Class 3 evolution would also not work as any mutation or deviation would generate a chaotic fluctuation in the resulting state allowing no room for adaptation. Thus it is only the cohabitation of order and chaos that we find in Class 4 in which evolution can perform its magic.

## Coarse Graining

As we’ve journeyed from Class 1 to Class 4, we’ve found increasing complexity and more elaborate behavior with each step, however sometimes it may be desirable to glaze over the more abstract aspects of a system and focus on the macro states, what I will refer to as coarse graining. Geoffrey West, a former president of the Santa Fe Institute, has said “We will probably never make detailed predictions of complex systems, but coarse-grained descriptions that lead to quantitative predictions for essential features are within our grasp.”

In his book The Quark and the Jaguar the physicist Murray Gell-Mann uses an analogy of alternate histories of a horse race to describe the concept of coarse graining, with such a lens limiting focus to times in history of the race being won, only considering horses in the race (rest of universe is ignored), and only focusing on the tip of the nose of the winning horses. So for each potential horse’s win there exists a whole set of alternate histories of every combination of results, but by limiting our focus to these few points we can ignore all of the versions which have some interference (in the quantum sense).

## Cellular Automata as the Basis of Physics

One of the more controversial aspects of NKS is Wolfram’s proposal for a kind of cellular automaton (obviously not the elementary variety) as the underlying basis of physics. We’ve already addressed how the potential space of a system’s behavior can be increased by adjusting potential parameters of a cellular automata model, but will just restate again here for clarity, as repetition can serve a useful function for reinforcement of key concepts:

Some primary principles that will be considered in modeling a system (along with their analog for cellular automata) include the type of interactions (rules) of agents (cells), the extent of interaction with other agents (in elementary cellular automata a child cell is influenced only by three parent cells), the uniformity of agents (in elementary cellular automata all grid points are identical 2-state points), the extent of hierarchy in governing rules and agents (not applicable to the elementary rules), the impact of randomness or probabilistic behavior in the rules of interaction (in elementary cellular automata there is strictly determinism), the number of potential behaviors (elementary cellular automata cells only have two states), the dimensions of interaction (our automata act on a 2D grid but because they only evolve in one direction they are considered 1D), the parallel or independent nature of agent updates (in our automata grid an entire row updates in parallel), or even the scale of the system.

What NKS proposes here almost has an air of “man with a hammer syndrome”, but I am only an amateur in field of physics so will keep an open mind. It is possible this passage was written prior to wider acceptance of Edward Witten’s proposal of a mysterious yet magical membrane (M) theory for instance. In order to demonstrate just how the universe could be the manifested evolution of an automaton, NKS offers a few examples of simple cellular automata that exhibit behaviors similar to those found in physics which actually make for pretty pictures. For example, this Rule 422R illustrates how a reversible cellular automata can depict properties consistent with evolution from a singularity to increasing entropy:

The principles of entropy and complexity have an interesting relationship, with possible applicability to the evolution of the universe. When one pours cream into a coffee and allow it to disperse, the complexity of the combination will at first climb as the simple spoonful of cream disperses and forms more elaborate shapes in the coffee, however as the mixing continues the randomness of the dissolved mixture starts to dominate and the complexity metric falls, all while the entropy continues to climb throughout as the 2nd Law of Thermodynamics dictates.

One of the hopes of the laws of the universe being derivable from simple rules like cellular automata is that we can then cycle through the space of potential rules on a computer simulation as a new means of discovery. Although this space of possible rules is infinite, we could perhaps use our knowledge of existing laws of physics to narrow the range somewhat. By experimenting with different models and running through a few resulting simulations for each we can test for known properties such as entropy, fundamental particles and their interactions, Newtonian physics, relativity, quantum mechanics, and strings — perhaps even one day stumbling across the very underlying rules of physics. It’s a cool thought and an entirely new approach to discovery, again this is the “new kind of science” that inspired the book’s title.

## Emergence

Although we have already addressed some of the behaviors associated with emergence, we haven’t really introduced the concept in full, so will now turn our attention to this fundamental and important property of complex systems.

As we’ve demonstrated with the elementary cellular automata, it is possible to generate complex and irreducible behavior using only simple rules. Emergence is how we refer to the behavior of a system that couldn’t be inferred simply from study of the individual underlying components. One can’t learn how an ant colony forages for food by simply inspecting a single ant, just as one can’t understand a person’s behavior by inspecting one of their brain cells.

As an illustrative cellular automata, we’ll turn to John Conway’s Game of Life, a 2D automata because it evolves along both the x and y axis based on rules as a function of all 8 surrounding adjacent cells. This is a well known and studied set because from its simple rules there emerge properties in it’s evolution that could almost be considered to resemble “living” units. There are potential patterns in the game that exhibit features such as traveling around the grid, breeding, generating new child patterns, and other curious interactions when patterns meet. If a cellular automaton ruleset really is the basis of physics it will certainly need to generate this type of emergent behavior, the Game of Life gives us a clue of what could be possible with even these most simple rules.

For a playful take on the differences between an emergent system and its underlying components I will briefly turn over to a passage excerpted from Douglas Hofstadter’s Gödel Escher Bach. Hofstadter has a unique voice which often includes language games and tricks such as reflections between dialogues and form or allegories with characters representing themes which serve to keep a reader on his toes. In this passage we join the four characters of Achilles, the Tortoise, the Anteater, and a Crab discussing an image found nested in a copy of the sheet music to Johann Sebastian Bach’s The Well-Tempered Clavier. The characters are trying to identify if there is an underlying theme to be found, some suggest we should focus on a reductionism to the individual components, others on the holism of the collection altogether. We find the Anteater argues for reductionism, the Crab argues for holism, Achilles goes nowhere fast, and the Tortoise settles on a mu.

## Principle of Computational Equivalence

Turning back to NKS, as we approach the final chapters we finally see all of the exploration of elementary cellular automata paying off via demonstration of new underlying principles applicable to broader complex systems. Wolfram proposes what he calls the Principle of Computational Equivalence. I will attempt to gloss it here but am sure that I will not do the significance of the concept justice.

The Principle of Computational Equivalence is a broad concept and a new way of looking at the world. In effect it is a way of stating that all processes are a form of computation. Further, the theory postulates that there is just one highest level of computational sophistication for natural systems, which is achieved by all processes with sufficient complexity. In other words, systems with sufficient complexity are capable of achieving Turing equivalence, or the ability to perform universal classical computations.

To demonstrate how this principle applies to even the most basic rules, let’s turn back to the space of elementary cellular automata. You may remember we discussed Rule 110 as an example of Class 4 behavior. Take a second look at the seeding with random inputs:

In the interaction of the various legs and protrusions we can start to imagine setting this system up to achieve a kind of intentional interaction between the various features. Of course we have no control of the progression of the system other than our initialization via the input row. Thus as the computational complexity of our desired calculation increases, we will need a corresponding wider band of field in play. What is cool about this particular example of Rule 110 is that it has actually been proven that Rule 110 is universal, so not only can rule 110 simulate a Turing machine, it can hence simulate any other cellular automata as well, to demonstrate here is a further illustration from NKS, with a universal automata simulating the Sierpinski Sieve of rule 90.

Take a second to think about the implications of any system of sufficient complexity having universality, and then consider the extent of these behaviors showing up in the natural world. Here is a seashell with a cellular automata style shape on its shell close to rule 30. Does that mean this shell has Turing equivalence? The principle of computational equivalence implies that it does although in limited fashion due to its narrow field. This is a silly example but it is also illustrative of what it means when all complex systems can be considered capable of computation.

A more interesting example could be considered in a system we are all familiar with — the human brain. The theory of computational equivalence implies that the our neurons’ emergent consciousness are performing the same sophistication of computations as a Turing complete ant colony, albeit with a wider calculation space due to our billions of neurons and trillions of synapses. Not everyone would agree with this assertion of the brain as a classical computer. In The Emperor’s New Mind Roger Penrose explores the theory of mind as a computer, and concludes that there must be some kind of quantum effects at play to account for the global nature of consciousness, and even looks for specific neural structures in the brain that could account. So is our brain a Turing machine or a kind of quantum computer with coherence at room temperature? Just as any internet discussion board if carried on long enough will eventually turn to comparisons of Hitler and Nazis, it is perhaps the case that any discussion of complexity will eventually turn to theories on the brain and consciousness. Personally I don’t have a dog in this fight, but found some wisdom again from the writings of the author Jorge Luis Borges. In his short story Fune, His Memories Borges describes a character with photographic memory who sees and experiences all senses at complete fine-grained detail. His narrator then ruminates on the nature of consciousness, claiming that “To think is to ignore (or forget) differences, to generalize, to abstract.” I see in this statement a claim of perhaps a quantum coarse graining in what we experience with consciousness.

In 1930, the Austrian philosopher Kurt Gödel proved that in any formal system of logic, which includes mathematics and a kind of idealized computer called a Turing machine, there are statements that cannot be proven either true or false… Gödel says you can’t program intelligence as complex as yourself. But you can let it evolve.

from

Free Will: Now You Have It, Now You Don’tby Dennis Overbye

A fitness landscape, as may be targeted for use of an optimization algorithm such as those discussed in the preceding post, will likely have a degree of complexity about it, especially when you consider cases where it is not stationary or has feedback effects to your subject of optimization. In his book Fire in the Mind the author and columnist George Johnson uses as both analogy and metaphor for fitness landscapes the mountains and hills of New Mexico surrounding the Santa Fe Institute. An algorithm trying to minimize a function could be like a Native American on a spirit journey, whose vision of the horizon is always obstructed by surrounding hills and mountains, but through the random path fluctuations such as through a simulated annealing he increases chances of scaling the right peak to find the valley he seeks for wisdom and solace.

## *For further readings please check out my Table of Contents, Book Recommendations, and Music Recommendations.

Books that were referenced here or otherwise inspired this post:

## A New Kind of Science — Stephen Wolfram

## Gödel Escher Bach — Douglas Hofstadter

## M.C. Escher — The Graphic Work

## The Annotated Turing — Alan Turing and Charles Petzold

## Kind of Blue — Miles Davis

## The Fractal Geometry of Nature— Benoit Mandelbrot

## At Home in the Universe — Stuart Kauffman

## Making Things Work — Yaneer Bar-Yam

## The Quark and the Jaguar — Murray Gell-Mann

## The Well-Tempered Clavier — Johann Sebastian Bach

## The Mind’s I— Douglas Hofstadter

## The Emperor’s New Mind — Roger Penrose

## Collected Fictions — Jorge Luis Borges

## Fire in the Mind — George Johnson

## Jazz Giants — K. Abé

*(As an Amazon Associate I earn from qualifying purchases.)*

Albums that were referenced here or otherwise inspired this post:

*I’m just an amateur blogger doing this for fun and hopefully to make a few new connections along the way. If I have violated any copyrights here I would be happy to edit just let me know. I can also be reached on twitter at @_NicT_.*

For further readings please check out my Table of Contents, Book Recommendations, and Music Recommendations.