Addison Maille
14 min readFeb 29, 2024

Within the standard understanding of complex systems is the property known as emergence. At its core, emergence refers to the process by which new properties, patterns, or behaviors emerge in a system that are not explicitly present in its individual components. Emergence cannot be predicted from studying the parts in isolation. There’s nothing that comes from studying a single ant that would describe the emergent behavior of the ant colony. While we can often glean these emergent behaviors, patterns, and properties by studying similar systems, if that one component was all we had, then we would be stymied from making an accurate prediction. But, there is a flaw in the ointment.

Embedded in the modern understanding of emergence from complex systems is the notion that the order that creates the emergent behavior is or can come from randomness. Put another way order can spontaneously arise from chaos. In fact, chaos destroys order, rather than building it up. Chaos, aka randomness is literally the antithesis to order. The more complex a system gets, the more ordered it must be. That order is seen in what we perceive as fine tuning.

In my article Why We Are Getting Worse At Understanding Complex Systems, I present what I call the law of fine tuning. What I call the law of fine tuning states the more complexity in a system, the more specialized one or more of the components, inputs, processes, and/or outputs must be to produce a functional system. If one or more of these 1st principles of systems aren’t finely tuned then the complexity necessary cannot be achieved for emergence to arise.

If you are skeptical, then let’s look at economies to start with. Economies are artificial ecosystems that serve to increase economic activity in what we might better think of as the exchange of resources. Right off the bat we see that if a given economic system isn’t finely tuned, it has difficulty growing. In order for two hunter gatherer tribes to be economically intertwined, they must have something of value to trade between each other. If my resources are literally the same as your resources then we will have no reason to trade and we will likely not bother. But if I have something you don’t or something you don’t have as much of as you would like, and vice versa, then we trade and end up in a place with more total economic activity as we both maximize what we are better at producing than we might otherwise do so in isolation. This economic interaction creates a larger system with a degree of specialization on both sides that leads to greater economic complexity than if each tribe had kept to itself. This example would be the first steps to economic fine tuning required for larger economies and civilizations.

The increase in fine tuning allowed for a more complex system. Each tribe started specializing which led to an increase in complexity. We see this again and again and again in system after system after system. To increase the complexity of any given system requires the increase in fine tuning. The more economic activity we see in a given economy the greater the degree of fine tuning of all the different professions, services, products, rules, regulations, norms, and on and on. As the economic system grows in its complexity, so must its fine tuning. In a way, fine tuning is actually our response to emergence itself. Phenomenon that we couldn’t otherwise predict comes about and systems are put into place that are more finely tuned to deal with the emergent behaviors, patterns, and so on.

This same phenomenon can be seen in ecosystems, organisms, and even systems within organisms. As an organism gets more complex, its tissues and organ systems get more finely tuned. Even the cells get more complex and finely tuned as we go from prokaryotic single celled organisms to eukaryotic multicellular systems. Whether the complexity is artificially created by humans, naturally created by biology, or seemingly written into the fabric of reality via physics and chemistry, the more complexity we find the higher degree of fine tuning that’s required to make it work.

And just as this hypothesis would predict, the more complex systems we study ranging from freakishly complex supply chains prior to the COVID19 pandemic to the most intricate economies and ecosystems, they are all the most vulnerable to even the slightest loss of that fine tuning. Tweak the system just a little bit and it starts rapidly breaking down. Tweak it a lot and it gets annihilated. Civil wars destroy economies. Urbanization destroys ecosystems, and mass disruptions in resources and sea routes destroy complex supply chains. Any loss in fine tuning from random/chaotic events leads to a massive reduction in the complexity of a system.

Perhaps the finest and most irrefutable example of this is in the fine tuning of the universe. Physics appears to be the most fundamental scientific realm we have a decent understanding of. When we look at the various laws of physics such as entropy and inertia, the major forces of physics such as the nuclear force, electromagnetic force, strong force, weak force along with astrophysics in the form of the universal constant, gravity, and on and on like this, the amount of fine tuning to create the known universe is beyond anything we’ve ever studied. Protons can’t do the work of electrons and vice versa. Each particle and force is freakishly specialized which creates a much higher degree of complexity.

This also applies to, perhaps the most complex system that we all take for granted as being random, evolution. This truth has become so evident that even atheist evolutionary biologists have started calling for a new theory of evolution. The current theory of evolution has failed every concrete experiment we have thrown at it because randomness doesn’t create complexity. To understand this, we must get a 1st principles definition of Neo-Darwinian Evolution.

The most precise definition for the modern theory of Evolution is that complex biological systems can be developed through random iteration. To show how this breaks down, we need to take a much closer look at viruses, bacteria, and multicellular organisms. The nice thing about viruses and bacteria is that they reproduce and mutate fast enough that we can track their evolutionary change over real time, rather than having to guess at it via fossil records. Here’s what we see. Viruses mutate to more virulent strains much faster than bacteria by orders of magnitude. This makes sense. Viruses have genomes that are typically between 2–200 genes in length. Equally telling is that the smaller genomes, such as influenza (10 genes), HIV (9 genes), and SARSCOVID II (29 genes) mutate at a fairly fast rate. But much larger viruses, like smallpox (200 genes) are very slow to mutate and appear to be very limited in their mutations as well.

Equally interesting is that the smallest viruses hardly mutate at all because they can’t change the systems they rely on to produce their progeny. Circoviruses typically only have two genes. We can think of these two genes like the three pronged plug required for a toaster. If we have the entire toaster to work with then we can create quite a few changes to the toaster without much difficulty. But since we can’t mutate the outlet in the wall, the plug must stay more or less the same, save the ability to remove the grounding wire. The circoviruses are effectively viruses that are so small in their genome that they amount to little more than a plug that fits the outlet. If we attempt to change it by much at all, we will lose the ability to plug it in.

In fact, the rate of change of smallpox was so slow that it allowed the world to use a vaccine and literally wipe out the virus from the wild. In effect, because it changed so incredibly little, likely due to its inability to change relative to how the vaccine worked. Because it couldn’t evolve a way around the vaccine, the virus called smallpox, once one of the most deadly viruses circulating in the world, literally went extinct. Not only was the virus unable to evolve a way around the vaccine, which had been in circulation for more than a decade, but even the level of mutation rate it did have was still much slower than small viruses like COVID19 that escaped the first attempt to vaccinate against it in less than a year.

However slow we may think the evolution of larger viruses, like smallpox are, one look at bacteria will tell an even slower tale. Bacteria have genomes that range in size from roughly 4,000 genes to 7,000 genes with the simpler prokaryotic bacteria ranging from 4,000 — 5,000 genes and the eukaryotic bacteria ranging from 6,000 — 7,000 genes. Not surprisingly, the mutation rate of eukaryotic bacteria is slower than the mutation rate of prokaryotic bacteria. Again, the need for fine tuning means they can’t change as fast because it’s harder to hit a useful mutation via random iteration, aka chance. Let’s get a better understanding of this by looking at actual timelines.

While certain viruses can evolve over weeks, and months, bacteria evolve over years and sometimes even decades or centuries. Viruses can do this due to two factors. RNA viruses (common cold, the flu, and COVID19) mutate at a very high rate of roughly 1 in 10,000 to 1 in 1,000,000 replications. Bacteria typically mutate at rates that are between 100,000 and 1,000,000 times less frequent than that. This completely cancels out the sometimes faster replication rates of certain bacteria like the Pneumococcus bacteria, which is among the most common bacterial infections encountered by humans along with E. Coli. Both can replicate as fast as 20 minutes if given ideal conditions. This is far faster than viruses that take several hours or longer before they can replicate due to their need to invade cells.

Not only do viruses mutate at a much higher rate, but they typically have a higher load in infected people. A person that has a substantial viral infection will typically have billions to trillions of circulating viruses while bacterial infections of a substantial bacterial load will typically have millions to billions of bacteria in its host. While both bacterial and viral loads are difficult to measure, their trend is rather obvious. There are far more viruses than bacteria in people that are sick with either one.

Take COVID19 as an example which likely infected billions of people in 2021 and 2022. By contrast the pneumococcus bacteria, which is one of the most common bacterial infections within humans and responsible for several diseases like pneumonia and meningitis, only infect tens of millions of people each year. This adds another three orders of magnitude greater amount of mutations in viruses than in bacteria that infect humans each and every year.

If you are keeping track, this means that the overall mutation rate, meaning the number of mutations generated each year by highly transmissible and mutateable viruses like SARSCOVID II, is roughly 10,000,000,000 greater (that’s an increase of 10 trillion) than more mutateable bacteria such as the pneumococcus bacteria. Before anyone loses their shit over my back of the envelope calculations, they are intended to show the general directionality of evolution as we go from simple biological systems, like viruses to the next form of life in the evolutionary complexity scale, bacteria. So whether the real number is 10 trillion or 10 million, that’s still an awfully big increase in mutation rate. And, of course, this is just talking about general mutation rate. The study of evolution is the study of successful mutation rate, meaning the mutation confers a benefit to the host, over time.

When we get to multicellular organisms, even the simple ones like the roundworm C.elegans with its 20,000 genes, there’s no question why we can’t actively study its evolutionary change through time. By literally every mathematical estimate made by biochemists even something that might seem like modest changes to complex multicellular organisms, like going from a three chambered heart to a four chambered heart, would take more time than the universe has been in existence. And ramping up to the complexity of a human being from something as simple as a roundworm is preposterously difficult due to the increase in complexity of a human being. There’s not even a valid hypothetical process that has been proposed that can actually trace how to build up that much complexity over even the timespan of our universe. But not to worry, it gets far worse.

The equally large problem comes from the carnage that would be wrought in the process of evolution of multicellular organisms. The vast vast vast vast majority of organisms that were exposed to randomized iteration of their genome would die. As the fine tuning required for a beneficial change increases, the odds that a random mutation would make that change decrease. Not only do they decrease, but the degree to which the change itself would have to be more complex to be beneficial increases as well. As fine tuning increases, the size of change to add a benefit or even just maintain functionality required keeps getting larger and larger as the system gets more and more complex. Just look at a distinctly non-evolutionary model to better understand this aspect of fine tuning.

When looking at the alphabet, there are a total of 676 possible two letter combinations. 676 is the square of 26 which I will write as 26^2. Of those two letter combinations, according to the most updated online English Scrabble dictionaries in 2023 there were 124 valid two letter words. This means about 18.3% or a little less than 1 in 5 random pairings of any two letters will result in a real word. This means that if we choose any two letters at random, like a lottery, then we would get a real word a little less than 1 in 5 times. For simplicity sake we’ll call this a 1 in 5 chance of getting a word.

Now observe what happens to our chance of successfully spelling a word through random iteration as we increase the complexity from 2 letter combinations to 3, 4, 5… up through 9 letter words.

2 letter combinations, 1 in 5

3 letter combinations, 1 in 17

4 letter combinations, 1 in 114

5 letter combinations, 1 in 948

6 letter combinations, 1 in 13,423

7 letter combinations, 1 in 229,358

8 letter combinations, 1 in 2.6 million

9 letter combinations, 1 in 132.4 million

This also gets worse and worse because the similarity between the various 9 letter words and larger words are practically nonexistent. I can easily Change the word bad to dad, bid, sad, lad, and so on. A single letter can be changed to create a different word with its own definition and so on. In other words, I can a single letter and get different functionality out of the word. But, if I want to try an evolutionary approach to spelling large 15 letter words, like going from uncopyrightable to accomplishments effectively requires a complete rewriting of the entire word. This is exactly what happens as we try to make significant changes to any complex system. To make large changes like going from gills to lungs or legs to wings requires a massive change to the source code. That change would be so large, that small iterative changes wouldn’t get us anywhere close.

Just as small iterative changes to 15 letter words would lead to utter jibberish, so too would small iterative changes eventually kill the organism long before we had viable lungs or wings. There’s no good way to do it. Even if we had the trillions upon trillions upon quadrillions to even googles of years (a google is a 1 followed by 100 zeros) required to work out the math to make these large changes through randomized iteration, the organisms would’ve died due to the disruptions in their genome long before then. Again, chaos/randomness destroy order and complexity, never build it.

This is why even atheists such as Nobel Prize winner Sir Roger Penrose have stated that the universe does appear to have an extremely high degree of fine tuning. Regardless of whether you believe this is due to God, some other higher power, the multiverse, or mutant unicorns, when and where systems become more complex, they require a greater degree of fine tuning, which is, by all available evidence, the exact opposite of randomness. Functional complexity tells us that whatever we are looking at isn’t random.

So why does any of this matter? Who cares if this or that scientist attributes a given property of biology to adaptation, evolution, or those damn unicorns? The short answer to this problem is that how we think about and understand complex systems will radically change how we approach them. Just take a look at what AI is actually doing to our world at a structural level. AI is in the process of giving us the capacity to change highly complex systems like human learning at incredible scale and speed without the usual understanding that comes with making such changes. Even through the Industrial Revolution, humanity was making changes to its world and the natural world at a pace that was, at least moderately understood by humanity itself. It was our innovations and our minds that were engineering these changes. But AI stands to change all that.

To make large changes to complex systems, particularly those changes that we don’t understand and haven’t tried at all, let alone at scale, is rather suicidal. If we magically added 300 horsepower to everyone’s car tomorrow then the number of accidents would skyrocket beyond belief. Large complex systems, whether it’s the flow of traffic or our complex supply chains to respond well to large unproven changes. And yet, we are in a world where the name of the game is making faster and faster changes at ever increasing scales. And while we can do this to computer systems, thanks to our ability to recreate and simulate those systems, we aren’t anywhere near as good at doing this to reality.

We’ve long had political leaders that have attempted to radically change complex systems in the form of revolutions, often altering governments to put into practice that which was more theoretical than practical. The bloody history of communism is a tragic historic reminder of this fallacy. Dictators have been playing this horrible game of making massive unproven changes to complex systems in the name of progress, going back centuries and even millenia. And whenever they decided to make massive changes at a chaotic pace the systems usually broke down and mass level suffering tended to ensue.

What usually saved those civilizations and people, if they were saved, was that enough people that had a clue as to what was going on survived to reimplement the old systems and their benefits. This is why it was often said that great powers could endure two or three bad rulers in a row but rarely more. And if a string of bad rulers came about, which was often, revolution usually wasn’t far behind. But now, we’ve introduced the world’s first technology that actually removes the need for human learning at all. See my article Will AI Destroy Learning to get a better sense of this.

When we have a system that gives a small group of people the power to make massive adjustments to our understanding of the world at a scale and speed formerly impossible with no improvement to humanity’s understanding of those systems, disaster was almost always the result. Chaos doesn’t build up complex systems, it breaks them down. Here we all are, four years after the initial breakdown of supply chains and we are still nowhere near back to the interconnected glory that was 2019 and those freakishly robust supply chains.

I believe that our willingness to chance these massive changes to complex systems while reducing our understanding of those same systems stems, in no small part, due to our misunderstanding of the relationship between randomness and complexity. They are literally polar opposites. If we increase one, we will get less of the other. Complexity is never random, it comes from a high degree of fine tuning. And that fine tuning requires deep understanding, whether that understanding seems to be innate in the case of nature or learned in the case of humanity. These are literally the only two choices we have for understanding complex systems. Either it’s written into the code of the physics and/or biological organisms that keep it running, or it’s learned through a myriad of specialties the likes of which we use to keep our economies and infrastructure running. If we fail to understand this, as so many have failed to understand throughout history, then we will end up like those in history’s past that destroyed the very systems they relied upon.

Addison Maille

I am a learning enthusiast that is trying to improve humanity’s understanding of how learning works.