Aren’t those opposites because a complex system has a greater chance of catastrophic failure, and…
Evan Nibbe
11

The Future Is Chaos

Is there any difference between entropy and complexity?

By MARTIN REZNY

So I have decided to read up on the difference between entropy and complexity because I realized I wasn’t sure what it is, exactly. What I ended up finding to be the most interesting was a third option that connects both, chaos. I have found a brilliant article by Michel Baranger from MIT which explains it well to non-physicists using as few difficult words and as little math as possible. If you want the science in detail, I strongly suggest reading it.

Good soundtrack to listen to when you’re thinking about this

The Entropy Is a Brain Failure

What I’d like to write about are some of the implications of this that I can think of for the nature of the future. But in order to do that, I do have to attempt to briefly summarize what’s the difference between entropy and complexity. Starting with entropy, what surprised me the most was to learn that the 2nd law of thermodynamics actually references our own subjective issue more than it does anything that’s strictly speaking physical or natural.

Put as simply as I can manage, entropy is disorder. Think about it, what is order? How do you measure order? Turns out that our lowered ability to predict a future state of a system means it becomes less ordered. Our ability, based on the information we store and process. Nothing about the material composition of the system changes in time, total amount of energy is conserved (the 1st law of thermodynamics), it just slips our mental control.

As Baranger puts it, a blob that represents the phase space of a system, a map of all possible states of which the system takes only one at a time, turns in time from a manageable blob to a fractal insanity. After a variably long while, even though the surface area of the fractal is exactly the same as that of the initial blob, its constant physical amount of entropy, at some point we have to draw a larger blob around that fractal just to be able to deal with it in any way.

To me, the real culprit to be concerned about here is chaos itself, the driving force of this process. What that means is that most natural systems evolve in a non-linear fashion, they’re very sensitive to small differences in initial conditions. Even the slightest difference between two starting points of something like a weather system leads very quickly to exponentially diverging paths those systems actually end up taking, the proverbial butterfly effect.

Butterfly effect, bitches!

Complexity Is Surprisingly Complex

Based on this point of view, what entropy means is that we’re destined to suck at predicting future. Before I read Baranger’s paper, I would have added “of complex systems”, but it turns out there’s a lot more nuance there. Firstly, simple systems can and often do behave chaotically too, even if you just map motion trajectory of an object or something as simple as that. But to me even more surprisingly, complex systems, poorly understood as they are, can be characterized better as existing at the edge of chaos, which is super cool.

If there’s anything to be said with any certainty about what complexity is, it would be that it is complex, not just one of anything. It’s not just ordered, as it is not just disordered. It’s many interdependent constituent parts, structured on multiple scales, with behaviors simply emerging on some of them without it being explainable based on any specific interplay of any of the individual parts on any single level. Complexity is cooperating and competing at once.

If you allow me one more reference to entropy, systems are only doomed to become more and more disordered in time if they’re not acted upon by anything else from the outside, only if they’re isolated systems. By the very nature of complexity as systems among and within systems, the normal state of complex systems is exactly the constant inter-reaction. It sounds to me more and more like complex systems can actually do whatever the hell they want.

To put it again as simply as I can manage, complexity is life. It’s in fact best defined in reference to biological systems. The reason why we’re much better able to predict much of celestial mechanics than human behavior is that the celestial mechanics are vastly less complex than what’s happening in your brain right now. As for the connection to future, complexity of life is creating ever more disorder by its escalating ordering of stuff on ever larger scales.

Somehow, search for “entropy” literally yielded apocalyptic imagery from Revelations — The Great Day of His Wrath by John Martin

The Future Is Chaos

This to me opens up so many more questions. Is information real? I mean, as real as energy or matter? If it isn’t, if it’s just an artifact of math or some subjective property that only makes sense to our minds, is entropy a real thing at all? And assuming it isn’t, why does the arrow of time arbitrarily aim in its direction? Is it aimed in the direction of chaos, then? But why? Chaos itself is a math thing, or is it? It just describes the behavior of one type of function.

If I try to take a step back from pure physics or math artistry, what it sounds like to me is simply that the further into the future you try to project your idea, the less chance you have of being sure that it will in fact become a reality. As the time stretches out away from us, there are exponentially more and more possibilities of what can be the state of reality. From this point of view, the attempts at predicting the end of the universe feel laughable.

How can you believe you know what will the whole universe be like in trillions of years, when you can’t predict the weather next month on almost any scale? Sure, it should be easier on larger scales for probabilistic statistical reasons, the law of averages and all, but let’s not kid ourselves. We can’t really predict climate either. The current ideas are something like imagine an egg getting fried, but what if the future state is extremely chaotic, but insanely complex?

We have no idea why time began, and to me, that sounds like we must have exactly no idea on how it could end. Remember that we don’t know why it’s running in the direction in which it is, let alone why it’s running at all. Also, if life is a form of rising complexity, it sounds like a rising complication to any predictability of anything, because once it grows big or advanced enough to manipulate celestial objects or even spacetime, not even entropy is assured.

You could demonstrate mathematically that on the whole, entropy would be increasing overall in larger and larger blobs of phase space of the universe, but what if the universe itself, or the grand total of reality (let’s not discriminate multiverse theory), is in fact an infinite fractal, and there’s always a larger frame from which the entire domain of life-space would be acted upon from the outside of itself? In any part, there could always be order.

Classic “Time Elemental” MtG card illustration by Douglas Shuler, also possibly the future of life

I used to see entropy as ensured doom, but now it actually seems more like a proof that no doom is ever ensurable, given enough time. I think that’s a good note to end this on. Let me know if you’re still wondering about something.

Like what you read? Subscribe to my publication, heart, follow, or…

Make me happy and throw something into my tip jar

One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.