Cosmogony

Bruno Monteiro
Synesism
Published in
7 min readOct 16, 2016

How then did our physical world came to be in light of these ideas?

The set of all things with which we have some sort of direct contact and possess measurable properties (ie, are physical) is called the universe, or more usually the observable universe. Our observable universe is, to the best of our knowledge, a system in the synesistic sense, which means its semonic value must remain constant — or at least approximately constant — over time. It does, however, undergo several phase changes throughout its life cycle, which translates in lots of partitioning and regrouping of its constituent elements.

It all started with the Big Bang. The initial stage of the universe was the singularity, when all matter and energy, as well as time and space themselves, were combined in a plasma of infinite density and temperature a trillionth-trillionth-trillionth of a meter wide. That was the primordial semon, or synod, of everything we see today — but it didn’t last long. An infinitesimal fraction of a second after coming into existence, it expanded violently in what’s known as inflation, and in the process that original semon became two: energy and spacetime. Spacetime grew several orders of magnitude, diluting the energetic contents of the universe — which in turn were divided over nearly equal amounts of particles and antiparticles, each produced in gigantic numbers and governed by the Standard Model formalism. Those immediately recombined, releasing troves of radiation as a result; from a small imbalance in their portions though, the remaining particles of matter (an estimated 10⁸⁰!) populated the infant universe with what would later become all its stars and galaxies.

Both the particles and antiparticles created in the early universe came in two families of three generations each, with their corresponding force carriers. That we know of, that’s pretty much the end of the story (though we intentionally glossed over some ‘details’, like dark energy/matter; in any case, the events described here have of course been grossly simplified), but there are indications it might not be so. Theoretical physicists speculate the existence of structure at much smaller scales than those of so-called fundamental particles. A lower limit exists, however, beyond which the very notion of distance loses its meaning and our familiar physics starts to break down. It’s known as the Planck scale (approximately 10^-35 meters), and it’s believed to be the very conversion factor between spacetime and matter — the realm of quantum gravity. If that’s true, then every Planck unit is itself a special sort of semon — an anad. Now, if you recall, an anad is a semon ‘locked’ as such, and therefore incapable of partitioning any further; hence, they are the true atoms of nature, and the smallest possible element of analysis or composition (indeed, they bear the most resemblance to Leibniz’s original formulation of a semon). A rough estimate puts the number of anads in our observable universe as 10²¹⁸, which is an absurdly huge, and yet, finite number. No subsystem of the Worldtree is properly infinite, because that would imply a cardinality bigger than that of the Worldtree itself (infinity to the power of infinity yields a bigger infinity), and conversely that’s why every system (=universe) must have anads as its most fundamental constituents.

Being a closed system, it’s reasonable to say the number — and hence overall properties — of anads has remained the same ever since the universe’s inception, as well as their overall semonic value, and all evidence points that will remain the case until its final demise, whatever and whenever it may be.

On the theme of the universe’s structure, let’s now consider the subject of entropy. Arguably one of the most mysterious concepts in science, it relates directly with those of time, information and thermodynamics. Entropy is roughly a quantification of the “messiness” in which a system is found; the basic idea here is that entropy is maximum when all outcomes are equally likely. What that means in practical terms is that the simpler the system, the less information you’ll be required to describe it, the most usable energy you’ll have available, the less entropy it will have. The canonical example is that of an egg: it may have all sorts of shapes and sizes, but overall it’s an incredibly ordered (note that the term is here employed in its commonsensical meaning, not the wolrdtree-related one) and simple system — all yolk goes in the middle, all calcium goes in the shell, which in turn is only found at the surface, etc. -, that is until you decide to break it; turns out there’s an infinitude of configurations a broken (ie, orderless) egg can take, and it’s easy to see why. Just think of the algorisms 0 to 9, how many ways are there to arrange them in a way that preserves their order — from lower to higher? Just one. But how many ways are there to arrange them if you forgo any order requirement? That’s 10 factorial! What that tells us is that all the interesting stuff starts happening at the moment we begin to break the rules or loosen the requirements; the arrow of time then becomes simply the route taken from the simpler configuration of a system (a semon) to one of its multiple variants (a higher-order semon). What drives change, and consequently time, is then the very process of semon partitioning. We can extrapolate from that thought and say that the end of the universe will come when it reaches a state where the only remaining sema will be anads, becoming once again ‘simple’ (though at the other end of the spectrum this time), and, in the absence of any other repulsive force, collapse back into a single semon. As above so below.

But let’s not get too ahead of ourselves. Back to entropy, you must’ve realized it is proportional to the informational content of the system. Since an ordered system is simpler and thus demands less information to be described (1000000000000000 is actually much simpler than 70518 because, despite being a bigger number, its formation rule is extremely simple [just write 10¹⁵], whether 70518 demands a bigger ‘recipe’ to be made — see Kolmogorov Complexity), its entropy will naturally be lower. On the other hand, the second law of thermodynamics states that the entropy in a closed system always increases, which explains why we don’t see time moving backwards but also ultimately allows for complex structures — like stars, heavy elements and life forms such as ourselves — to arise. Human beings are extremely complicated creatures, which means we require an awful lot of information to be made. Information, in turn, is nothing more than a measure of deviance. Consider our numeric example: what essentially makes 10¹⁵ a less information-intensive number than 70518 is that it has less diversity — while the former has 15 repeating digits and compounds from just two prime factors (2 and 5), the latter requires a nontrivial factorization of 5 different primes and results in a rather unremarkable number. The primes are the atoms of arithmetic, precisely because their representation is irreducible (meaning they can’t be decomposed in factors other than themselves and 1), which also translates as them being the most natural choice of construction for the entire number system.

Another way of formalizing this ‘less diversity’, or invariance, aspect of the natural world is through the concept of symmetry. In base-10 notation, the number 1000000000000000 has obviously a lot of symmetry; to arrive at it, you simply multiply 10 by itself fifteen times. That’s because almost all its place values are the same in that base, so you could exchange one digit for the other and nothing would change. Of course, that’s only one, rather simplistic example of this phenomenon, but the fact is symmetry (and asymmetry) comes in several varieties, and indeed you can model anything using it as a language. To have asymmetry though, you first need an underlying symmetry to be broken, the same way you must first have an ordered state to be able to achieve a disordered, higher-entropy one. Symmetry is most importantly manifest in the form of the natural laws that govern our physical world. It is known for over 100 years, thanks to groundbreaking work of physicist Emmy Noether (from whom Noether’s Theorem gets its name) that every differential symmetry of the action of a physical system has a corresponding conservation law. This was an important discovery not only because it made calculations of conserved quantities in physical systems much easier, but mainly because it provided valuable insight into the nature and character of physical law, once held as a nearly supernatural or given feature of our universe. It also allowed us to push theoretical considerations further than ever before; if quantum gravity researchers are able to pry into the very-high-energy domain where a unified theory may inhabit, that’s only because of symmetry’s incredibly ubiquitous and scalable quality. If mathematics is the language of nature, symmetry is the alphabet in which it’s written.

These three main ideas — entropy, information and symmetry — are very intimately linked, and their relation becomes even more explicit once we consider it through the lens of Synesism. As a matter of fact, they can be represented very straightforwardly as follows:

Information ∝ Entropy ∝ Asymmetry

That reads as “information is proportional to entropy, which is proportional to asymmetry”. All these refer to some nontrivial property of a system while, conversely,

Redundancy ∝ Syntropy ∝ Symmetry

Refers to some trivial one (remember our exploration about sema).

In this balance between “order” and “chaos”, ones and zeros, bloom the composite beings that populate our universe — and all others. All structure arises from the criticality (see definition) of symmetric and antisymmetric forces in our cosmos; from the most middling atom, to the most grandiose galaxy cluster and the most intricate lifeforms, it all boils down to something like Conway’s Game of Life playing over epic scales: a checkerboard of tiny anads clumping together, tearing apart and moving around the table, not much different from Democritus’ first proposed theory of the universe in its fundamentals, over two millennia ago.

Conway’s Game of Life — the universe on the sub-Planck scale

--

--