Kunal Shah’s ‘Entropic complexity’ tweets explained
I was recently revisiting statistical mechanics to understand social media trends and came across a tweet by Kunal Shah, while most people thought its thermodynamics but is actually statistical mechanics.
If you are not a tech or science geek, you can consider, ‘Entropy is a measurement of randomness or disorder’ and jump directly to the next ‘Explaining the tweets’ section, skipping the thermodynamics and physics.
Thermodynamics and Statistical Mechanics
Thermodynamic Entropy: Amount of heat loss when energy is transformed to work, Heat loss: disorder
The second law of Thermodynamics
The total entropy of any isolated thermodynamic system tends to increase over time, approaching a maximum value
This is still easier to understand with class IX physics knowledge and most are aware of this version of entropy.
Statistical Mechanics: Measures the number of microstates that leads to a macrostate. Number of microstates: disorder
What are microstates and macrostates? Let’s flip a coin
This figure shows the relationship between microstate and microstate while filling the coin. H refers to the “head” and “T” refers to the tail of the coin. H and T are microstates while (H,T), (H,H) and T,T) are macrostates. All microstates are equally probable, but macrostate (H, T) is twice as probable as macrostates (H, H) and (T, T).
The second law of thermodynamics( Statistical Mechanics Version)
In an isolated system, the system will always progress to a macrostate that corresponds to the maximum number of microstates.
📌 The entropy S of a macrostate is k times the natural logarithm of the number W of microstates corresponding to that macrostate.
k is called Boltzmann’s constant,
The microstates that give rise to a macrostate, the more probable the macrostate is, thus high entropy = High probable
In an Isolated system, the system will tend to progress to the most probable macrostate.
Explaining the tweets
Any entropic complexity will flow from a less probable state to a more probable state, thereby increasing the entropy following the statistical version of the second law of thermodynamics.
In the case of a Market cap, the probability increases with efficient companies with moats and data, but cannot be retained unless efficiency is continuously improved and will keep on moving to higher efficiency thus higher probability increase the entropy. To retain or increase market cap companies have to continuously increase their probability which as mentioned is because of asymmetric moats and data, but this asymmetry won’t last as competitors start doing the same and will have to continuously evolve
The same is the case with Wealth, where probability is higher for more efficient people due to skills and intellect, but cannot be retained for long unless the efficiency is improved continuously.
Spontaneously the entropy will keep increasing, from a more ordered system to a more disordered system, work needs to be done to keep things in order.
So which is correct, does entropic complexity spontaneously move to more efficient systems, or does work has to be done to decrease the ever-increasing entropy?
Both are correct, while Kunal has considered money/wealth as entropic complexity with the system as earth. The wealth/ money will keep on moving towards the most probable state.
Akshay has considered the company as the system, where entropy will keep on increasing( disorder), and to keep that in check work needs to be done to keep things in order which would lead to asymmetric moats and data.