Information: The Language of Nature and Markets

Sami Al-Suwailem
Gödelian Letters
Published in
15 min readNov 25, 2023

--

The Universe is the ultimate information system. The capacity to communicate is built into every object, from electrons to cells to stars. But information is not possible without order. Order imposes a limit on the speed of information. This leads to the surprising emergence of time, money, and markets.

Image produced by DALL-E-3

Eyes speak all languages, wrote Ralph Waldo Emerson (1803–1882). We communicate not only verbally, but non-verbally as well. Nonverbal signals may be more important in our communications than words. We are not alone in communicating non-verbally — everything around us does.

The systematic analysis of the content of signals, verbal or non-verbal, is the subject of modern theories of information. At root, notes Benjamin Schumacher (2015, p. 239), information is the invariant quality shared by all kinds of signals.

Accordingly, information can be viewed as the language of the universe. It is the universal content carried by signals sent and received every moment by trillions of atoms, molecules, cells, rocks, planets, and stars. We know about the existence of an object from its signals. “It from bit,” John Wheeler conjectured decades ago (Siegfried, 2000, p. 245).

Gregory Chaitin (2023, pp. 41–42) argues that information theory provides an epistemological unification of mathematics and science. The same can be said of social sciences, particularly economics, as we shall see.

In this article, we examine how the very basic structure of information may have profound implications for understanding not only natural phenomena but also market dynamics.

What is Information?

There are many ways to address the question “What is information?” At a very basic level, however, we may describe a piece of information as:

A set of ordered symbols.

Consider the words “eat,” “ate,” and “tea.” They all have exactly the same letters. The only difference is their order. Order creates meaning and thus conveys information.

As another example, consider binary numbers.

The binary representation “1011” encodes the number eleven, “1101” encodes thirteen, and “1110” encodes fourteen. These binary numbers (1011, 1101, and 1110) have the same number of 1s and 0s, but they encode different quantities simply because of order.

We notice that each word or binary number has distinct symbols. They are separated in space and time. They are clearly separated in space (on the page or the screen), but they are also separated in time: Each word or number must be read from left to right. The direction of reading is extremely important in determining the information content of a set of symbols.

“Symbols” in our definition include signals like light or electromagnetic waves. As Shapiro and Varian (1999) point out, anything that can be encoded in bits (0 or 1) is information. Bits are symbols, and hence, signals are included in our definition.

According to Shanon’s theory of information (Stone, 2015), a bit is the amount of information required to choose between two equally probable alternatives. A binary number of, say, 4 digits, provides the information to choose between 2⁴ = 16 possible alternatives. These 16 alternatives could be the possible answers to 4 yes/no questions (yes = 1, no = 0). However, these questions must be properly ordered, or otherwise, the answers may not be informative. Again, without order, there would be no information.

Order is necessary (though not sufficient) for any form of information to be communicated. We tend to overlook this basic fact, but no mechanical or algorithmic procedure to encode or communicate information can work without proper order.

Logical Limit of Information Speed

The communication of ordered symbols requires that the speed of transmission must be finite. There must be a maximum limit to the speed of information. Why?

Suppose it is possible to transmit information at an infinite speed. In this case, all symbols will be transmitted instantaneously. But this means it will not be possible to tell which symbol comes before the other. At excessively high speeds, bits will become more and more indistinguishable. With an infinite speed, order will be lost. If order is lost, information is lost.

To preserve information, therefore, information must travel at a finite speed. A speed limit is necessary to maintain order and, thus, maintain information.

With an infinite speed, order will be lost. If order is lost, information is lost

The speed limit of information can explain the speed limit of signals in the universe, or the cosmic speed limit: ~300,000 km/s. This constant, as David Mermin (2005, p. 29) remarks, is “built into the very nature of space and time.” The cosmic speed limit is an upper limit to all signals in the universe. It is a universal constant of nature that affects almost all laws of nature, most notably causality (Giulini, 2005). Causality, in turn, is crucial for the existence of physical objects. This lends support to Wheeler’s hypothesis, “It from bit.”

A speed limit can be viewed as a conservation law of information. According to Leonard Susskind (2013, p. 9), the conservation of information is “the most fundamental law of all physics.” The scarcity of information, as we shall see, is consistent with its conservation.

Time, Space, and Uncertainty

Probably the most significant implication of the speed limit of signals is the emergence of time.

John Wheeler (1990, p. 10) remarked: “Time is nature’s way to keep everything from happening all at once.” If signals travel instantaneously, then events would unfold instantaneously as well. Everything would happen all at once. Without a speed limit, there would be no time.

Speed is the ratio of distance to time. So, if the speed of light is absolute, then time and distance cannot be absolute. Time and space must adjust in an exactly compensating manner so that the speed of light is constant for all observers. This shows that it is the speed of information that determines time and space, not the other way around.

The speed limit of information determines time and space, not the other way around

With time comes uncertainty. The fact that information cannot be instantaneous means that there is an impassable lag between physical events. This unavoidable delay introduces an impenetrable uncertainty about the incoming information.

Time Asymmetry and Irreversibility

Time has a direction. If the speed limit of information entails the emergence of time, it must have something to do with time asymmetry.

We are familiar with water waves created by throwing a stone in a still pond, but we never see waves spontaneously form at the edge of the pond and then move toward the center of the pond. Similarly, we don’t see a broken egg spontaneously un-break and reorganize itself into a whole egg.

The laws of classical mechanics intrinsically are time-symmetric; they do not distinguish going forward from backward in time. But the real world is obviously time-asymmetric. This has perplexed scientists for quite some time. “It’s a bit embarrassing,” writes Sean Carroll (2018, p. 43) “that with all of the progress made by modern physics and cosmology, we still don’t have a final answer for why the universe exhibits such a profound asymmetry in time.”

Source: pixabay.com.

The speed limit of information might shed some light on this dilemma. To un-break a broken egg requires coordination among the millions of molecules of the egg scattered all over the place to reunite again. Similarly, for waves to arise at the edge of the pond and move progressively towards the center, requires coordination of the millions of water molecules spread all over the pond to reverse the original wave pattern.

Such coordination, however, requires instant communication between the millions of scattered molecules, which violates the speed limit of information. The same argument applies to dissipative processes in general.

Irreversibility, it seems, is closely linked to the speed limit of information. The speed limit provides perspective on how time asymmetry can arise from time-symmetric laws.

Incompleteness of Information

The finite speed limit of information means that, within a given period, no more information can be transmitted than a certain amount. What if the speed limit is increased by 1%? More information could then be transmitted within the same period.

Suppose the speed limit of information is c. At this speed, an amount of information m is transmitted within a given period t. Now, there are infinitely many finite numbers larger than c; each of these numbers could have been a speed limit. At each of these possible higher limits, more information could have potentially been transmitted within t than m.

This means that, for a given speed limit c, there might potentially be far more information truncated during the given period t. In other words, within t, a given amount of information m can be transmitted, but there might potentially be an infinite amount that could not be transmitted due to the speed limit.

The incompleteness of information emerges from the necessity to preserve information

This implies that there is a sort of information incompleteness: Information that is impossible to transmit, not because of technological limits or frictions, but due to the very nature of information.

This incompleteness emerges from the necessity to preserve information by preserving the order of symbols or bits. Conservation of order, it seems, has a price: a potentially infinite amount of information must be truncated for the finite amount to be preserved.

Information Medium

One important consequence of the speed limit of information is that the speed would be sensitive to the medium of transmission.

If the speed were infinite, then it should not matter in what medium the information travels. Dividing infinity by 2 will still result in infinity. But dividing a finite quantity by 2 will result in half of that quantity.

For this reason, the speed of light is sensitive to the medium in which it travels. The maximum speed is established for the vacuum. However, the speed of light is different in water, glass, or other mediums.

Economic Information

Just like the maximum speed of light is different in different mediums, the speed limit of information will be different in different domains.

In the economic domain, information will naturally have a different speed limit than that of light in a vacuum, although it might not be obvious how to quantify this limit. More on this later.

The neoclassical theory of economics implicitly assumes an infinite speed of information: Prices instantaneously reflect all economic information, and they are instantly available to all agents at once. Although most economists agree these are unrealistic assumptions, they view them as the ideal benchmark in the absence of transaction costs or frictions.

At an infinite speed, there would be no information and thus, no markets

It was only during the second half of the 20th century that economists started seriously analyzing problems of information (see Stiglitz, 2020).

As we have seen, instant transmission of information is logically not possible. This impossibility is not technological or due to friction or bounded rationality. Rather, it stems from the basic definition of information. At an infinite speed, there would be no information and thus, no markets.

Moreover, if information is scarce, as Kenneth Arrow (1996, p. 120) points out, then it must have a speed limit. An infinite speed requires infinite energy for the transmission and processing of information, which entails unlimited resources. With finite resources, therefore, the speed of information must be finite. This shows the inherent scarcity of information due to the speed limit.

Money as an Information System

As is well known, money has three essential functions:

  1. It serves as a common unit of account or measure of economic value.
  2. It serves as a medium of exchange.
  3. It serves as a store of value.

The most important function is the first one: Only when there is an agreed measure of economic value can people trade using money as a medium of exchange. And only then can money be used as a store of value. In essence, money is a sophisticated record-keeping and communication system (Schumacher, 2015, p. 294).

As a unit of account, money represents the information framework needed to quantify the economic value of goods and services. The price of a commodity summarizes its market information.

Money allows the quantification of the information content of each commodity in the market

We frequently say, “You cannot compare apples with oranges.” But in the market, such comparisons happen all the time. The way to do so is to have a common denominator of both apples and oranges. This is the role of money. We find out the price of 1 kg of apples and the price of 1 kg of oranges, and we decide how to trade the two accordingly. So, for example, if the price of 1 kg of apples is $2 while the price of 1 kg of oranges is $3, then 1 kg of oranges is worth 1.5 kg of apples.

The comparison is only possible because we were able to quantify the information content of each commodity using a common language of money.

Medium of Money

Because neoclassical theory assumed information to travel at an infinite speed, there was no need for an information medium. For that reason, the theory has no place for money (coins, notes, etc.).

Mainstream economists struggled for a long time to explain the role of money in the economy (Martin, 2014). But with the inherent limit of information speed, money emerges naturally.

Currencies of The World. Source: worldatlas.com

Although there are all kinds of information in the economy (news, reports, etc.), ultimately, it is the transactions that determine the prevailing market prices. Hence, the speed of these transactions reflects the speed of economic information.

What is the maximum speed of information in the economy?

Since money is the information benchmark of the economy, the velocity of money can serve as an indicator of the speed limit of information related to the prices of goods and services. This shows why money must be different from other commodities.

The neoclassical theory assumes money is an arbitrary commodity and that all commodities are equally liquid (see Leijonhufvud, 1968, pp. 79–80). This assumption denies the role of money as an information reference for the economy. The liquidity of money is unique because it provides the maximum speed of information.

Uncertainty, Price Rigidity, and Liquidity Preference

With the speed limit, there must be uncertainty arising from the lag of information, as indicated above. Further, irreversibility due to time asymmetry, as discussed earlier, introduces another dimension of uncertainty. Yet another aspect of uncertainty arises from the uncomputability of efficient prices, as we shall see.

Uncertainty, in turn, plays a critical role in the emergence of money and other crucial economic phenomena that puzzled economists for decades (Goodhart, 1989).

One is “inertia,” or the lag of response of economic variables. Market prices might become more rigid due to the lag of information. In the presence of uncertainty, price stability is crucial to preserve the informational role of money.

In the presence of uncertainty, price stability is crucial to preserve the informational role of money

Moreover, a preference for liquidity, i.e., holding money (with zero returns), would emerge naturally in the presence of inherent uncertainty about the behavior of the market. With the lag of information, the concept of equilibrium might be ill-defined.

These phenomena are inconceivable in a world with instantaneous information. J.M. Keynes had to expend substantial effort and creativity to explain how such phenomena may arise in a free market (Skidelsky, 2010).

Since time and money emerge as a result of the speed limit of information, it is probably more accurate to say that information, not time, is money!

Algorithmic Information

An algorithm is a finite series of well-defined steps to compute or achieve a well-defined goal. An algorithm, in principle, can be executed on a computer.

Algorithmic information is information that can be encoded in bits (0 or 1) on a computer. Algorithmic Information Theory (AIT) is concerned with the relationship between computation and information.

The theory, developed in the 1960s independently by Andre Kolmogorov and Gregory Chaitin, studies inter alia the conditions for finding the most concise algorithm for generating (compressing) a given data set. By its algorithmic nature, AIT presupposes that information is processed in ordered steps rather than instantaneously.

The length, in bits, of the shortest description of an arbitrary data set is uncomputable

The length, in bits, of the shortest program to generate a given data set is defined as the “Kolmogorov-Chaitin Complexity” of the data. In general, the Kolmogorov-Chaitin Complexity is uncomputable: Given an arbitrary set of data (in bits), there is no general algorithm to identify the shortest possible program that can generate that data (Li & Vitányi, 2019, p. 127). This is the same kind of impossibility behind the Halting Problem.

The uncomputability of information complexity might be linked to the incompleteness of information discussed earlier. Because of the speed limit, there will be potentially an infinite amount of inaccessible information. An uncomputable property is an essentially inaccessible information.

According to Gregory Chaitin (2005), understanding is compression. To understand a phenomenon with rich and complex data means to find a relatively simple law that correctly predicts the phenomenon. The uncomputability of the Kolmogorov-Chaitin Complexity means understanding, in principle, is a non-mechanical process that lies at the heart of human ingenuity.

Market Efficiency

Neoclassical economics assumes the market to be efficient. This goes under the title: “Efficient Market Hypothesis” (EMH). Essentially, it assumes that prices, quickly and systematically, reflect all available information.

A very important consequence of the theory is that there should be no “bubbles.” Bubbles mean prices failed for an extended period of time to reflect accurately the economic value of the underlying asset. This was more or less the position of Robert Lucas (2009) and Eugene Fama (Cassidy, 2010), among others, regarding the surge in credit markets prior to the Global Financial Crisis. Proponents of the EMH deny systematic “Irrational Exuberance” that behavioral economists, like Robert Schiller (2000), repeatedly emphasize.

Algorithmic Information Theory provides a new perspective on market efficiency. Suppose economic agents were “robots” interacting in a mechanical artificial world, as Robert Lucas (1988) would like to model them. Would the market mechanism in such an environment achieve efficiency?

Define the amount of information of a commodity in terms of the number of bits required to describe it. Hence, we can represent a commodity x as a binary string d containing all the relevant information to describe it fully. The cost of communicating this description is measured by the number of bits in d, that is, by its length (Li & Vitányi, 2019, p. 2).

To minimize communication costs, we focus on the shortest program p that can generate d, instead of d itself. The least cost of transmission of d, therefore, is given by the length of p. Hence, we can define the efficient price q of commodity x as the length of p: L(p) = q. In an efficient market, q would be the most economical way to describe commodity x.

In a “robotic market,” bubbles and crashes cannot be automatically ruled out

The price q corresponds to the Kolmogorov-Chaitin complexity of d. As noted earlier, however, the Kolmogorov-Chaitin complexity is generally uncomputable. This suggests that the market mechanism cannot guarantee the achievement of efficient prices even with complete algorithmic information. In a “robotic market,” therefore, bubbles and crashes cannot be automatically ruled out. Human judgment is necessary to achieve desirable stability.

Conclusion

The deep links between information, time, and money are striking. These links emerge from a basic feature of information: order.

Information necessitates order, which imposes a speed limit for transmission. This limit is a fundamental constant in almost all laws of nature.

In markets, the speed limit of economic information brings money into the center of market activities. Money emerges due to logical necessity rather than friction or bounded rationality.

The market mechanism, while essential for wealth creation, cannot systematically achieve price efficiency. The uncomputability of information complexity means market instabilities cannot be automatically ruled out. Wisdom and judgment must complement the market mechanism to get closer to the best of all worlds.

Special thanks to Gregory Chaitin, Francisco Doria, and Benjamin Schumacher for invaluable discussions. The author is solely responsible for the views expressed herein.

References

  • Arrow, K. (1996). The Economics of Information: An Exposition. Empirica, 23, 119–128.
  • Carroll, S. (2010). From Eternity to Here: The Quest for the Ultimate Theory of Time. Dutton.
  • Cassidy, J. (2010). Interview with Eugene Fama. The New Yorker, January 13.
  • Chaitin, G. (2005). Meta Math! The Quest for Omega. Vintage Books.
  • Chaitin, G. (2023). Philosophical Mathematics: Infinity, Incompleteness, Irreducibility. Academia.edu.
  • Giulini, D. (2005). Special Relativity: A First Encounter. Oxford University Press.
  • Goodhart, C.A.E. (1989) Money, Information, and Uncertainty (2nd ed.). Macmillan.
  • Li, M., & Vitányi, P. (2019). An Introduction to Kolmogorov Complexity and Its Applications (4th ed.). Springer.
  • Leijonhufvud, A. (1968) On Keynesian Economics and the Economics of Keynes. Oxford University Press.
  • Lucas, R. (2009). Robert Lucas on the Crisis. The Economist, Aug. 6.
  • Martin, F. (2014) Money: The Unauthorized Biography. Alfred Knopf.
  • Mermin, N.D. (2005). It’s About Time: Understanding Einstein’s Relativity. Princeton University Press.
  • Schumacher, B. (2015). The Science of Information: From Language to Black Holes. The Great Courses.
  • Shapiro, C., & Varian, H. R. (1999). Information Rules: A Strategic Guide to the Network Economy. Harvard Business School Press.
  • Siegfried, T. (2000). The Bit and the Pendulum. John Wiley & Sons.
  • Skidelsky, R. (2010). Keynes: The Return of the Master. PublicAffairs.
  • Stiglitz, J. (2020). The Revolution of Information Economics: The Past and the Future. In K. Basu, D. Rosenblatt, & C. Sepúlveda (Eds.), The State of Economics, the State of the World (Chapter 3, pp. 101–138). MIT Press.
  • Stone, J. (2015) Information Theory: A Tutorial Introduction. Sebtel Press.
  • Susskind, L., & Hrabovsky, G. (2013). The Theoretical Minimum. Basic Books.
  • Wheeler, J. A. (1990). Information, Physics, Quantum: The Search for Links. In W. Zurek (Ed.), Complexity, Entropy, and the Physics of Information. CRC Press.

--

--