Complexity through Iteration: The World of loop(x = f(x))

katoshi
Neo-Cybernetics
Published in
14 min readMay 3, 2024
Photo by K8 on Unsplash

From the perspective of a systems engineer, I often ponder life and intelligence. A significant recent insight from my personal research is that sequential iterative processing over time is a crucial characteristic of both life and intelligence.

Focusing on this iterative process allows us to deepen our understanding of the commonalities and characteristics of life and intelligence and how they come to be. In this article, I will define and discuss “complex iteration” as the mechanism through which complex structures like organisms and intelligence emerge.

Through complex iteration, physical repetition evolves into chemical, biological, and eventually cognitive and linguistic processes, and societal systems, forming a hierarchical structure. I model this using what I call the “Sparse Onion.”

This article is lengthy, but below is the main text.

The Nature of Complex Iteration

Consider a function that takes a real number from 0 to 1 and returns a value in the same range. This function can repetitively use its output as the next input.

If this function includes both increasing and decreasing sections, its repetitive application does not converge but continues to change.

If the function is not linear but composed of curves, there is no repetition of the same pattern during its iterations, allowing for various changes.

Such chaotic patterns are complex, but they can reproduce the same patterns with the right combination of function and initial conditions.

Let’s call this type of repetition “complex iteration.”

Formulation of Complex Iteration

Complex iteration can be formulated simply. Since it uses the same variable to pass the function’s output as the next input, it is expressed not in typical mathematical notation but as a pseudocode:

loop(x = f(x))

Here, the loop() part means to repeat the process within it. The crucial part of this formulation is x := f(x), which indicates that the variable x is given to the function f, and the result is assigned back to x.

Variable x can be a single value or a composite of several parameters. Considering the deeper content discussed later, x could also include another function.

Complex Iteration as a Common Principle of Self-Organization

The nature of complex iteration implies that when used in evolution or learning, it allows for the acquisition and learning of numerous complex patterns with minimal parameter memory and reproduction.

Indeed, biological cells can sustain life by generating chains of specific chemical reactions in response to external stimuli and internal states. Stimuli and responses are countless, and these have been acquired over the course of evolution.

The brain also processes specific patterns based on external information and internal knowledge, enabling necessary actions and text generation. The pattern of processing according to information and internal state must have been acquired during the learning process.

Large language models are a type of recurrent neural network, with mechanisms that feedback previous outputs while producing one token at a time, utilizing pattern learning through iteration.

This understanding shows that biological cells, brains, and artificial intelligence acquire appropriate responses and processing patterns to stimuli and situations through complex iteration during evolution and learning.

Thus, complex iteration underlies the evolution of biology, the learning processes of brains, and artificial intelligence, serving as a common principle of self-organization.

Threads

In biology and intelligence, repetition is used to generate appropriate sequences of responses and processing patterns to initial stimuli or states. We will call a coherent series of repetitions that generates these response and processing patterns a “thread.”

States

Responses and processes do not directly feedback into the next iteration. They alter states, and these states change the stimuli or situations, feeding into the next iteration.

States can be like waves that form and change shape quickly, dissipating naturally, or they can disappear after the next iteration, like catalysts or memories, persisting for a certain time or number of iterations.

While states can be further classified based on whether they disappear over time or are used in repetitions, we will classify them based on whether they disappear after a moment or one repetition or persist through multiple repetitions.

First, states that disappear after an instant or one repetition will be called volatile states, and those that persist through multiple repetitions will be termed persistent states.

The state formed by a response or process does not definitively determine whether it is a volatile or persistent state at that moment. It depends on the relationship with the repetitions that feedback.

States that are inputs only for one repetition within a thread are considered volatile for that thread. Persistent states, on the other hand, are those that persistently input within a thread and disappear when the thread ends, which can be categorized as short-term states. States that persistently input across multiple threads can be classified as long-term states.

Nested Structures Between Threads

By mediating states, multiple threads can take a nested structure.

There can be a thread that repeats over a shorter duration, which we will call Thread A. This Thread A can be repeated several times, which we call Thread B. One repetition of Thread B consists of one instance of Thread A. This relationship is similar to function calls in general programming, where Function B internally calls Function A repeatedly, showing a nested structure.

In this case, inside Thread A, volatile states and short-term states are feedback while the internal repetitions of Thread A proceed. Then, states formed as a result that are passed from Thread A to Thread B are considered long-term from Thread A’s perspective.

Conversely, Thread B treats the states received from Thread A as either volatile, short-term, or long-term states. Thus, the classification of states from Thread A’s perspective and their relationship in Thread B do not match. This is similar to how, in programming, the result passed from Function A to Function B is independent of how it is treated in Function B.

This structure can be understood by considering the mechanism of a conversational AI system using large language models. In this case, the repetition of input sentences by users and output sentences generated by conversational AI can be called a chat thread, corresponding to Thread B above. And the part where conversational AI generates one output sentence in response to one input sentence corresponds to Thread A, with Thread B repeating Thread A.

Inside Thread A, each output sentence is generated by internally producing one character or one word at a time. There are no volatile states in Thread A; the output from one repetition is held as a short-term state within Thread A, and all outputs are feedback in the next repetition. Furthermore, when Thread A is called from Thread B, the user’s input sentence is given, which also inputs into all repetitions within Thread A. In addition, the history of previous user input sentences and conversational AI output sentences within Thread B is also given, all of which are inputs into every repetition within Thread A.

When Thread A finishes generating the output sentence, the result is passed to Thread B. From Thread B’s perspective, the output sentence from Thread A is treated as a short-term state, used only within the repetitions of Thread B. When Thread B ends, all these states are discarded.

Similarly, within a cell, the processing repeated in response to external stimuli involves transcribing specific parts of DNA to RNA, translating RNA to proteins, and these proteins causing other chemical reactions. This sequence of chemical substances and reactions necessary in response to the stimulus is generated. The transcription to RNA and translation to proteins also involve internal repetitive processing. There may also be internal repetitions in the part where proteins cause chemical reactions.

In this way, reactions and processes triggered by stimuli and states can have a complex nested structure mediated by state transfers.

Network Structures Between Threads

By mediating states, multiple threads can also take a network structure. In this case, the state generated by one thread can be randomly used by another thread, requiring some form of shared medium for state transfer.

For example, if it can be transferred in a way similar to a conveyor belt in a factory, a state can be precisely transferred from one thread to specific other threads. On the other hand, if a state generated by one thread is stored in a warehouse-like place, any thread can freely access that state. In this case, depending on timing and circumstances, the thread that receives the state can vary. Furthermore, if the state used by another thread remains unchanged, like a long-term state, it is like a tool kept in a toolbox to improve factory production efficiency. This state in the toolbox is always available for input to multiple threads.

In this way, by mediating states, multiple threads can take a network structure that influences each other.

The situation in cells is well understood when considering the behavior of reactions within cells. The RNA translated from a part of the DNA moves like a conveyor belt to the thread for translating proteins. On the other hand, the translated proteins diffuse in the cytoplasm, the fluid filling the cell, like inventory stored in a warehouse, used in chemical reactions by different threads depending on timing, or remaining continuously as a catalyst for repeated chemical reactions, like tools in a toolbox.

Whether in humans or conversational AI, communication through natural language, whether it is conversation, emails, or messages, involves interactions with specific individuals, with both parties considering the exchanged sentences as inputs for the next sentences in the thread network. Additionally, documents, books, and blog articles are like tools in a toolbox, inputs to the threads of many people’s thoughts. Also, humans use the sentences they have seen or thought about in the past, stored in long-term memory, as inputs when they think of other sentences or ponder alone.

Moreover, the process of biological evolution and artificial intelligence learning also involves long threads of repetition forming long-term states of DNA sequences or model parameters. These sequences and parameters are always used as inputs in individual reactions and processes. Therefore, the threads of evolution and learning and the individual threads of reactions and processes, although vastly different in time scale, form a kind of network structure of threads.

In this way, biological cells, brains, and artificial intelligence form complex networks of thread structures through complex iteration, intertwined in various time scales, individual or personal combinations. Within these, the evolution of biological species, short-term and long-term learning of individuals and persons, and the formation of collective intelligence and societal developments progress steadily, accumulating long-term states and advancing progress and prosperity.

Evolution of Thread Structures in Complex Iteration

Long-term states generated in a place like a toolbox not only serve existing complex iteration threads but can also create new complex iterations.

If a long-term state is given as a parameter to a complex iteration function and clearly changes the function’s properties, this can be seen as a change in the existing complex iteration or as an evolution to a new complex iteration. If DNA is considered a toolbox and a new, revolutionary, and useful pattern is recorded there, it can be seen as evolution.

Moreover, the DNA or books that are the toolboxes themselves, when considered where they came from, are likely produced during the process of evolution and learning of another complex iteration. This applies not only to the toolbox but also to warehouses and conveyor belts that store or transport short-term states. These methods of storing and transporting states must also have been generated during the process of evolution and learning of another complex iteration. This leads to the generation of new complex iterations or the construction of new relationships between threads.

Nested thread structures are similar. If a conveyor belt or warehouse is generated that can transfer states between the same small complex iteration threads, it is possible that a larger thread that repeats the small complex iteration threads could be generated.

This is similar to a chat AI, where initially a simple question-and-answer interaction could evolve to connect multiple conversations.

Evolution of Driving Structures

Organizing this way, we can understand how new complex iteration threads continuously form structures between them through the generation of new long-term states.

To drive this complex iteration, a driving structure and the energy to operate it are needed.

Artificial intelligence operates on computers, so the computer and its programs are the driving structure, and the energy is electricity. The intelligence of living beings is the brain, with glucose and other chemical energies as the energy source, used in nerve cells to generate and exchange electrical signals and neurotransmitters.

The energy sources for living cells are chemical energy, thermal energy, and light energy. These are used to drive the entire cell structure and perform complex iterations. The internal structure, as explained earlier, is based on generating proteins through RNA from parts of DNA, with these proteins causing chemical reactions.

Living cells form the bodies of multicellular organisms, advanced multicellular organisms have brains, and humans have the most advanced brains among them. And humans, within their brains, form advanced societies and have invented language, computers, and artificial intelligence.

This includes the process of evolution from single-celled organisms to humans, the formation of societies, and the development of academics and technology. This evolution and development are also supported by complex iteration. And during this process, not only the thread structures of complex iteration but also the driving structures evolve.

Also, even before the appearance of cells, the repetition of chemical reactions by inorganic chemicals on Earth led to the emergence of cells, the origin of life. The driving structure here is the structure of Earth itself, with energy sources considered to be sunlight, solar heat, geothermal energy, and chemical energy.

The driving structure as Earth includes not only the local diffusion of chemical substances in a pond but also the convection of water in the pond. And water flowing in rivers, evaporating from the sea, becoming clouds, and raining on land are part of a global water cycle on Earth. These not only move chemical substances but also form repetitive structures from micro to macro scales.

In this way, there are numerous places on Earth where repetition occurs, and these can be considered driving structures, allowing chemical evolution to gradually progress over unimaginable years, eventually leading to the birth of cells.

Feedback Beyond Repetition

While explaining that basic complex iteration is repeated, forming multilayered complex iteration structures and network-like repetition thread structures, and forming new driving structures, a key mechanism is involved.

This mechanism is that when a newly formed thread structure or driving structure advantageously affects the repetition of subordinate threads or related threads, positive feedback beyond the structure of repetition occurs.

Such positive feedback beyond repetition, forming new thread structures and driving structures, creates a powerful mechanism where the new and existing structures mutually reinforce each other. Based on this mechanism, thread structures and driving structures evolve from simple to complex.

Analysis of Nested Structures in Conversation

The external mechanism of repetition in natural language conversation and thinking may not change, but there has been a hierarchical evolution in the process of inventing and learning language.

Initially, communication used single sounds or changes in sounds, then words formed by connecting sounds, sentences formed by connecting words with grammar, and texts formed by connecting sentences with context, showing the evolution of language. Likely, the process of learning existing languages also progresses hierarchically.

At each point, there is a mechanism to temporarily remember the connection from the previous sound to the next, adding the rhythm specific to that language to generate the next sound, a mechanism to temporarily remember the words up to that point and add the grammar specific to that language to generate the next word, and a mechanism to temporarily remember the meaning of the sentences up to that point and generate the next sentence following the structure of the text. This expands to generating conversations following the conversation structure pattern while remembering the broader context of the conversation.

In this way, by combining states and complex iteration in multiple layers, natural language conversation and thinking are possible. Organizing this way, it can be understood that language appears to have a hierarchical structure from sounds, characters, words, sentences, texts, to conversations, utilizing the nature of the internal complex iteration thread structure. And it is clear that the state and repetition function differ at each layer.

In higher layers, repetition progresses while inheriting summarized states like meanings or contexts. And the complex iteration function also learns and uses different patterns at each layer, such as sequences of sounds, grammar, text structure, and conversation structure.

Conversational AI

The fact that conversational AI based on large language models can converse in natural language like humans means that human language acquisition ability is achievable through the mechanism of neural networks in computers.

And learning and handling natural language through repetition, and generating sentences in conversation through repetition, has been clearly possible. And the fact that conversational AI already has logical reasoning abilities and limited creative capabilities suggests that these abilities are also acquired through learning by repetition and demonstrated in the process of generating conversational sentences.

Considering that natural language conversation, as explained earlier, reflects the nested structure of complex iteration threads, it suggests that a similar nested structure of complex iteration threads reflecting this natural language structure could be formed in the neural networks during the learning process of conversational AI.

Neural networks, capable of maintaining numerous node parameters as long-term states, can form pseudo-toolbox or warehouse functions using these long-term states, and pseudo-functions corresponding to upper complex iteration threads could also be reasonably formed.

If long-term states form pseudo-toolbox or warehouse functions within neural networks, forming pseudo-threads is not surprising.

And in the process of forming these pseudo-thread groups, new threads based on existing threads could be formed, creating a Win-Win relationship that positively affects existing threads.

Thus, during the repetitive process of training with a large amount of text, the neural networks of large language models could form complex thread structures while constructing Win-Win relationships, forming pseudo-state storage places and pseudo-threads.

This is why, despite being a simple mechanism like a neural network, complex information processing and intelligent processing like natural language conversation are possible, I believe.

And conversational AI, capable of conversing and thinking in natural language like humans, differs in being human and computer processing, but essentially, both are based on complex iteration, with complex nested and network-like thread structures formed within them.

Sparse Onion Model

As complex iteration is repeated, thread structures and driving structures evolve step-by-step, and structures are strengthened through Win-Win relationships between existing and new threads, and new structures are continuously created, advancing evolution and learning.

On the other hand, new threads and structures can sometimes become independent without relying on existing threads or past structures.

For example, pre-cellular organic structures, which are thought to have existed in abundance on Earth before the birth of cells, cannot be found today. They have likely disappeared from Earth because cells, having strong reproductive capabilities, absorbed them as nutrition or energy sources.

Although no evidence remains of these pre-cellular organic structures, the circumstantial evidence that cells exist suggests that they must have existed and evolved on that basis to allow for the birth of cells. However, the fact that life can continue to exist even after their disappearance means that some structures built through complex iteration can indeed disappear without issues.

I call this the Sparse Onion Model. The way evolution progresses with diverse expansions based on previous stages resembles an onion with overlapping structures from the center. And inside this onion, there are partially empty (sparse) structures.

Understanding this structure can help us imagine that we and our civilization are not at the peak or innermost part of this structure but merely occupy its outermost layer currently.

And if the structure supporting us inside this Sparse Onion collapses or if a structure forms outside us, it may not be guaranteed that we can continue to exist stably.

Finally: The Outermost Function

In the first half of this article, I formulated complex iteration using pseudocode as follows:

loop(x = f(x))

This formula can be applied to various complex iterations. And each complex iteration exists within an outer complex iteration, called by the outer function f(x). Note that the apparent outer side of this formula and the outer side of the Sparse Onion Model are different concepts.

Each complex iteration is generated while the outer function f is repeated, and the loops and functions within these complex iterations contain x. At some point, a new function is assigned to x as a result of f(x), and from then on, the complex iteration function is executed inside f(x).

Considering this nested hierarchical structure, one might wonder what the outermost function f and loop are.

My current answer is that the function f is space and the physical laws therein, and the loop is time.

However, if this is extended to the realm of quantum physics or the cosmic scale of relativity, there may be a different answer.

--

--

katoshi
Neo-Cybernetics

Software Engineer and System Architect with a Ph.D. I write articles exploring the common nature between life and intelligence from a system perspective.