The Nature of Cultural Collaboration
Part I: The Diffusion of Complexity
Introduction
All-encompassing nature has no borders. We are the ones who create disciplines called physics, mathematics, biology, psychology, sociology, philosophy, history and so on. As our knowledge of nature advances in specialized fields of investigation, it is increasingly necessary to meld diverse information into a more holistic understanding of reality.
I am not a specialist by any stretch of the imagination. Just the opposite, in fact, more of a generalist. I am fascinated to investigate scientific and philosophical ideas that combine to form our basis for knowledge. So, I dig into various fields of study to comprehend the historical sequence of concepts that have led toward our current state of understanding.
Perhaps we need a discipline devoted to this effort. The closest I’ve seen is referred to as ‘Interdisciplinarity’, which is a process that is frequently attempted with occasional success. Issues arise over how to integrate the deep understanding and focused analysis of the specialist with the elementary grasp and broader perspective of the neophyte in multiple areas. This does not obviate the need to integrate concepts so as to produce a fuller recognition about the consequences of our beliefs. Nor does it negate the more important potential to discover emergent realizations.
Philosophy provides the fundamentals of theoretical thinking, cognition and self-awareness, which are utilized to design, conduct, analyze and interpret research outcomes and to process information about reality. The branches of Philosophy called ‘ontology’ (what exists for people to know about) and ‘epistemology’ (how knowledge is created and what is possible to know) produce generalized perspectives of the world. These form beliefs that guide action and reveal assumptions about investigations and the methodologies undertaken.
Evolution had better result from the challenge to our capacity for adaptation in a rapidly changing environment of new tools and ideas through the process of perpetual feedback. The unresolved dilemma remains how we, as individuals and as a culture, shall assimilate the rising flood of information that technology and cognitive creativity produces.
Chaos
It is instructive to review some paths of research over the past 100 years in Math and Physics that exemplify the kinds of science leading to a broader vision of the landscape. Fundamental theoretical research using the tools of non-linear fractal equations and particle accelerators produces insights that are directly related to postmodern issues of information technology and sociology. The question arises whether science precedes or follows cultural recognition; which is the chicken, and which the egg?
20th Century Particle physics came out of the ‘relativist’ and ‘quantum’ revolutions. It was about the search for simplicity and continuity. Its principal tool was calculus. Its final expression was field theory. 21st Century theoretical physics is coming out of the ‘chaos’ revolution. It will be about complexity and its principal tool will be the computer. Its final expression remains to be found.
For a popular introduction to this subject, check out the 1987 publication by James Gleick, Chaos: Making a New Science, which succeeds in delivering the general principles of chaos theory and its history to the broader public. It incorporates spectacular computer-generated graphics produced by the fractal equations that Benoit Mandelbrot published in The Fractal Geometry of Nature, (1982).
The first to encounter this field of investigation was a French mathematician, theoretical physicist, engineer and philosopher of science, in a time before specialization, when it was still possible to excel in multiple disciplines. A prodigious intellect, known for many important advancements around the turn of the century, Henri Poincar (1854–1912) was working on the ‘three-body problem’, taking an initial set of data that specifies the positions, masses and velocities of three bodies for some point in time and then determining their motions in accordance with the laws of classical mechanics. He noticed that a slight change in the initial conditions can result in large-scale differences in the outcome.
Decades later, a weatherman named Edward Lorenz (1917–2008) became skeptical about the appropriateness of the linear statistical models used in meteorology when he found a minuscule decimal rounding error led to a completely divergent prediction. His application of non-linear functions presented in a 1963 paper, Deterministic Nonperiodic Flow, comprise the foundation for the theory of chaos. In his talks, Lorenz coined the term ‘butterfly effect’ to metaphorically portray the theoretical example of a hurricane’s formation being contingent on the incidental flap of a distant butterfly wing weeks before.
For theoretical physicists, chaos is a purely mathematical concept and an undeniable fact. The enormous success of calculus is in large part responsible for the decidedly reductionist attitude of most contemporary science, the belief in absolute control arising from detailed knowledge. Chaos is the anti-calculus revolution. An object that is chaotic in space is called a ‘fractal’, generally meaning; a geometric form that does not become simpler when you analyze it into smaller and smaller parts.
Nature is full of fractals. Mountain ranges, trees and the human body exemplify fractals. A fern leaf, the sky on a partially cloudy day, a pattern of human settlements, the continental coastline, waves on the ocean surface; all are fractals in the sense that they do not become simpler when you examine them with increasing magnification.
Chaos in time is reflected by ‘dynamical systems’ which contain motion equations and variables that do not remain static. Recognition of sensitivity to initial conditions is the death of reductionism. Even if we know the state of a system very precisely now, we cannot predict the future trajectory indefinitely. Throughout the 20th Century, intensive research has been applied using quantum mechanics and relativity, two theories firmly based in calculus, with the intention of uncovering a universal law of physics as exhibited by elementary particles. Chaos puts an end to this reductionist dream, the dream that we have absolute power if only we know enough about the details.
Complexity
‘Complexity’ is an ancillary perspective to grapple with chaotic systems that extend beyond the scope of chaos alone. Complex systems share some characteristics of chaos, though the complete parameters are not yet clearly defined. It is generally agreed to contain the following properties, many of which best apply to biological and cultural systems:
1. Complex systems contain many constituents interacting nonlinearly. Recall that nonlinearity is a necessary condition for chaos, but this does not mean that all chaotic systems are complex.
2. The constituents of a complex system are interdependent. For example; removing 10% of the gas from a container does not change the behavior of the remaining gas, however, removing 10% of the human body, say a leg, will change the behavior of a person quite dramatically. This is a complex system.
3. A complex system possesses a structure spanning several scales. At every scale we shall find a separate structure.
4. A complex system is capable of emerging behavior. Behavior is said to be emergent if it cannot be understood when studied at one scale alone. Each constituent observation at one scale may also be a complex system made up of finer scales. Emerging behavior understood as a new phenomenon specific to the scale considered and results from global interactions between the scale’s constituents.
The combination of structure and emergence leads to ‘self-organization’, which happens when emerging behavior has the effect of changing the structure or creating a new structure. The special category of complex systems that accommodates living beings is called ‘complex adaptive systems’. Complexity and chaos have in common the property of nonlinearity and complexity implies the presence of chaos, but the reverse is not true. Chaos is basically pure mathematics and fairly well understood. Complexity is almost totally unexplored still and is not really math at all. Chaos is but a very small sub-trait in the world of complexity. Another significant item defining complex systems is this:
5. Complexity involves interplay between chaos and non-chaos. At the ‘edge of chaos’, there is a critical point in phase transitions where complex systems manage to modify the environment, as in self-regulating to maintain an equilibrium, at this place near the edge where self-organization is most likely to occur.
There is one more property of Complexity which concerns all social systems, all collections of organisms subject to the laws of evolution. These include our own immune systems, plant and animal populations, human groups of various sizes such as families, tribes or city-states, social or economic classes, international corporations and global nations. In order to evolve and stay alive, in order to remain complex, all must obey one final rule:
6. Complexity involves interplay between cooperation and competition. This is an interplay between scales within the complex system. Wars between nations, for example, are supported by an underlying patriotism of the citizenry. Once this competition-cooperation dichotomy is comprehended, we are a long way departed from the old cliché ‘survival of the fittest’ which has done so much damage to the understanding of evolution.
If the foregoing views of chaos and complexity are accepted, then the myth of Sisyphus is perpetuated by many very intelligent others, working diligently in diverse scientific fields over the last century, who were not informed of the insurmountable incline up the mountain of knowledge and have not given up. The extent to which developments have been progressive or regressive is a subject for philosophical and civil debate that reaches beyond the scope of scientists to presume judgment.
Entropy
The science of complex systems demands a dynamic method of analysis. Statistical Mechanics does this in Thermodynamics, which is the study of disordered energy. There is a lot of this in complex systems like the weather and the economy. An understanding of energy begins with the fairly intuitive 1st law of thermodynamics which states that the total sum of all energy, in one form or another, is conserved at all times. The 2nd law is more sublime. It defines ‘entropy’ as a measure of disorder within an isolated system, a property which may theoretically stay constant (but has never been observed to do so) and can never decrease over time. It is not reversible; Entropy must always increase!
Since entropy is a measure of order-disorder in a system, the evolution of systems can likewise be seen as irreversible. Backwards evolution, with things happening in reverse order of time cannot be possible, because it would imply a decrease in entropy. Statistical mechanics does not recognize this constraint, as the mathematical functions operate in either direction. This leads to a current conundrum, sometimes called ‘the paradox of the arrow of time’. As a complex system nears the ‘edge of chaos’, it is the large number of cognitive possibilities for what is taking place that is measured by the quantity called entropy. When we say entropy is a measure of disorder in this context, it is a measure of disorder in our minds, in our knowledge of the situation.
Classical mechanics — calculus — creates the conundrum by insufficient approximation, the smoothing process it applies to reality. We summarize statistical conditions in order to cope with the complexity, as Lorenz found with his meteorological rounding error. Every such shortcut is a loss of information which increases the potential distribution of results, hence entropy. Chaos, therefore, explains the paradox. Entropy, which measures our lack of knowledge, is a purely subjective quantity. It has nothing to do with the fundamental laws of particles and their interactions. It has to do with the fact that chaos messes things up; that situations that were initially simple and easy to know in detail, will eventually become so complicated, thanks to chaos, that we may be forced to give up trying to know them. ref. Chaos, Complexity, and Entropy — by Michel Baranger (1927–2014)
Erwin Schrodinger (1887–1961) expressed an interesting take on entropy as it relates to biological systems in a book he wrote in 1944 called What is Life? He noticed that through the evolution of species and the growth of organisms, life creates order from disorder, thus contradicting the 2nd law of thermodynamics according to which entropy only increases in a ‘closed system’ (such as the universe). The solution to this paradox is that life is not a ‘closed system’. Schrödinger explains that living matter evades entropic decay with ‘negative entropy’ (a value called ‘information’) in an ‘open system’, thereby redefining the concept of ‘universe’.
“…living matter, while not eluding the laws of physics and chemistry as established up to date, is likely to involve other laws of physics hitherto unknown, which however, once they have been revealed, will form just as integral a part of science as the former.”- Erwin Schrödinger
The brightest minds struggle to unify scientific theories and find solutions to cope with the disintegration of civic cohesion resulting from diverse sources of uncontrolled and questionable information. It was believed that liberal access to information, like that which is increasingly pervasive through social media and immediately accessible by internet search engines, would enhance our knowledge base to the benefit of all citizens. It is now apparent that it has contributed to the disconnected confusion about a multitude of mythologies with few unifying cross-connections. Random access to a chaotic non-linear knowledge structure has revealed a complex cognitive reality filled with entropically diffused fragments of information with which the human mind seems incapable of coping.
At the current, rapidly increasing rate of technological evolution, our capacity to direct development, rather than be directed by developments, is questionable. Cultural adaptation may be unable to keep pace without enhanced collaboration between diverse sub-disciplines within science, engineering, and also the humanities. There are ethical considerations to think about, for the good of society, unless we are willing for artificial intelligence to come and save the day!
Symbiosis
‘Symbiosis’ may be the term I have been searching for! It is central to the theme that I aim to communicate in relating the interdisciplinary nature of ideas that distinguish historical paradigm shifts of human culture. Symbiosis is vital to the evolution of new belief systems which result from the interaction of ideas, unique to the eras in which they arise, both deriving from and contributing toward a phase change in human thinking. It is at the heart of the social ‘complex adaptive system’.
The Greek origination of the word simply means ‘living together’. In a psychological context, it applies to a relationship in which people are mutually dependent and receive reinforcement from one another, whether beneficial or detrimental. It is synonymous with ‘interdependence’ or ‘synergy’, related words which fail to encompass the broad scope of implications with which ‘symbiosis’ is imbued.
Among scientists, the meaning of symbiosis is controversial. Some believe it should refer only to narrowly defined types of biological interactions between organisms. Historically, symbiosis has received less attention than ‘predation’ or ‘competition’ but is increasingly recognized as an important selective force behind evolution. The biologist Lynn Margulis (1938–2011), renowned for her theory of endosymbiosis, considered Charles Darwin’s notion of evolution driven by competition to be incomplete and claimed that evolution is strongly based on cooperation, interaction and mutual dependence among organisms. “Life did not take over the globe by combat, but by networking”, she contends.
The relationship between ideas and the words used to communicate them is also symbiotic, just as the comprehension of a reader involves a conceptual melding with the author. Each must recognize an existential basis of understanding in the other that overcomes the individuality of their experience. The receptive reader interacts with transmitted ideas by using what they can assimilate and accommodating any novel realizations they may encounter. At the same time, the individual is a constituent of the larger society, both contributing to and influenced by current group ideologies. Society symbiotically reciprocates by standardizing the environment in which individual comprehension may thrive.
Therefore, it is essential to establish foundational concepts upon which a common understanding can transpire. Humanity is a complex organism consisting of symbiotic components interacting on multiple levels. Any esoteric notion perceived is conditioned upon experiential and environmental factors which equip recipients to comprehend by linkage within their personal cognitive framework. We seek to understand seminal advances in human knowledge by closely studying key individuals and the evolution of definitive ideas. But we must simultaneously step back to see the context of ideas within the forest they grow and appreciate the intertwined ecosystem that shapes our cultural inheritance. Recognizing the relationships among contemporary developments in different fields of thought is how we shall hope to progress.
In an age of increasing specialization, fragmentation and pluralism, there is an associated mountain of detail we must explore to discover useful paths and destinations. Regardless of its accessibility, raw data must be given structure to be called ‘information’. Information must be understood before it can be called ‘knowledge’. Knowledge must be combined with insight to become ‘wisdom’.
T.S. Eliot poetically voices our collective concern:
“Where is the life we have lost in living?
Where is the wisdom we have lost in knowledge?
Where is the knowledge we have lost in information?”
This discussion resumes in Part II: Our Quantum Philosophical Inheritance.