The Nature of Cultural Collaboration

Philip Mraz
17 min readJul 6, 2024

--

Part II: Our Quantum Philosophical Inheritance

The Assembly of Knowledge

Collaboration of the cognoscenti has served to promote technological advancement since the dawn of Enlightenment. Despite the proliferation of scientific developments, a unifying consensus of truths and beliefs about knowledge and reality seems to recede into the void, beyond the expanding universe in which we exist. Wisdom, like the end of the rainbow, always remains further afield.

Please see Part I of this essay, “The Diffusion of Complexity”, in which the principles of Chaos, Complexity, Entropy and Symbiosis were discussed. We continue here by characterizing some of the preeminent thinkers associated with 20th Century philosophical movements leading up to the Postmodern Condition we are faced with today.

A review of the connections between historical philosophers and psychologists shed light on the roles they play in defining, shaping, and enhancing the assimilation of information within the public consciousness. Observing the information exchanges and discourse between contemporary physicists serves to illustrate their metaphysical concerns beyond the immediate consequences of the activities they were involved with. The social commentary implicit in contemporary schools of thought expresses an understanding about the acquisition of cognitive knowledge and anxiety over the power dynamics of the decision-making process within the evolving cultural context.

Pragmatism

The US Civil War produced a breadth of social disruption, suffering and loss from which few Americans emerged unscathed. The discomfort extended well beyond physical mortality, pain and displacement, into disturbing moral, ethical, psychological and sociological questions that society struggled to resolve during a period of Reconstruction and westward expansion. The multiplicity of divergent attitudes and irreconcilable differences among members of a supposedly united society presented a stumbling block, then as now. This was difficult to overcome within the confines of traditional philosophic approaches that tended to offer universal absolutes which no longer seemed to apply.

In this environment, a group of esteemed gentlemen gathered in Cambridge Massachusetts on Saturdays in the year 1872. Charles Pierce, William James and Oliver Wendell Holmes Jr., were occasionally joined by influential others in discussions which formed the foundation for a uniquely American, anti-metaphysical philosophy, which eventually became known as Pragmatism.

In 1897, William James’ (1842–1910) first philosophical book, The Will to Believe and Other Essays in Popular Philosophy, was dedicated to Charles Sanders Peirce, whom he acknowledged as the prime initiator of his ideas. The following year, at the University of California at Berkeley, James delivered a lecture, “Philosophical Conceptions and Practical Results”, which helped to launch pragmatism as a nationwide philosophical movement.

It would seem that anything knowable must be true, but what does it mean to call a proposition or belief ‘true’ from the perspective of pragmatism? This subject of James’ famous sixth lecture begins with a standard dictionary analysis of truth as agreement with reality. Accepting this, he warns that pragmatists and intellectualists will disagree over how to interpret the concepts of ‘agreement’ and ‘reality’, the latter thinking that ideas copy what is fixed and independent of us. By contrast he advocates a more dynamic and practical interpretation; a true idea or belief being one we can incorporate into our ways of thinking in such a way that it can be experientially validated.

To say that our truths must agree with such realities pragmatically means that they must lead us to useful consequences. He is a ‘fallibilist’, seeing all existential truths, in theory, as revisable given new experience. They involve a relationship between facts and our ideas or beliefs. “Because the facts, and our experience of them, change we must beware of regarding such truths as absolute, as rationalists tend to do.” (Pragmatism) This relativistic theory generated a firestorm of criticism among mainstream philosophers to which he responded in “The Meaning of Truth”.

Epistemology is but a small segment of the wide-ranging subject matter about which James eloquently verbalized the most progressive ideas of his era. Among his fundamental principles are 1st, our human nature comprises a capacity for an intuitive moral sense which must be developed in a context of values that socially evolve. 2nd, our basic moral concepts of good and bad or right and wrong are all subject-relative. 3rd, when values conflict, those which would seem to satisfy as many personal demands as possible while frustrating the fewest should have priority, regardless of the nature of those demands. This represents a pragmatic form of moral relativism, in which no action can be absolutely good or evil in all conceivable circumstances.

In one of his lectures, “Great Men and Their Environment”, James underlined the significance of a reciprocal impact between Great Men and their surroundings to the extent that they mutually shape one another, just as individual members of animal species do according to Darwinian theory. It refutes the tradition of applying singular importance to key figures in precipitating historical events, recognizing that the cultural ecosystem is equally if not more contributory than any individuals who may serve to define it.

John Dewey (1859–1952) became a confirmed disciple of James after attending one of his lectures, “The Principles of Psychology”, in 1890. He carried forward and supplemented the ideas of Pragmatism as one of the most influential educational reformers and social philosophers of his time. Dewey came to believe that a productive, naturalistic approach to the theory of knowledge must begin with a consideration of the development of knowledge. He advocated for an educational structure that strikes a balance between delivering knowledge while also taking into account the interests and experiences of the student. He notes that “the child and the curriculum are simply two limits which define a single process. Just as two points define a straight line, so the present standpoint of the child and the facts and truths of studies define instruction” (Dewey, 1902, p. 16).

Dewey accepted the fallibilism that was characteristic of the school of pragmatism; the view that any proposition accepted as an item of knowledge has this status only provisionally, contingent upon its adequacy in providing a coherent understanding of the world as the basis for human action. This pragmatic theory of truth met with strong opposition among its critics, perhaps most notably from the British logician and philosopher Bertrand Russell. Dewey later began to suspect that the issues surrounding the conditions of truth as well as knowledge were hopelessly obscured by the accretion of traditional and, in his view, misguided meanings to the terms, resulting in confusing ambiguity. He later abandoned these terms in favor of ‘warranted assertiblity’ to describe the distinctive property of ideas that results from successful inquiry.

The central focus of Dewey’s criticism of the tradition of ethical thought is its tendency to seek solutions to moral and social problems in dogmatic principles and simplistic criteria which in his view were incapable of dealing effectively with the changing requirements of human events. In Reconstruction of Philosophy and The Quest for Certainty, Dewey located the motivation of traditional dogmatic approaches in philosophy in the forlorn hope for security in an uncertain world. It is futile because the conservatism of these approaches has the effect of inhibiting the intelligent adaptation of human practice to the inexorable changes in the physical and social environment.

Ideals and values must be understood with respect to their social consequences, either as inhibitors or as valuable instruments for social progress. Dewey argues that philosophy, because of the breadth of its concern and its critical approach, plays a crucial role in this evaluation.

Social Psychology

Others seeking to understand the way we think proposed new ideas in the fledgling field of psychology which was then in the process of splitting off from philosophical speculation (‘psychology’ literally means ‘the study of the soul’) to establish itself as an empirical social science. Carl Jung (1875–1961) shifted and expanded the focus of the cultural ecosystem in which we thrive to encompass a ‘collective unconscious’. He saw motives not so much in the history of the individual as in the history of the entire human race.

His unconscious is a repertory of motives created over the millennia and shared by all humankind. Its ‘archetypes’ spontaneously emerge in all minds. All human brains are wired to create some myths rather than others. Thus, mythology is the key to understanding the human mind, because myths are precisely the keys to unlock those motives. Dreams reflect this collective unconscious, and therefore connect the individual with the rest of humankind and its archaic past. For Jung, the goal of Psychoanalysis is a spiritual renewal through the mystical connection with our primitive ancestors.

Meanwhile, Karl Jaspers (1883–1969) saw cultural constraints as diminishing the potential for individual freedom. In his view, existence implies a contradiction in terms. In theory, humans are free to choose the existence they prefer but in practice it is impossible to transcend the historical and social background. Thus, one is only truly free in accepting one’s destiny. Ultimately, we can only glimpse the essence of our own existence, but we cannot change it.

A French anthropologist and ethnologist, Claude Levi-Strauss (1908–2009), noticed the mythological undercurrents of our subconscious, stating; “I therefore claim to show not how men think in myths, but how myths operate in men’s minds without their being aware of the fact”. This subtext perpetually exists behind human aspirations toward meaningful accomplishment and the recognition of truth.

Constructivist Theory

The Constructivist paradigm evolving at this time is a philosophical perspective about the nature of accepted knowledge in a society. It maintains that knowledge is cognitively constructed rather than discovered in the world, and that the concepts of science are mental frameworks proposed in order to explain our sensory experience. Another important tenet of Constructivist theory is that there is no single valid methodology in science, but rather a diversity of useful methods.

The human corollary, Cognitive Constructivism, was formalized by the Swiss developmental psychologist, Jean Piaget (1896–1980), who described the mechanism by which knowledge is internalized by learners. Through the processes of ‘accommodation and assimilation’ individuals construct new knowledge from their experiences. When individuals assimilate, they incorporate the new experience into an already existing framework without changing that framework. In contrast, when an individuals’ experiences contradict their internal representations, they may change their perceptions of the experiences to fit their internal representations. Accordingly, ‘accommodation’ is the process of reframing one’s mental representation of the external world to fit new experiences.

Others took alternate routes to a similar understanding. Frederic Bartlett (1886–1969) composed a series of short fables (the best known was called The War of the Ghosts), each of which comprised a sequence of events which were ostensibly logical but subtly illogical, having several discreet non-sequiturs. He would recite this story to subjects, then later ask them to recall as much as possible. He learned that most people found it extremely difficult to recall the story exactly, even after repeated readings. Where the elements of the story failed to fit into the ‘schemata’ of the listener, these elements were omitted from recollection or transformed into more familiar fabrications.

Cognitivists of the ‘gestalt’ school, such as Max Wertheimer, Wolfgang Kohler and Karl Lashley, believed we perceive and react to ‘form’ as a whole, not to individual stimuli. We recognize objects not by focusing on the details of each image, but by focusing on the entirety. We solve problems not by breaking them down into more and more minute details, but via sudden insight, often by restructuring the field of perception.

This understanding of cognitive perception for the individual also describes the ‘holy grail’ of ‘interdisciplinarity’ on a global scale. The prime purpose is to gain a clearer view of the bigger picture, the entire ecosystem of the forest made of interdependent trees. The earth cries out for this. International cultural relations begs for it. Citizens will coalesce for a more cohesive perspective on reality.

Particle Physics

In stark contrast to the integrative principles of cognitive learning and comprehension developed in the early 20th Century, the methodology of scientific research was moving in a distinctly more reductionist direction. The interactions of renowned physicists connecting key conceptual advancements during this era is fascinating to follow. Even they seemed hard pressed to convey a unifying comprehension of the explosive discoveries.

In ‘classical mechanics’, a particle has, at every moment, an exact position and an exact momentum. These values change ‘deterministically’ as the particle moves according to Newton’s Laws. In ‘quantum mechanics’, particles do not have exactly determined properties and when they are measured the result is randomly drawn from a ‘probability distribution’. The ‘Heisenberg uncertainty principle’ is a famous example of indeterminate quantum mechanics. It states that the more precisely a particle’s position is known, the less precisely its momentum is known, and vice-versa. The ‘Schrödinger Equation’ predicts what the probability distributions are, but fundamentally cannot predict the exact result of each measurement.

It was a shock to Albert Einstein in 1925 when Werner Heisenberg (1901–1976), while working as an assistant to Niels Bohr, introduced matrix equations that removed the Newtonian elements of space and time from any underlying reality. Another shock came a year later, when Max Born (1882–1970) proposed that the mechanics was to be understood as a probability without any causal explanation. Einstein rejected this interpretation and in a letter to Max Born wrote: “I, at any rate, am convinced that He (God) does not throw dice.” Finally, in late 1927, Heisenberg and Born declared at the Solvay Conference that the revolution was over and nothing further was needed. It was at that last stage that Einstein’s skepticism turned to dismay. He believed that much had been accomplished but the reasons for the mechanics still needed to be understood.

Niels Bohr (1885–1962) conceived the ‘principle of complementarity’; that phenomenon could be separately analyzed as having several contradictory properties. Based on this principle, physicists currently conclude that light behaves either as a wave or a stream of particles, depending upon the experimental framework. These would otherwise appear to be two mutually exclusive properties. The traditional idea of a world experienced through universal physical phenomenon suddenly eroded, which precipitated additional philosophical implications.

Bohr enjoyed debating such daring principles with Einstein, who much preferred the determinism of classical physics over the probabilistic new quantum physics. The debates are remembered because of their importance to Philosophy of Science. An account of them has been written by Bohr in an article titled Discussions with Einstein on Epistemological Problems in Atomic Physics. Despite their differences of opinion regarding quantum mechanics, Bohr and Einstein had a lasting mutual admiration.

Leave it to a Hungarian-American wizard to introduce a sense of humanism to the debate. John von Neumann (1903–1957) is generally regarded as one of the greatest mathematicians in modern history. There are numerous testimonies to his prodigious abilities in rapid math calculation and photographic memory recall. He made major theoretical contributions and excelled in many fields; set theory, functional analysis, quantum mechanics, fluid dynamics, economics, game theory, computer science… the list goes on. He asked a seemingly simple question regarding empirical observations, uncertainty principles and probability distributions: “At which point does the collapse occur?”

If a measurement causes Nature to choose one value, and only one, among the many that are allowed by Schrodinger’s equation, when does this occur? In other words, where in the measuring apparatus does this happen? The measurement is performed by having a machine interact with the quantum system and eventually deliver a visual measurement to the human brain. Somewhere in this process a range of possibilities is reduced to one specific value. The quantum world of waves collapses into the classical world of objects.

Measurement consists in a chain of interactions between the apparatus and the system, whereby the states of the apparatus become dependent on the states of the system. Eventually, states of the observer’s consciousness are made dependent on states of the system, when the observer ‘knows’ what the value of the observable is. If we proceed backwards, this seems to imply that the ‘collapse’ occurs within the conscious being, and therefore consciousness creates reality (my emphasis).

The challenge of assimilating rapidly evolving conceptions of reality and eventual information overload was anticipated very early in the 20th Century by French philosopher Henri Bergson (1859–1941): “The eye sees only what the mind is prepared to comprehend.” This demonstrates, once again, that there is no clear dividing line between scientific analysis and philosophical interpretation of the results.

The Postmodern Condition

We learned that ‘complex adaptive systems’, like society, involve an environmental interplay. In this case, the evolution of knowledge involves a gradual adaptation of cultural acceptance, which permits creative ‘accommodation’, as Cognitivists used the term. We can see this taking place in the philosophical concepts that began to arise in the 20th century. This is another reminder that historical context is paramount to the comprehension of intertwined ideas.

A clear distinction can be made between two types of knowledge — ‘narrative’ knowledge and ‘scientific’ knowledge. Narrative knowledge is the kind that is prevalent in ‘primitive’ or ‘traditional’ societies, and is based on storytelling, sometimes in the form of ritual, music and dance. The mythology of narrative knowledge becomes as real as the timelessness of the narrative tradition itself — told by people who heard it before to listeners who will repeat it to others. There is no inclination to question traditional tales assumed to be true experience.

According to the portrayal of science, however, only knowledge which is legitimated (justified) is legitimate — i.e. is knowledge at all. Scientific knowledge is legitimated by certain scientific criteria — the repeatability of experiments, etc. This kind of narrative also implies an epic hero of some kind, the scientist as the ‘hero of knowledge’ who discovers scientific truths.

The distinction between narrative and scientific knowledge is a critical issue discussed by Jean-François Lyotard in The Postmodern Condition: A Report on Knowledge, (1979). One of the defining issues of postmodernity is the dominance of scientific knowledge over narrative knowledge, since the pragmatics of scientific knowledge does not allow for the recognition of narrative knowledge as legitimate. He sees danger in this domination, since it has been shown that reality cannot be entirely captured within one genre of discourse or representation of events and that science may miss aspects of events which narrative knowledge might capture. In other words, science is not justified in claiming to be a more legitimate form of knowledge than narrative.

Lyotard explains that the ‘metanarratives’ of science speculate on the eventual totality and unity of all knowledge. Scientific advancement is legitimated by the story that it will one day lead us to that goal and to the emancipation of humanity. Postmodernity is characterized by the end of metanarratives. So, what legitimates science now? Lyotard’s answer is — performativity. This is called the ‘technological criterion’ — the most efficient input/output ratio. The technical and technological changes over the last few decades — as well as the development of capitalism — have caused the production of knowledge to become increasingly influenced by a technological model.

It was during the industrial revolution that knowledge entered into the economic equation and became a force for production, but it is in postmodernity that knowledge is becoming the central force for production. Lyotard believes that knowledge is becoming so important an economic factor, in fact, that one day wars will be waged over the control of information (my emphasis).

The change that has taken place is nothing less than the ‘mercantilization’ of knowledge. In postmodernity, knowledge has become primarily a saleable commodity. Knowledge is produced in order to be sold and is consumed in order to fuel new production. Knowledge in postmodernity has largely lost its truth-value, that is, the production of knowledge is no longer an aspiration to produce truth. Today students no longer ask if something is true, but what use it is to them.

Lyotard believes that computerization and the legitimation of knowledge by the performativity criterion is doing away with the idea that the absorption of knowledge is inseparable from the training of minds. In the near future, he predicts, education will no longer be given ‘en bloc’ to people in their youth as a preparation for life. Rather, it will be an ongoing process of learning updated technical information that will be essential for their functioning in their respective professions.

Most true ‘discoveries,’ Lyotard argues, are discoveries by virtue of the fact that they are so radical that they change the rules of the game — they cannot even be articulated within the rules of the ‘dominant’ game (which is dominant because it draws the consensus of opinions). Many discoveries are not found to have a use until quite some time after they are made; therefore, they seem to be of little value by the performativity criterion. For economic reasons, legitimation by performativity tends to follow the consensus opinion — that which is perceived by the majority of experts to have the most efficient input/output ratio is considered most likely in fact to be most performatively efficient, and hence the safest investment.

Lyotard does not claim that research should be aimed at production of ‘the truth’. Rather, he sees the role of research as the production of ideas. Legitimation of knowledge by performativity terrorizes the production of ideas. What, then, is the alternative? Lyotard proposes that a better form of legitimation would be legitimation by ‘paralogy’. The etymology of this word resides in the Greek words para — beside, past, beyond — and logos in its sense as ‘reason.’ Thus, paralogy is the movement beyond or against reason.

In relation to research, this means the production of new ideas by going against or outside of established norms. Lyotard argues that this is in fact what takes place in scientific research, despite the imposition of the performativity criterion of legitimation. This is particularly evident in ‘postmodern science’ — the search for instabilities. Knowledge is not only the known but also the ‘revelation’ or ‘articulation’ of the unknown. (ref. Internet Encyclopedia of Philosophy — Jean Francois Lyotard, 1924–1998)

Information Technology

Computerization is a defining example that raises the issue of power as it relates to the status of knowledge in the world’s most advanced countries. There are two sides to the same question: who decides what knowledge is, and who knows what needs to be decided? With vast amounts of information stored digitally in databases, who decides what is worth storing (what is legitimate knowledge) and who has access to these databases? Who will control the resources of Information Technology such as allocation of broadband and net neutrality permitting passage to the storehouses of information? Now, more than ever, it is determined by governments and multinational corporations.

In the post-industrial era, information has been commoditized through the manipulation of technology and used for exploitative purposes, just as human labor was capitalized in conjunction with the steam engine to produce a previous revolution in industry. It was believed that liberal access to information like that which is increasingly pervasive would enhance our knowledge base to the benefit of all citizens. It is now apparent that it has contributed to the diffusion and fragmentation of knowledge into a multitude of mythologies with insufficient unifying cross-connections.

The potential for Quantum Computing and Nano Technology to fuse with Artificial Intelligence in producing new capabilities and entities never before seen is an example of interdisciplinary symbiosis already in progress. Solutions to the global environmental impacts of increasing energy consumption and renewable resource utilization are critical needs which ironically may be facilitated by these nascent technologies. Current research will impact related disciplines of neuroscience and social psychology, as well as physics and philosophy. Interdisciplinarity is the necessary process of reconstructing diverse elements of information as required to achieve these complex goals.

The potential impact of evolving developments must somehow be comprehended and assimilated by society so intelligent choices can be made, moving forward. New concepts occasionally combine with technological advances to synthesize a ‘paradigm shift’ — also known as a ‘leap of faith’. Human experience has shown this to be a worrisome phrase.

Resolution

Specialists in diverse fields of science, technology and engineering will continue to productively attend to the details of their niche avocations. This reductionist focus is the means by which beneficial progress has been generated on many levels. At the same time, independent fields of investigation have been peeled away piecemeal from the general study of ‘Natural Philosophy’, as science was previously called, unintentionally reducing the integration of knowledge.

It is an open question whether the volume of information resulting from specialized research might overwhelm our ability to process a fuller understanding of reality in all its complexity, though a Renaissance type of polymath ability no longer seems feasible (excepting for the likes of Jon Von Neumann perhaps). Maybe an emerging artificial intelligence is needed to assist. It appears evident, however, that the current de-emphasis of broad humanistic inquiry — exploring issues, ideas and methods across the range of natural and social sciences, including the arts, history and philosophy — will not enhance our ability to accommodate knowledge.

Civil unrest results from anxiety about the unknown. Better education is the cure, from pre-school parenting to higher education. Specialization and professional training have undeniable value to individuals, but multi-disciplinary breadth of awareness is the best prescription to enable debate and interpretation about both scientific and cultural phenomena. It is demonstrably true, too, that a broad-minded outlook is best able to accommodate differences of opinion, a civil quality so lacking today.

See Part I: The Diffusion of Complexity

--

--