Logical fallacies

A fundamental evil in the philosophy of mind

Wolfgang Stegemann, Dr. phil.
8 min read5 days ago
Image created by Copilot

The philosophy of mind deals with fundamental questions about the nature of consciousness, subjectivity, and the relationship between mind and matter. It is a field that is particularly susceptible to logical fallacies. Misleading ways of thinking and erroneous conclusions can easily lead to confusion and wrong turns. Therefore, it is imperative to accurately understand and avoid the most common logical fallacies in this field. Some claim that most philosophical problems arise from such fallacies.

The Equivocation Error

This error occurs when the same term is used within an argument with different meanings. In the context of consciousness, this becomes apparent, for example, when “consciousness” is meant in the sense of subjective experience and sometimes as the ability to process information. Such ambiguities lead to confusion and inconsistencies.

A striking example comes from John Searle in his critique of artificial intelligence:

“We are superhuman experts in pattern recognition. We all make billions or trillions of conclusions per second […] there’s no way a digital computer could ever do that, because it’s just cracking one lousy little instruction after another” (Searle 1980).

Here, Searle uses “inferences” in two different meanings — on the one hand, as unconscious pattern recognition processes in the brain, and on the other hand, as arithmetic operations in computers, resulting in a misleading equation.

Another example of this related to “consciousness”:

First argument: Every person has consciousness and can make decisions.

Second argument: A computer program can have “consciousness” because it makes decisions based on data.

In this case, “consciousness” is defined once as human consciousness and once as the decision-making ability of a program. This fallacy leads to a misleading equation of two different concepts.

This fallacy has a significant influence on the discussion about artificial intelligence and is part of justifications for the possibility of general artificial intelligence, which should apply to both humans and machines.

Fallacies of Assumption

These include circular reasoning such as the “petitio principii”, in which what is to be proven is already presupposed in the premises. In the mind-body debate, it would be such a mistake to assume that the existence of an immaterial soul is given and no longer needs to be justified.

René Descartes’ famous premise “I think, therefore I am” already presupposes the existence of a thinking subject in order to then seemingly derive it. The philosopher Patricia Churchland criticized this as circular.

Fallacies of relevance

The “argumentum ad hominem” is a prime example: instead of dealing with the contents of an argument, the person who presents it is attacked. In debates about consciousness, the motives or circumstances of a discussant are often used to discredit his position.

As an example of a personal attack rather than an argument, it would be if someone said, “Somehow I don’t believe a guy growing tomatoes in the Arizona desert.”

The Category Error

Here, traits or abilities are wrongly attributed to entities to which they do not belong. Thus, the ability of computers to solve problems is often equated with real consciousness — an obvious category error.

The physicist Roger Penrose claims in his book “The Emperor’s New Mind” that consciousness must be based on quantum effects in microstructures of neurons. This has been criticized as a category error, as it mistakenly transfers concepts from the micro to the macro world, but also concepts from physics to psychology/philosophy.

Reductionism

Reductionism mistakenly tries to reduce complex phenomena to their basic components and ignores emergent properties.

Francis Crick argued:

“You, your joys and your sorrows, your memories and your ambitions, your sense of personal identity and free will, are in fact no more than the behavior of a vast assembly of nerve cells and their associated molecules” (Crick 1994).

This reduces the complex phenomenon to its physical components without recognizing emergent properties.

Another reductionist approach is Tononi’s Integrated Information Theory. The IIT claims that any system that has a high level of integrated information has some level of consciousness. This would theoretically mean that non-biological systems, such as a stone, could also be capable of consciousness if they meet the appropriate criteria. This idea contradicts the view that consciousness is a property of living organisms with complex nervous systems.

Anthropomorphism

It is the mistake of transferring human characteristics to non-human entities.

An example would be the idea that nature or the universe has some kind of “intention” or “purpose,” similar to how humans have intentions and goals. But the idea that all entities have consciousness is also such an anthropomorphism.

Apparent causality

The correlation of neuronal processes with states of consciousness is often misinterpreted as causality.

Since states of consciousness correlate with neuronal processes, they must be caused by them — a typical “hoc ergo propter hoc” fallacy in the philosophy of mind. Neurons do not condition consciousness, both are different perspectives.

False analogies

Likewise, concepts from quantum physics or other fields are erroneously transferred to the macro world or consciousness. In his book, Roger Penrose tried to use quantum effects such as “superposition” to explain consciousness — an example of a false analogy from the micro to the macro range.

Another example is the analogy of a computer simulation, in which our consciousness is “played”, so to speak, as discussed by David Chalmers in his book “Reality+”. It is conceptually and logically unsustainable. It is indeed a mistake to compare our subjective world of experience with a video game. Here are some reasons why the simulation hypothesis is not plausible in this sense:

Our consciousness and our neuronal processes are based on extremely complex biochemical processes in the brain. There is no known way in which these could be “emulated” by a computer simulation.

The assumption that our experiences correspond to a kind of “neuronal input-output system” into which information can be fed is a crude simplification of the functioning of the brain.

There is no known mechanism by which an external simulation could interact with and control the neuronal structures and biochemical processes at all.

The analogy to the video game wrongly assumes that our reality is ultimately reducible to software code — an assumption for which there is no evidence.

Misattribution, overgeneralization, and ontological fallacies

Panpsychism, represented by philosophers like Galen Strawson, ascribes an associated consciousness to literally every thing — stones, atoms, etc. — an extreme overgeneralization.

Often, properties such as consciousness are falsely attributed to certain entities. The ontological fallacy is to mistakenly conceive of consciousness as a substance in its own right, rather than as a state or process.

“Consciousness exists as a kind of field that extends through the whole body and the environment.” This mistakenly treats consciousness as a substance rather than a mental state. Examples are the so-called 4 E’s (Embeddedness, Embodiment, Enactivism, Extended Mind).

Another example is the animistic conception of nature of indigenous peoples, who ascribe to nature a kind of conscious “spirit” or “soul”, such as the “mana” of the Polynesians.

The homunculus fallacy

This particularly serious error tries to explain consciousness by the existence of a “small observer” in the brain. However, this only leads to an infinite regress, as it would then have to be explained how this “homunculus” itself can be conscious.

René Descartes’ idea of a separate small “res cogitans” in the brain that perceives the sensory impressions leads to an infinite regress, because how can this small substance itself be conscious again?

The explanation gap fallacy

Here, it is wrongly assumed that physical explanatory approaches cannot capture the subjective qualities of consciousness experiences (qualia) and that there is therefore a fundamental “explanatory gap”.

Joseph Levine argued that there was an “explanatory gap” between the physical processes and the subjective quality of experience (qualia), and that physics could not explain the latter. Nor can it, since qualia are not the subject of physics, physics has no categories for their description. Thus, the so-called “hard problem of consciousness” is an illusory problem.

Fallacy of the privileged perspective

This fallacy is based on the erroneous assumption that we have a special access to the nature of consciousness through our subjective experience of consciousness and can therefore judge it better than objective theories.

“I experience directly in my consciousness that there must be a non-physical component” — an intuitive but deceptive view, as skeptics of physicalism often argue.

Cognitive Discrimination Fallacy

The mistaken assumption that we can reliably distinguish between conscious and unconscious processes based on certain characteristics such as clarity or distinctness.

“Only contents of consciousness that appear clearly and distinctly in my mind’s eye are really conscious” — a misleading assumption, since many mental processes take place subconsciously.

The moral fallacy

Here, ethical assumptions flow into the analysis of consciousness. For example, it is often claimed that certain entities must have consciousness, otherwise they do not deserve moral consideration.

“Animals must have a consciousness, otherwise there would be no ethical concerns about exploiting them” — a typical straw man objection of animal rights activists against the restriction of consciousness to humans. This has nothing to do with the empirical view that animals obviously have consciousness.

Fallacy of introspection

The idea that the nature of consciousness can be directly grasped by observing one’s own inner mental states is deceptive. Introspection can be deceiving and is often distorted.

“By observing and tracing my inner mental processes, I know what consciousness really is” — an intuitive but deceptive assumption to which most philosophers of the mind fall.

Perspective fallacy

The key point is that the body and mind should not be seen as two separate, independent entities, but as two different views or perspectives on the same holistic object — the human being.

The mistake lies in making the perspectives of the physical (body) and the psychic (mind) independent and wrongly considering them as substances that exist independently of each other.

In reality, however, body and mind only represent different levels of description and ways of looking at the human being as a holistic bio-psycho-social being.

In this respect, it is a perspective fallacy to treat body and mind as fundamentally separate entities instead of as complementary perspectives on the one human nature.

Associations Fallacy

The Global Workspace Theory (GWT) is one such association. Developed by Bernard Baars and Stan Franklin in the late 1980s, it uses theatre as a metaphor for human perception and consciousness. According to GWT, the brain is a network of parallel, specialized processes that are largely unconscious. The attention acts as a spotlight that brings some of these unconscious activities into the conscious mind on the main stage. The mistake lies in a false association. From a physiological point of view, there is neither a theatre stage nor a spotlight. Both are projections that would have to be explained again and whose ontology remains unclear.

In order to arrive at viable insights in the philosophy of mind, it is indispensable to avoid all these manifold logical fallacies. Reasoning at the highest level of rigor and precision is imperative in this field. This is the only way to further clarify one of the greatest mysteries of philosophy step by step.

References

Crick, F. H. C. (1994). The astonishing hypothesis: The scientific search for the soul. Charles Scribner’S Sons.

Penrose, R. (1989). The emperor’s new mind: Concerning computers, minds, and the laws of physics. Oxford University Press.

Searle, J. R. (1980). “Minds, brains, and programs.” Behavioral and Brain Sciences, 3(3), 417–424.

--

--