Hard, Real, and Core Problems of Consciousness
The hard problem of consciousness, as formulated by David Chalmers, has fascinated me for many years. Why are chemical reactions and electrical impulses in my brain associated with qualitative experience? I don’t believe that it feels like anything to be a dishwasher, so why is it different for my brain? Why doesn’t my brain just produce behavior as a function of inputs and internal dynamics, end of story? (Of course, some people might deny the premise of this question.) To me, the hard problem has always been to equal degrees intuitive in its framing, but utterly mysterious when it comes to its implications. In fact, it’s hard for me to conceive of any possible solution in its traditional formulation. It wasn’t until last year, after some conversations with philosophers and physicists I met at a consciousness workshop, that I re-evaluated my understanding of the problem that consciousness poses.
In this article, I want to put the hard problem of consciousness into the context of two alternative problem statements: The real problem of consciousness and the core problem of consciousness. Below I address all three problem statements after a quick introduction to the definition of phenomenal consciousness. (Feel free to skip that part if you are familiar with the term.) My aim is to elucidate the assumptions of the different problem statements by contrasting them with each other. I conclude with my opinion on what the best problem statement is in terms of finding common ground between different scientists and philosophers, while at the same time not losing sight of the mystery at the heart of consciousness.
Disclaimer: This article represents my interpretation of the problem statements put forward by the various authors. If any of the ideas outlined don’t make sense, it is probably me who is to blame. My background is in deep learning and computational neuroscience. I have no formal background in philosophy.
Background: What is Consciousness?
Together with words like ‘god’ or ‘meaning’, this might be one of the contenders for the most fuzzy terms out there. Ask ten people for a definition, and you might get ten different answers. Moreover, such answers are often circular, such as to be conscious is to be aware of something. However, this does not mean that there is nothing there that is worth investigating. The kind of consciousness I am interested in is phenomenal consciousness, or qualitative consciousness. To start, let me go over some common ideas about consciousness that phenomenal consciousness does not imply: A sense of self or identity, a meta-cognitive awareness of what I’m doing, or some level of general intelligence. Note that all of these could be associated with phenomenal consciousness, but they are not identical to it or implied by it. So what’s left? Among people interested in phenomenal consciousness, a very popular essay to cite is Thomas Nagel’s What Is It Like to Be a Bat? [1]. Nagel proposes that, for any being to have a subjective experience (i.e. to be phenomenally conscious), there is something it is like to be that being. In other words, if it feels like anything at all to be a bat, bats are conscious. This is an ostensive definition, meaning it doesn’t give a full account of the phenomenon, but it points to it. Of course, compared to other definitions, this might not be the most satisfying one, but I believe it is one of the best ways we have for using language to describe the idea of phenomenal consciousness. Another way to capture the same intuition is by thinking about subjectivity vs. objectivity. We can study a bat’s brain until we perfectly understand its behavior as a function of external inputs and internal dynamics from an outside, or objective, perspective. However, no matter how much we understand this system from the outside, we (ostensibly) will not know what it is like to be that system from a subjective perspective. For more information on different definitions and categorizations of consciousness, check out [2].
Within phenomenal consciousness, it is also common to draw a distinction between conscious state and conscious contents, also referred to as global and local states of consciousness [3]. The former points to overall descriptions of a subject’s conscious experience, such as wakefulness, sleep, or potentially something like selflessness (as induced by say meditation or psychedelics). The latter refers to specific objects within our conscious experience, such as smells, visual objects, sounds, joint pains, etc. In the following, I will mostly be talking about conscious contents, although the distinction between the two does not seem completely clean-cut to me.
The Hard Problem
The hard problem of consciousness, coined by Chalmers [4], is one of the most commonly thrown around phrases when talking about consciousness. Its aim is to distill the most mysterious aspect of subjective experience from other, more easily explainable phenomena, that are often associated with it. Essentially, it separates the idea of phenomenal consciousness from behavioral and cognitive brain function, much like Nagel’s definition, but in a different way. It doesn’t just point to it, but it frames its explanation as a problem that is seemingly out of the reach of science. Chalmers isolates the hard problem by first defining the easy problems of consciousness: Explaining how neural activity gives rise to various cognitive functions and behaviors. These are easy in the sense that, at least in principle, the tools of neuroscience and cognitive science should be well-suited for this task, and we can imagine what such solutions might look like. If for instance, we find some principle of computation, which, when implemented in a simulation, produces the same behavior as our neocortex, then we are at least on a good track to solve the easy problems. (Although a satisfying solution might also require explanations on higher levels of abstractions that are consistent with and are reducible to the lower-level explanations.) Setting aside these easy problems, we’re left with phenomenal consciousness. However, by definition, what we’re left with is not characterizable by its function or behavior. Thus, how could we explain it? What would such an explanation even look like? Here is a formulation of this hard problem in the words of Chalmers [4]:
Why is it that when our cognitive systems engage in visual and auditory information-processing, we have visual or auditory experience: the quality of deep blue, the sensation of middle C? How can we explain why there is something it is like to entertain a mental image, or to experience an emotion? It is widely agreed that experience arises from a physical basis, but we have no good explanation of why and how it so arises. Why should physical processing give rise to a rich inner life at all? It seems objectively unreasonable that it should, and yet it does.
The hard problem is therefore a question of why and how physical processes give rise to subjective experience. As things stand, it appears that there could be no scientific experiment that ever gets us closer to tackling the hard problem. This is because science requires measurements, and something that is not characterized by function or behavior cannot produce such measurements. Should we therefore give up and acknowledge phenomenal consciousness is outside of the realm of science? Maybe not. Maybe this formulation of the issue is problematic. To see why this could be, it’s important to recognize that the idea of the hard problem rests on an assumption that is implicit in the above quote: Assumption 1 — we can conceive of all of our brain’s activity and behavior to take place without an associated phenomenal experience, and it is thus possible. In other words, the question of why physical processes would give rise to consciousness in the first place only makes sense if there was any way that physical processes taking place in our brain could exist without the associated experience. Moreover, it assumes a directionality in the explanation: Assumption 2 — A (scientific) account of consciousness should assume the physical and explain the phenomenal. Most people in the scientific community would take this directionality for granted. However, it is not obvious why this would be true, especially from an epistemological standpoint: The only thing we have certain knowledge of is phenomenal experience, everything else is conjecture. But I don’t want to argue for idealism here, I just want to question the assumption of the directionality implicit in the hard problem. It is worth mentioning, however, that Chalmers uses this formulation of the hard problem as an argument against this directionality. In the paper introducing the hard problem, he proposes a type of dual-aspect monism, where physics and phenomenal experience stand on equal ontological grounds. This leaves us with a problem statement that, by design, cannot be addressed by science.
The Real Problem
Given that the hard problem by definition precludes any scientific solution, more pragmatically-minded people have tried to reformulate the problem of explaining phenomenal consciousness. Enter Anil Seth. He defines the real problem of consciousness [5] as
“the primary goals of consciousness science are to explain, predict, and control the phenomenological properties of conscious experience. This means explaining why a particular conscious experience is the way it is — why it has the phenomenological properties that it has — in terms of physical mechanisms and processes in the brain and body. These explanations should enable us to predict when specific subjective experiences will occur, and enable their control through intervening in the underlying mechanisms.
This brings the problem of consciousness back into the realm of science. After all, if an explanation constitutes a systematic theory that makes predictions about experience based on brain activity, then it is testable.
However, this only works if we have a ground truth on conscious experiences, i.e. we can only validate a theory if we can check whether its predicted conscious contents match reality. The only way to get to these ground truths is by relying on behavior, be that verbal reports or some other motor action of a subject. Thus, within the framework of the hard problem, we are just dealing with easy problems. Chalmers might see the project of solving the real problem as a search for a systematic neural correlate of consciousness [6] rather than a solution to the hard problem. But that’s ok, we don’t have to accept the framing of the hard problem; given its assumptions, it might not be the right one anyways. Moreover, Chalmers does acknowledge the absurdness of assuming consciousness to be a pure epiphenomenon (that is, behavior being completely divorced from phenomenal consciousness) when discussing the meta-problem of consciousness [7]. Still, let’s spell out Seth’s assumption explicitly: Assumption 1 — We can investigate conscious contents by measuring behavior. The hard problem denies this assumption.
In terms of ontology, we are still assuming a directionality of explanation, as with the hard problem: Assumption 2 — An account of consciousness should assume the physical and explain the phenomenal. In the end, Seth argues, while phenomenal consciousness is real, it might not be what we think it is, and it should be reducible to physical processes. This is closely resembles the view that Keith Frankish labels as weak illusionism [8]. Chalmers is skeptical about the power of weak illusionist theories to explain phenomenal consciousness [7]. If only physics is fundamental, it has to account for phenomenal consciousness. Under this assumption, the claims of the hard problem about the difficulty of this problems are fair, I think. If phenomenal consciousness is seen as real, but not fundamental, we have to give an account of how its phenomenal properties arise beyond function and behavior.
The Core Problem
While being more optimistic about our ability to infer phenomenal states from behavior, the real problem is arguably more parsimonious about metaphysical assumptions (the conceivability argument) than the hard problem. Could we make it even leaner in terms of its presuppositions, while leaving open the possibility for scientific inquiry? Possibly. One such idea has been put forward by Robert Chis-Ciure and Francesco Ellia called the core problem of consciousness [9]. The authors describe the hard problem as being layered in the sense that it can be decomposed (paraphrased by me, therefore possibly inaccurate):
- There is phenomenal consciousness, which needs to be explained.
- Conceivability arguments of non-conscious functional (and structural) equivalents are coherent. As a result, physical explanations cannot account for phenomenal consciousness.
They suggest that we can do away with the second layer, while holding on to the most basic idea of relating consciousness to physical matter. They thus define the core problem of consciousness as
We need to explain how the fact that there is something it is like to be us relates to physical matter.
It seems to me that this does away with both Assumption 1 and Assumption 2 outlined when describing the hard problem: We don’t assume any conceivability of non-conscious functional equivalents, and we don’t assign any ontological primacy to physics. In a sense, this is a much leaner version of the hard problem which still captures much of the mysteries that the hard problem addresses. The only assumptions we’re making (loosely speaking) is that both phenomenal consciousness and physics exist, and that they are somehow related. This fundamental assumption is very hard to deny, and is implied by both the real problem and the hard problem. In that sense, the core problem represents a weaker form of both the hard problem and the real problem; it thus functions as an intersection between these frameworks.
Conclusion
Combining the metaphysical leanness of the core problem with the pragmatism of the real problem (i.e. let’s assume we can at least partially measure conscious contents by behavior), we arrive at a problem statement that might be interesting to a broader range of people. Whether you’re a realist or a weak illusionist [8], the project of characterizing the relation between processes in the brain and consciousness without assuming any ontological primacy of physics or consciousness should be of interest to you. This endeavor subsumes both the scientific aspect of finding a systemic neural correlate of consciousness, as well as the question of what this relation means on a metaphysical level.
The epiphenomenal nature of consciousness suggested by the hard problem can alienate scientifically-minded people, while the reductionist prior of the real problem might turn away people who reject the assumption of physical primacy. The terminology of the core problem, on the other hand, could help scientists and philosophers, that might have some disagreements in their assumptions, find common ground. To me, an epistemologically optimistic (with respect to conscious contents, from an objective perspective) version of the core problem might be the best framework for the scientific study of phenomenal consciousness.
Bibliography
[1] Nagel, Thomas. “What is it like to be a bat?.” The philosophical review 83.4 (1974): 435–450.
[2] Van Gulick, Robert, “Consciousness”, The Stanford Encyclopedia of Philosophy (Winter 2022 Edition), Edward N. Zalta & Uri Nodelman (eds.), URL = https://plato.stanford.edu/archives/win2022/entries/consciousness/.
[3] Seth, Anil K., and Tim Bayne. “Theories of consciousness.” Nature Reviews Neuroscience (2022): 1–14.
[4] Chalmers, David J. “Facing up to the problem of consciousness.” Journal of consciousness studies 2.3 (1995): 200–219.
[5] Seth, Anil. Being you: A new science of consciousness. Penguin, 2021.
[6] Chalmers, David J. “What is a neural correlate of consciousness.” Neural correlates of consciousness: Empirical and conceptual questions (2000): 17–39.
[7] Chalmers, David. “The meta-problem of consciousness.” Journal of Consciousness Studies 25.9–10 (2018).
[8] Frankish, Keith. “Illusionism as a theory of consciousness.” Journal of Consciousness Studies 23.11–12 (2016): 11–39.
[9] Chis-Ciure, Robert, and Francesco Ellia. “Facing up to the hard problem of consciousness as an integrated information theorist.” Foundations of Science (2021): 1–17.