I don’t think consciousness is a byproduct

I’m not an expert and you shouldn’t particularly care what I think, but I’ll tell you anyway.

Physicalists aka materialists are often presented with the challenge of the so-called hard problem of consciousness. That is, essentially, how and why does subjective experience arise from physical brains? In general, I have the impression that most physicalists remain skeptical about the possible solutions, rather than defending a certain one strongly. One possible explanation that is quite popular is that consciousness is a byproduct of physical processes in the brain. I used to tend towards this kind of view in the past, but in recent years my perspective has changed.

First off, what is consciousness? Like many important concepts it can’t really be pinned down, but we can describe it incompletely. Consciousness is a kind of mode of operation of a brain in which the brain prioritizes interaction with the outside world. It is a state of wakefulness and awareness. The distinguishing feature of consciousness is sensory perception. This includes everything you consciously experience. Individual pieces of sensory perception (as in, “the experience of seeing the color red” for example) are often called qualia. A related concept in philosophy is phenomena (appearances), as contrasted by noumena (things in themselves).

I find these concepts useful, however I think it’s a mistake to think of consciousness as something wholly separate from physical reality. This separation between subjective and objective is really the reason we have a “hard problem” at all. It’s how Chalmers can get away with claiming P-zombies are conceivable. In my opinion, humans just have a hard time understanding their own minds.

So, then, what’s the deal with consciousness? Let’s start with solipsism, the philosophical position that the person holding this position is the only actual consciousness in the universe. There are variations on this, for example the “brain in a vat” thought experiment can be described as a kind of solipsism. Interestingly, The Matrix avoids solipsism by the robotic overlords genuinely putting humans together in the same simulation. Anyway, virtually no one is a true believer in solipsism. Like the brain in a vat, it’s more of a thought experiment. Because we don’t have access to others’ consciousnesses, we can’t know for sure that they really have them. But if we genuinely don’t have enough evidence to prove that other people have their own subjective experiences, then why is no one a solipsist? Well, because the premise is clearly false; we do have the evidence. That, and for sort of “historical” reasons. Our pre-human ancestors almost certainly recognized consciousness in conspecific individuals, and probably some animals as well. In fact, there are most likely non-human animals that do this, but of course it’s hard to tell exactly how they conceptualize something like that. These are organisms with no knowledge of how a brain works, potentially not even what the brain does at a high level. Now, recognizing consciousness in others is obviously adaptive for social animals, and whenever human culture started it already would have been a collective belief.

My point is that, up until very recently, observations of behavior were the sole determining factor in deciding whether another organism is conscious. While we don’t have robots that look like humans yet, we do have AI that passes the Turing Test. You can imagine that if ancient people could communicate with modern AI that was pretending to be human, they would come to the obvious conclusion that they were talking to a conscious human, even though the AI is not conscious. In other words, we’ve proven with our own technology that observable behavior alone is not enough. Fortunately, we have also advanced in our understanding of human physiology and neuroscience. If you suspected a person of being a robot, you would want to know if they have a human body including a brain.

To be more precise, one could compare themselves to others by observing similarities in brain structure and activity using things like fMRI. More generally speaking, scientists can also dissect and analyze brains and see how they’re similar, and they have been. All of the new information we have has, to my knowledge, never caused anyone to doubt whether other humans are conscious in the same way they are. It has only given us more evidence that all humans’ brains function in a very similar way, and from this we can infer that humans are all conscious in the same way.

But, wait, why can we make that inference? Let’s recall that the hard problem of consciousness is a part of the broader mind-body problem, i.e., how is it that the mind and body are related and interact? A core piece of understanding this problem is how changes in the body and changes in the mind are correlated, and nothing else appears to be correlated with the mind except the physical body. As for causation we can identify only one direction empirically, which is that changes in the body can cause changes in the mind. There are countless examples, but to name a few: traumatic brain injuries, brain tumors, brain surgeries, alcohol, narcotics, psychedelics, psychoactive medications, and so on. We have even identified certain areas of the brain that are related to certain aspects of consciousness.

The brain is clearly doing something to produce consciousness. I think that the thing the brain is doing is consciousness. There’s no “movie theater of the mind” where consciousness is being projected, but rather consciousness is just how the brain processes information. It feels confusing to think about our own consciousness, and it makes it seem like it’s something that’s separate from reality (especially because it doesn’t always align with reality). Well, it is somewhat separate from the world outside your body, but not from physical reality as a whole.

Part of my reason for thinking this way is evolution. An unoriginal insight to be sure, but: a consciousness that doesn’t functionally do anything would not evolve, in my opinion not even as a byproduct. It’s just too energy intensive. Some may think that consciousness is functional because it’s related to free will, but I don’t believe it’s possible for free will to have causal effects. There’s no evidence of the mind influencing the body, and in fact it has been observed that brain changes precede awareness of an event. Anyway, this isn’t a post about whether physicalism or dualism is true.

I was inspired to write this post after seeing an argument about free will being an “illusion”. I used to agree with this characterization, but I don’t really anymore. Illusions are misperceptions, and the experience of freely making choices is not (always) a misperception. I think there are plenty of cases where an explanation of why a certain choice was made is pure post-hoc rationalization, but I’m thinking more of deliberate decisions, the kind of decisions that one would think for a long time about before making. It’s not the case that deliberation is not meaningful due to the absence of free will. That is, the experience of thinking about the choice and making a decision just is how your brain made the decision. I think it’s always more complicated than that, and there can be both conscious and unconscious factors influencing a decision.

Frankly, sensory perception is a better example than decision making. When you observe objects that you can see within your visual field, your experience of observation is how your brain processes and integrates information from the eyes. This is why I disagree that consciousness is an illusion.


Photo by Ruvim

Leave a comment