You Are Not an Algorithm
Some claim algorithms will soon know us better than we know ourselves. But the truth about being human is not so simple.
What is it like to be a bat?
The philosopher Thomas Nagel asked that question in a now-legendary paper first published in The Philosophical Review in 1974. He wasn’t really asking his readers to imagine a life spent using echolocation to navigate through dark caves. Instead, he wanted them to think about the nature of mind.
Specifically, Nagel wanted to highlight the private, subjective, what it is like-ness of conscious experience, and how it seemingly cannot be reduced to a description of physical brain states.
Nagel’s point was as follows. We could in principle describe exactly the physical structure of a bat brain. We could describe the firing of neurons inside that brain and the consequent passage of electrical signals through it. But after all that, there will still be something about the operations of a bat brain that we would be no closer to knowing. That is, what it is like to be a bat.
People used to think that human beings have souls.
Now our understanding of ourselves is different. We can see that the brain is a highly complex, rule-bound information processing structure. And our best bet is that mind is simply the result of that information processing. There is no spirit-mind, as we once believed. No ghost in the machine. No soul.
If mind is simply a product of the physical brain, then we have to sacrifice the idea that humans have free will as traditionally understood. The brain is a physical object, subject to the laws of nature. And if our minds — our thoughts, decisions, feelings and so on — are simply the product of law-abiding physical processes, that means they are not chosen but determined.
The brain receives inputs, processes it in accordance with the laws of physics, and then produces outputs, some of which take the form of thoughts, decisions, feelings, and all the rest of what we call mind. There is no observer sitting somehow apart from these processes and directing us to choose, ‘I’ll have the tuna sandwich, not the cheese one.’ No magical free will.
Input, process, output: that’s all.
Human minds may well be the product of information processing in human brains (though it’s worth saying that there are serious thinkers on the subject who dispute this). But there is still a vast gap in our understanding of our brains and what they do.
Sure, seen from outside the property that we call mind seems to be nothing more than information processing. But seen from within, there is something else going on. Something mysterious, irreducible and incredibly rich. A private, subjective conscious experience that we identify as our authentic self. That is, the feeling of looking out at the world through your own eyes that each of us would hold up as the answer to the question, ‘what is it like to be you?’
The existence of this subjective experience — or what philosophers call qualia — remains a mystery untouched by our advancing physical descriptions of the brain. We simply don’t know how the roughly 1.4kg of wet organic matter inside our skulls produces the experience of the colour blue, pre-exam nerves, or eating a slice of pineapple.
We can’t explain subjective conscious experience. But each of us knows that it is there.
Why is this important?
An idea is gaining traction in our culture right now. That is, the idea that advances in biotechnologies and algorithmic intelligence will soon mean that algorithms will be able to model activity in the human brain and predict human thoughts and behaviours with unprecedented and near-total accuracy.
Humans, so runs this line of thinking, are hackable animals. We should prepare for a future in which algorithms know us better than we know ourselves. And in which brands, governments and assorted bad actors are able to use those algorithms to get us to think, feel and do whatever they want.
Underlying that idea is another: that organisms — including human beings — are essentially algorithmic. That even the human mind is itself only a super-complex algorithm, and in order to model it perfectly all we need to do is build our own algorithm of sufficient sophistication.
That idea seems to accord perfectly with our current conception of mind as the result of information processing in the brain.
But it ignores the existence of private, subjective experience. The what it is like-ness that is at the heart any meaningful understanding of being human. And within that mystery at the heart of human experience, we can open up a space of resistance to the algorithmically-determined future that is being planned for us.
We humans are not merely information-processing zombies. Instead, we are host to a private, subjective experience that remains fundamentally mysterious to us. And inaccessible to any algorithm.
It may be objected that algorithms will not need to know what it is like to be us in order to predict and manipulate our behaviours. After all, the YouTube algorithm has no access to your qualia, but it still gets you to watch videos for three hours instead of filing your tax returns. Similarly, we have no idea what it is like to be a sheep, but we’ve still been predicting and manipulating sheep behaviour for thousands of years.
There’s a good deal of truth in all that. But if private, subjective experience is the only non-algorithmic, non-hackable product of human brains, then can it represent a kind of basecamp from which we built a resistance to the idea that algorithmic overlords will soon rule over us?
The broader truth about the human brain seems to be that while it is a physical object with operations governed by the same rules that govern all objects, it is one of incredible complexity: 100 billion neurons connected by 100 trillion synapses. So that while the outputs of the brain may be formally determined by physical laws and therefore predictable, in practice the physical operations taking place are so complex that they are not always predictable. Meanwhile, those formal information processing operations are not taking place anywhere; they are taking place on the specific organic substrate that is your human body, and this substrate, just as much as the information processing, helps produce the subjective experience we all have of being ourselves.
Put all that together, and the idea that algorithms will soon ‘know us better than we know ourselves’ starts to look like something of an over-reach. If private, subjective experience remains outside the scope of algorithmic analysis, then it is equally possible that certain aspects of human thought, feeling and behaviour will also remain forever opaque to algorithms.
History is full of movements that claimed to have a comprehensive analysis of the motives and behaviours of human beings. In time, the algorithmic determinism currently so fashionable will almost certainly be revealed to be flawed, just as those claims were.
In the meantime, we must resist the idea, pushed on us by a Silicon Valley techno-elite and their ideologues, that there is no difference in kind between us and the algorithms they want to use to bend our thinking and behaviour to their advantage.
There is something mysterious, irreducible and fundamentally non-algorithmic about being a human being (or a bat). And it’s precisely that part of human experience that is the most precious. As ever-more sophisticated algorithms and A.I. gather around us, it will pay to remember that.
This new weekly column, Another World, examines our shared future in the 21st-century.
David Mattin is Global Head of Trends & Insights at TrendWatching. He sits on the World Economic Forum’s Global Future Council on Consumption.