Can we trust our conclusions?
If your brain was lying to you, how could you tell?
In Essays #23a and #23b we discussed ways of obtaining information about the universe. In Essay #23c we discussed ways of putting the information we obtain together to draw conclusions. In this essay it is time to review this procedure to make a quick sanity check: having obtained information, and having put the bits together to draw a conclusion, do we have any reason to believe that our conclusions are reliable?
A scientific answer
Naturalistic science — often touted as that most-rational of pursuits — provides an answer as to the reliability of our own scientific conclusions. The life sciences and the physical sciences both independently provide insight into the expected reliability (or otherwise) of our thinking in a naturalistic world. And (spoiler alert) it is not encouraging.
An evolutionary perspective
Let us start our reflections by considering my hamster. Her name is Chubby Cheeks. I am very fond of her. I go to great lengths to care for her and make sure she is well looked after. Always the same, day after day: love, care food, water, love, care, food, water. Despite this, whenever I go to feed her, she runs away. Whenever I change her water, she cowers in the corner. She has no conception of the fact that I am not going to eat her.
There is a very simple and sensible reason for this. Imagine for a moment, eleven million years ago, that there were two groups of hamsters in the deserts of North Africa. One group cared about the truth. If they saw something new — be it a rock, or a root, or a raptor — they went to investigate it: How does it feel? What does it taste like? How does it behave? Does it have teeth? How fast can it move? The other group of hamsters did not care for the truth. If they saw something new — be it a rock, a root, or a raptor — they ran and hid. Of these two groups, one group made it past the second generation.
The evolutionary pressure to care about the truth is incredibly weak. There are some situations where learning the truth can help a hamster survive. But if a hamster’s brain has a choice between processing a thought with regard for truth and processing a thought with regard for survival, it will always do the latter. As a result, while it is true that I would never eat my hamster, my hamster’s brain is incapable of knowing that. She simply does not have a mental category for “big animal that won’t eat me.”
Having recognised that millions of years of evolutionary pressure have rendered hamsters incapable of recognising the truth staring them in the face, let us now turn to humans: Is your mind able to recognise the truth?
A knee-jerk response is, obviously, “yes.” You are a reasonable person. Your cognitive abilities are far superior to a hamster’s. Through science, humans can understand the grandeur of distant galaxies and the strangeness of subatomic particles. We have developed maths, and logic, and aeroplanes, and mobile phones. How could we have done all of that if we could not grasp the plain truth of the universe? 
It may seem hard to get round such an argument. It seems pretty water tight. But it rapidly springs a leak if we ask, “Who says that I understand maths? How do I know aeroplanes work?” These are all ideas that you hold in your brain. And what is it that tells you your brain is reliable? it is your brain!
You may object: “But I have seen planes fly! With my own eyes!”
Except, when you think (!) about it, the account of “seeing” a plane flying (as we understand it) is that photons go from the plane to your eyes; your eyes focus those photons onto your retina; an electrical signal is sent to your brain; and your brain processes all of this and tells you you have seen a plane. You did not see the plane. Your brain tells you you saw a plane. If there was no plane, no photons, no eyes, just your brain telling you that you had seen a plane, how would your experience be any different?
You may object: “But I can check this independently! Bob saw the plane too!”
Except, how do you know Bob saw it? Did he tell you? Did sound go from Bob’s mouth to your ears? Did your brain process the signal resulting electrical signals? Did your brain tell you that you had heard Bob tell you? Certainly, the last step happened. It is hard to tell whether any of the other steps took place.
If you went into a court of law and the only alibi or character witness the defendant was able to call was the defendant himself, would that seem strange?
“Yes, Your Honour, I am a tremendously trustworthy person. I say so myself. And you can believe me when I say it, because I am tremendously trustworthy.”
You wouldn’t accept that! Circular logic is one thing. But that is a very tight circle indeed.
So much for observation. Maybe we must accept that we cannot trust the evidence of our own eyes. Maybe our brain misleads us regarding our experiences, just as my hamster’s brain misleads her regarding hers. Hey, at a push, maybe we are living in the Matrix and we don’t know it. But surely (we think to ourselves) cold hard reason is still reliable. Even in the Matrix, 1+1=2. There is no way around hard logic.
Here, though, is our problem: who tells us that there is no way around hard logic? It is my own rationality! It is my own brain, which apparently evolved over long periods of time to keep me alive. If I were able to get around cold hard reason, but doing so would be detrimental to humanity’s survival, my brain would not tell me about it! My brain would work very hard to make sure I never knew I could circumvent reason.
Thus far we have established that, on the view of naturalistic evolution, if our brain tells us that it is reliable, we have reason to doubt that our brain is telling us the truth. That seems bad. Unfortunately, our situation is even worse than this. Because even our own brain doesn’t always tell us it is reliable.
Our brains tell us that evolutionary pressures in hamsters have given rise to brains that lie. And our brains tell us that our brains may well have been created by the same evolutionary process that created hamsters’ brains. So our brain tells us that we should expect our brains to lie to us. Sometimes. But we do not always know when.
This is like a defendant standing up as the only witness in their own defence and saying, “Your Honour, I am innocent. Probably. But, by my own admission, I do sometimes make stuff up. So, yeah… what can I say? You can trust me if you want. Good luck.”
Naturalistic evolution therefore provides a serious epistemic problem: If Richard Dawkins is right, and if our brains arose by a naturalistic process of natural selection, then we have reason to doubt everything we think we know. This includes having reason to doubt the claim that Richard Dawkin’s claims about natural selection are true.
Of course, this does not mean that our theories of evolution are wrong. Maybe evolutionary theory is one of the bits that our brains got right. However, any naturalistic, materialistic evolutionary theory which optimises for survival excludes the possibility of epistemic optimism. We should expect that our brains have a propensity to utterly fail to grasp significant aspects of the truth of the universe around us.
Naturalistic scientists, far from crowing about how evolution has furnished us with a brain that lets us grasp the wonders of the universe, should be deeply concerned that evolution would most likely furnish us with a mind that lies to us about the basic nature of the world.
A physical-science perspective
In facing this problem, it would be unfair to single out evolution specifically, or even the life sciences in general, as uniquely problematic. Naturalistic physics and naturalistic chemistry have the same problems. If anything, their problems are worse.
If the atoms in our brain are just atoms — following rules — why should we expect the rules that they are following lead to the production of an organ which would have anything to do with presenting pictures of the universe that are true?
I can set up a computer so that in answer to the input “2+2=” it always returns the output “17”. Or I could create an image-processing system which filters out all orange objects and replaces them with pictures of cats. Or I could set up an AI computer system which writes wonderful fiction. Each of these systems would follow the rules of physics. And they would follow the rules of the system. But there is no reason why the rules of physics or the rules of the system must require outputs which correspond to reality.
So, too, with our brains. If our faculty of processing information are simply collections of atoms following rules, we have no reason to think that it is doing anything related to processing and presenting information about the world around us. This is true, even if we incorrigibly have the impression that it is doing just that.
What if one day a philosopher or an information theorist provided a rigorous proof that, in fact, our brains, following the laws of physics, were assured of generating experiences which connected to reality. Even this would not seal the argument. Their proof simply begs the question: how do we know that what seems like a watertight proof is really a watertight proof, without first assuming that our brains are trustworthy?
However we slice it, within a naturalistic world view there seems little reason for epistemic optimism. We might get the impression that logic works, or that direct observation provides us with information about the world. We might think that aeroplanes fly, or that we are living our lives in a way that is not really really stupid. But we have no reason to believe that we have any better grasp of the world than a hamster who runs away from the very person who is trying to look after her.
A religious answer
Naturalistic science — lauded by some as a bastion of rationality and certainty — inexorably leads us to be deeply skeptical of our own conclusions. Does religion — derided by some as a haven for ignorance and inconsistency — have anything to say? More than that, does it have anything better to offer?
It turns out that it may.
If your brain arose via a process that prioritises survival over truth telling, then you cannot trust your brain. By contrast, if your brain arose by a process that prioritised truth telling over survival, then you may be able to trust your mind.
If, for example, your brain was created by a benevolent creator who wanted you to know the truth about the world around you, to know Him, and to experience Him; and if He designed your brain with that purpose in mind, then you might expect to reliably know things about the world around you. You may be able to trust your brain.
This is not to say that your brain was made by a benevolent creator. It is simply to say that if your brain was made by a benevolent creator, it might be reasonable to believe that our brains are more-or-less trustworthy. By contrast, if our brains arose by naturalistic processes, it would be tremendously surprising for our brains to be trustworthy.
This leads to a necessary reversal of the way that science and religion are often thought to relate:
Often, people wonder if science can offer us any justification for believing religious claims. But now we see that naturalistic science cannot even offer any justification for believing scientific claims. Instead, it is religion that comes to the rescue: religion may offer us a justification for believing scientific claims.
Qualified religious optimism
This religious argument does not, however, offer us a blank cheque. We cannot simply say, “Alright, we have reason to believe that our brains are reliable. Let’s go off and do science again, just like we did before.”
While a benevolent creator opens up the possibility that your brain may be reliable, it does not guarantee it. This can be illustrated with the example of Christianity.
Within a Christian worldview, our senses and rationality were initially trustworthy, but became corrupted by the Fall, when Adam and Eve sinned. Just as there were mortal consequences of rebelling against God who is the source of life, and moral consequences of rebelling against God who is the source of goodness, there were epistemic consequences of rebelling against God who is the source of knowledge.
Within this view, although humans are capable of generosity, they have an inherent tilt towards selfishness, and even their generous acts are tinged with their own interests. Similarly, although humans are capable of knowledge, they have an inherent tilt towards ignorance, and even our knowledge is tinged with wilful neglect of the truth.
Consider and example of this in practice:
If I am faced with two theories — a theory that I came up with and a theory that my colleague / competitor came up with — it is obvious to me that, of the two, my theory is better. My theory is elegant and flawless. The theory which my competitor (who I shall call Bob) came up with is a cobbled-together mess full of obvious holes. When I point out to Bob the obvious problems of his theory, he cannot see them, even though they are as clear as day. This demonstrates to me that Bob is not very clever. Worse still, Bob insists that there are holes in my theory, because he cannot understand the important yet subtle connections I have made. This demonstrates to me that Bob is not very clever at all.
Psychologists call this a cognitive bias. Christians call this pride.
Knowing that I struggle with pride, I can mitigate its epistemic effects by nurturing humility. However much it hurts to admit it, it is possible that I am wrong, and it is possible that my colleagues are right. If I have a result that makes me look good, I need to be extra cautious before accepting it. If I have a result that makes me look bad, I need to be extra cautious before rejecting it. By following such strictures, I become a better scientist.
Religion does not therefore give us a free ride to reliable thought. It gives us something better: it gives us a way to face our own epistemic unreliability, and ways to address it.
From a Christian perspective, I can approach science and say that my brain can be expected to be somewhat reliable. I can also expect my thinking to have certain key blind spots, and I can say what I expect some of those blind spots to be. Best of all, I can propose and implement practical steps by which the problems caused by those blind-spots can be mitigated.
A scientist with a Christian worldview who trusts their own brain to at least some degree does so rationally. A scientist with a naturalistic worldview who trusts their own mind to any degree at all does so irrationally. The beauty of this irony is apparently lost on Richard Dawkins, which is a pity.
Different religious traditions take different positions on the reliability of human reason and experience. Nonetheless, the extent to which our brains are — or can be — trustworthy, and the ways in which our brains might deceive us, while of clear relevance to science, is an appropriate theological question. It is also a question for which theology may offer a more optimistic answer, which is more conducive to science, than the answer offered by naturalism.
 If you have been following the series, you may remember that Essay#13 argued that we do not have a reliable grasp of noumenological truth, even though we can build aeroplanes. In this Essay we make the stronger claim that we do not even have a reliable grasp of phenomenological truth, even though we can apparently build aeroplanes.