A Postscript on Hitchens’ Razor
The sciences occupy a strange place in the 21st century psyche, in the same breath a spokesperson — say at a TED Talk, the medicine tent show of the internet age — can make it appear to be simultaneously an endangered species and an indomitable powerhouse when it comes to producing valuable insights to the universe and the human situation. For all their contributions to the modern world in the form of medicine, incredible technologies and probes into the inner workings of nature, scientists are in a way still employees — often contracted to do the bidding of corporations and increase the capital of the wealthy. In other respects the research which can be done is restricted by policies made by governments appearing to have little interest in the advancement of scientific knowledge. Few scientists would dispute this reality and it’s been the cause of much distress within the community for many years. It costs money to do research no matter what scale it’s conducted at, scientists need research grants or employment in laboratory firms in order to continue their research. And for the most lofty contributions to knowledge — like CERN’s Large Hadron Collider — the effort to put the funds together can be international in extent. This necessarily puts pressure on those who propose new areas of research to justify them thoroughly.
There’s simply no disputing that for the kinds of questions science deals with, it’s the most potent instrument which can be used. But the line where “scientific questions” end and where other disciplines can take over — disciplines which may operate under very different assumptions about the world than a physicist or neurologist might — is becoming much more difficult to draw than it was in previous centuries. Some would say the scientific method is adequate to tackle or dispel any problem, given enough time of course —after all, it wasn’t until the 16th century that it was discovered that the Earth revolved around the Sun, rather than the other way around. It wasn’t until 1953 that, after much refinement and building on previous data, the properties of DNA were finally described. These discoveries don’t come out of the sky, they sometimes require many generations of scientists putting together and pulling apart theories — descriptions of nature which, if accurate, can aid the project of human knowledge or, if inadequate, can just as well be thrown in the trash and forgotten. Under this assumption such questions as human consciousness, the origin of the universe and that of life on Earth, the laws which bind time and space no matter how large or small — even an empirical basis for morality — can be given rigorous, realistic explanations if governments and economic marketplaces would let scientists do what they do for however much time is needed and creationists would stop interfering in school curriculums.
Given the strength of this method of discovery, it might be worth asking what makes a theory or idea scientific? One answer comes from mid-20th century philosopher of science Karl Popper, who believed a position is scientific (not, as some misunderstand, true or false — though for reasons we will discuss later this misunderstanding is probably inevitable) if it can be disproven. This is called falsifiability — if you can falsify or prove my position wrong, I’m showing scientific rigor in my assertions. I’m leaving it open to endless testing, reformulation, and no matter how much it pains me, to be rejected and forgotten for some better position to take its place. Falsifiability remains for some people an adequate marker between science and what may well be called pseudoscience — claims which appear scientific but actually show no trust in the rigors of testing. It sets scientific knowledge apart from the competition.
The problem with this is that it ignores the history of how discoveries are made and theories actually come to be accepted. Even if we were to make falsifying theories the new standard for the rest of scientific history, it still would do nothing more than restrict the necessary mental athletics needed by scientists to make their new discoveries. This case is made by Thomas Kuhn in his book The Structure of Scientific Revolutions, and Paul Feyerabend in his book Against Method.
Kuhn explains the history of science has not been an onward progression from bad science (like geocentrism) to ever-better science (like heliocentrism), but radical shifts in what he calls “paradigms”, or status-quo social understandings of what constitutes knowledge:
[Paradigms are what] I take to be universally recognized scientific achievements that for a time provide model problems and solutions to a community of practitioners. … [The] successive transition from one paradigm to another via revolution is the usual developmental pattern of mature science. (Kuhn, viii &12)
When scientific findings build up sufficient cases for a particular point of view (like that the Earth orbits the Sun), previous understandings — which may require more leaps of faith in order to work—are abandoned. It may be, in the case of Galileo, that these initial findings are met with hostility and rejection, but as more and more explanations of nature can be reached with these findings that they attain the status of a new paradigm, or scientific standard. This is not a case of a new theory conclusively falsifying the theories that came before, only that the new one is more adequate, more sophisticated, or efficient.
Feyerabend, while sharing much Kuhn’s view, goes further.
Feyerabend concludes that falsifiability — and even the notion of a “scientific method” — ignores the reality of how science does its business, and that it doesn’t really provide any grounds for asserting that Claim A is “more scientific” than Claim B.
We can only say Claim A conforms more to a set of accepted observations than Claim B, which does not recommend science as an enterprise for legislating truth but creating consensus realities based on a certain set of assumptions.
He reaches this conclusion
… taking examples from the history of science … he claims that scientists frequently depart completely from the scientific method when they use ad-hoc ideas to explain observations that are only later justified by theory. To Feyerabend, ad-hoc hypotheses play a central role; they temporarily make a new theory compatible with facts until the theory to be defended can be supported by other theories. (via Antimatter)
He argues from here that external political and social pressures mold the sciences into “special interest groups who use their own narrow values (and of course, their power) to confer unreality on any problem that might be perceived or even solved within a different approach” (Feyerabend, “Science and Ideology” 156). Additionally, since the 18th and 19th centuries science has taken an increasingly materialist, rationalist view of the universe — there are no souls, no supernatural phenomena, no gods, to complicate the picture. There’s only matter and energy, things which can be observed, understood using certain methods, and explained. Everything else, including the mind and human emotions, is relegated basically to the status of illusion (these are called “secondary characteristics”). The belief in materialism and rationalism achieve certain goals as they simplify the nature being studied by reducing the variables to be considered, and this is unquestionably useful. It’s only when these pictures of reality (what Feyerabend calls “fairy tales”) are used to legislate societies at large that it becomes dangerous.
Colonized peoples have certainly felt the brunt of this chauvinism, when their worldviews are not being completely wiped out they are conferred with the marginalizing unreality Feyerabend describes — anthropologists observing indigenous cultures emphasize “the psychological meaning, the social functions” of “oracles, raindances, [the culture’s] treatment of mind and body” as expressing the social norms of the community without granting the community “accompanying knowledge of distant events, rain, mind, body” (Feyerabend, Science in a Free Society 77). To grant this would be to accept “magical thinking” into the picture of a purely physical universe, and this picture excludes — by their acceptance of paranormal phenomena — “non-scientific” cultures holding knowledge of the workings of the real world that is equally valid to a quantum physicist. Viewing knowledge and human culture as culminating in Western science and capitalist democracies cuts out the work for those who want to legislate reality pretty clearly — conquer superstitions like shamanism and animism and receive “undeveloped” cultures into the globalized free market so humanity can soldier forth together towards truth and beauty.
This has always been the attitude of colonizers, as noted in Linda Tuhiwai Smith’s Decolonizing Methodologies—the European Enlightenment, which served as fertile ground for the advancement and formulation of
“the modern state, of science, of ideas and of the ‘modern’ human person” still influencing capitalist democracies today, “reaffirms the West’s view of itself as the center of legitimate knowledge” through global imperialism, through corporate free markets and the historical marginalization of other cultures (Tuhiwai Smith, 23 & 66).
Because inheritors of the Enlightenment tradition of rationality and materialist science are unconscious of the imperial component of these ideas’ spread — Western medicine didn’t win a debate with any given indigenous healing practice, the latter was simply suppressed with the rest of its respective culture— the colonial outlook reinforces itself and extinguishes the traces of its competition. The products of the Enlightenment, which includes science and carried on the crest of the free market capitalist wave, are exported globally and through brute economic force they take the place of cultures deemed illegitimate and in some cases unworthy of survival. Anyone doubting that this continues today can inform themselves on the land rights struggles surrounding the Lower Churchill project in Muskrat Falls, Labrador and the Dakota Access Pipeline.
So, according to this line of reasoning science is still in need of justification for its supposed authority. One solution coming out of this conclusion is that, according to Feyerabend, “anything goes” (or we might pull back a little, saying “most things go”) when it comes to scientific discovery — any method, any apparently disproven theory can be invoked to solve a problem. Kuhn echoes this, comparing scientific problems to “puzzles”, with the intellectual merits needed to solve one being transferable to the other. Imagination, ingenuity, and independent thought is always a virtue, in other words. To try to restrict these natural and quite human attributes in the name of purifying the limits of scientific questions will, as both authors conclude, only set the progress of human understanding backwards. One advantage is that this opens the vista of discovery to a less rigidly professionalized class of researchers, certainly one not so dependent on large research grants and whose failures wouldn’t cost them their livelihood.
Still, falsifiability and similar theories thrive as yardsticks by which the scientific merits of a claim can be measured. One which came to prominence in the 21st century in the midst of post-9/11 suspicion of religious fundamentalism is Hitchens’ Razor, named after the late critic of religion and champion of scientific inquiry—along with falsifiability — Christopher Hitchens.
Hitchens’ Razor— like Occam’s Razor, which states “the best explanation is usually the simplest” — is a tool used to navigate the stormy seas of nonsense which can pass for genuine knowledge. In one explanation, it states “Extraordinary claims require extraordinary evidence”, and to this we can simply ask “What makes a claim ‘extraordinary’” and “to who are these claims considered extraordinary”, without touching on what constitutes “‘extraordinary’ evidence”. But to be more faithful to Hitchens’ meaning, Hitchens’ Razor is better stated as “The burden of proof lies on the person making the claim”, and everyone else is free to dismiss that person’s ideas outright if that burden is not met with convincing evidence. In this sense it isn’t a new idea (“What is generously asserted can be generously deserted” is apparently a proverb with roots at least as old as the Roman empire).
It also doesn’t make things any simpler. Within science, of course it can prove very useful as all scientific proofs are more or less held to the same standard. Hitchens wasn’t worried about inter-science conflicts, which can be resolved through such safeguards as peer review and testing, etc. Science is a notably self-correcting enterprise, and whatever pitfalls may accompany this process it’s not in our interest to deny it. The conflicts he’s really worried about are between science — which represents “real” knowledge — and fundamentalist religious groups, or we might extend this to corporations when they’re forging evidence to disprove climate change, for example. The razor presupposes that scientific knowledge, however flawed, is still better than the dogmas of religion, which suppress free thought. In his 2007 book God Is Not Great he claims as much:
Religion has run out of justifications. Thanks to the telescope and the microscope, it no longer offers an explanation of anything important. Where once it used to be able, by its total command of a worldview, to prevent the emergence of rivals, it can now only impede and retard — or try to turn back — the measurable advances that we have made. Sometimes, true, it will artfully concede them. But this is to offer itself the choice between irrelevance and obstruction, impotence or outright reaction, and, given this choice, it is programmed to select the worse of the two. Meanwhile, confronted with undreamed-of vistas inside our own evolving cortex, in the farthest reaches of the known universe, and in the proteins and acids which constitute our nature, religion offers either annihilation in the name of god, or else the false promise that if we take a knife to our foreskins, or pray in the right direction, or ingest pieces of wafer, we shall be “saved” (Hitchens, 96).
Politically there’s no question that religions can and do pose genuine threats, and should be treated with no less mercy than any other terrorist organization — the anti-birth control policies sponsored by missionaries in the Third World serve no other purpose than to worsen the socioeconomic climate, and the aggressive policies towards reproductive rights in North America have a no less inhumane motive. In my opinion that shouldn’t be debated. It’s reasonable to agree with Feyerabend that religious beliefs and even atheism represent reductive pictures of reality, but this is not to say they’re wrong or that people shouldn’t be granted the right to believe what they believe. No worldview is so potent and infallible that it can decide what can and can’t pass for truth. The atrocities committed by human institutions are not enough to recommend a wholesale abandonment or suppression of religion for those who need it — enforcing this flies in the face of self-determination of individuals and societies. To use a scientific standard of empirical evidence — something that can be sensed by the body or detected by scientific instruments — to discuss the “reality” of God or the soul or, by extension, the reality of individual emotions and states of mind, is to commit a crude mistake, it doesn’t provide any meaningful metric. To grant that religions or political beliefs provide kinds of knowledge or ways of understanding the world which have their own worth costs science nothing. Nothing is at stake if someone disagrees that atoms are real, unless they intend to kill someone over it. People have been killed over adultery. People will probably, unfortunately, continue killing one another over small differences for a long time. This would happen regardless if the entire world population were scientific materialists or if the entire world resided in the Amazon basin. It can be no more proven that scientific methods encourage free thought than genuine religious belief suppresses it, it depends on individuals and communities deciding to live a certain way. Whatever stands in the way of that freedom isn’t worth saving — “those who look at the rich material provided by history, and who are not intent on impoverishing it in order to please their lower instincts, their craving for intellectual security in the form of clarity, precision, ‘objectivity’ [or] ‘truth’” (Feyerabend, Against Method 18) are surely better off than those wanting to legislate reality for the rest of us poor creatures.
As for drawing the line between science and non-science, we might recommend in place of all these formulas a pair of redundant statements:
i. Science produces scientific knowledge if the scientific community agrees it meets their standards,
ii. and any tradition or body of knowledge not claiming to be scientific should not be regarded as an imposter or an enemy to human progress until it has manifestly proven itself to be so.
Then you’re free to assert and abandon theories as you please.