The Obvious and the Way of the Scoundrel — and How You Can Be One, Too, Even if You Don’t Want To Be.
One word that I had always dreaded is “obvious.”
“Obvious” is something that people “know” without needing an explanation. “Teaching,” in its manifold forms, requires explaining something. People will not respond to an explanation when something is “obvious.” Therefore, “obvious” is something about which people cannot be taught.
The reason people do not respond to an explanation when the problem is “obvious” is that they already think they know all the answers they need. Often, in fact, that is true, but not necessarily always: we suffer from all manner of selection biases when we observe the universe. We are creatures of our environment. We do not see gravity operating on different planets. We do not observe vacuum. We do not experience non-Oxygen atmosphere. We have not been on the receiving end of a comet impact. And these are just the examples drawn from natural sciences. Of course, the existence of these unusual phenomena still does not make our understanding of them “necessary.” We will never be, as far as we know, on Mars — at least, not “us” ourselves. So do we need to actually understand Martian gravity or atmosphere? Without much data, which will be very difficult to get in the first place, achieving an actual understanding will be both costly and difficult.
It still astounds me to no end that astronomers have been able to convince people to pay the considerable price to learn about Mars and Jupiter, without an actual “need.” I always suspected that one important reason that a lot of big projects in natural sciences find support among the general public is because these are not usually accompanied by a set of obvious answers that accounts for the “why not” — not the rhetorical “why not,” but actual reasons why supporting the projects would not be a good idea. While understanding Mars might be expensive, achieving it does not directly affect one’s lives adversely, and people are generally curious enough to pay the price.
If so, however, the relative ease of convincing the public to pay for understanding may well have spoiled the science advocates (I use this term to distinguish them from “scientists.” These folks may be scientists, but when they have an agenda that they are pushing, they are not doing “science” but “advocacy,” even if it is for “science.”) The examples of the topics in natural sciences that do generate controversy are those that dealing with which does offer an obvious “why not,” even if these “obviouses” do not concern directly the substance of the topic. Climate change is one topic. Evolution is another. I spent a good deal of time in my time in academia researching creationists and it always astonished me how flexible and creative creationist actually are: variations of “intelligent design” that they come up with are remarkable in how close they actually come to an evolutionary argument —for example, when sufficiently informed, they don’t reject the evolution of bacteria over mere years so as to reject vaccines as unnecessary (well, some people do have issues with vaccines — but that’s not because they don’t think microorganisms don’t evolve). One of my former colleagues snidely described this sort of approach as an indicator that they really don’t have a set belief in a particular explanation. But, in a sense, that is the point: they are not so fixed in their belief against evolution that they will hold on rigidly to an anti-evolutionary explanation against empirically observed evidence. This is actually a rather enlightened and scientific attitude, at least in context.
What they do know as “obvious,” and to be fair, they are reminded of this often enough by their enemies among science advocates themselves, is that many advocates of evolutionary theory are their enemies, who hold them and their way of life in deep contempt. So they reject the broader structure that the empirically observed evidence is placed in by the sciences, instead opting for a logical structure that they are familiar with. The catch, of course, is that, as the empirical evidence accumulate and the explanation becomes more elaborate, the “theory” concocted by informed creationists and the evolutionary theory will necessarily converge, at least in the broad sense, for they do have to account for the same observed evidence, even if, in absence of complete knowledge about the universe, the convergence may never be perfect, but at that point, when the limits of the feasible theory building, conditional on the body of data available, are reached, there is no telling one is any more “right” than the other. Their starting positions may be wildly different. The paths that they take may differ. But how they proceed along the different paths, if done right, can be the same: science, the real one.
I think the central problem, and the path of the unwitting scoundrel, is that those whose set of “obvious” that we do not understand at an intuitive level are judged on the basis of the (misleading) “obvious” rather than the actual facts. The broad logical structure that creationists rely on necessarily has to have a place for God, in some sense, in some capacity, although they themselves often don’t know what exactly the place for the Divine is. (I exclude the Biblical literalists from the consideration — they have a precise, potentially empirically testable place for God, and they do reject empirical evidence. But even among creationists, complete Biblical literalism is uncommon.) There may be no empirical reason to believe that there is God, but none that there isn’t one either — especially when the “theory” of God is inchoate and untestable as it is. In the end, however, however inchoate, empirically untestable, and thus not-wrong the basic structure is, the idea that there is God with certain religiously attributed characteristic is “so obviously wrong” to many among science advocates that the entire group is shunned without their modus operandus being understood.
The persistence of the steady state theory of the universe for far too long beyond its explanatory power provides somewhat of an equivalent tale from the opposite side. Fr. LeMaitre, the original proponent of the Big Bang Theory was a Catholic priest in addition to an astrophysicist, and in the theory many physicists saw a Trojan Horse, a creationist theory trying to enter the realm of science through the back door. That Fr. LeMaitre insistently kept God out of his explanation did not matter: he is a priest, so he “obviously” must have a religious agenda. After all, Big Bang is only a theory, even if a theory that happens to be consistent with all existing theories of physics and all observed phenomena. A lot of contortion went into maintaining the steady state theory when the Big Bang was clearly the more logical explanation.
To be fair, trying to explain how evolution works to people who believe that God has to be part of the answer can be frustrating. A lot of extra baggage might need to be thrown in and additional empirical evidence be demonstrated that would not be necessary had the audience not been insistent on seeing God in their explanations. But the attempt at explaining an idea to people who don’t take the “obvious” but not necessarily justified axioms for granted is potentially a rewarding experience. Even if you thought X is obvious, having to explain it to people who don’t see it as so obvious forces you to understand the intervening steps, why X is true (generally), but also why X might not be true (sometimes). This is not just a problem with creationism or science, but, I think, all walks of life — but especially when the aim is to spread an understanding of social, political, cultural, and/or economic phenomena, where the realm of the “obvious” taken for granted by many is both huge and often unjustified. Speaking from personal experience, I found that the biggest difference between people who “like” politics and those who don’t is that the former have a long litany of things that they “know” because they are “obvious,” based on the experiences that they have had. We also know, empirically, that the former are also often wrong, even more than those who do not like politics as much, precisely because the information that they take to be “obvious” is often not as “obvious” as they think it is. Having to explain how the “obvious” works, especially the attendant addendum about why the “obvious” may not be as obvious as it might seem sometimes — and when and how and why — may seem a tedious and time consuming exercise, but it is also the critical step in promulgating an actual “science.”
Two of the maxims for applied statisticians, among those listed by Frank Harrell here, are to “Give the client what she needs, not what she wants” (obvious) and “Teach the client to want what she needs.” (less obvious). The former implies the latter, but with a few steps in between. The clients wants what she wants because she is experienced in what she does, just not in statistics. She has accumulated a body information that tells her that what she wants is “obviously” what she needs. In order to successfully and convincingly enlighten the client, you need to understand not only the problem from the limited perspective that you have, but also to both understand the problem as whole and how the problem has been approached and understood from the client’s perspective — and what is wrong with it under what circumstances, why that is relevant, and why the perspective you are offering is better. That is a lot of information and, potentially, a lot of stepping over the boundaries. I’ve had experiences where my attempt at making sense of the big picture was met by hostility: “We told you what we want and we gave you the data. Go and plug them into the formula or something and don’t ask us too many questions.”
In a sense, this is not an illogical viewpoint. Time is lacking. Statistical work is only a small portion of what needs to be done and the statistician is a small cog in the machinery. They don’t want to reinvent the wheel, even if the wheel is square — unless they can be shown why the square wheel is a bad idea, but that won’t be possible unless the wheel is reinvented to be hexagonal, and an hexagonal wheel is not that much better than a square wheel. (We won’t be able to make the leap to a round wheel that quickly.) In a sense, the analogue is to that of teaching creationists how to approach evolution. A lot of time “wasted” on “obvious” points that need to be thought through and explained. Far easier to point to the textbook and say, “this is the Truth because it is so.” If you don’t believe, then tough. Equivalently, we do what we do because that is what we do. We just need you to crunch the numbers so that we can use it doing what we’ve always done, not tell us what we need to do different. And it bugs me to end that the truth is that I’ve been disillusioned enough that I can’t disagree with this.
But the counterpoint to this tendentious approach, of understanding the problem(s) from all angles and placing them in their proper places, is the quick and “obvious” approach, where rather than an actual explanation of how things work (and don’t) is offered, a non-explanation is offered as a series of analogies to the “obvious,” what the “clients” already “know” and “understand,” or so they think. “If you explain, you lose,” as the old saying in political campaigns goes. the idea is that people already have plenty of information they accumulated from life and they tend to understand politics (or really, any number of different things) as analogies to what they already know and believe. A successful politician does not tell the audience anything they didn’t understand, but just remind them that it’s all like something that they already know intimately. All of life is a box of chocolates, as the saying might go. Now, a “dumb” politician, or a scientist trying to play politics (badly), might ask, “what’s a box of chocolates?” and that will go even worse than someone trying to explain all of life. This is the fundamental problem: while we are all supposed to have an intuitive understanding of “a box of chocolates,” without anything defining or explaining anything about it. But there is no reason to believe that is so. But a box of chocolates, in reality, has nothing to do with “all of life.” Trying to make sense of “a box of chocolates” is truly an actual waste of time. However, it is necessary in making sense of how the politician successfully “persuaded” his audiences, even if not for making sense of “all of life.” To the degree that “a box of chocolates” means different things “obviously” to different people, the politician sold different bills of goods to different groups, who, if successful, will accept its meaning as whatever that is “obvious” to them. But if they are mistaken, well then, that would be unfortunate. Or, in other words, through the skillful use of ambiguous, unexplained, misleading, but still very “obvious” analogies, the politician played the role of a scoundrel magnificently. The audience is left unenlightened but satisfied. Nothing is learned and no improvement is made. This seems problematic to me.
In absence of “science,” “scoundrels” win. In fact, a lot of alleged “science” sold in the manner of “scoundrels” rather than science, through appeals to be “obvious” rather than explanations and investigations. This is a problem. But if we don’t have time to think in depth, if we are too busy trying to answer problems that we want answered rather than those that we need — and we can’t tell if what we need and what we want might be different things (this is where the “obvious” problem comes in), this is potentially a dangerous situation indeed. When information becomes cheaper in general, bad information (the ones that are cheap to produce) becomes even cheaper than the good ones. If they cannot be easily evaluated and distinguished, the way of the scoundrel is far more readily enabled by technology than “science.”
PS. Thinking about the problems of educational assessment (among others) pointed out by Howard Johnson made me wonder about the statistical problem of the linkage between “obvious” and the selection bias. “Obvious” in statistics is perfect correlation: if A <-> B is ALWAYS true. If there is a perfect correlation, there is no information provided by knowing A or B separately. Know one, and know the other. ALWAYS. If, in a test, the same 10% of the test takers always get the right answers to every question (and the remaining 90% always get every question wrong), all but one of the test questions is a waste of time, because only one answer provides all the information that the entire test provides. But having a lot of questions that produce the same result does accomplish one thing: it makes the point loud and clear — the 10% of the kids are (obviously!) smart and 90% are dumb, and they can be perfectly ranked, and all we want are the answers, i.e. how to rank them exactly, right? Well, this would be nice if the test instruments were not contrived to produce this answer, which I cannot even comment on without getting into insulting snark about data naivete. In order for data with multiple observations to be informative, we need some subset of data that behaves differently, even if they need to be introduced with some artificial effort. We want to identify the cases where A<-> B does not hold, or the whole shtick with conditional effects: something like A → B all the time, but B → not A if it rains, or whatever. Can we replicate this just by sprinkling water on B? By putting it in shades but no water, or what? If we have a test where 10% of the students get the right answer on every question, but it’s different 10% for each question, then we have a lot of information, a lot of different conditional effects, even if we now can’t rank the students who took the test easily. We don’t have the simple “right answer” any more, but we now have a good “hard question” to think out. I’ve always thought a good “hard question” beats a simpleminded “right answer.”