Skepticism.

Henry Kim
4 min readAug 19, 2017

--

I wanted to draw attention to this article by Nathan Robinson, whose writing I generally enjoy, not only because this fits my idea of how to teach (notwithstanding my less than successful attempt at implementing it in social science realms) but also because I felt David Ng and Howard Johnson, whose writings I value immensely, would enjoy this as well.

Reading a pair of insightful articles by Ethan Zuckerman and Danah Boyd this morning, however, made me think a bit deeper about the problems of teaching and epistemology, specifically the “social” dimensions thereof, even though my basic views did not change. Ultimately, the point (relevant to this issue, at least) implicitly raised by Zuckerman and Boyd is that the culture of unguided skepticism coupled with individualism feeds conspiratorial thinking: it feeds the negative feedback loop of distrust fueled by motivated reasoning and selection bias, or as Zuckerman puts it, triangulation of the truth via Google search. This takes place because many — perhaps most — people don’t know truth even if they see it. The prerequisite for empirical thinking, in a sense, is that one needs to be able to evaluate the evidence and, in absence of this ability, rampant and motivated skepticism runs wild.

I am not particularly sympathetic to this argument. Robinson is right: if we skimp on teaching people how to think and reason through evidence, just to get to the “right answer,” we are simply placing scientists on the pedestal while taking down science. If you trust Richard Dawkins, that’s fine, I suppose, but not when you have legitimate reasons for distrusting him that are independent of “science” (and the knowledge that people like Dawkins are contemptuous of and hostile to your people, friends, family, and neighbors, counts as a good reason.) This is particularly more rampant on social and cultural matters, I suspect: most people don’t have the expertise to say with much clarity whether vaccines cause autism or whether there is climate change, and they know that much. They are merely suspicious that “experts” have too much agenda that they suspect (rightly or wrongly) that comes at their expense. Insisting that experts should be trusted just because they are experts, and this coming from people whom these folks also distrust, incidentally, feeds the cycle. On social and cultural matters, people have been living enough of their lives that they feel that they know what the outlines of the right answer is. Being told that they need to tear down things that they “know” to be true is not likely to go well — I know this from firsthand experience. Teaching people, especially people who think they know “science” well, how to think “scientifically” is difficult. But it is a laudable goal that we should put more effort into.

The caveat, of course, is that there is too much demand for knowing the “right answers,” too much cutthroat competition, that skimping steps and dropping explanations of what is “intuitively obvious” (obvious to whom, I’d ask — except people would look at me funny for thinking that’s even a worthwhile question at all) becomes all too easy. One interesting trend in various kinds of coalition politics is that partisan appeal, where the coalition-builder seeks to only assemble “Democrats” or “Republicans,” based solely on partisan and ideological codewords has increasingly become the norm and making “side deals,” or “side appeals,” on the basis of local interests and community ideals that are independent of uniform partisanship and ideology has become the exception. The idea that weak opponents, by changing the topic, might turn into supporters has been lost in the political arts. But this applies to more than just political coalition building, but also applied epistemology and pedagogy — how do you know, without knowing how X arrived at his current beliefs, is committed to his beliefs which are at odds with yours? The “right” answer is that X disagrees with you, and apparently, that’s enough of an answer that requires no further “wasteful” investigation, and this strikes me as intellectual laziness of practically criminal kind.

The best experience I had while teaching social sciences was that I was able to get students who did not care about politics or social issues — sadly, only a minority — to think about them carefully and critically, regardless of what conclusions they reached. The greatest regret is that I could not get so many students, of various political stripes, who were too interested in “politics,” of changing the world in their image — unfortunately, way too many — to think carefully and critically about how the world actually worked. I think this is the problem — if you will, the problem with political science is that there’s too much “political” compared to the science. Sadly, even natural sciences are wanting to be “political” all too often, rather than science.

--

--