How to Live With Fake (Science) News

Ehud Lamm
9 min readSep 29, 2020

--

Everybody is worried about fake news these days. Clearly, no one wants to be taken for a fool. But at the same time, people have become increasingly skeptical of expert authority, whether of physicians, suspected of being captured by big-pharma, or of other academics. These worries are two sides of the same coin: trying to critically engage with a media landscape filled with marketing, algorithmically placed ads, sock-puppets, and astroturf, unsurprisingly leads people to become skeptical of other sources of information. And who can you turn to for help in navigating the conflicting information? Politicians? Government institutions that are increasingly politicized?

Photo by CDC on Unsplash

Many YouTube videos, magazine essays, and even books try to give advice and tips on how to be critical where appropriate while still being able to recognize bullshit that does not merit our time or attention. But trying to recognize bullshit is often not going to work. Each of us, as an individual, does not have the knowledge, time, resources, or skill to determine the truth value of any important claim. Even Newton, Einstein, or Darwin, wouldn’t be able to know whether a vaccine they heard about on Facebook was effective or safe. How could they? The data is not available; if it is available who knows how it was produced; and even if you trust everyone involved, these three genius scientists did not know any modern immunology.

It seems that a good rule of thumb is to try to recognize when you are being lied to or manipulated. The hope is that this will at least take care of the most egregious cases. The problem is that this will not really work. This is partly because you are probably more worried about biases and commercial capture than simple lies and partly because the real experts on how people detect bullshit and lies are the liars and manipulators. And they, be they tobacco peddlers, climate change deniers, or wannabe gurus, know how to overcome these tips and to come off as sincere and thoughtful. At least some of them do. Now, don’t get me wrong — it is very useful to know how to detect bullshit. But it is not enough, and even if you know all the good tips, it takes time and effort. It is much easier to retweet and hope for the best. You do that while reassuring yourself that you are not buying the claim you have just reverberated. The problem is that by the third time you see the same claim — reported by others thinking precisely the same thing — you tend to assign to it more credibility than before. Eventually, bad ideas will crowd out good old fashioned common sense.

So what can one do? Instead of looking for tips for distinguishing between truth and falsehood, serious claims and hash, the trick is to learn how to navigate an uncertain world. While no shortcut can replace serious, deep, time-consuming analysis, there are simply not enough hours in the day to wade through all the bullshit out there, whether from various partisans or the bullshit cynically produced to make a profit. If you do try to determine the credibility of numerous claims (do vitamins work? do probiotics?) keep in mind that you have just as much a chance of being wrong as anyone else, maybe more if you are not an expert, so be wary of falling in love with your own ideas and conclusions; this is a road many smart people inadvertently end up taking. And if you are smart enough to remain uncertain, remember that this makes you an ideal mark for the next conman that comes around. So while engaging deeply with the science is the best option, very few are in a position to take it if they are not right now choosing their major in college. And at best we can do it in some areas and some of the time. Instead, here are three tips that each of us can use in our daily life when we are confronted with dubious claims.

Photo by United Nations COVID-19 Response on Unsplash

The first tip is to realize that the real issue that is at stake is not what you believe but rather what you do. Someone can go around thinking that the earth is flat without causing too much harm to himself or anyone else, except being mildly annoying. More important are cases in which the goal is to get you to do something or, even more often these days, not to do something. Not to wear a mask to stop the virus from spreading. Not to take chemotherapy, because homeopathy is just as effective (it isn’t). Not to vaccinate your child, because the MMR vaccine causes autism (it doesn’t). Proponents of these ideas emphasize uncertainty, they embrace it as one of the tools of the trade. You are not sure about vaccines? That is exactly the right attitude. Be skeptical, no one can guarantee 100% certainty. And do you want to risk your child on something that is not 100% safe? What kind of mother would? This all makes sense as far as it goes. (Really, how many of us can assign a probability to the safety of vaccines? That is what experts are for.) But things become clearer when we put aside what we believe and concentrate on what we do. We are making a decision about our own child, our community. It is a one time, irrevocable action, not an exercise in probabilities. It is easy to see why even small doubt would favor inaction. That is until we start to consider that not doing something is itself an action, a decision. So what favors this action, beyond doubt and uncertainty? What are the costs (very likely severe illness and possibly death of the child if you live in a place with measles going around; and this danger increases with every parent who does not vaccinate) compared to the benefits (at most a very small decrease in the risk of autism, zero if you believe what scientist say about the relation between the MMR vaccine and autism). The choice is between these two courses of action; it is not about whether you believe arrogant scientists or heart-broken mothers, big-pharma or an association of affected families. Two people may have the same opinion about the safety of vaccines but decide to act in opposite ways because of differences in their knowledge of measles or because they differ in how concerned they are about the potential consequences.

Most of the time manipulators, con artists, and even well-meaning friends and acquaintances are interested in affecting what you do, not what you think. Consider the action and its potential consequences. Ask yourself what may be the reason why your action is of interest to them (are they selling something? trying to feel better by convincing you? getting ad revenue from YouTube based on the number of views?) Try to agree and see if they are still interested in what you do. Consider if the action has consequences for you or others. Taking a low dose of vitamin C from a reliable manufacturer is probably not going to harm you even if it does not help you (but do ask your doctor). People waste money on worse things. But taking vitamin C instead of seeing the doctor when you suspect cancer can be the difference between life and death. You have to be very sure it works to take the chance.

A second tip to avoid bullshit is to pay attention to your emotions. This suggestion may be surprising. The truth of climate change has nothing to do with how we feel about it. Indeed, the very first thing we are taught about science is that it is objective, impersonal. Facts are distinct from opinions and values. People who forget that should just grow up, right? While facts and values are much closer than this picture suggests, that is not my point here. The important thing is to realize that emotions affect us: they affect who we believe, when we are open to changing our minds, and the extent to which we are willing to move from beliefs to extreme behaviors and actions. We are willing to do many things, even stupid things, to feel good about ourselves. Things that make us feel inspired and hopeful seem to always be worth a try (no one ever lost money publishing a self-help book). When people are confused, vulnerable, or isolated, they are easy marks. When we are angry, we do not want to criticize our own side. Every beginning con artist, guru, or authoritarian leader knows all this. So pay attention to your emotions: are you now in a good place for making decisions? Are your emotions being manipulated? Are you concerned about the evidence or emotionally invested in the conclusion? Is one view going to make you feel better about yourself? Or maybe it will make you more in line with people you like or admire, or allow you to fit in? Is one view favored by people who share your politics? If so — beware! Ask yourself if an alien, who will not gain anything if one side or the other is right, will come to the same conclusion as you. It is not that your emotions can tell you what is true or not, but they can tell you when you are in a better or worse state to assess ideas and evidence. The less optimal your emotional state is, the more wary you should be before deciding what to actually do (see tip 1).

These two tips are the most important. The last tip is a bonus recommendation. Find a few people, not too many, that you think you can more or less trust. Ideally, they should be smarter than you, more knowledgeable concerning some areas than you are, more emotionally stable and mature, and themselves have a network of people smarter and more knowledgeable than they are. Their primary source of income and social success should not be tied to you believing them. Cull the list mercilessly whenever you think you are being manipulated or when they are not on the level, but never remove someone from the list because you disagree with them (that goes against the whole point of trusting them!). Once you have this list of people you trust, instead of trying to decide what is true in each case, something we already concluded is a fool’s errand, blindly trust these people. Especially if they all agree about something. When you have the time, skills, inclination, and tranquility then, by all means, do the digging yourself. But assuming you have to earn a living, raise a family, have hobbies, and whatnot, you should not try to rationally decide what the evidence says about things you are not an expert on.

To conclude: don’t try to judge the truth of complex things you lack direct knowledge of. Instead:

  1. Think about what you should do, rather than about what you should believe.
  2. Pay attention to your emotions.
  3. Maintain a list of up to five people you think are worthy of your trust, and be vigilant about revising it.

Book Recommendations:

  1. The best explanation about the distinction between what the evidence says about what you should believe and what it says about what you should do that I know of is found in chapter one of Elliott Sober’s book Evidence and Evolution.
  2. Two very different books anyone interested in the emotions should read are Martha Nussbaum’s Upheavals of Thought: The Intelligence of Emotions and Robert Frank’s Passions Within Reason: The Strategic Role of the Emotions.
  3. The social aspect of reasoning is beautifully explored in Hugo Mercier and Dan Sperber’s The Enigma of Reason.
  4. If you want to get into the weeds of pseudoscience and fake news and how to fight them these are good starting points: Massimo Pigliucci’s Nonsense on Stilts: How to Tell Science from Bunk; Naomi Oreskes and Erik Conway’s Merchants of Doubt: How a Handful of Scientists Obscured the Truth on Issues from Tobacco Smoke to Global Warming; Carl Bergstrom and Jevin West’s Calling Bullshit: The Art of Skepticism in a Data-Driven World.

--

--

Ehud Lamm

Philosopher and Historian of Biology at Tel Aviv University. Follow me on twitter @ehud