Listen to this story
There’s a wonderful ride that we go on every few months. It’s astonishingly predictable and a brilliant short story about how we misunderstand scientific research.
To understand this story, you must understand the first basic rule of science. I’m not talking about one of Newton’s laws or Galileo’s formulae. I’m not even talking about the principles of basic research.
The first basic rule of science is this: Most research progress is achieved through tiny steps. Research is slow and ponderous.
But most of all, it’s pretty boring.
This is something that, fundamentally, we don’t want to be true. How can research be boring when it has filled our lives with wonderful gadgets and medical science that keeps us all ticking? Research has made our lives infinitely more interesting.
Research moves forward in tiny steps. There’s no giant leap forward; instead, it’s a tentative crawl with thousands of missteps. For every “breakthrough” headline, there are hundreds of esoteric studies that are meaningless to most people.
And yet, every few weeks, we see a new headline that goes something like this:
“New Study Proves Common Thing Is Actually Killing You.”
If science is slow and methodical, why do we see a new, outrageous discovery threatening us all every second week?
The Big Scary Study
Every “That Common Thing That Is Killing You” story starts in the same place: someone, somewhere publishes a Big Scary Study.
The Big Scary Study is characterized by a few things. First, it’s big. The research involves tens of thousands of people and was conducted at a well-known institution that even us plebs will recognize. The research question is, most likely, quite broad. And most important of all, it’s the kind of boring health research that gets consistently misinterpreted by the media.
Take the soft drink study mentioned earlier. This was a huge, well-done study that looked at the connection between diet soft drinks and the risks of stroke and dementia. There was existing evidence of this connection, but the risk was low and more likely due to other factors — in essence, people who drink diet soda are trying to lose weight, so the negative effects are already there.
When the study was published, that’s basically what the researchers reported: that there may be a minor association between diet drinks and stroke, but it’s better explained by other factors.
And yet every news headline screamed that you should stay away from diet sodas.
Because the media doesn’t understand correlation and causation.
Ice Cream Doesn’t Cause Drownings
I remember one of the first lectures I attended as a young university student. I was taking Psychology 101. A lecturer was standing in front of 400 students with a graph that, he told us, proved ice cream caused drownings.
It looked a little like this:
Sure, at first glance it looks like the more people purchase ice cream, the more they drown. But it’s obvious that something else is going on here. here’s no plausible way that ice cream could cause you to drown, short of some tragicomedy involving a sea of ice cream and a snorkel.
So what’s really happening here?
Basically, there’s an underlying cause that is associated with both drownings and ice cream. It’s something that causes both ice cream consumption to go up and people to drown more.
If you hadn’t guessed, it’s hot weather.
There are hundreds of these spurious correlations. One of my favorites is that the length of spelling bee words is well correlated with the number of people who are killed in the United States by venomous spiders. There’s no reason why these two things should be causally related — long words don’t cause spider bites — but there happens to be a correlation between them when you compare the two numbers.
It’s amazing how many news stories are written without understanding this simple concept.
So how does this relate to the Big Scary Study?
Scary Studies And Correlations
Determining cause and effect is one of the biggest problems when we do medical research. Hundreds or even thousands of factors can affect most diseases.
For example, let’s say we are comparing 100 people taking a drug for heart disease. In this case, we just pick who takes what: 50 take the drug, 50 don’t. After six months, we find that the 50 people taking the drug have less heart disease than the people who don’t.
Sounds like the drug is effective, right?
But looking closer, we realize that the people taking the drug are also, on average, slimmer than the people who aren’t taking it. We know that being heavier is associated with more heart disease, so it makes sense that the fatter people (no drug) will be unhealthier than the thinner people (taking drug).
This happens all the time in research. There appears to be an effect, but the effect can be better explained by background differences between study populations than by the actual thing being tested. A common problem happens when researchers don’t take into account the wealth of their study groups; rich people are usually healthier than poor people.
To combat this problem, scientists have developed a method to make the groups more equal. We know that if we pick which people go in each group, they will end up with differences. But if we randomly assign people into the group defining which treatment they will get, the differences tend to disappear.
When researchers do this, it’s called a randomized controlled trial, or RCT. It’s considered the best way to determine if one thing causes another, at least for medical research.
Sadly, most Big Scary Studies are not RCTs. They are the first type of study I described, what’s known as an observational study.
I’m going to tell you something vital that you really need to listen to.
Here we go.
Observational research is incredibly useful.
Nowhere in this article will I say that you should throw out observational studies just because they aren’t RCTs. Different types of research are useful for different things. In fact, much of the research that I do is observational — I work in diabetes research, and it’s often hard to randomize people into groups for a variety of reasons. For example, you can’t force people to smoke, so smoking in diabetes is studied almost exclusively through observational research.
But there is something that I definitely can say.
Observational research doesn’t imply causation.
An observational study is, at it’s heart, basically the same as that graph about spiders and long words. It shows that there’s a connection between two variables, say less heart disease and a drug, but it doesn’t prove that the drug is causing less heart disease. If we do enough observational research, taking into account enough factors, we can eventually conclude that one thing is probably causing another, but it takes years, or even decades, to get to that point.
A good example here is smoking. For a number of reasons (mostly because it’s unethical), it’s hard to randomize people into one group who smoke and another who doesn’t. So when researchers in the 1940s and 50s started seeing correlations in their observational trials between smoking and lung disease, they were cautious. They looked at other factors. One early theory was that all this disease was caused by World War I. Another was that the asphalt dust from all the new roads built in the early 1900s was causing it. Even though scientists noticed the increased in lung disease from the late 1800s, we didn’t fully understand the link between smoking and lung cancer until the middle of the 20th century. Almost 50 years.
It’s not that observational research is useless. It’s just that it takes a lot of carefully done observational research to come up with a solid answer to any question.
And almost all Big Scary Studies are observational.
Spotting the Scary
So how can you tell if you’re looking at research that is 1) sound, and 2) likely to make a difference in your life?
My advice: Try to find the study. If you can’t do that, find the press release. See what the scientists said about their own research. If it is observational, they will almost always say so. And they’ll probably mention other explanations for the effects they noticed. If it’s an RCT, they’ll tell you that, too. Press releases are often a great way to get a common language understanding of what this research means to you.
If a big scary headline goes against established knowledge — like if it says broccoli is killing you — try to find the study. Chances are the news is telling you something that isn’t quite true.