The dangers of insecticides, poor statistics and over-enthusiastic press offices

A paper published today in Occupational & Environmental Medicine was accompanied by a press release titled “Exposure to certain insecticides linked to childhood behavioural difficulties”, and which made the bold claim that “children with the highest levels of metabolites in their urine were around three times as likely to display abnormal behaviour.” But these findings are quite plausibly due to chance alone.

Head louse image courtesy Gilles San Martin, Wikimedia commons

The experimental part of the study itself was carefully conducted. The researchers looked at metabolites of pyrethroids, a pesticide present in household products such as head lice treatments and mosquito repellants, in the urine of 571 pregnant women. Six years later, they recontacted the women and 287 allowed the researchers to both measure these same pesticide metabolite levels in their children and assess any behavioural problems the children suffered from.

This is where the statistical analysis becomes important. Five metabolites measured in mothers and children were first divided into three levels, and correlated with three behavioural outcomes. When compared against the baseline level of each metabolite, this makes 60 statistical tests. This presents a well-known scenario, taught to science students the world over: if you carry out a lot of statistical tests, then just by chance, you expect some ‘false positive’ results (this could be called the ‘green jelly bean’ issue, after a famous xkcd cartoon). In this case, if there were no association whatever between metabolites and behaviour, we would expect 3 (1 in 20) to show an apparently ‘statistically significant’ association (at ‘P < 0.05’). Rather remarkably, in this study only one association turned out to be ‘significant’, less than expected by chance alone. So the researchers should have been wary of drawing any positive conclusions at all from these results.

They then did a more sensitive statistical procedure in which metabolite levels did not have to be divided into three levels, and the resulting 30 tests resulted in showing one significant positive association, one borderline positive, and one showing a negative link, so that more metabolites were associated with better behaviour. This should surely have rung alarm bells with the researchers — there was essentially nothing, statistically, to suggest an association between the metabolites and behavioural problems.

What about that three-fold increase in bad behaviour identified in the press release? The researchers themselves didn’t even mention it in the paper — it turns out to be for a single outcome and a specific metabolite, and not at the highest level as stated in the press release.

This was an interesting study, but which ended up with extremely weak evidence for any positive association. Personally, I consider it unacceptable that papers are still published with abstracts which only report positive findings, and do not reflect the totality of statistical tests conducted.

Worse, the selective reporting of the paper itself was enhanced — and even inaccurately represented — by a press release that could be considered misleading.

This kind of promotion is hardly a new issue, but sadly it does not reflect well on the scientists, the journal, or the press office. Fortunately the story did not receive much publicity (I might have had some influence on this, through critical comments being distributed by the Science Media Centre). The Daily Mail’s coverage was fairly uncritical. But a commentator in the Sydney Morning Herald pointed out the possibility of any link being due to ‘reverse causation’: could children with behavioural difficulties tend to get more exposed to head lice treatments?

One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.