16. Opinions in research, confirmation bias and policies not based on data.

Kuba Pilch
Kuba reads
Published in
4 min readJun 11, 2020

Research papers, scientific articles, journal publications — all these terms carry somehow dry feeling of pure facts and lack of emotion. After all, science is supposed to provide what is and not what someone think there is. But such approach almost by definition is wrong — science is a process of iteration towards ultimate truth, and being led by emotional and opinionated creatures, it is bound to inspire lots of opinions. While in standard research articles we can often see some statements represent opinions (“we think”, “we believe” etc.), there are many kinds of scientific content intended to share authors perspectives in principle. It is critical to maintain common sense and logical approach when digesting such texts, and be especially mindful of our confirmation bias.

Reducing transmission of SARS-CoV-2 — very easy read. In this peer-reviewed perspective, Prather et al. write statements I personally agree with — people need to wear masks and we have to increase testing capacity to effectively fight the current pandemic. Many of the sentences also base on common sense — if a coughing person wears a mask, and we wear a mask, our exposure is smaller than if either or both of us did not wear masks. And while the read is overall interesting and I agree with its message, it does convey some controversial statements as well. For example, authors compare aerosol spread to cigarette smoke. Their sentiment may be correct (and, again, I agree with it), but we do not actually know if the far-travelled aerosol can contain viral load capable of infection. Similarly, the authors write “Masks can also protect uninfected individuals from SARS-CoV-2 aerosols and droplets (13, 14)”. However, this statement is not backed by their references! I wrote about ref. 13 in post #5 and it treats about masks for protecting others. 14 treats about the disease spread in Syrian Golden Hamsters — a step in modeling, but not enough for the author’s statement. A non-scientific reader could reasonably feel cheated when they discovered the claim had no basis in research, which could in turn undermine their trust in science. This is exactly why precision is important, even in not strictly-research-oriented papers. Finally, it would be easy for an informed reader to skim through and just believe the peer review, because of our confirmation bias. We may have already heard a similar statement and just register that this must be true as well. And while many times it would be alright, there are always those instances where “trust but verify” proves to be an invaluable approach.

Sometimes common sense, however, may betray us. If we see a leak in our ceiling and there’s a storm outside we may just assume our roof is damaged, and we won’t check the pipes. This, of course, is a silly example, but there are cases where policies are implemented based on such common sense, without research, basically hoping to get the right result. Often such laws will bring no change, or even worsen the situation (see: “war on drugs” have not lead to reduction in drug use or deaths — it does the opposite.) I had a chance to find a great example of such scenario.

Impact of a Half Dome Cable Permitting Process on Search and Rescue Activity, Hiker Mortality Rates, and Operational Costs Above Little Yosemite Valley — easy read. Spano et al. analyzed if the policy banning climbing the famous Yosemite Half Dome from climbers without a license brought the desired effects: reducing deaths and overall rescue missions. The motivation behind the rules was common sense . There were tens or hundreds of climbers at the same time at the final cable-rail hill during the season, and regulators and rangers looking at the pictures perceived increased danger when more than 70 people were climbing at the same time. So through reduction of the number of people attempting the trip, we should get our desired results. However, the results show that the overall number of incidents have not significantly decreased. First, through analysis of rescue reports, authors noted that only 20% of the deaths happened on the restricted cables before the ban. Further analysis of the data shown that the causes of distress both before and after the ban varied from heart problems, through lack of oxygen symptoms to rattlesnake bites and fear of falling. Arguably, it’s harder to see the “common sense” behind the ban after seeing this breakdown. The discussion in the article also presents many interesting insights and possible reasons for observed changes. As we may see, certain policies should be based on research from the start, or at least altered after such is conducted. Otherwise our lawmakers are treading in the dark, with minimal to no chance of success. And because policies do have potential to impact lives, such diligence is critical.

--

--