Scientific fact: Neither sacred nor free

This essay first appeared in the collection Don’t Dream it’s Over: Reimagining Journalism in Aotearoa New Zealand. Published by Freerange Press, the book contains a series of essays by people much smarter than me, who know an awful lot more about journalism… but this is my naïve take on the science end of things. There’s a review here which gives more detail on why you should buy a copy, basically.

1. A paradigm shift in science communication

Climate change. GMOs. Animals in medical research. Ethics in the age of big data. Claims of health benefits of [insert food here]. Vaccination. Wifi and cancer; diet and cancer; vitamins and cancer; the concept of a cure for cancer. Biodiversity. Conservation. De-extinction. The psychology of unconscious biases, and the social science of economic inequality. Water quality. Gravitational waves.

Effective communication of scientific research matters. There’s cause to believe that this is a well-accepted truth: if nothing else, the recent establishment of high profile prizes for science communication make this clear. (In New Zealand the Prime Minister’s Prize for Science Communication, with a hefty monetary award, was established in 2009; the Royal Society of New Zealand offered its inaugural Callaghan medal, named after the well-known physicist, Sir Paul Callaghan, in 2011.)

Scientists — in particular early career researchers — are encouraged to put time and effort into communicating their work. This happens in structural ways that impact on researchers’ careers. As part of the New Zealand National Science Challenges, the Nation of Curious Minds–He Whenua Hihiri i te Mahara project was created with a deliberate dual focus: to enhance the public’s engagement with science, but equally, the ways in which scientists engage with the public­.

The outdated paradigm used to justify the need for this dual focus is referred to as the deficit model, or sometimes as the information or knowledge deficit model. Scientists who assume that members of the public — whether parents who refuse to vaccinate their children, or politicians who refuse to act on climate change — are simply in need of more information, are falling into a well-known fallacy repeatedly demonstrated in studies of education and science communication: sometimes the facts are not enough. Belief is not always based on evidence.

The idea that information alone does not enable good decision-making — at least in the sense of acting in agreement with what scientific evidence would suggest is rational — is useful motivation for scientists to get out and learn what their research actually means to people. There’s no doubt, however, that science communication through engagement with communities is much more costly for scientific researchers than simply putting themselves and their work out there in the form of traditional public talks. Likewise, traditional forms of science journalism have the clear benefit of having broad reach without significant demands on the researcher’s time, but are potentially limited by an imagination gap: perceived public interest drives the choice of topic, resulting in a bias towards the familiar and the canonically important. At the same time, the need to publish peer-reviewed scientific articles has only become more critical to the success of a scientific career; there is therefore a real sense in which the expectation that scientists should do public-focused science communication is increasing the workload for scientists, which comes at a cost to the time that they have in which to do research.

There must be a motivation for scientists to put the time in, in other words. Increasingly, this motivation is framed in terms of benefits to the individual scientist. Some of these motivations are altruistic — such as the desire to contribute to better decision-making, in environmental or health policy, for example, which is often a natural reflection of the drivers that spur people to work in these fields in the first place. Some of these motivations might be considered selfish — the desire to have more evidence of the impact of your research, for example, which can be used to leverage more research funding — but only selfish in the sense that any effort for self-promotion in any industry can be considered selfish.

The concerning step is that at which the desire for self-promotion — what would be a normal desire for professional success, in most industries — comes at a cost to the quality of the science itself. And still yet more importantly, depending on the implications of the claims being made — in human health, for example — is the effect of the communication itself: claims relayed unquestioned by media risk creating a false perception of truth.

So when does science communication become a public relations exercise? And when is that — and when is that not — ok?

These are the questions that the scientific community is asking itself.

In early 2016, the University of Maryland put out a press release claiming that chocolate milk helped football players to recover from concussion. A great news story — as reported by Health News Review — only, the research was not peer reviewed, not published, and — ouch — not even complete.

There’s been a lot written about the tendency of media reports to get science stories wrong. Many scientists are even very wary of journalists, mistrusting the need to sacrifice scientific accuracy to make a story both palatable and digestible by the public. There’s no doubt indeed that misrepresentation of scientific studies happens, and that the combination of a weird-sounding result with the ability to say ‘Scientists say…’ is a simple recipe for clickbait. It’s also clear that careers in journalism have been under serious pressure for some time now, and it is tempting to write off some of the more egregious examples of journalists getting it wrong — such as in mistaking the anatomical description of ‘heel length’ between ankle and Achilles tendon for the height of a stiletto, as simply due to this journalistic attrition.

I’m not convinced that that’s all that’s going on here, though. While the chocolate milk story might be an extreme example of the extent to which even our educational institutions are willing to undermine the scientific process, it is by no means an isolated incident. In 2014 a meta-analysis of hyperbole in university press releases, and in subsequent reporting, came to the unequivocal conclusion that when there is inaccuracy in media reports on scientific studies, the hyperbole was already there in the university press release.

Science, we have a problem.

2. Freedom and responsibility

‘Comment is free, but facts are sacred.’

But are they?

It’s a curious quote in the context of science communication. The original intent of C.P. Scott, in his editorial in the Manchester Guardian in 1921, was to underline the commitment of journalism to the truth — to honesty in both content and presentation: ‘Neither in what it gives, nor in what it does not give, nor in the mode of presentation must the unclouded face of truth suffer wrong.’ From the point of view of science, however, the sanctity of facts themselves is, and should be, an uneasy concept. Science is a process, by which we discover how the world works: thus any fact, no matter how long it has been considered to be true, must be able to be abandoned in the light of new evidence.

Facts are not free. Nor are they sacred.

The need to question, to interrogate science (and, in effect, scientists), sits uncomfortably with the idea of science communication as a public relations exercise. If the motivation presented to scientists for their science communication is that sharing their knowledge with the public is an unquestioned good, because science itself is unquestionably good, then criticism or questioning of a scientific story becomes problematic — indeed, almost disrespectful.

There is something else missing in this model for science communication, and what I think is missing is the role of the media itself. The journalist. The person who asks questions, with an eye trained to detect hyperbole, and an understanding of objectivity that can contend with scientific data and their interpretation. People trained in the art of critique.

This is the issue — not with science communication itself, but with the framing of it as something which defies critique — that I think is in need of reimagining. Just as frameworks for science communication based around the deficit model ignored the needs of the public, the current emphasis on two-way communication continues to centre the scientist, as the actor who is both speaking and listening to the public, and it thereby ignores the imbalance of power inherent in the relationship between the two parties.

If we look to centre the role of the media in science communication, the relationship between scientists and the public becomes much clearer, and much more balanced. The public becomes better defined — no longer an amorphous mass defined in opposition to the scientific community, but as readers of newspapers and blogs, parents in need of information about vaccination, or taxpayers with a skeptical eye for government expenditure. Precisely on these issues where the scientific community should not be expected to look at itself with a dispassionate eye, building trust requires the engagement of an objective third party: the media.

For it is generally true that scientists care about their work and can not always be dispassionate observers, despite the central role of the concept of an objective observer in the stories that we tell ourselves about how science works. But in perpetuating this myth — by concealing the fact that real people do science, and for a range of motivations — is it possible that we in fact do damage to the trust the public has in science?

A recent study of the interactions between climate change scientists and the media shines a little light onto some answers to these questions. As described by Post in the journal Public Understanding of Science, scientists’ interactions with the media — and their concerns about communicating the uncertainties in current scientific understanding — change according to the political or social context. The context within which the media is operating no doubt also complicates these concerns. In particular, climate scientists are more willing to discuss uncertainties in the data or interpretation when they are less concerned about the possibility of misrepresentation by interest groups. It is also considered more problematic to discuss results that suggest climate change might proceed more slowly than previously expected, than if the results were to suggest that climate change might occur faster.

This actually makes some sense in terms of the way that science actually proceeds. If news is made on the basis of a single study — as it inevitably is when there is a ‘new’, and hence newsworthy result — then this study has to be evaluated in the context of all previous research in the field. A new result that suggests that the status quo is more problematic than previously thought, is nonetheless more consistent with the direction of previously published literature (in terms of the actions which it would suggest and support), than is a single result that tends towards moderating, or reversing the scientific consensus. Not all risks are equal, and not all scientific studies are equal — and taking this into account requires a complicated evaluation of evidence and context.

It is in many ways unfortunate that there is a good deal of skepticism about the role of advocacy in science communication, and mistrust of the idea that a scientist might be more or less passionate about communicating a scientific result to the public, depending on what that result is. This is indeed naïve. The meaning of any individual scientific study depends crucially on its context within the scientific literature. This judgement — itself somewhat subjective, in the sense that not all results are equal — needs to be understood, in any discussion of the role of objectivity in science. Objectivity is something that we aspire to as scientists — and if a sufficient number of experiments are done, at different times and in different places, and by different people, we can converge on something that looks like an objective truth.

3. Objectivity

There’s another way of making this case, about the role of journalism and the media in science, which almost turns the relationship between scientists and journalists on its head. Journalists can be thought of as being poised between scientists and the public, and this is a useful framing. But we can equally consider the ways in which the boundary between science and journalism can be blurred so that we can see better what we have in common.

The concept of objectivity is a natural connection. That’s not to say that we mean exactly the same thing by the word: of course we don’t. But then again, ‘objectivity’, in different branches of science, and at different times in history, has referred to different things. My favourite example is that the current meanings of the words themselves, ‘subjective’ and ‘objective’, were essentially inverted in the eighteenth century by Immanuel Kant, before which they meant almost exactly the opposite.

In modern science, an objective representation of a plant species may need to be a composite of many different, and variable, examples of the species. Mechanical objectivity, on the other hand, might refer to machinery used in physical experiments to capture data independently of human judgment; a classic example would be photography. Yet there is always more than one way to take a photo.

In journalism the concept of objectivity is used to convey the importance of fairness and truth in reporting, and — similarly to its usage in physics, where a metaphorical observer is often placed in a thought experiment to keep track of what is going on — a kind of disinterestedness that amounts to trying to remove the journalist from the interview, from the data collection, and from the analysis. In a similar vein to the discussion in the scientific community, the journalist is expected to be merely an ‘honest broker’ of the facts, and never an advocate for one view or another.

But is this really the way that objectivity works? In science, one serious concern is that the pressures of PR and self-promotion create a selection bias such that only positive scientific results are reported. Possibly worse is the prospect that scientific research on subjects likely to attract positive media attention are more likely to be carried out in the first place: in the New Zealand context, for example, the preference that both public and funders have for research on our iconic birds over native invertebrates.

The discovery of CRISPR — clustered regularly interspaced short palindromic repeats — has been called a game-changer in biology, as a new way of allowing genes to be edited with precision. Such scientific discoveries — the ones that change the way scientific research is done, opening up new avenues of investigation — are immediately recognisable candidates for Nobel Prizes. In a strikingly clear example of self promotion, Eric Lander — founding director of the Broad Institute, where some of the key work on CRISPR was done — wrote a history of the discovery for the journal Cell, in which he highlighted the work done by researchers at his own institute, and glossed over the work done by other scientists, most notably researchers Jennifer Doudna and Emmanuelle Charpentier, with whom his institute is currently fighting a costly, high stakes battle over patent rights.

First-hand accounts of the ways in which science progresses are singularly valuable science communication, and they can naturally never be without their own biases: but transparency in recognising those biases is necessary. Is it possible that the scientific ideal of objectivity is itself a hindrance to open and full disclosure of conflicts of interest, because of the perception that scientists need to be objective, and act in an unbiased fashion?

There are definitely cultural factors at work that need to be recognised. The tendency to self promote, and the extent to which your community will accept it, varies between scientific disciplines and cultures. Accordingly the balance between historical background and recent results in a scientific presentation needs to be adjusted based on the intended audience: to a European audience, not fully crediting the major work done by others on a topic is either evidence of ignorance or self importance, while to an American audience dwelling for too long on the past is a fatal underselling of the work you have done, tantamount to an admission of lack of novelty. It is not always an easy balance to strike.

4. Transparency

In 2014 I conducted, on behalf of the New Zealand Association of Scientists (NZAS), a brief survey of scientists in New Zealand who were concerned enough about issues of science communication to respond to a few short questions about why scientists do — or do not — communicate directly to the public. This was based on a suggestion — made in the Nation of Curious Minds proposal document, in 2014 — that the Royal Society of New Zealand should compose a Code of Public Engagement, which would outline the ways in which scientists should — or should not — engage with the public. This code has been subsequently downgraded in status to a set of guidelines, in response to serious concerns raised within the scientific community that the implementation of a code — or perhaps, the mere suggestion that a code was needed — might actually discourage scientists from public communication, rather than enabling it.

Discussion of the barriers to science communication — in particular, around issues of the public good — open cans of cans of worms. Our Crown Research Institutes (CRIs) are publicly owned, and they are therefore required to do science for the good of New Zealand — but they are also required to be financially profitable, and they achieve this through bringing in significant amounts of funding from private, commercial sources: roughly a third of total revenue, though this varies greatly between CRIs. This is not so different from research in our universities: both the contestable grants systems managed by our Ministry of Business, Innovation and Employment, and the Performance Based Research Funding system of baseline university research funding, have been adjusted since 2010 to increasingly incentivise, and financially reward, the support of research programmes by industry.

There is nothing wrong with industry funding research in itself. However, it is fair to query whether the ways in which government funding is allocated provide sufficient counterbalance to the natural pressures of industry funding.

More germane to the problem of effective science communication, is whether the source of funding, and its context, affect what a scientist is able to talk about publicly. One of the survey responses we received called it ‘amusingly naïve’ to assume that science communication is universally seen as a good thing: ‘How as a CRI scientist can I ever speak out against an industry that my CRI serves? I just cannot.’

These concerns are at the far end of the conflicts of interest that exist, naturally, whenever a scientist starts down the path of talking to the public about their research. The question that worries me is whether the current structures that exist within the scientific community are adequate to support these scientists to manage those conflicts of interest.

5. The importance of critique

Kevin Folta is a well recognised plant scientist, based at the University of Florida, who has been engaged in science communication on the topic of genetically modified organisms (GMOs) for several years. Brooke Borel is a well-established science journalist who has written about the topic extensively, and generally favourably, such as in a ten-point debunking of myths about GMOs in 2014.

So how did these two people — the scientist and the journalist — come to clash so publicly in late 2015 that Folta claimed to have been the subject of a ‘personal takedown’?

The root cause seems to have been a lack of transparency. The reason for Borel’s initial interest in Folta was an undeclared relationship that he had with Monsanto, which was funding his science communication work on GMOs. This was uncovered by an anti-GMO group, Right to Know, through a Freedom of Official Information Act request (the US equivalent of the OIA in New Zealand). It turned out that not only had Folta repeatedly denied any involvement with Monsanto, but he had emailed them to thank them for a US$25,000 contribution to his communication work, promising ‘a solid return on investment’.

The story becomes more bizarre as Borel discovered that Folta had also been running a science podcast for several years under the name Vern Blazek, even going so far as to interview himself without disclosing his identity. This lack of transparency — repeated — stimulated a great deal of online discussion about the extent to which scientists should be wary of relationships with industry, or more pragmatically, the extent to which industry relationships can hardly be avoided, so the only thing to do is to insist on complete transparency.

The biggest concern that comes through in online discussions of the issue is that, by being less than transparent about both the podcast and his relationship with Monsanto, Folta had made it possible for his opponents to justify their attacks on his intellectual freedom, and by implication, on that of other researchers. It is the system-wide consequences of his actions that lead to necessary questions about how we build and maintain trust.

Yet there is a very clear message that comes through in his writing online about his motivation — that science communication is good, because science, and scientific progress, are good. In the introductory post on his blog dating back to 2008, he says:

‘It would allow us to transcend useless tradition in favor of informed, reasonable decisions. It would allow society to separate lies, distortions and fraud from fact. We were going to embrace science and continue to use technology to speed our world into a brighter new millennium.’

This unquestioning view of the place of science in human society, as the source of all progress, is endemic in scientific culture — and that’s not completely a bad thing. It’s only human to want to think that your work makes a positive difference to the world, after all.

However, there are also serious dangers in science communication that becomes PR, even at its most impersonal and (apparently) unconflicted. Advocating for science on the basis that it is a Good Thing is fine up to a point. That point can be reached very quickly, however — whenever there is a suggestion that scientists know best, and can be trusted to act in the interests of the public good. Often such a suggestion is followed by the implication that they will necessarily do the Right Thing, simply because they are scientists. This is why we need a critical media: any discussion of science as progress, that presents such progress as linear and inevitable, and neglects to mention the cul de sacs and backtracking that has occurred along the way — the Manhattan Project, the story of Henrietta Lacks, or Mengele’s research at Auschwitz come to mind — is flawed. Should the public trust us if we are not aware of — or if we are not willing to discuss — the skeletons in our scientific cupboards?

6. The public good

There is no question that there is a public good aspect to scientific research. It is however appropriate to query the extent to which the public good is balanced against other interests: those of the researchers themselves, their employers, and, in some cases, the interests of the broader scientific community. These last interests often appear as concerns about science being brought into disrepute; it seems that the idea of science heroes making great discoveries is so omnipresent that to suggest that individuals are fallible is seen as undermining the scientific method itself.

In New Zealand, the idea that researchers should have a role as ‘critic and conscience’ of society is embedded into the Education Act of 1989. On the other hand, the fact that CRI researchers are not covered by the Education Act does lead to conflicts. As one person put it in the 2014 survey carried out by the NZ Association of Scientists:

If you think scientists are both free thinkers and free talkers in New Zealand you’re about fifteen years out of date. In a nutshell science is a political arena in New Zealand, in a way that education is not. University staff may be free to talk but that privilege is now almost unique to them.

It is not clear at all, however, that intellectual freedom is entirely a privilege: there are times when it is much more clearly a responsibility than otherwise. To take another international example, the poisoning of the water supply with lead in the town of Flint, Michigan in the United States, discovered by civil engineer Marc Edwards, has been an example of a scientist stepping up to do right by the community, where the scientists responsible for the problem have not:

‘The agencies paid to protect these people weren’t solving the problem … they were the problem’.

In an interview with The Chronicle of Higher Education, Edwards discusses very directly his concerns about the connection between funding incentives and the ability of research scientists to prioritise the public good:

I am very concerned about the culture of academia in this country and the perverse incentives that are given to young faculty. The pressures to get funding are just extraordinary… the idea of science as a public good is being lost.

A New Zealand example that will be familiar to many is the case of Mike Joy. Although in a position to speak publicly, in his role as a university researcher, he has not been immune from criticism. In 2012, Joy commented to the New York Times about the 100% Pure brand used by Tourism New Zealand, stating that it was misleading given the current state of our environmental standards. This led to an editorial in the New Zealand Herald, criticizing Joy’s ‘exaggerations’, but also — and more seriously — attacking his timing, suggesting that by coming out a week from the launch of Peter Jackson’s The Hobbit: An Unexpected Journey, Joy’s comments might deter ‘big-spending American tourists’ from coming to New Zealand.

In the background to the Herald editorial was the shadow of politics. Prime Minister John Key had been interviewed on the BBC current affairs show Hardtalk over a year earlier, and had been asked at that time to comment on Mike Joy’s criticisms of the 100% Pure brand. John Key said of Mike Joy then: ‘He’s one academic, and like lawyers, I can provide you with another one that will give you a counterview.’

The dismissal of scientific expertise on the grounds that the scientists themselves are conflicted is problematic. Yes, to some extent, any scientist will come to an interview with a preformed point of view. The problem is only when that point of view is allowed to pass unacknowledged and unchallenged. Like any other human beings, scientists can and should be allowed to advocate for policy on the basis of their beliefs.

I’ve had far too much to say about issues in the science system for a book focused on the idea of reimagining the media. But the structural issues in science, that encourage individual researchers to reach out and communicate, yet provide little support or training for them when they do, require some explanation. There is a deficit in science communication — but I don’t think it’s one of knowledge, but rather of critique. Even more: a deficit of appreciation for critique.

Critique — especially in the form of journalism — is precisely the missing ingredient needed to build trust between scientists and the broader public. It certainly isn’t a substitute for the human engagement elements of scientific communication that will always require that scientists get out and talk about their work — acting as role models, to represent the reality that science is a human endeavor. However it does achieve a related goal, and one that is no less important: giving the scientist space for advocacy and opinion, without undermining the ideal of an objective truth, embedded in the scientific results being discussed.

I’d like us to reimagine the role that the media play in science. Journalists are seen too often as some sort of a support crew, rather than as independent actors with their own role to play, who support public understanding of the scientific method by challenging and contextualizing the interpretation of new results. I earlier laid out my concern that the concept of objectivity undermines the scientific acceptance of critique, but perhaps is it equally problematic to expect idealized objectivity from journalism, too. Just as funding both drives and limits scientific discovery, the prospect of a truly effective and objective media is restricted by the realities of who pays the bills.

Sadly, the reality of scientific research in an environment where much of our work is still supported by the taxpayer suggests that public funding alone is no panacea. However, there are aspects of what public funding can enable that we might like to examine, within the analogy between public good journalism, and public good science. Public institutions — both funding bodies, and professional societies — play an important role in defining the boundaries within which we operate as scientists. When individuals come under political or financial pressure, the sense of identity that comes with belonging to a defined community matters, and for both scientists and journalists, this may mean that objectivity is only truly possible when it is an acknowledged, communal aspiration. Funding matters, but even more does its publicly stated purpose.

Comment is free, as is seen at any glance below the line online, but neither journalism, nor scientific fact are or can be. The argument for publicly funded institutions is thus this: their contributions to the public sphere are priceless.