Alex Freeman
Sep 26, 2018 · 6 min read

Has journalism and science communication crossed a line?

For a long time I worked in science communication, and we thought it was fairly easy to know when we’d done our job well. At the BBC, we had a whole raft of audience feedback for every television programme we made. How many people watched it? What was the average ‘appreciation index’ score that they gave it? We would also follow social media comments, and get emails and comments direct to our inboxes from viewers. Officially we were supposed to ‘inform, educate, entertain’ and we thought we knew how to do all three. Now, though, I am beginning to question what exactly science communication is doing.

Nearly two years ago, I left the BBC to join the new Winton Centre for Risk & Evidence Communication in Cambridge. The Centre is founded on the basis of being there to help ‘inform and not persuade’. When I first heard that phrase, it made complete sense and didn’t seem controversial.

At the Winton Centre we want to help present information (particularly numbers) to people in order to help them form their own opinion about a subject, and make their own decisions. For example, when you go to the doctor and have to choose a medical treatment, how can your doctor present the pros and cons of each option in such a way that you can choose the one that would be best for you? Some people may just want their doctor to decide for them, but others want to make their own choice — particularly when the treatments have big side-effects or can be dangerous, and the benefits might be slim.

Legally, this is also the basis of ‘informed consent’. Reinforced by a 2015 Supreme Court ruling known as the Montgomery judgement, doctors HAVE to ensure that their patients have understood the options that face them, understood the potential outcomes of each, and have chosen to accept the risks that come with their selected treatment. After all, if you have not consented to someone cutting you open with a knife, then whoever is doing it — regardless of whether they are a qualified surgeon or an assailant in a backstreet — is committing a serious offence.

These same principles apply in politics. In order to form an opinion on a policy, we need to know its pros and cons — and not just for us, but for everyone it might affect (and across many different domains, such as financial, environmental, health etc). Presenting this wider-ranging information is a far bigger ask than presenting information for a medical decision, but ethically (if not legally) it has to be tackled.

Many people working in journalism or science communication will think that this ‘informing’ is what they are trying to do. It’s what most of us at the BBC assumed we were trying to do. But I’ve come to see things slightly differently. The clue to a person’s intentions when they communicate is what they consider a measure of success.

For example, are you interested in what people do or think after they read or hear the information you provide? If you want people to change their behaviour or attitudes — to do something differently after hearing your message — then I would argue that you are primarily trying to persuade. Information might be part of your persuasive message: if you want people to stop taking an ineffective medicine, or care more about climate change, you could (and probably should) tell them about the evidence in each case. But research consistently shows that if you want people to change behaviour then simply giving them information is not the way to do it — you need to employ appeals to their emotions and give them direct action points that they can employ easily and immediately as a result.

These are, of course, exactly the sorts of techniques I became used to using as a documentary maker. We dealt in the world of emotions. We tried to get people’s attention and then to make them laugh, to make them cry, to give them clear ‘take home’ points — and it was a successful formula. It made for popular television, and these sorts of techniques have also become the mainstay of teaching scientists ‘how to communicate’.

But that is not impartial evidence communication. As journalists, script-writers and directors we knew what message we were trying to put across. We always had a carefully crafted ‘story’. Evidence presented purely to inform — balanced and not trying to persuade a person to agree or disagree — does not make for entertainment, almost by definition.

This is clearly an important difference. At the Winton Centre, we are designing communication for people who already want to know, who have an important decision to make, and want information about it presented clearly and without bias or ulterior motives. We can afford to live in the world of facts. We do not need to entertain, and we don’t care what decisions people make after considering the evidence. We just want those decisions to be informed and ‘good’.

Out in the competitive media or news environment, or trying to ‘raise awareness’ with a politician, attracting and holding attention is the first priority. Giving information and supporting a good decision is secondary. But how do the two interact?

This is where the two worlds — of facts and emotion — combine in an interesting way. Because in order to make a good decision, researchers believe that we first need to make ourselves imagine more than one potential future scenario. We need to open our minds to the possibility that things could turn out well or badly. We need emotion.

There are hints about the importance of this in some classic psychology experiments. You can change people’s perception of a risk by turning it from a bland number into an imagined scenario.

For instance, professional forensic psychiatrists asked to assess the chance of a patients in a psychiatric hospital harming someone on release classify more as ‘high risk’ when asked to quantify that risk in the format ‘20 out of 100 patients like this would likely harm someone’ than when asked to state the risk in the format ‘this patient has a 20% chance of harming someone’. 20% chance for 1 patient harming someone sounds quite low. 20 dangerous patients out of 100 conjures up a more vivid image of potential disastrous consequences. The imagination of different potential outcomes has been set aflame.

But as this example also illustrates, unleashing imagination and emotion is also dangerous when trying to make a good decision. Once you’ve imagined the incredibly rare chance of releasing a subtle serial-killer from a psychiatric hospital it’s difficult to shake off that strong emotion.

And this, for me, raises an issue.

We all have a right to be ‘informed and not persuaded’ — an ethical right and often a legal right. And yet training in communication focuses almost entirely on how to grab attention and how to manipulate emotions to tell a story — how to be persuasive. There is very little training on how to use emotions more subtly, in a way that opens minds to possibilities constructively, but is not designed to persuade (I’m certainly still learning how it might be done).

Of course in a competitive environment it is important to get your voice heard, and I often hear people say their objective is to persuade people to ‘do the right thing’. But who is defining ‘right’? If you as the communicator are — if you are defining the story — then you are not simply informing.

When people are making really important decisions — decisions about their health, their finances, about policies that will affect millions, or about someone’s guilt or innocence — I would now argue that a different kind of communication is needed: the skill to engage and be clear and to allow the audience to form their own story through the information, and to make their own decision at the end as to how to react to it.

This is clearly a kind of communication that requires at least as much skill, and which is much harder to tell if it is being done successfully. But I would argue that it’s one that needs to be recognised, and taught alongside persuasiveness to those who work in areas where communicating facts is key. That includes science communication, political communication, expert witnesses in court and news reporting.


The Winton Centre for Risk and Evidence Communication is hosted within the Department of Pure Mathematics and Mathematical Statistics in the University of Cambridge. Transparent evidence designed to inform, not to persuade.

Alex Freeman

Written by

Winton Centre for Risk and Evidence Communication, University of Cambridge


The Winton Centre for Risk and Evidence Communication is hosted within the Department of Pure Mathematics and Mathematical Statistics in the University of Cambridge. Transparent evidence designed to inform, not to persuade.

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade