A Spoonful of Sugar

Abel Gustafson, PhD
Aug 8, 2017 · 11 min read

Two guys are at a bar.
One asks: ‘What’s the biggest problem in the world today: Ignorance or Apathy?’”
The other guy grumbles: ‘I don’t know, and I don’t care.’

Photo art by Kaleigh M

What is the shelf life of a box of Twinkies? Take a guess.
Contrary to popular belief, it is only about 45 days.

Also, all parts of your tongue taste all of the tastes.
Stress is not the cause of ulcers.
Buddha was not fat.

If we compared expert knowledge to public opinion on every topic, we would find that most people are mostly wrong about most things.

Most of the time, this doesn’t really matter.
I doubt that Buddha is eternally offended. And Twinkies are straight garbage, no matter when you eat them.

But it turns out that we are also often wrong about things that really do matter. On many issues that have significant implications for our current and future quality of life (think: food, energy, medicine, and the Earth’s ability to continue to support human society), there are huge discrepancies between public opinion and the best available science.
These large gaps are no small problem.

Because of this problem, the last few decades have seen many scientists, educators, politicians, and entertainers place a great deal of focus on efforts to a) change public opinion about science topics, and b) change public opinion about scientists and science itself.

It has not gone particularly well.

The scientific community is remarkably bad at science communication.

And it’s a special kind of bad.

It’s bad in the very same way that I’m a bad dancer: Well-meaning, but oblivious to the fact that I am alienating the very same 50% of the population that I am trying to attract.


In March of 2017, we scientists and science advocates were collectively in the spotlight for one weekend. Like many opportunities with great potential for benefit, this opportunity also had great potential for harm.

We marched through the streets of the world in support of Science, and we were broadcasted into every corner of radio, television, and social media. We were inescapable.
Many of us were also intolerable.

The cornucopia of signage at the March for Science was a microcosm of the most common strategies of science communication.
Two of these strategies are both especially common and especially ineffective. We all do these (myself included) and we do them far too often.

Strategy One.

Many of us operate under the assumption that “If the public just wasn’t so ignorant, they’d agree with us! Here, let’s give them another bucket of facts.”

For decades, the focus of science communication has largely been on educating an ignorant public. The assumption is that people will believe _X_ if we just tell them that the evidence and experts support it.

Strategy Two.

Many science evangelists succumb to baser, elitist instincts, unleashing their frustration by resorting to derogatory attacks, condescension, and B-grade jokes.

I mean, who doesn’t love being the butt of a good inside joke?

One observer of the March for Science Facebook page lamented “Science needs better PR, not more trolls… Idk, this thread just feels like a physicist version of the comma police.”

The experts say that both of these strategies are inherently ineffective.

There is a good bit of irony here. When someone uses these two science communication strategies to try to wrangle others to obey the basic recommendations of science, they themselves are demonstrating blatant disregard for the basic recommendations of the scientific research on science communication.

Here are 3 things relevant to science communication that scientists, of all people, should understand:

1. Asking humans to care about science is asking a lot.

Humans have a finite amount of cognitive capacity, so we use it sparingly — allocating it when and where it is most necessary. Usually, “most necessary” equates to “most important for getting our genes into the next generation” — such as pursuing attractive mates, prioritizing high-calorie foods, and running away from that rustle in the bushes behind you.

We are simply not wired to prioritize invisible, slow-moving threats like climate change. Even if there are mountains of supporting evidence, we don’t have much inherent concern, because it will most severely affect people who are a) thousands of miles away and b) not born yet. Those dusty, dry reports of climate change data are in competition for your limited cognitive resources with that TV ad for juicy glistening burgers held by juicy glistening people.

Conversely, we are much more likely to be concerned about visceral, imminent threats like the supposed dangers that lurk in vaccines, GMOs, and people who eat, pray, and love differently than us. Even if there is zero supporting evidence, the slightest suggestion of immediate threat to the immediate health of ourselves and our cubs can ignite our most fundamental fears.

Applied to science communication: In the battle for public opinion, scientific evidence is not the silver bullet, or the golden gun.
Quite to the contrary — facts, figures, findings, and logic are quite often the proverbial knives at the gun fight.
This is why Strategy One (“Pay attention to these facts!”) fails so frequently.

If Team A wields the epistemological superiority of facts, logic, evidence, and expert consensus — while Team B counters with the practical superiority of tribal instincts and primal priorities — guess who wins? One might say Team B has the trump card.

2. Humans reject things that appear to contradict their existing beliefs, values, social norms, behaviors, etc.

One of humans’ most reliable psychological traits is our proclivity to engage in “motivated reasoning” and “confirmation bias”: to interpret the world around us in a way that is most favorable to the ideologies and beliefs that we (and our tribe) already value.

Let’s say that you are watching your favorite team competing in a tight playoff game. A referee calls a penalty. Your opinion about the accuracy of that penalty call is not primarily driven by your IQ. It is not even driven by your expertise in the rules of the game.
Instead, the best predictor of your opinion is whether the call was for or against the team you are rooting for.

We also are prone to disproportionately look for, notice, remember, and trust information that confirms the opinions we already have.
When they show the replay of the penalty, the details you notice — and your interpretation of them — will most likely be skewed to justify your initial opinion.
If two sports commentators disagree on whether the penalty was justified, you are likely to conclude that the one that agreed with you is just more knowledgeable and trustworthy (unbiased, even) than the other.
On the flip side, we tend to ignore, reject, or discredit data (or people) that blatantly contradict the beliefs or values we already have.

Further, we have a natural knee-jerk reaction against being told we must — or must not — do something. This is called reactance.
Near my childhood home, there is a stretch of sidewalk that is forever imprinted with the tread of my bicycle tires. I blame it on the sign that told me that it was forbidden to touch the wet concrete.

Mix these together and the result is a cocktail that tastes like this: We have a natural reflex to push back against people telling us what to do or think, especially when it’s presented as being contrary to our existing beliefs, values, or actions. And especially when it’s coming from someone who makes it clear that they are not one of us.

This is precisely how people respond when scientific evidence is presented in an elitist, domineering, Science vs Public, condescending style. This is why Strategy Two fails so frequently.

Applied to science communication: It is always tempting to tell people to just shut up and listen to the experts.
Like an exasperated bartender saying: “Look — I’m the expert here, so I’ll be choosing your drink today. You’re getting a gin martini. Hendricks, cucumber, and basil. Because if I serve you that Bud Light, we’ll both look like colossal idiots.”

But when people detect tones of Us vs Them, the natural reflex is to batten the hatches — to barricade their mind against strange ideas from strange outsiders.
This exacerbates the politicization of science, the distrust of the scientific community, and the alienation of the public.

We need to emphasize camaraderie, not condescension.
Commonalities, not commands.
It sounds cliche, but it works.

It works because if we first establish a little bit of trust — if we first establish that we are on the same team — then the subsequent information about science (or whatever) is processed through a lens of positive motivated reasoning.
Think about that. Hacking our primitive biases.

As a great pharmacological thought-leader once said: “a spoonful of sugar helps the medicine go down.”

3. Almost everybody rejects science.

Opposition to (and ignorance of) science is not restricted to any political affiliation, education level, or IQ score.

It may be the case that the stereotypical conservative denies climate change and fears Muslim refugees.
But the stereotypical liberal is so afraid of GMOs that the next horror flick coming out of Hollywood will likely be about a group of reckless teens vacationing at a remote cabin in the woods, slaughtered by a large, pest-resistant strawberry.

For every right-leaning person who ignores the experts and evidence (e.g., claims the death penalty is a crime deterrent, calls for trickle-down economics, and thinks that guns keep you safer), there is a left-leaning person who ignores the experts and evidence (e.g., peddles essential oils on Facebook, attributes their personality to being a Sagittarius, and makes liberal use of the word ‘toxins’).

In one recent experiment, people were shown some “new” (made up) scientific findings. Conservatives and liberals were roughly equally supportive of the scientific evidence when it was pitched as corroborating their political beliefs. When the findings were pitched as contradicting the person’s ideology, liberals and conservatives both rejected the science equally.

Dan Kahan, the principal investigator of this study, notes that a majority of recent clashes between science and politics have been ones where conservatives politics landed on the wrong side of science.
But, as the experiment showed, the rejection of science through motivated reasoning is not exclusive to conservatives.

Even more interesting, agreement with science has surprisingly little to do with intelligence or education. Just like how your opinion of a penalty in sports is not primarily driven by your intelligence or knowledge of the game.

Some very interesting research has found that people that are more scientifically literate and are better with numbers respond to new science info with more political bias.
Did you catch that?
Higher cognitive ability and science knowledge does not necessarily make people converge upon scientific opinion. Instead, super-educated, super-smart people ingrain themselves into their biased beliefs about science topics even more stubbornly than the average bear.
Some researchers argue that this is because they are capable of rationalizing away any contradictory evidence that arises. If you are especially intelligent, that can make you especially adept at motivated reasoning.

Disagree? Before you violently object to this point, just stop and think about the irony of rejecting the research and experts here…

“Just as I thought: Extremely stubborn and suspicious.” — Mary Poppins

Liberals do not side with science on climate change because they are more rational or more enlightened. They do so because the evidence on this issue happens to fit their ideology (collaborative communal protection, collective action) and because they see other liberals supporting it (and/or see conservatives opposing it).

Similarly, conservatives side with science on GMOs because it fits their ideology (minimal regulation, individual responsibility) and because they see other conservatives supporting it (and/or see liberals opposing it).

So if you happen to have opinions that align with science on a particular set of topics, it is most likely NOT because you have escaped the cognitive shackles that still encumber your knuckle-dragging peers.
Instead, it is much more likely that you — purely by chance — were influenced by a social circle that valued the scientific opinion on that select assortment of topics, and your confirmation bias and motivated reasoning have taken over the reins from there.

Applied to science communication: Get down off your high horse. Speak with humility and respect. Recognize that disagreement with science about a particular topic is not a sign of cognitive deficiency — it is mostly an effect of social context.
Don’t do Strategy Two.


My point is not that it is bad to give people the data they need to make good decisions. And there is a time and a place to tell someone very firmly that science says they are wrong.
Just like in a good friendship, these things are often necessary — and they can even signify the importance and strength of that relationship.

Rather, my point is that these things should not be the focus of most conversations between two friends, and should not be the premise or defining feature of the relationship. That’s just not healthy.

We can learn from the few successes we have had, simply by observing their style and the relationship they create.
Carl Sagan is your spiritual mentor, Bill Nye is your favorite quirky teacher from grade school, NDGT is your cool grandfather.
They connect with us on a human dimension — speaking to our highest dreams and deepest fears, our strongest ethics and weakest faults.

Being the cleverest troll on the internet is quick and easy fun. I get it.
Being the Mr. Rogers of science is much more difficult, and much less fun.

But science says that persuading people to change deep-seeded beliefs and behaviors requires a little tenderness, a lot of time, and a ton of trust.
Not lectures and snarky inside jokes.


Abel Gustafson is a PhD candidate at the University of California-Santa Barbara. His research optimizes strategic communication in science.

Abel Gustafson, PhD

Written by

Scientist @ Yale University // Strategic Comms in Sustainability, Clean Energy, & Green Tech

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade