The series “Guidance Systems” discusses technologies and techniques that seem to improve our lives by offering us new choices, while in fact shaping or removing our ability to decide things for ourselves. Here, we begin to look at systems of persuasion and manipulation known to psychologists and decision scientists that may have positive uses as well.
A few nights ago I was asked to appear on a news network long after dark, and I drove about half an hour to a studio near where I was staying. I’ve been doing these sorts of appearances for a few years, either as a guest or as an employee, so the ritual of it is becoming familiar to me: recite a handful of data points in an effort to memorize them as I drive into a somewhat forlorn part of town where the rents are low and the spaces are large, pass through a remotely operated security gate, shake hands with the camera operator who got stuck with the late-night guest, put on some powder, check myself in the mirror, sit down, plug in an earpiece and attach a microphone.
But the portion of the experience I’m not yet used to is the next part, when we’re settled and in place and waiting for someone to come over the line and tell us when I’ll be on. That uncertain stretch of time with the camera operator tends to generate a strange sort of conversation: polite, friendly, a bit commiserative, and always conscious that we can both be heard by the network on the microphone I’m wearing. In this case it was the typical exchange: a friendly, breezy chat, drawn from the news, filling time that would otherwise involve him staring at me and me staring at him, or both of us staring at our phones. We talked about climate change.
There are certain viewpoints that sit so entirely opposite mine I don’t have any idea how to address them. Climate change skepticism is one of them. I have a very difficult time keeping my head clear as I talk with people who believe global warming is a hoax. I find it disorienting — it’s as if we occupy separate realities. In the reality of people who dismiss climate change as a cynical political creation, the scientific community is a self-serving club of hacks who will do anything to steer attention and money their own way, and the few researchers brave and principled enough to buck the trend are being shouted down. In my reality, scientists are almost temperamentally incapable of agreeing with one another, so the fact that nearly all of them agree on the origins and threat of climate change is as close to scientific certainty as we’ll ever see in my lifetime, and anyone claiming to have a viable alternate theory is most likely either incorrect, looking for attention, or a combination of the two.
It’s hard to have that conversation.
But have it we did. He asked my feelings about climate change, while warily volunteering that he felt the whole thing must be a bit overblown, right? The world is too big and mysterious for scientists to be able to accurately predict this sort of thing, right? And I went right ahead and plowed through my typical talking points on the subject, until a voice announced in both our earpieces that we’d be live in two minutes. We straightened up and quieted down and it was clear that neither of us had dented the other’s opinion, and that was that.
In an age when social media walls us off from what psychologists call “inconsistent information” — information that contradicts our beliefs — it can feel as if debate is pointless. But a new persuasion technique may be able to bring people together across enormous political and intellectual divides. It’s unorthodox and counterintuitive, it may be dangerous in the wrong hands, and it came to being not through typical behavioral research, but through a concert.
When the singer and composer Leonard Cohen left the stage, soaked in sweat, at the end of a 2009 concert in Tel Aviv, he couldn’t known he’d just funded the means of not just better understanding deeply held beliefs, but perhaps of helping to rewrite them. He’d named the concert, attended by 47,000 people in the middle of widespread calls for boycotts of Israel following its three-week Gaza war, “A Concert for Reconciliation, Tolerance and Peace.” And he gave the approximately $1.5 million in ticket sales to a newly formed charity run by a board of Israelis and Palestinians, so that they could pursue projects that promoted coexistence. Cohen died in 2016.
The money funded, in part, work by a pair of researchers, Daniel Bar-Tal, who teaches at the school of education at Tel Aviv University, and Eran Halperin, a conflict specialist and psychology professor at the Interdisciplinary Center in Hezliya. Their mandate from Cohen’s organization was clear: Pursue peace along paths that haven’t been walked before. So, hoping to find new ideas, they approached Israeli advertising agencies, and asked for suggestions on how to break through seemingly impossible disagreements.
One freelance creative strategist, Atara Bieler, suggested the idea of perhaps exaggerating someone’s views, rather than trying to argue against them. It’s a technique that professional debaters call reductio ad absurdum — blowing an opponent’s argument out of proportion, carrying it to its most logical but cartoonish extreme, and thereby reducing it to ridiculousness.
It turned out that her suggestion had a documented basis in psychology. The technical term is paradoxical thinking, and it’s best-known in the clinical world as a tactic used by addiction counselors for helping cigarette smokers to quit.
Say the patient, a lifelong smoker, remarks to you, his counselor, that doctors don’t know so much. Smoking isn’t as bad as they say! You, an addiction counselor, having read the literature, want to leap up and shout at him that his habit has given him a one in two chance of lung cancer, that smoking is on a par with knife-fighting for bodily risk. But your training stops you, and instead you engage in a type of paradoxical thinking called amplified reflection.
“You’re right,” you tell him with enthusiasm. “In fact, I make it a policy not to believe anything a doctor says. Lung cancer has nothing to do with cigarettes. It just happens or it doesn’t.”
You repeat that notion for a month or more of treatment, and pretty soon, the patient starts to push back. “Well,” he says slowly. “I don’t know. That seems extreme. I mean, doctors know at least a few things, right?” And soon, the patient is beginning to rethink his own feelings about a doctor’s advice and about the threat posed by cigarettes, and soon, if all goes well, he finds a new motivation to hear the evidence, follow a doctor’s advice, and kick the habit.
This is the rhetorical trick the Israeli researchers decided to employ to in pursuit of what Cohen had asked for.
Boaz Hameiri grew up in a violent environment—the town of Netanya, a beach town roughly 20 minutes north of Tel Aviv by bus. His childhood was the time of the second intifada, and suicide bombers occasionally made their way into town. “Explosions used to happen near the city center, and I used to live near there, and so I heard a lot of explosions,” he remembers.
“I think I didn’t realize at that point that it was so unusual to experience that sort of violence. I only realized it when I was a little bit older, 18 or 20. So that made an impact. And when I grew up, I wanted to do something about it.”
Hameiri eventually joined Eran Halperin’s team as a PhD student in social psychology, and led, with his fellow student Roni Porat, a project that subjected an entire town to a grand experiment in paradoxical thinking.
Hameiri and his colleagues had already found success with online tests of paradoxical thinking, but they wanted to find an enclave of very strong political beliefs — a place where compromise seemed impossible. The researchers have kept the name of the town a secret.
It’s a deeply conservative place. In elections just before the experiment, “most of the people who live in the city voted for the religious zionist right-wing party,” says Hameiri. “It’s mostly rightist and religious. This was important to us. We wanted to give it a severe test. The most tough test we can think of.”
In a study revealing the results in 2016, Hameiri and his colleagues say they built the campaign of paradoxical thinking for this city on morality, a strong aspect of national self-image in Israel.
“We focused on the perception of Palestinian responsibility for the continuation of the conflict, which engulfs several ethos of conflict beliefs: that is, that Israelis are peace loving and always reached out for peace and that the Palestinians are not real partners for peace because they prefer violence over negotiation,” they wrote.
To set off the paradox, Hameiri says, they blasted the town with billboards, t-shirts, buttons, and YouTube pre-roll advertisements that talked about the enormous morality of Israeli soldiers. “We show them videos of Israeli soldiers petting a cat, or helping a Palestinian child, or an older person. It’s all portraying Israeli soldiers as being very moral,” Hameiri says. “And then the tag line is ‘in order to be moral, we need the conflict.’ It’s not that we’re moral in spite of the conflict, like we manage to maintain our morality in spite of this really complicated situation. But perhaps we actually need it to be moral. So it’s kind of like an absurd twist on this belief.”
The results were dramatic. After months of exposure to these messages, self-proclaimed rightists reported being 30% more amenable to reconsidering their positions. And after a year, the changed attitude in those rightists persisted. “
“We believe that the results show unequivocally that the paradoxical thinking intervention is effective,” the authors wrote.
In an October 2017 paper, “Paradoxical Thinking as a New Avenue of Intervention to Promote Peace,” Hameiri and his co-authors disassembled paradoxical thinking to identify the three components which make it work: “a perceived threat to the individuals’ identities, their surprised reaction, and general disagreement with the paradoxical thinking messages.” If those conditions are met, the authors write, it’s possible to be far more effective with paradoxical thinking than one can by introducing unwelcome and unfamiliar information to win an argument.
I ask him whether he’s thrilled by his findings. After all, he’s found a method to correct irrational beliefs! Perhaps in an age when the President of the United States is retweeting anti-muslim messages, when self-avowed nazis are marching in the streets, when people cannot agree on the facts of almost any matter, maybe this is the answer! But his answer is very measured.
First, he points out, “you really have to know what you’re doing.” The wrong messaging can serve to not only leave people unconvinced — it can reinforce their beliefs. “We found in one study that wasn’t published that the messages we developed weren’t in essence exaggerated or absurd or extreme enough,” he says. “So actually what we did was making those who were pretty confident in their attitudes more confident in their attitudes. So it can really backfire.”
And second, the signs are that this doesn’t just work on people of a single political persuasion. Studies on Americans, on Palestinians, on Germans and elsewhere have all shown that nearly anyone, liberal or conservative, who meets the conditions of identity threat, surprise, and general disagreement can be moved back from the edges to the center by this technique.
“I’m hopeful as a person, but with paradoxical thinking I’m more fearful than hopeful,” Hameiri says. “It’s quite a powerful tool.” He says he worries it could be used for marketing, for political manipulation, for any number of things. And he says he hopes that by studying the technique, identifying its necessary conditions and its real-world limits, he can help us all to protect ourselves against manipulation.
Perhaps the fact that people of any political leaning can be pulled back to an upright, central place — a middle zone of agreement—is a good thing, I suggest to Hameiri. After all, conflict resolution experts like Lord Alderdice say that the trick of successfully navigating a disagreement is to assume that no one is right, that everyone, in their way, is wrong. And finding our way to agreement involves turning away from something we were absolutely convinced we were right about. If there’s enough of this technique going around, and people are forced by threats to their identity and surprise and agreement to reconsider their own beliefs, maybe we can all find our way to a much more sustainable feeling. Maybe we can all agree on what’s right?
Hameiri laughs. “As one of my professors used to say, ‘that’s an empirical question.’”