Loading…
0:00
5:42

There’s a scene toward the end of the documentary Holy Hell where Snow Patrol’s “Chasing Cars” plays in the background as the camera cuts to close-up shots of various Buddhafield cult members. The cult members are wearing flowing fabrics, and a few have flower crowns on their heads. In some shots, the members sit cross-legged or wave their arms above their heads, as if their limbs are made of something more fluid than bone. When they look into the camera, their gazes are direct, blissed out, and a little disconcerting.

When I played this clip for my research seminar, more than a few people were slightly disturbed. It has an otherworldly feel to it — when you watch it, it’s hard to put yourself in the shoes of the people in the video. Maybe it’s the expressions on their faces, or the corniness of it all, or even the music, but it feels especially culty. And these aren’t actors — they’re real people who thought they were making a propaganda video for the cult but later became part of a documentary exposé.

In an interview with The Wrap following Holy Hell’s premiere, documentarian and former Buddhafield member Will Allen explains that he tries not to use the word “cult” to describe his experience, because “it’s hard to get out from under it.” He describes the “innocent beginnings” of a spiritual group that evolved into a cult. “I ended up getting stuck in it,” Allen said. “I would use that word. It wasn’t my plan.”

It’s tempting to buy into the fantasy that getting drawn into a cult is something that happens to a “different” kind of person and would never happen to us. But interestingly, research suggests that circumstances may be more significant than personality when it comes to susceptibility to cults. Cult members are typically psychologically healthy, often particularly idealistic, and may be going through “normal life blips,” such as a breakup or another period of upheaval or transition, that make them more amenable to a cult’s messaging.

“These people are often idealists. They think they will make a difference to humanity, or that they will best serve their god or their ideals within the group. Individuals who are recruited are also often young adults in the middle or upper class,” Robert Pardon, director of the New England Institute of Religious Research, told IBTimes UK.

The combination of idealism — which may lead an individual to reject certain societal conventions — and emotional upheaval often means that the vulnerable individual is actively searching for “solutions” to his or her problems. Cults provide the “structure, authority, and close social contacts” that can feel like both solution and purpose.

Once an individual joins a cult and conforms to the cult leader’s dictums on how to think and how to live, psychologists describe a process of snapping, or a sudden shift in personality. This shift helps account for the difficulty that friends and families of victims can have in getting through to their loved ones — they often feel like the person they’re trying to reach has become a stranger.

In a video explaining the Asch conformity experiment, Philip Zimbardo, a psychologist and former Stanford professor best known for the Stanford prison experiment, demonstrates how easy it is for individuals to yield to group influence.

In the study, subjects are given a standard line and three comparison lines and asked to choose which of the comparisons is the same length as the standard. The choice should be an easy one — the lines aren’t meant to present any sort of optical illusion. But planted subjects, who are instructed to select the wrong comparison line, go first. In Solomon Asch’s original conformity study, this was enough for many subjects (37 percent) to yield to group influence.

“As long as there are three or more people who agree that reality is not the way you see it, in many cases, you give in to see the world in their way,” Zimbardo says.

It’s fascinating to watch footage from the original Asch conformity experiment and wonder at the subject who delivers what is clearly an incorrect answer, solely based on pressure from the group. As with joining a cult, we may believe — or want to believe — that we would perform differently, choose differently, in similar circumstances. Of course, that’s also the temptation with Zimbardo’s Stanford prison experiment. When viewing it as a detached observer, it’s all too easy to make the assumption that it would never happen to us.

In the Asch conformity experiment video, a narrator explains how distortion can happen at the level of either perception or response. In the case of informational conformity, an individual goes along with a group because what they say convinces the individual that the group is right — a perception-level distortion. With normative conformity, an individual goes along with the group because he/she is apprehensive that the group will disapprove — a response-level distortion.

Something interesting happens, however, when the subject is no longer alone in defying the group consensus. When provided with a “partner” who gave the correct answer before the subject did, yielding fell from 37 percent to only 5 percent. “The partnership variation shows that much of the power of the group came not merely from its numbers, but from the unanimity of its opposition,” explains the narrator. “When that unanimity is punctured, the group’s power is greatly reduced.”

In Holy Hell, Allen describes the domino effect of members leaving the Buddhafield once the Teacher’s hypocrisies and exploitations are exposed. No longer an unanimous unit, many members break off and drift away. Yet that wasn’t the end of the road for the Teacher — as of the documentary’s release in 2016, the Teacher was living in Hawaii with dozens of followers. The film’s ending suggests that as long as there are people seeking answers to life’s biggest questions, there will be receptive audiences for cult leaders who seem to provide these answers.