Prisons Without Bars

“The greatest enemy of knowledge is not ignorance; it is the illusion of knowledge.” — Stephen Hawking

Russ Isaacs-Wade
Dialogue & Discourse
13 min readJun 12, 2024

--

Cult Examples: Jonestown, Branch Davidians, Heaven’s Gate, Unification Church, Church of Scientology, Trump
Cult Examples — graphic assembled by author from multiple public sources

Over the last few years I have noticed increasing bewilderment regarding otherwise intelligent people who genuinely believe various absurd explanations of reality. For example:

  • Believing a woman named Eve was created from the rib of a man named Adam — Fundamentalist Christians.
  • Believing the Earth and all it contains was created in six twenty-four-hour periods— Fundamentalist Christians.
  • Believing an evil cult Is ruling the planet — QAnon and the Illuminati.
  • Believing people are extraterrestrial beings called thetans taking up residence in human bodies — Scientology.
  • Believing followers could transform themselves into immortal extraterrestrial beings — Heaven’s Gate.
  • Believing in the imminent return of Jesus and the onset of Armagedon at that time—Branch Davidians, Jehovah’s Witnesses, and numerous others.

etcetera.

Of course, living in a free country means people are free to believe whatever they want... right? But what if they’re not? What if a person’s freedom of thought has been usurped; literally taken without the person even being aware of the theft?

Each American is vulnerable to snapping, even if he or she has never considered participating in a religious cult or mass therapy. The techniques used by America’s cults and therapies permeate every level of American society, from government and business to our daily social interactions. Yet most of us have little understanding of the extent to which we ourselves — not only our beliefs and opinions but our individual personalities — may be shaped and changed by those around us and by things we experience every day. (Conway and Siegelman) ¹

What is a cult?

If asked, most people would probably say they know what a cult is, however, if further pressed for an easily understandable and workable definition, you will likely receive neither. The problem is that no settled agreement of what constitutes a cult exists, although this is not due to a lack of trying:

It is in most contexts a pejorative term, also used for a new religious movement or other social group which is defined by its unusual religious, spiritual, or philosophical beliefs and rituals,[2] or its common interest in a particular person, object, or goal. This sense of the term is weakly defined — having divergent definitions both in popular culture and academia — and has also been an ongoing source of contention among scholars across several fields of study. ²

Because a working definition of a cult has proven to be elusive, I recommend the following definition:

A cult is any group of people who share an unfalsifiable, hence unprovable, core belief promoted as the only true representation of reality and upon which belief the group’s existence depends.

It follows from this definition that all religions and the sects which derive from them, as well as many non-religious and quasi-religious groups are cults.

However, while pushing an unprovable belief as truth justifies a healthy skepticism, the pejorative application of the term cult in “most contexts” suggests something more sinister is occurring than just the naïveté of a group’s members.

Likewise, although one might deplore that the majority of humanity subscribes to unprovable beliefs; for example, the Abrahamic religions unprovable belief in the existence of God; Buddhism’s and Hinduism’s unprovable belief in reincarnation, and, for that matter, any atheistic group for their unprovable belief that God doesn’t exist, I suspect most people would not consider these major belief systems are sinister as the pejorative use of cult implies.

Nevertheless, it may be informative to study the history of a group, even a group of long standing, to determine whether the pejorative sense of cult is warranted. For example, Catholicism is generally accepted as a legitimate religion nowadays despite having a significant history of persecuting both believers and non-believers. In this context, the numerous inquisitions come to mind.

It is also curious how many of the truly malevolent cults; that is, cults harmful to their members and society, utilize religious ideas. For example,

Cult leaders utilize religious ideas to persuade (graphic composed by author, references noted).
Cult leaders utilize religious ideas to persuade (graphic composed by author, references noted).

Perhaps because cult leaders are master manipulators they realize how malleable are the minds of people willing to believe explanations of reality without proof. Perhaps these manipulators realize people will obediently follow their demands if they can first be convinced the cult leader believes as they do.

Trump holding up Bible ⁶

But the question remains, why do smart people join cults and come to believe the truly ridiculous dogma they propagate?

What is a malevolent cult?

I submit that a cult is malevolent when their primary reason for existing is to replace your conception of reality with their own, including destroying (literally) your ability to conceive of reality differing from their dogma or questioning its legitimacy. I submit that regardless how a cult member describes their experience of cult membership (awesome, wonderful, eye-opening, etc.) they are instead in an intellectual prison.

A generic overview of the structure, organization, and strategic elements of malevolent cults is presented in the following schema:

Typical Cult Organization and Control Functions
Typical Cult Organization and Control Functions (Graphic created by author)

The motivation of a cult’s leadership is obtaining power by growing and maintaining the cult’s membership. Being the leader of large groups of people promotes the impression both inside and outside the cult, as well as to the cult’s leader, that he is special.

The leader’s belief that he is special and far superior to other human beings is confirmed by the cult membership’s undivided devotion to the leader’s belief system (a good example of this type of fawning devotion can be seen on a video here ⁷).

In my opinion, it is the cult’s membership control strategy that reveals whether a cult is malevolent. Some or all of the control strategies engaged in by malevolent cults are as follows:

  • A psychological information barrier is established between cult members and nonmembers in the form of ‘us versus them’ beliefs. Equivalent meanings might be, ‘members are privileged, non-members are evil’, ‘members are saved, non-members are lost’, or even ‘members are superior, non-members are inferior’.
  • A physical information barrier exists where members are encouraged to distance themselves from all previous non-member relationships, such as, family, friends, and romantic relationships.
  • Independent thought is discouraged and sometimes abusively so through ridicule and/or silent disregard. Only the ideas of the cult’s leadership are valued and promoted.
  • Only the activities and dogma of the cult are considered worthy of pursuing. The primary activities expected of malevolent cult members are recruiting new members, contributing financial resources, and attending all member meetings. The time constraints and financial limitations of individual members are of no importance to the cult’s leadership.
  • Membership meetings consist exclusively of repetitive lectures about the obvious wisdom and truth of the cult’s dogma. Repetitive statements supporting a cult’s dogma shouted by cult members is encouraged, for example, when members of certain evangelical Christian cults shout ‘Jesus is Lord!’ or ‘We agree, Jesus is Lord!’. The meetings of some Buddhist cults consist entirely of highly repetitive group chanting, the lack of members knowing the meaning of their chanting notwithstanding.
  • Fear of losing the approbation of the cult is constantly threatened in subtle ways. Even a disapproving look from the cult’s leadership can induce a member’s deeper commitment to the cult’s beliefs and agenda.
  • Fear about the dire consequences of leaving and disparaging the cult is frequently included in the cult’s dogma. For example, ex-cult members may be presented as occupying a position even worse than nonmembers, having voluntarily rejected the only path available to receiving ‘eternal life’, achieving ‘nirvana’, or avoiding harmful life events. One Buddhist sect propagates that people leaving their cult always come to some horrific demise.
  • And worst of all is using the strategies above to disable a member’s ability to think and reason apart from the cult's dogma.

Attempting to gain an understanding from members why they have joined a cult is a fruitless task. Cult members will invariably explain their behavior from the content of their beliefs rather than the context of their beliefs.

We talked with dozens of individuals in this state of mind: cult members, recent est graduates, Born Again Christians, and even some Transcendental Meditators. After a while, it seemed very much like dancing to a broken record. We would ask a question, and the individual would spin round and round in a circle of dogma. If we tried to interrupt, he or she would simply pick right up again or go back to the beginning and start over. (p. 54) ¹

Besides, cult members believe joining the cult was based on a choice they have freely made and may truly be unaware they were manipulated into making the choice.

Non-member susceptibility factors.

Two sets of conditions that will increase the likelihood a person who is not a cult member will be converted are: (1) the manipulation skills of the recruiting cult member applying the cult induction techniques, and (2) the current psychological state of the targeted non-cult member.

The cult recruiter’s perspective.

One very important aspect to remember about the recruiting cult member is that no matter what this person’s objectives are perceived or stated to be toward the targeted non-member, for example, conveying concern for the target’s welfare, there is really only one objective in play and that is to induce the target into joining his or her cult.

Just as growing the cult’s membership validates the cult leader’s belief he is special and important beyond the importance of other human beings, so also does getting someone to join the cult validate a member’s belief in the cult’s dogma and, consequently, his or her own reasons for joining the cult.

The target’s perspective.

In an interview hosted by Rebecca Morris with Flo Conway and Jim Siegelman for the Seattle Times ⁸, the subject of “sudden personality change” was discussed in the context of uncharacteristic and aberrant behaviors exhibited by accomplished professionals, for example, a physician who blew up his Manhattan townhouse, an elementary school principal who exposed himself to a woman in a car, and an astronaut accused of attempted murder.

While these three professionals and others discussed manifested aberrant behavior in response to different precipitating events (divorce, job stress, and a rival suitor) they shared similar lifestyles and personality characteristics predisposing them to act in the way they did, and which lifestyles and characteristics may be extrapolated to include people who manifest the aberrant behavior of joining a cult. Conway and Siegelman described these people as:

Smart and are expected to achieve a high level of success in both their work and personal life. Have intense professional demands. Have jobs that require they hide their emotions — perhaps police, firefighters, members of the military, astronauts. May be isolated in their work. Are disconnected, with few social connections and no one to talk to. ⁸

For potential cult inductees, I believe the lack of a social support system and the associated factor of loneliness play a significant role in making someone susceptible to conversion. Additionally, I believe cult inductees have sustained injury or injuries to their self-esteem resulting in lost confidence and the ability to accurately judge events and situations. Such people will be susceptible to a confident cult recruiter who seems to have all the answers.

Cult induction scenarios

From the cult member’s perspective, there are three general scenarios in which induction techniques can be applied, informal, quasi formal, and formal scenarios. Informal scenarios occur as unplanned events when a person engages with a cult member in a casual way, for example, sitting next to someone while riding the bus or meeting a cult member through a mutual acquaintance. Regardless of who initiates the conversation, the cult member will be biding their time until the optimum moment they can begin proselytizing their beliefs (recruiting the target).

The informal scenario can be the most effective way to recruit a potential target into a cult as the target believes their engagement occurred via mutual participation. In other words, the target will be unaware of the cult members preconditioning to always be aware of opportunities for recruiting new members.

Quasi formal scenarios consist of various methods or activities used to spark the curiosity and interest of potential cult members. For example, the Hare Krishna sect of Hinduism ⁹ uses jubilant dancing, repetitive percussion, and chanting for this purpose. Likewise, Nichiren Shōshū a sect of Japanese Nichiren Buddhism ¹⁰ uses chanting and the giving of small alter gifts for recruitment purposes. And the American contrived cult of Scientology ¹¹ offers “free” personality testing to induce the curiosity of potential recruits.

Formal scenarios are planned events by cult members specifically for the purpose of recruiting new members. Such groups as the Jehovah’s Witnesses, Mormons, and Evangelical Christian sects frequently canvas residential areas and population centers looking for people who might respond positively to their good news message and be open to joining their cult.

It is within the formal induction scenario that the model for inducing someone to join a cult is easily discernible, although some form of the model is used to a greater or lesser degree by all cults in all induction scenarios.

The cult induction model, or how I can get the right person to believe everything I say is true.

If I am a cult member (which I can assure you, dear reader, I am not) my goal is to get the right target (an elderly person, someone who is lonely without good social support, a young person or adult without good logic skills, someone who has suffered a blow to their self-esteem, such as an unwanted divorce, job firing, or rejection by a clique, etc.) to accept what I will tell them is a real possibility.

The tools I will use to achieve my goal are not mysterious or esoteric, such as hypnosis, although they could be considered specialized. The tools are all conversationally based, and it is likely I would have learned how to apply them by other cult members. It is unlikely I will know the tools I am using are derived from the social sciences.

Upon first meeting my potential target, after introducing myself, I will ask a question that is personal, is intended to provoke some anxiety, and which most people are likely to agree with. This induction method is based on the “foot in the door” technique described as follows:

The foot-in-the-door technique is a compliance tactic that assumes agreeing to a small request increases the likelihood of agreeing to a second, larger request. ¹³

An example question taken from real life occurred when I answered a knock on my door and found a woman and a younger woman (probably a cult trainee) standing there. After introducing themselves the woman asked this question: “Wouldn’t you agree that facing the death of a loved relative is one of the hardest things a person faces in life?”

My agreeing to answer her question, no matter my answer, would have been complying with the woman’s first, small request and thus making it more likely I would comply with a second, larger request.

Although I abruptly ended her ploy, it was easy enough to guess what her second, larger request would be. She would no doubt pose something like the following: ‘We have wonderful news about how God will very soon establish his kingdom on Earth which we would love to share with you?’

In other words, and in starkest terms, her larger request is, ‘We want to take up more of your time convincing you that what we will convey is true, get you to join our cult, and preferably be invited into your home to do so.’

Now, even if the target would rather not spend the next half-hour, or hour, or more engaging in a religious conversation (a very one-sided conversation), and depending on the extent of his or her loneliness and bruised self-esteem, the target will experience at least a small amount of cognitive dissonance:

In the field of psychology, cognitive dissonance is described as the mental discomfort people feel when their beliefs and actions are inconsistent and contradictory, ultimately encouraging some change (often either in their beliefs or actions) to align better and reduce this dissonance. ¹⁴

That is, because the target complied with answering the woman’s first question, he or she will likely agree to the second request in order to make his or her behavior consistent and remove the uncomfortable feeling of cognitive dissonance. Besides, what harm can simply listening do?

Once inside the target’s home, the induction process will increase in intensity. The target will be inundated with numerous questions intended to guide the person to one single conclusion: ‘This group must be right, otherwise, why would I have answered all those questions as I did.’ Likewise, there will be a heavy dose of false flattery, such as, ‘You are obviously a thinking person,’ or ‘You have an impressive understanding of God’s plan.’

Finally, the cult recruiter’s last request will come in the form of an invitation to a larger cult meeting where further attention and peer pressure will be heaped upon the target inductee.

Conway and Siegelman described the induction and conversion process as follows:

At any time during these early stages of recruitment — and throughout participation in the cult or group — the individual’s actions and responses may be artfully controlled without the use of physiological stress or any physical means whatsoever. In lieu of coercion or hypnosis, cult and group leaders use an altogether different class of strategies: they may misrepresent their identities and intentions; they may lie about their own relationships to their organizations; they may display false affection for the potential member; they may radiate spiritual fulfillment and happiness to the point where it has a profound impact on the individual they are confronting; or they may provoke discussion and debate, creating what Sargant calls “emotionally charged mental conflicts needing urgent resolution.” p. 91 ¹

If the cult inductee cannot find the inner strength to opt out of the above process, which at each step in the process becomes harder to do, so the individual will have given up his independence of thought and entered the prison without bars.

References

  1. SNAPPING America’s Epidemic of Sudden Personality Change, Copyright © 1978, 1979 by Flo Conway and Jim Siegelman, Dell Publishing Co., Inc., ISBN: 0–440–57970–8
  2. Cult, May 3, 2024, In Wikipedia
  3. Communism, Marxism, and Socialism: Radical Politics and Jim Jones, Catherine Abbott, © 2024 Alternative Considerations of Jonestown & Peoples Temple
  4. Mug shot of David Koresh taken by McLennan County Sheriff’s Office, Nov. 3, 1987, In Wikipedia
  5. Sun Myung Moon, 2024, May 10, In Wikipedia. Photo downloaded from National Turk, Leader Of Moonies Sect Dies | NationalTurk
  6. Photo downloaded from Waving The Bible | ThePreachersWord
  7. Watch Donald Trump’s Entire Cabinet Compete to See Who Can Flatter Him the Most, June 12, 2017, by Jay Willis, GQ
  8. Just what does it take to make someone snap?, April 16, 2009, by Rebecca Morris, in The Seattle Times
  9. Hare Krishna, May 23, 2018, by e. b. Rochford, eds., encyclopedia.com
  10. Nichiren Buddhism, April 28, 2024, In Wikipedia
  11. Scientology, June 1, 2024, In Wikipedia
  12. Mormonism, June 7, 2024, In Wikipedia
  13. Techniques of Compliance in Psychology, June 14, 2023, by Saul Mcleod, PhD, in simplypsychology.org
  14. Cognitive dissonance, June 10, 2024, In Wikipedia

--

--