Christopher A. Kelley
27 min readJan 29, 2022

What do we mean when we talk about mind-altering substances?

Dose

Hey — it’s the way he sees it, anyway. And what’s wrong with that?

I also don’t believe in drugs […] In my city we would keep the traffic in the dark people — the colored. They’re animals anyway, so let them lose their souls.” — “Don” Giuseppe Zaluchi (of Detroit), from The Godfather

Context: History and Culture

As one infamous and cynical quotation has it, “Perception is reality.” There is no similar maxim analogizing the passive, vigilant phenomenalism warranted by acculturation in a country notorious for its unfriendliness to foreigners, but “conception confers actuality” sounds good, it neatly summarizes how imputing truth to mere ideas can keep you from thinking too hard about things you probably don’t need to worry yourself about, anyway…capisce? This proposed adage can also explain successful marketing campaigns for products our direct experience of which may leave us unexpectedly disappointed. Though we may be aware of and even condone the commercial use of appealingly sculpted realities, we can give much of the rest of the world negligent attention that has us oblivious of how much we’ve been led to misunderstand it — sometimes to tragic effect. Imagine, for example, that you are the head of an entrepreneurial immigrant European family and that the norms of the larger society, along with what little education you’ve had of your new country’s history, almost require your belief in the subhuman nature of other people with whom your own marginalized ethnic community have been cautioned to have as little contact as possible. Now, as well as you can determine, the conventional wisdom regarding the outgroup is corroborated by your merely looking at how they choose to live. Inclinations you might have to form your own understanding — by, say, giving the outgroup the benefit of the doubt and developing friendships with a few of its members — would be set against desires to manifest your approval of the dominant societal regime, to gain its favor, to fit in, and to make money. Would you, in respecting moral principle, fight temptations to engage in a business enterprise you know will have a terrible impact on that community, or would you pragmatically acquiesce to the shared beliefs — the ideology — of the dominant culture? My advice: Don’t be like Don Zaluchi.

Unless you’re really lucky, by the time you get to my age you’ve encountered and maybe even been victimized by the most common and least refined species of ideologue: The gossip. They’re thought of as people given to talking about other people, often maliciously, and if only that were the end of it. Most of them also like judiciously sharing “information” as a way to influence how other people are viewed. Their distribution in populations is related to culture and context, and surely fewer of them are found among happier, well-adjusted groups. Their attraction to this manner of achieving and exercising social power probably arises from insecurity in the best case, and from aggrievement and deprivation in the worst. In my experience, they seem most fulfilled when they’ve caused a change in someone’s estimation of a third party, notching their greatest successes when the understanding they induce totally conforms to what they want someone to believe. Take note of that: Their primary aim is to convince; one can’t be sure of how much stock they actually have in the tales they tell. In this way, they illustrate the raison d’être of most ideology: The arbitration of reality. Knowing this, we should anticipate the tenuousness and conditionality typifying the ideologue’s allegiance to truth.

We are each different, but we are alike in that what we know and what we think we understand of reality is found between our ears. Once our individuated, speculative models of the world have acquired the weight of confirmation — like ballast allowing our minds’ smooth sailing from one situation to the next — we interpret our ostensibly faultless navigations as post hoc evidence of our conceptions’ accuracy, and our surmise of reality can slide into a thoughtless manner of finding what we expect or want to see, anyway. There is an anticipated reality, an encountered reality, and at times, an imposed reality, and there are many instances wherein we’ve been lax in making proper distinctions among them. We’ve been remiss, too, in not exploring and minimizing the possibility that we as individuals, but especially as groups, are able to coax or help force into being disordered temporal conditions which can successfully masquerade as truth.

These simulacra sometimes endure for centuries if only because once people have believed a thing for very long it can attain practical veracity despite being straightforwardly demonstrable to diverge from fact. Consider the theory of “spontaneous generation”; it was widely assumed to be true until the results of Louis Pasteur’s acclaimed and easily reproducible experiments were properly disseminated, and after that anyone calling themselves a scientist couldn’t be known to support the idea without automatically incurring professional and social penalties. It is the world’s continued good fortune that Pasteur was one among many, and we may take somewhat for granted our wealth of “scientists” who diligently and successfully apply their interests in wresting deeper truths from what may not even be directly observable. I’m particularly impressed by their ability to marshal uncommon gifts of curiosity, critical thinking, and native empiricism, and to bring them to bear on matters whose depths can’t be plumbed by a less demanding phenomenalism or from an ideological perspective. If only all of us who aren’t so gifted had half their objectivity: With the facts surrounding such diverse matters as public health guidelines, “The 1619 Project,” and climate change being the objects of regular, vehement, and out-of-hand rejection, an expectation that most of us routinely submit plain reality to methodical critique may be too optimistic.

Many who’d bristle at being labeled ideologues will resent nonetheless that broadly-shared truth defends the cognitive territory they mean to secure. Theirs is an exaggeration of our usual ambivalence toward reality because of its indifference to our benefit or disfavor, no matter how some regard the nature of our fortunes as the universe taking sides. In any event, reality seldom outshines our personal preferences. I recall an article I read decades ago in one of the national news weeklies in which the author listed ten rules of life he thought his children should observe, his lean and solid explications following each boldfaced principle; I’m not surprised to remember some of them. One of his precepts is grimly pertinent to our present circumstances and deserves thundering pronouncement on every media platform: Believing something harder will not make it true. Covid-19 demands mindfulness of the obverse: Resolute disbelief of a fact will not diminish its reality. Irrespective of some people’s attempts to shield themselves with epistemological sophistry, the substance of reality exists beyond perceptions, wishes, and fears, the universe is unaffected by magical thinking, and the essence of the eternal now of Divine observation, “the truth,” can’t be changed, no matter its scale, and no matter how others want us to see it.

Our response to endemic untruth leaves too few of us appropriately distressed or ashamed, and with much work left undone. When we aren’t just sweeping things under the rug, we acknowledge societal mendacity only insofar as it may provoke our consternation and a pledge to resist being pulled into its decaying orbit. That’s not enough. We can act as if the world’s brazen hostility to truth has no effect on us as long as the fallout from that conflict is “not in our backyard.” Again, that is not enough. At best, strategies and habits like these betray laziness, but more likely, they result from initiative and reasoning weakened by way of ideological principle. As history has shown us time and again, ideology is a commonly accepted way to cut corners, to excuse avoidance of the hardest but most rewarding work of being human: Aiding in the creation of a world wherein all of us might live our best lives. But that would be a moral undertaking, wouldn’t it? Ay, there’s the rub.

To some, “morality” consists of arbitrary behavioral standards cribbed from second-hand ideas. We may more accurately consider it an elaboration of the impulse to harmoniously consolidate our nature, our experience, and our reason in a manner consistent with aspirational notions of ourselves and what we’d most like to see in the rest of humanity. The failure to respect that difference has permitted quasi-moralistic principles to be wrung from our inurement to bad ideas. For instance, it is insistently alleged that unfairness is natural, that it will always be with us. Though this idea is one we may rationally accept, albeit grudgingly, we can’t conclude logically that we ought to disbelieve in a countervailing moral principle obligating us to mitigate unfairness, but this is precisely the kind of trap we’ve been led into. With Covid-19 representing a providentially concrete analogue to the suggestion that “Injustice anywhere is a threat to justice everywhere,” we should note that this quotation from the last century’s preeminent social ethicist is almost sixty years old. What ideas have we espoused or embraced in the meantime?

Almost hidden in plain sight here is a simple rule of affective and mental habit not given due emphasis: Ideological perspectives unavoidably diminish our capacity for authentic moral action by stunting our imagination — after all, the desire to be moral is founded on abilities to entertain counterfactuals, to empathize, to formulate intention with regard to outcomes and their repercussions. These deliberative modes are of scant use when aspects of reality we are habituated to viewing though ideological lenses acquire such crystalline resolution we don’t feel it necessary to examine them more closely. Only when we know the truth of things can we respond to them appropriately.

Were we to dully list unhelpful and widely-shared ideologies, I expect most of them would relate in some way to the exasperatingly avoidable but not unanticipated academic drama playing out here in the West, and it’s about damn time. To be candid, some of us are intensely gratified that the foundations and ramifications of the tortured socio-anthropological episteme of our post-Columbian world are receiving thorough scrutiny by scholars equipped to articulate and substantiate their corrective findings. Other scholars prefer those examinations cease as soon as can be managed; they may even think (but never say): We’ve lived with these untruths for so long, by now they have certainly attained a firmness of substance at least tantamount to truth. No, they haven’t — make that “no” times infinity. As I and others have argued elsewhere, much of the world’s present-day suffering is thoroughly and profoundly entangled with a heedless, zombified misconstrual of human ontology. The mistake of it was blindingly apparent from the beginning: It was intellectually irresponsible to permit that an unfairly exclusive class of “enlightened” men, however much their concepts intrigued the academy, be assumed to have elucidated the proper nature of human life. It should be no surprise, then, that the pedagogical armada that formed behind them is proving not as invulnerable as previously thought; it may never have even been that seaworthy. The great thinkers may only have been as ingenious at developing and marketing their ideologies as people were eager to find ones suited to their own speculative models. The sheer recklessness of this gambit was almost certainly overshadowed by the anticipation of increased cultural and material standing — distinct sociological advantages to be enjoyed by (need I say it again?) an unfairly exclusive class. By any measure, this represents the second-most prominent and consequential instantiation of ideology — that is, the subjectively informed arbitration of reality — in the history of mankind. In any event, the ensuing wide acceptance of theories about how civilizations evolve and should be structured can be credited to men who had accrued neither the wisdom nor the authority to make the fruits of their ruminations worthy of such wide and deep applications. As well as that worked out for them and their reputations, any welcome reassessments we make of their judgements won’t hurt their feelings; they’re long dead. It is we who are now forced to live in a world constrained by the limits of their thinking.

Context: Being Individual

Here’s an ideology I unashamedly embrace: God made me to know Him, to love Him, to serve Him in this world, and to be happy with Him in the next. This is a categorically false and debilitating idea to some, even if a few may find winsome elegance in its unassuming juxtaposition of such bland succinctness against myriad and profound implications. Even so, the assertion’s epistemological status is not thereby improved; however earnestly I declare it essential to my understanding of the world, it remains merely an article of faith regardless of its significance in the expansive firmament of religious thought. I first encountered it when the now quasi-obsolescent Baltimore Catechism posited it as the answer to the question, Why did God make me?, an inquiry I and my fellows in the second-grade class of Cleveland’s Epiphany Elementary School made in preparation for our First Communion. What sounds like a helping of sweet theological fluff seems rightly discovered in a lesson plan for pre-adolescents; children so young needn’t and probably shouldn’t be prematurely introduced to more demandingly turbid existential investigations. Note, though, that the simplicity of this interrogation belies the immeasurable worth of making it at all; it manifests a clue that beneath our every thought and action there is — at barest minimum — an inchoate suspicion that not one of us exists by accident. The universal experience of this attests to our distinction from all other observed creation: We are equipped to examine the bare fact of our being.

We aren’t born that way; I, myself, “got the memo,” so to speak, courtesy of family tradition, being brought into the Church as an infant, and wasn’t really much older when I was in second grade. As I matured, basic human curiosity would lead me to initiate my own line of investigation, and I’m actually glad to report this exercise of philosophical independence didn’t change a thing. There are people who will just have to accept — with some measure of condescension, I assume — that I never developed honest skepticism of the idea I continue to embrace: The Answer, if you will. They should also resist criticizing how I never resented the earliness and manner of what they might unflatteringly call my “indoctrination.” It doesn’t take a brilliant mathematician to figure out and appreciate that society receives at least indirect benefit from my conformity to an ideology enjoining the pursuit of wisdom and commitment to virtues like prudence, compassion, justice, and charity, among others, and which sometimes can be difficult for me to fulfill. Also, as some social theorists may not have fittingly taken into account, The Answer inspires a plainly better kind of enlightened self-interest than is found in other quarters.

Still, we shouldn’t ignore that there are those who purport to share my viewpoint who do what they can to effect improvements to the world, but for only themselves and, perhaps, their accidental and chosen cohorts — and often to the conspicuous detriment of those outside of them. In this way, they put lie to their claim about the reason for their own existence and are complicit in the degradation of a world which — in accordance with its own laws and consequent to the various activities of all who live in it — undergoes continuous creation. Worse yet is how what they do invites others’ interpretation of various responses to the question Why did God make me? to be rendered volatile and contingent — pointless, even, as if the mere possibility of there being “an answer” were not itself immense and transcendently obdurate, after all. And although an ideological status leaves it vulnerable to the rhetorical adulteration of those who insincerely offer it upon query, The Answer is a thing one truly has or has not. But even if it were more pervasively and faithfully realized it would be among only a handful of examples one might use to defend holding unverifiable viewpoints on the proper ordering of reality.

My early education didn’t predict how I came to abstractly “believe” in God as a matter of principle or definition, not too unlike how I believe numerical sums exist whether anyone computes or even thinks of them, so I’m comfortable with the simple fact of God, and I’ve only been amused by the suggestion that my faith is unscientific. In my opinion, such a criticism is more properly leveled against disbelief, but that can be discussed elsewhere. The point I want to make is this: If God is the source of all being, and religion is generally purposed to gain appreciation for the fact of and reason for human existence, then one should expect that religious teaching accord with rationally accepted facts of the universe inasmuch as the world contains human beings possessing the nature and capacity to gain better understanding of it. This is consistent with the Catholic tradition of integrating faith and reason, and does much to explain why Catholics are at pains to avoid positions on Church teaching that are framed ideologically as opposed to clearly cohering with Scripture, Tradition, and the Church’s Mission of Salvation. Right now, though, one easily discovers ideologies creeping into Catholic spaces, and those responsible for these incursions work hard to draw firm lines linking Church teaching and the socio-political stances they try to disguise with spiritually barren moralism. But the nature of their aims is revealed in their positions’ disconnections from scientific, social, and historical realities, in which case their objectives are, by definition, unreasonable. Of course, not everyone is Catholic or even believes, and I was older than I might care to admit when I discovered this.

Believers and nonbelievers alike can have distorted perspectives on just about every dimension of human nature and enterprise, and will sometimes embrace their views with fervid religiosity in the face of reasonable contentions against them. It was this finding that most motivated me to contemplate why such varied representations of the world are found in people’s minds. A question demanded to be asked: How is it that people can look at something, see it very differently from the way others do, have their vision subjected to tenable dispute, yet hold so tightly to a faulty reality? The acrimony stirred by some political and sociological debates I’ve seen hints at stakes having less to do with examinations of reality and more with an establishment of dominion over its appearance — through framing strategies, nomenclature, and so on. For some, the better reality is one somehow bolstered by stolid maintenance of their own convictions, and they anticipate little benefit in seriously attempting to dissect opposing arguments — forget about their humbly appropriating elements from them which might cast light on universal truths. The incredible power of bad ideas, I realized, expends itself in great part on those who hold them.

A grasp of what’s really going on has always seemed required for me to make my way in the world — to survive, in other words, so I was an empiricist before I knew what that was. Along the way, it became obvious that this isn’t everyone’s approach to life, that people can sensitize themselves to ideas in a way that discredits their own observations. But willfully approaching the world in ways to affirm preferred theories of it is at least unwittingly dishonest, and one’s survival is jeopardized by comparative indifference to whole swaths of reality. It may be that one’s concept of “survival” is really the differentiating variable in that a desired permanence of oneself isn’t necessarily attached to physical being, that one can and will survive as long the ideas subsuming one’s identity do. This plausibly explains why beliefs harden even when acting on them invites a cascade of negative consequences.

We now have people running up the proverbial hills upon which paeans of the future will locate their final acts of heroism. Our compounded misfortune is that the American tradition demands being comfortable with societal values biased toward an idea of individual (?) liberty which can be exercised without regard for the proven needs of the community even though such values will also moderate provisions for a common good made vulnerable to individuals’ unsteady commitment to its achievement. We shouldn’t be surprised to discover, then, that the plain desirability of the “common good” has become, well, not so plain, since the good of the many can’t help but invite diminished respect for the one who may feel it important to preempt offenses to him- or herself.

And so it’s unlikely that “grace” has inspired anyone to display the frank recalcitrance which, when combined with a misguided sense of piety, creates an irksome ethical tangle I’ve almost tripped over upon silently witnessing the disintegrity of fellow Catholics who don’t wear masks during Mass. Somehow, they can worship facing an altar behind which is a representation of the crucified Savior without any apparent sense of contradiction in their own unwillingness to sacrifice (!) an hour’s discomfort (or even better, an hour’s prescribed humility) to preserve the safety of their fellow congregants. Perhaps they believe God approves of the choice they’ve made to accentuate the importance of culture wars’ semiotics, but I doubt they’ve thought it through to that extent. In my estimation, they’re just selfishly making a statement in service to a political idea upon which hangs some integral aspect of their “survival,” a thing accomplished with their overt and insensitive self-assertion.

Context: Pathology

Despite having seen the gravest consequences of another’s having done so, humans can irrationally choose to do what they know better than to do. For example, too many of us have fallen into the habit of ingesting, injecting, and inhaling compounds of sketchy provenance, an activity sometimes leading to death on the floor of some public restroom. Given the current supply of poisoned illegal drugs, one’s undiscerning investment in prejudicial and, moreover, disprovable beliefs seems less risky than substance abuse, and normally it would be. Right now, though, the estimation of hazard should be appreciably higher. Not only have we been struggling through a graciously rare event wherein any one of us can easily transmit and contract a potentially debilitating and fatal disease, we’ve done so as a fractured community with solid socio-political lines circumscribing uneven distributions of the most ordinary willingness to make sacrifices for the sake of personal and communal safety. The well-meaning experts responsible for gathering and interpreting epidemiological data have had to belabor the fact that a witless pathogen literally couldn’t care less about political opinions, and they have not only to correctly determine the interests of public health, but also to waste time and effort countering worthless ideas.

Stingy confidence in the motives of government and in the health care “industry” can’t account for all the people whose great distrust of advisory institutions permits their own and others’ needless and avoidable exposure to the virus. To compound the problem, the doubters often peddle an alternate reality wherein refusing to do what needs to be done is the right thing to do, the foolishness of which is confirmed in their preference for blatant quackery to reliable scientific principle, and, tragically, the consequences many go on to face: They lie in hospital beds of their own making and in graves they might as well have dug themselves. The unambiguous chronicling of these results hasn’t dissuaded a literal multitude who still refuse the vaccine, resist wearing masks, and undermine the efforts of governments, school boards, private enterprises, etc., to mandate those things. Again, doubtful regard of public officials and doctors does not explain their self-immolation.

I think there’s something we’ve been missing, and the meagerness of our sympathy for these folks — which, to be candid, is probably contempt by another name — may be a bit too much like that of the social drinker or recreational pot user who is self-affirmingly convinced some other person simply lacks the moral fortitude or grit to kick their drug habit. We should consider the possibility that an insidious pleasure akin to the thrall of substance-use disorder afflicts those who seem irrationally determined to cultivate their own and others’ perception of reality into congruity with some chimerical alternate. Having, and, importantly, sharing an ideological outlook may feel just that damn good.

As impediments to human flourishing, widely-held ideologies seem surprisingly underestimated in spite of how they have so often resulted in catastrophe. While many relatively harmless ideas can be traced to everyday customs and traditions, some really toxic systems of thought have originated among not only the powerful, but the educated, too, and this can be puzzling considering that many of these beliefs are intellectually repellent — ostentatiously lacking nuance and selectively disdainful of plain empiricism. In short, they often go out of their way to antagonize reason, and instead arrogate blunt insistence on the way the world actually does or should be made to work. We do well to also note the often merely semantical difference between having an ideological outlook and the holding of superstition. On top of that, a dimension of egoism is revealed in the way people so obviously employ ideology with ambitions to prove just how much smarter they are than the rest of us, regardless of the communal costs incurred by their failed demonstrations.

One is led to conclude that ready assimilation of certain kinds of ideology merely reflects the sheer number of those held captive by the ad populum fallacy — a phenomenon which fosters growth in the consequences of its own existence, and whose contributions are implied by the accelerated pervasiveness of outlandish conspiracy theories. Centuries’ worth of evidence should have helped us avoid past and current ideologically-charged socio-political moments, but the same history discloses the world’s tolerance of obtuseness, too, and the past several years have yielded faithful assemblages immune to the ominous clangor of some of history’s more dreadful rhymes. Though some in these groups appear mindful of loud and pervasive warnings and criticisms, it’s only insofar as they are wont to efficiently neutralize their detractors — and wouldn’t you know it: Ideology comes in handy for that, too.

We don’t have reliable, fast-acting antidotes for people’s overindulgence in tendentiously curated traditions, fake news, mendacious pedagogy, and intellectually stupefying echo chambers, all of which more unfailingly proliferate in this highly technological age and aid in the creation of the preferred realities upon which people come to betray their psychological dependence. Consider that many won’t hesitate to say, “I don’t believe that,” or, “That’s not what I think,” when they reject facts beyond reasonable dispute. They often demonstrate a raw intolerance of others’ opinions, an outsized reaction that probably corresponds to the massive and unhealthy emotional stake they’ve made. My more charitable view is that they know their hearts will be broken if they attempt to accommodate a reality critically different from the one in which they have found personal validation, and although I think it’s right we all try to sympathize with that feeling, we need to remember: Their sense of self has become integrated with the story they’ve embraced and with the feeling of having been embraced by those sharing their beliefs. They are at home. Hence, as to the societal consequences of their intransigence, the word “polarization” can’t adequately describe a mushrooming crisis of reason and morality happening while both the manner and the very instruments of human thought are undergoing changes that may be permanent.

People’s tardy recognition of their diminished self-understanding — especially with regard to its inescapable entanglement with actualities untouched by any world they have coerced their own minds to devise (or have allowed others to erect in them) — can make for grandly risible farce or for soul-scorching tragedy. Think of how the narrative arts would be appreciably drained of intrigue and entertainment value if not for this well-observed human predilection. And in the actual event, when living people are forced to face challenging facts about themselves and their world, they are prone to suffer acute cognitive dissonance, anxiety, and perhaps debilitating self-doubt. In short, kicking the ideological habit seems to come at very high cost, and so we shouldn’t be surprised most ideologues never appear to recover, and it’s their pursuit of just one more dose of vindicating and obviously fictitious information that gives warning. In this way, some behavioral components of modern-day ideology clearly mimic presentations of substance addiction, and isn’t that prima facie evidence of disorder? I would add that the disturbance is easily distinguished from the transient emotional agitation accompanying trauma — the deep, overwrought anguish in response to the unexpected death of a loved one, for example. In contradistinction, the impairment of ideology persists in a fashion demanding clinical remediation. One might seriously predict that the DSM used by mental health professionals will one day describe a condition of primarily exogenous etiology characterized by technologically-facilitated embrace of discredited news, history, and analysis, which tendency is agonistic to pronounced irrational dissension to information from more creditable and generally accepted sources. It could be called Dissociative Denialism Syndrome. Irrespective of its eventually being so recognized, right now it clearly manifests as a maladaptive manner of taking refuge from reality.

It’s reasonable to speculate that curtailment of people’s cognitive adaptivity is a threat to mental health in that, at the very least, the habit-cum-reflex to avoid certain thought pathways eventually allows the occlusion of access to otherwise reachable and potentially helpful options, i.e., the interior corners of “the box” people are congratulated for thinking outside of. Such constrainment of the normal (or at least more commonly found) intellect necessarily attenuates the oftentimes welcome variability with which perception and experience can be processed. At worst, what we can call cognitive obstinacy may actually both result from and encourage disuse and atrophic changes to important regions of the cerebral cortex, and maybe even the agonized death of brain cells — a vicious cycle evinced by the precipitous deterioration of one’s conception of the real world that comes to resemble psychosis. Okay; this faux-scientific and uninformed bit of alarmism is a stretch, and may sound too much like, “If you keep making that face, it’s going to get stuck that way,” but after witnessing the chronic ineptitude of a few ideologues, it’s what my modest knowledge and reasoning have offered to explain why people who are widely held to be “ideological” rather than “delusional” are yet so similarly unreliable in their abilities to correctly interpret reality. To be fair, though, I admit my theory could reflect uncharitable and self-aggrandizing motives I’ve hidden from myself, and it’s just a happy coincidence my awareness of that possibility highlights my desire to be “objective.”

Context: Recovery

In any case, we do well to optimistically assume the prolonged holding of an ideology isn’t neurodegenerative and won’t result in permanent cognitive defects, and that a person can be rehabilitated to the semblance of disinterest we accept as a substitute for true objectivity. Moreover, that with time, discomfiting and inconvenient truths can be assimilated regardless of how staunchly someone may previously have disputed them. This would be an excellent finding; humans’ interaction with their world, when grounded in reality, assures the engagement will more likely be advantageous. On that note, let me point out that it isn’t realistic to expect that all of the conceptually disabled will be able to independently manage deliberate escape from their unreality. Just as much as they “need to want to get better,” they should be charitably reminded that rationality involves overcoming the fear of unwelcome truths, a fact we become acquainted with before we even learn to talk.

From our earliest years, and regardless of emotional disposition, we learn that some interactions with the truth of things can be downright painful. Among the first teachers of this fact are hot stoves and sharp utensils, and — good for us — we don’t each subsequently lay secret plans to somehow rid the world of culinary equipment. We go on to face challenges of greater practical complexity and emotional moment than could ever have been resolved by carefully avoiding risk of physical or emotional injury: learning how to drive, and to establish friendships, e.g. Our accomplishments instill an awareness that as much as fear is an important adjunct to the instinct of self-preservation, it is best moderated by reasoned aspiration. It’s also true that besides being a hindrance to doing what may be immediately necessary (one may someday be called upon to prepare supper, after all) fear can account for regrettable self-limitation: A child’s burned finger should not preclude her someday being a great chef; rather, one assumes her healthful cooperation with experience will inevitably serve to amend an unduly narrow “idea” of the nature of stoves. In a similar way, the evolution of human society depends to a great degree on everyone’s shared commitment to overcoming the fears preventing our preparation of a banquet that satisfies and nurtures everyone.

If this is what we really want, why haven’t we achieved it? If we find ourselves complacent in not having it, what has sapped our initiative? I blame the usual personal and collective shortcomings, the list of which can be long and varied, but whose particulars are rooted in that tragic dyad of the human state: materialism and relativism, the foundations of iniquity. The great many of us who don’t suffer cripplingly disordered ethical deficiencies still have a bad habit of according excess sopoec [“sopoec” is a personal coinage: a contraction of social, political, and economic.] power to those who do, and too few of these anointed demonstrate compensatory affinity for common virtues. It’s easy to conclude that either we believe virtues to be of limited sopoec utility, or that powerful people and saints shouldn’t be made of the same stuff. One result is that regardless of their placement on any ethical continuum, far too many of our leaders end up sharing the luxury of being able to publicly laud unquestionably desirable virtues like justice and charity while the tedious, self-abnegating work they require is left for those engaged in social activism.

The 1957 integration of Little Rock’s public schools offers an example of this. President Eisenhower handled his end with integrity and crackerjack panache by flipping the script on the reprobate Arkansas governor and providing a strong response to the situation’s practical difficulties. Beyond that, the involvement of the president and the military could do nothing to contravene a hateful ideology or even to acknowledge the cognitive dimensions of the agitation which afflicted the town. The spectacularly unedifying experience for the people of Little Rock was of meddlesome guys from the government who were there to help…and to help the wrong people, just as similar guys had done during a previous generation’s fatally abbreviated Reconstruction. Big, bad government ended up reprising its role as the principal villain in the drama of the “Lost Cause” — and as the unqualified, undesirable mediator of “freedom.” Even more importantly, its easy to see that the intervening generations had produced no one of sufficiently convincing and appropriate sopoec stature to help improve the collective thinking of the populace. A mind is a difficult thing to change.

Context: Reality

Ideology is used by social recidivists to denigrate the beliefs of their critics and to promote the appeal of a world whose sopoec stratifications are inevitable, if not ordained. Recidivists are by definition anti-egalitarian, will cheat with impunity, and are lenient in their judgment of violence committed in furtherance of their interests. Their immorality is excusable since it’s in the service of the greater good which can be achieved only if they are in charge. These simple facts should make it horrifyingly clear that the fundamental philosophical differences between America’s deviant right and certain autocratic regimes pertain more to matters of degree and less to principle. Of course, the deviant right’s having refrained from actual mass murder is too forgiving a criterion to indicate their fitness for leadership — mind you, though, we can attribute to politicians restraint not found in their supporters: Among the rightist rank and file are those who wouldn’t hesitate to set up .50 cal. guns along the southern border and gleefully defend it. The moral certainty of that is one reason I’ll take it as a huge net positive that realization of recidivist goals is valiantly impeded by progressives.

I can’t ignore, however, that one of the salient shortcomings of leftist ideology is how it mistakenly insists that a species of humanistic utilitarianism can supplant any ethical system grounded in the natural law, which the left generally view as merely an intellectual construct. It is further supposed that purposeful secularism is morally anodyne and more firmly demarcates the interests of church and state. Besides that, religious — oh, wait — metaphysical indifferentism conforms to Constitutional principle, and as I have previously argued, the reasoning of the Framers themselves allows that anyone can seriously suggest this. Of course, during the ongoing and lively debate about the aptness of progressivism, the left are unavoidably ceding ground to the recidivists who are quick to claim, however falsely, greater understanding of and fidelity to “true” morality. What the left need to do is reframe their message in a manner undeniably expressing their belief in human dignity; you see, having embraced this idea, however tacitly, they have a leg up on the competition.

Many in these two movements are viscerally hostile to well-reasoned and indisputable contention, as if just listening to each other would entail discarding their ideas of what society should be like in the first place, and more importantly, who they are in it — a reckoning of their identity. We can therefore infer that their experienced self-awareness is accompanied by suffering of a kind, particularly in the context of unwanted and grudging interactions with those who offer them reasonable disagreement — note the qualifier: reasonable. This is the individuated phenomenon we know to be the inexorable outgrowth of tribalism, and surely, no one actually wants that, right? Alas, regardless of the possibility of disastrous results, we politely decline to ostracize even those energized by the frisson of conflict. Don’t expect me to make some disingenuously unbiased appraisal here, though, as if the principals on all sides of these arguments are on equal philosophical footing when it’s clear that they aren’t. The glaring truth is that, as much as the right use the term “the radical left,” it is they who reliably demonstrate the kind of substantive and thorough radicalism which can elevate obviously problematic ideas at the expense of reality.

Ideas can nonetheless thrive in a marketplace regardless of their conforming to truth, which is why I alternatively propose that intellectual endeavors actually occur within an ecosystem of diverse interrelated suppositional models, especially since their accuracy may be revealed only after spans of multiple generations. The exigencies of the present don’t allow the time for our verifications to reach the customarily detailed resolution, though. While many of us have been relaxing in the elaborate architecture of our houses of self-worship, or cowering in the ramshackle shelter of self-designed realities and comforting beliefs, people have literally been dying because we have tolerated the promulgation of bad ideas. We should at least more insistently share good ones. For instance: Humans alone are capable of independent, altruistic, and, paradoxically, disinterested compassion; as much as this felicitous concept may be widely embraced, ideologues past and present have undermined our simple appreciation of it, but in spite of that, it will forever rest in gracefully implacable opposition to the most fundamentally destructive idea of all: “Humanity” is a quantifiable trait subject to arbitration — and this is the squalid heart of the most pernicious ideologies. I invite those inclined to dispute this finding to make their own inquiry of history, to test and see the meanness of humankind. This idea’s muddy boot prints are seen leading to and from the sad and enduringly observable fact that people are adept at developing convenient “reasons” to minimize and nullify the humanity of others. On account of this, people continue to be exploited and killed, are given scant regard when beset by misfortune, are made homeless, are ridiculed, are allowed to die and if not given that blessed relief are made to suffer outrages of all kinds. Take note that in every case, there is always, always, a material benefit attached to either passive or performative disbelief of others’ full and true humanity, even if it’s only a transient feeling of power, a fleeting sensation of superiority not unlike the immediacy and warmth of a timely dose of heroin.

© 2022 CHRISTOPHER A. KELLEY