‘The White Negro’ Revisited: The Demise of the Indispensable Hipster

Tracy Dahlby
Apr 13 · 18 min read

Originally published in The Mailer Review, Volume 5, Number 1, 2011.

Introduction: A Case for Reincarnating the “Psychic Outlaw”

You wouldn’t necessarily guess that in a country undergoing a revolution, albeit a digital one, conformity would be on such a roll. Yet what strikes me as odd in America today is our seeming inability to produce authentic nonconformists in any significant numbers. Folks that show up locked and loaded for Tea Party rallies — Christian Lorentzen called the type “the radical square”[1] — don’t count, since the point is to question groupthink, not enforce it. What I’ve got in mind is Norman Mailer’s definition of that quintessential American existentialist: the hipster.

In his 1957 essay, “The White Negro: Superficial Reflections on the Hipster,” Mailer defined his hero as “a philosophical psychopath” whose role it was to test the fear-soaked conformity of the time by rejecting “that domain of experience where security is boredom and therefore sickness.” Five years before novelist Ken Kesey gave us One Flew Over the Cuckoo’s Nest, Mailer laid down the theoretical groove: individuals who would challenge society’s norms had better work up the guts, like Kesey’s Chief Bromden, to smash the asylum windows and make a run for it. Not for nothing did the expression “Crazy, man!” become a hallmark of hipster lingo.

Mailer’s views were hot stuff for 1957. Critics accused him of condoning violence as a form of existential expression,[1] when in fact what he championed was a sort of psychic jujitsu that turned society’s straightjacket inside out by exploring what he called “the rebellious imperatives of the self.” Pursuing that “uncharted journey,” the hipster (Mailer guessed there were 100,000 of them working America’s fringes) would help liberate the rest of us TV-numbed squares into the bargain.[2] He overstated the case in claiming “there is not the hipster alive who is not absorbed in his own tumultuous hypotheses.”[2] And his assertion that his self-selected adventurer “had absorbed the existentialist synapses of the Negro,” thus becoming a “white Negro,” makes one wince today.

Yet Mailer put a righteous new spin on Thoreau’s old dictum: “in Wildness is the preservation of the world.”[3] After all, it takes some wildness for a society to fight off cultural rigor mortis. Which, as I say, is the problem: we don’t produce many wild ones today, and the hipster is the butt of Internet jokes. According to a post on the PBS NewsHour Art Beat blog, “[T]he hipster is usually seen as a clown — not so much a trendsetter or truth-teller, but a young and moneyed member of a cultural niche or the uber-fringe.”[4] Hip 20-somethings in Austin (the self-proclaimed “Live Music Capital of the World” is chockablock with self-identifying hipsters) minced fewer words when I asked them in April 2011 what “hipster” brought to mind. “Stands for nothing,” they said. “Wears tight pants,” “Think they’re better than you.”

How did the hipster sink so low? There are at least three reasons: For starters, existential terror isn’t what it used to be; second, the triumph of economic convention over truth in our society has made it hard to talk about the scope of what really matters in life; and lastly, today’s online revolution is helping blunt the very sensibilities we’ll need to strike some kind of humane balance. Yin follows Yang, and maybe we’ll pull out of this existential blind alley, but meanwhile where are Mailer’s psychic outlaws when we need them?

Scared Sideways: The Waning Value of Menace

It is hard to outdo the Fifties, with its raw memories of Nanjing, Dresden, Hiroshima and the Holocaust, for inducing bone-chilling dread. As Mailer noted:

“The Second World War presented a mirror to the human condition which blinded anyone who looked into it. For if tens of millions were killed in concentration camps out of the inexorable agonies and contractions of super-states . . . one was then obliged also to see that no matter how crippled and perverted an image of man was the society he had created, it was nonetheless his… collective creation… and if society was so murderous, then who could ignore the most hideous of questions about his own nature?”[5]

If self-reflection wasn’t your bag back then, the chronic threat of annihilation in a nuclear showdown with the Russians could stir doubts about a life in civilization, that as Mailer wrote, “could mean… that we might still be doomed to die as a cipher in some vast statistical operation in which our teeth would be counted, and our hair would be saved, but our death itself would be unknown, unhonored, and unremarked . . . a death by deus ex machina in a gas chamber or a radioactive city.”[6]

Today, we live less existentially encumbered lives. Our society is vastly more inclusive and solicitous of individual preferences than the exclusionary, quasi-militaristic fifties (even if we’re not as open as we generally like to think), and that’s progress you can take to the bank. In fact, we have; sixty years of muscular economic growth has, until recently, so improved the material quality of American life that we now routinely confuse our jamboree of consumer-oriented diversions, with its incessant Internet distractions, with life itself. But we’ve paid a price: Somewhere along the line we’ve lost an actionable grip on the phobias that, if properly explored, were once thought to bring out the best in us. To wit, Mailer’s authentic hipster is

“the man who knows that if our collective condition is to live with instant death by atomic war… or with a slow death by conformity with every creative and rebellious instinct stifled… why then the only life-giving answer is to accept the terms of death, to live with death as immediate danger, to divorce oneself from society, to exist without roots . . .” [7]

Yes, we’ve had our moments. The September 11 attacks on New York and Washington were terrible shocks, and viscerally so for those of us living in a place like Manhattan at the time. Knocked off our bearings, the country let its maximum political leaders talk us into a dubious bargain. Ours wasn’t to dig in against our attackers with the unflagging, see-you-in-Hell focus that was typical of World War II (in part because we had such difficulty in identifying who or what we were fighting) — or to make the sacrifices necessary to wean ourselves from Middle East oil and secure greater energy independence — but to get back to business as usual.

As Andrew J. Bacevich later observed in The Washington Post, “from the very outset” President George Bush described the “war on terror” as a vast undertaking of paramount importance. But he simultaneously urged Americans to carry on as if there were no war. “Get down to Disney World in Florida,” he urged just over two weeks after 9/11. “Take your families and enjoy life, the way we want it to be enjoyed.” Bush certainly wanted citizens to support his war — he just wasn’t going to require them actually to do anything.[8]

In that horrible season, I remember receiving a mass e-mail from one big-box retailer in particular expressing grief over the tragic events while taking the opportunity to remind its customers of an upcoming fall sales event and wondering if it weren’t a sick joke. Victims’ families mourned, our soldiers went off to fight, and the rest of us were left to define patriotic duty as flexing our plastic.

Ten autumns later, it is easier to see the bankruptcy of our moral math. The utility of existential fear for taking collective action is basically a numbers game. In the years since the 9/11 attacks, how many of us have been directly exposed to terrorism’s terrors on a sustained basis or subjected to its material deprivations? In truth, not that many. Instead, the Pentagon has prosecuted a war by proxy, using our tax dollars to do hard things at the farthest of psychic removes from the average American, while at home our fears have been absorbed in a mishmash of media-speak and online canoodling and kvetching.

What a difference a half-century makes. “No matter what its horrors,” Mailer wrote in “The White Negro,” “the Twentieth Century is a vastly exciting century for its tendency is to reduce all of life to its ultimate alternatives”[9] — totalitarianism v. freedom, boredom v. excitement, life v. death. Such stark philosophical opposites gave the Beats, the literary arm of the hipster movement, its traction. In Howl, Allen Ginsberg could famously tell us what needed rebelling against: a society built on “Robot apartments! invisible suburbs! skeleton treasuries! blind capitals! demonic industries! spectral nations! invincible mad houses! granite cocks! monstrous bombs!”[10] In On the Road, Jack Kerouac told us exactly who would do the rebelling, “the mad ones, the ones who are mad to live, mad to talk, mad to be saved, desirous of everything at the same time . . .”[11]

True, most Americans didn’t want to hear such crazy talk. Howl went to court on obscenity charges, and the mass media did its damnedest to mock the budding counterculture by inventing the put-down term “beatnik” to describe zany, misguided youth devoted to lives of “berets, bongos and bedbugs.”[12] Yet contradictions percolating beneath the surface of the archly placid fifties, in the lack of civil rights, sexual freedom and gender equality, to name a few, allowed the Beats to bolster, as Mailer put it, “the isolated courage of isolated people”[13] Eventually new ideas helped persuade society at large that the outsiders were some of the sanest people among us.

Today, as New York Times columnist Roger Cohen pointed out, Americans are once again looking for a “sense of direction” but are stuck by circumstances in which “‘banksters,’ salvaged by tax dollars, get richer. Ordinary folk get poorer . . . Corporations sit on their cash piles. Algorithms drive Americans to the news that comforts their prejudices and stokes their anger.”[14] Meanwhile, our primary allegiance, no matter what they say down at yoga class, is not mainly to our deeper selves but to economic imperatives that encourage core conformity and support our collective “comfort zone” — a now-thicker and more insidious version of public inertness Mailer once called “the wad.”[15]

Could that be in part because Americans are locked into behaving primarily as economic actors?

The Rise of the Economic “Orthodoxy”

According to Mailer’s hipster formula, a society in which deep-dish sacrifice is discouraged by circumstance less than direct political fiat is unlikely to produce many committed rebels. What was true then is truer now. In “Dehumanized,” his 2009 essay in Harper’s Magazine, Mark Slouka argues that our submission to economic “orthodoxy” is nearly complete:

“Popular culture fetishizes it, our entertainments salaam to it . . . our artists are ranked by and revered for it. There is no institution wholly apart. Everything submits; everything must, sooner or later, pay fealty to the market . . .” [16]

Slouka marvels: “You have to admire the skill with which we’ve been outmaneuvered; there’s something almost chess-like in the way the other side has narrowed the field, neutralized lines of attack, co-opted the terms of battle.”[16]

Yes, our greedy tycoons have helped bring out the worst in the American character, but it’s also true the rest of us have mostly gone along for the ride. Social cocooning, with its emphasis on physical comfort and unquestioned self-image, has become our lodestar, even among the self-styled hip. Interviewed in the Art Beat blog post cited earlier, John Leland, author of Hip: The History, said the original hipsters “had to make a choice and make sacrifices to live the . . . bohemian lives that they did.” Today, he said, “you do not need to make those same kinds of sacrifices. You can have a wonderful job, a wonderful apartment, your parents can be so proud of you in this bohemian hipster role.” [17]

Even the September 11 attacks didn’t much rattle America’s collective amour propre. By then the American public had fallen for a seductive narrative line. Trumpeted by politicians, the news media, and popular misinterpretation of academics like Francis Fukuyama, author of the controversial 1989 essay, “The End of History,” it went like this: As victors of the Cold War, Americans had little to fear. The Soviet bogeyman was kaput and with it the bilateral death sweepstakes, that had included such white-knuckle moments as the Cuban Missile Crisis. American-style market capitalism had won, hands down, as the champion fructifier of the human nut. It would be up to others, in China or Djibouti, say, to figure out how the formula could be fine-tuned for local use.[a]

It followed logically in 2001, then, that no bunch of ragtag terrorists, however bloodthirsty, could do real damage to our bulwark of wealth, inventiveness and military power. And to an extent we were right. Although we were on our way to spending our second trillion in Iraq and Afghanistan by the time we plunged into the financial crisis of 2008, it was mainly our own wayward bankers and financial wizards that ruined our economy and slammed millions of Americans into joblessness and foreclosure.

Now comes the reckoning. As David Leonhardt recently explained it, “the economic version of the law of gravity is reasserting itself. We are feeling the deferred pain from 25 years of excess,” in which both ordinary Americans and their government spent money — lots of it — that didn’t belong to us.[18] Chronic unemployment is the highest it has been since the Great Depression. Household debt has sent consumer spending into historic decline.[19] China is rising, our infrastructure is crumbling, and while jobs continue to migrate offshore, jobless college grads are moving back home to reunite with mom and dad. We might hope against hope for a return to the old spend-and-grow magic but we know something has fundamentally changed.

And so, we arrive at today’s existential rub. Having been generally conditioned not to look for much in our lives beyond our role as economic beings, it’s hard for Americans to recalibrate and think of ourselves in fundamentally different ways. We appear to have forgotten, as Mailer pointed out, that true self-centeredness is not ultimately about selfishness. It is an act, requiring courage and sacrifice that ultimately opens the existential envelope for everyone:

“The only Hip morality (but of course it is an ever-present morality) is . . . to be engaged in one primal battle: to open the limits of the possible for oneself, for oneself alone because that is one’s need. Yet in widening the arena of the possible, one widens it reciprocally for others as well, so that the nihilistic fulfillment of each man’s desire contains its antithesis of human co-operation.” [20]

Today, our nihilism excludes such optimism. Still ensorcelled by the gods of consumption, we’re stuck in a vast holding operation of the psyche. No wonder we’re fascinated by dark and handsome Don Draper, the womanizing, corner-cutting protagonist of AMC’s hit TV series, Mad Men, about ad execs at bay in the early sixties. Draper is our televangelical Odysseus, “the man of twists and turns driven time and again off course,” who we suspect is headed for a moral train wreck.[21] But who’s to say Draper’s charm and luck won’t see him through? We know the feeling: We’re hoping for a little luck ourselves. Unlike the retro Draper, though, we’ve got the Internet, where we can still fly through the air with the greatest of ease. Maybe social media will help us get back on track?

Quietly Smothered by Mother Technology

The Internet is our millennial magic carpet ride. Facebook, Twitter and YouTube take us into places like Cairo’s Tahrir Square where, earlier this year, we witnessed historic events, in real time, and met people literally changing the world. And there will be a lot more Internet-inspired problem solving where that came from, we’re told. Crowdsourcing guru Clay Shirky argues, for example, that the time we save in our computer-networked world not watching TV as much anymore creates “cognitive surplus” that we can invest in socially redeeming pursuits — like-minded people gathering on the Web to talk, collaborate and act to benefit the commonweal in ways small or big. It’s convergence, not conformity, in Shirky’s view, because people will do things increasingly for love, not money. In his book, Cognitive Surplus, he observes, “Expanding our focus to include producing and sharing doesn’t even require making big shifts in individual behavior to create enormous changes in outcome.”[22] Thus, like Mailer’s hipster of old, the techno-hip enable a new human order, this time one in which the love of sharing eventually trumps the desire to consume. It is Shirky, by the way, who is widely credited with popularizing the phrase “the Internet runs on love.”[23]

Those are lovely ideas, but for now the real question is, How deep is the love? Wired cooperation has proven effective in collecting micro-donations for the recent natural disasters in Haiti and Japan, for example, or zeroing in on petty crooks or shoddy public service. No small achievements. So far, however, it has worked less well in fostering fundamental change. In an essay entitled “Small Change: Why the Revolution Will Not Be Tweeted,” New Yorker writer Malcolm Gladwell noted that student protests in Tehran in 2009 were robustly tweeted but the role of social media was overhyped; most of the micro-blogging was done by people outside Iran and in English.[24] In Cairo earlier this year, getting out the word through micro-blogging did help out on the ground, putting protestors in Tahrir Square, but it’s the-people-in-the-square part, and the physical risks and hardships they shared, that was key — not bloggers re-tweeting developments from the safe distance of New York or Madrid.

How does this relate to America in hipster times? Gladwell argues that America’s Civil Rights Movement, which was gaining momentum around the time Mailer published “The White Negro,” could not have been successful without the life-and-death commitments forged between and among its organizers in real time and real places. As Gladwell observed: “Activism that challenges the status quo — that attacks deeply rooted problems — is not for the faint of heart.”[25] Movement veteran, and now U.S. Congressman, John Lewis confirmed the point for the PBS NewsHour, when he said: “[T]here comes a time when you believe in something that is so right, so good and so necessary that you’re prepared to stand up and be willing to die for it.”[26]

Online organization doesn’t often lead to that degree of commitment or sacrifice — at least not yet. Rather, as Gladwell argues, social media encourages low-risk participation and “weak ties” among participants.[27] This “form of organizing,” he wrote, “makes it easier for activists to express themselves, and harder for that expression to have any impact. The instruments of social media are well suited to making the existing social order more efficient. They are not a natural enemy of the status quo.”[28]

Neither are many of the personal communication habits we now pick up online. “Anonymous blog comments, vapid video pranks, and lightweight mashups,” wrote computer programming pioneer Jaron Lanier in his 2010 anti-programming manifesto, You Are Not a Gadget, “may seem trivial and harmless, but as a whole, this widespread practice of fragmentary, impersonal communication has demeaned interpersonal interaction.” Lanier continued: “Communication is now often experienced as a super-human phenomenon that towers above individuals. A new generation has come of age with a reduced expectation of what a person can be, and of whom each person might become.” [29]

In “The White Negro,” Mailer had a different order of person-to-person communication in mind: To be hip, he wrote, “is to swing” and to swing is “to communicate, is to convey the rhythms of one’s own being to a lover, a friend, or an audience, and — equally necessary — be able to feel the rhythms of their response.”[30] Instinctively, we know that such human intimacy isn’t possible when we’re wedded to a screen-based existence that separates us from the world of blood, fiber, touch and smell. Observed Lanier:

“I know quite a few people, mostly young adults but not all, who are proud to say that they have accumulated thousands of friends on Facebook. Obviously, this statement can only be true if the idea of friendship is reduced. A real friendship ought to introduce each person to unexpected weirdness in the other. Each acquaintance is an alien, a well of unexplored difference in the experience of life that cannot be imagined or accessed in any way but through genuine interaction.” [31]

Mailer would be down with that. To him, genuine interaction meant to be “with it,” in hipster parlance, and that meant

“to have grace… to be closer to the secrets of that inner unconscious life which will nourish you if you can hear it, for you are then nearer to that God which every hipster believes is located in the senses of his body, that trapped, mutilated and nonetheless megalomaniacal God who is It, who is energy, life, sex, force… not the God of the churches but the unachievable whisper of mystery within the sex, the paradise of limitless energy and perception just beyond the next wave of the next orgasm… To which a cool cat might reply, ‘Crazy, man!’” [32]

That does sound a little crazy in today’s world where the Internet cognoscenti talk about life as the “survival of the busiest”[33] and people generally squirm at the prospect of failing our new technological gods. Younger friends have told me that they suspect our growing addiction to social media may be robbing us of what it means to be human but fear it would be antisocial to put up too much of a fight. Intrepid souls declare e-mail “sabbaticals” or even “bankruptcies,” but it’s crazy to think any of that will turn the tables. When it comes to fighting techno-conformity, to paraphrase the old-time boxing commentators, we’ve got a punch that wouldn’t crack a potato chip.

Conclusion: Ready to “Beat” Once More?

In a long interview for The Fifties, a documentary series I helped create some years ago, Allen Ginsberg said the whole purpose of the Beat movement was to “define some kind of new vision of America.” Kerouac, he explained,

“began picking up on the word ‘beat’ drawn from the common vocabulary of Times Square, meaning in those days exhausted, worn out, maybe homeless, maybe up all night, not wanting to be hassled. It didn’t mean ‘beaten.’ It didn’t mean the beat of drums. It didn’t mean hearing the beat, man, snap, snap, snap — all that hippy dippy stereotype you get on the mass media. It just meant emotionally and intellectually exhausted and wide open, and maybe receptive to some other awareness, some more deeper perception.” [12]

Mailer knew that however great the conformity, things won’t conform forever when worked on by the steady drip of iconoclastic ideas. In “The White Negro,” he caught a sharp sense of where America was headed in the sixties, toward a generational rebellion that could “bring into the air such animosities, antipathies, and new conflicts of interest that the mean empty hypocrisies of mass conformity will no longer work. A time of violence, new hysteria, confusion and rebellion will then be likely to replace the time of conformity.” [34]

I can’t compete with prophecy, so let me just mention an idea that’s less premonition than possibility. Today, with a record number of ordinary folks out of work, thus weakening our hold on that American article of faith that, whatever else happens, our kids lives will be materially better off than our generation; with declining economic fortunes of middle and working-class Americans cutting into our genius for supporting stability and flux at one and the same time — with all that, can we ask the question: Are we “beat” enough as a society, have enough of us had a taste of what Kerouac memorably called that “oldtime lowdown,”[35] to force new perceptions — or new political ideas? New action?

Not for the moment, it appears. For now, “the radical square” still has the upper hand in endorsing the false hopes of what we might call “liberation conformity.” Meanwhile, the potentially hippest lack conviction. Somehow, we’ve diminished what is at once daring and difficult in ourselves: our urge to go strongly against the grain in the service of exploring, more completely, what being human entails. What Mailer wrote in 1957 speaks to us in a way that we no longer speak to ourselves:

“And in being so controlled, denied, and starved into the attrition of conformity, indeed the hipster may come to see that his condition is no more than an exaggeration of the human condition, and if he would be free, then everyone must be free. Yes, this is possible too, for the heart of Hip is its emphasis upon courage at the moment of crisis, and it is pleasant to think that courage contains within itself (as the explanation of its existence) some glimpse of the necessity of life to become more than it has been.” [36]

In other words, generating genuine nonconformists is essential to the future of our republic. They comfort the economically afflicted and afflict the materially self-satisfied, and may help keep the rest of us computer-networked potatoes at least mildly aggrieved in our thinking. If life’s wisdom is to commit ourselves to, and to sacrifice for, achieving some kind of balance among competing values, it seems only sane to turn to our philosophical psychopaths to shake things up and keep the energy flowing. And so, I repeat: Where are the hipsters when we need them?

Endnote

  1. In Fukuyama’s original essay, which appeared in the summer 1989 edition of The National Interest, the author argued: “The end of history will be a very sad time. The struggle for recognition, the willingness to risk one’s life for a purely abstract goal, the worldwide ideological struggle that called forth daring, courage, imagination, and idealism, will be replaced by economic calculation, the endless solving of technical problems, environmental concerns, and the satisfaction of sophisticated consumer demands. In the post-historical period there will be neither art nor philosophy, just the perpetual caretaking of the museum of human history.” Fukuyama developed his ideas, and answered his critics, with the publication of his book: The End of History and the Last Man. New York: Penguin, 1992.

Sources: For footnotes and works cited, click through to Project Mailer.

Tracy Dahlby

Written by

Career journalist. Former foreign correspondent. Professor, University of Texas at Austin.

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade