How We Learned to Stop Worrying and Forget the Bomb
At 8 p.m. on Sunday, November 20th, 1983, just over half of the adult population of the United States collectively tuned into ABC television to witness Armageddon. Nearly 100 million people forewent Thanksgiving preparations to confront a harrowing, two-hour dramatization of the ultimate — and seemingly inevitable — denouement of the atomic age: nuclear holocaust in silo-rich American heartland. Unlike previous inept attempts to visualize global nuclear exchange via mismatched stock footage of fires and traditional warfare, The Day After (1983) invested the fodder of Nuclear Freeze Campaign pamphlets with a $7 million production value and pathological attention to detailed accuracy. Mushroom clouds looming high over the Midwest; instantaneous vaporization; full-body first-degree burns; slow, agonizing starvation; mass, late-stage radiation sickness; inescapable societal degeneration. Viewed today, it is difficult to deny the B-movie elements of the production — victims of the initial fiery blast zap off the screen in a montage of low-tech, yet disturbing “x-ray” silhouettes — but as a primer on the horror of thermonuclear war, this was a graphic rendering of the abyss.
However, despite the careful research and horrific scenarios which the movie presented, public hand-wringing surrounding The Day After highlighted the paradoxes that characterize American dogma about nuclear war to this day. The film proved too grim for the war hawks who believed in the prospect of limited, contained nuclear war, and too sanitized for the activist scientists who forecast nothing but corpses and cockroaches for the next world war. Like most congressional studies of the time, the film conceived of nuclear war as a matter of a couple thousand bombs all deployed within a few days — yet individuals outside the military establishment postulated that a nuclear war could drag on for months or years, and involve tens of thousands of warheads. That same year, a coalition of scientists helmed by Carl Sagan had warned the public that, were such a war to occur, smoke from countless burning landscapes could cast the planet into “nuclear winter.” In a war involving no more than one-third of the American and Soviet arsenals, one billion people would die immediately and another billion would be critically injured. The rest, the scientists reported, would be smothered under nuclear clouds in an earthly hell of darkness and cold and death. In the middle of July, the temperature would descend to 40 degrees below zero.
Such a scenario is arguably much more difficult, if not impossible to portray with the necessary gravitas in a narrative format. If a story allows human agency to triumph, and propagates the illusion that the bomb can be rendered toothless, it risks trivializing the nuclear danger. But if that story is faithful to the reality of nuclear warfare as an existential threat to human survival, the author has to present a series of events so hopeless that it assails the agency of the protagonists and the viewers who identify with them. After all, what sort of gripping tale could you tell about the aftermath of nuclear attack if no one survives? The human imagination cannot help but stagger under the moral weight of two billion lives.
This weight registers in the slump of my grandmother’s shoulders when I ask her whether she remembers seeing The Day After when it originally aired. She doesn’t recall the film, but its mushroom cloud fables cast a long shadow over her formative memories. She quietly tells me about the 50s — how she dreaded her father’s turns standing sentinel on Lake Michigan, scouring the night skies for Russian bombers. She tells me about the 80s — the bomb shelter rusting in her backyard, and the years of canned food gathering dust in her medical school’s basement. These preparations seem vaguely absurd and naive in retrospect, but at the time they symbolized the reassurance that life as she knew it could continue against all odds, in the bleakest of nuclear winters. Over decades-long cycles of activism and apathy, coping with the Bomb shaped the American psyche like deep-sea mountain chains contour ocean currents and weather patterns — in all kinds of hidden ways.
Even with an excellent intellectual grasp of the damage done by a nuclear weapon, its destructiveness is so psychologically unreal it barely registers a blip on our emotional radars. As the survivors of Hiroshima and Nagasaki dwindle, so does the salience of their testimonies — artifacts of the only instances of nuclear bombing in human history. No living American has seen an American city reduced to rubble by modern warfare. Nuclear weapons poised to annihilate one’s city in distant countries are outside the realm of immediate, individual experience. They are invisible to the senses, controlled by bloodless nation-states from tens of thousands of miles away. Though this temporal and spatial distance magnified as the Cold War wore on, its psychological effects on successive generations were evident from very early on. As far back as the 1960s, a college teacher noted a distinct progression towards greater and great ambivalence when he asked his students each semester about their feelings of nuclear fear. Those who had entered adolescence prior to the bombing of Japan “frankly admitted anxiety, but the next generation did not,” despite what the teacher determined to be an acute awareness of the ever-present danger. In the absence of any tangible reference point from which these students could imagine a nuclear attack, nuclear war’s emotional impact dimmed year after year.
When the horrors of nuclear weaponry sporadically do penetrate the public consciousness — as in the autumn of 1983 — a common psychological mechanism for coping with the resultant anxiety is denial. According to psychologist Jerome Frank, denial refers to “the exclusion from awareness of certain aspects of reality which, if allowed to enter consciousness, would create strong anxiety or other painful emotions.” This denial is bolstered by other psychological heuristics, particularly habituation. Humans, like all living creatures, cease attending to stimuli which persist unchanged over a certain period of time. We simply don’t have the intellectual processing capacity to consciously deal with everything hitting us at once. No matter how objectively threatening, continuing stimuli will blend into the environmental milieu given time. The first atomic bomb dropped on Hiroshima generated a shock wave that galvanized global efforts to ban nuclear weapons, as did atmospheric testing and the leap from fission to fusion technology in the 1950s, and the Cuban Missile Crisis in the 1960s. But each time, the images of war stagnated in the abstract, increasingly removed from tangible actions. The end of the Cold War inaugurated a new age for nuclear affairs with multiple hopeful shocks: first the disintegration of the Soviet Union, then détente between capitalist America and Communist China, and finally a flurry of new treaties that promised to slash nuclear stockpiles from 70,000 warheads in 1986 to just 4,000 by 2012. The Bulletin of Atomic Scientists relaxed its famous “Doomsday Clock” to seventeen minutes from midnight. Americans, for the first time in four decades, could step out of the cold shadow of imminent nuclear annihilation and breathe a sigh of relief. With the nuclear threat dimmed, we now seem to be living in a different reality, one that no longer puts human existence on a hair trigger.
It takes no feat of imagination, then, to see that nuclear imagery and terminology has lost its fearful resonance in contemporary American culture. Like most others my age, my first encounter with the atomic bomb took place in a middle school classroom. Grainy footage of incinerated trees and mushroom clouds, mingled in memory with “Bert the Turtle” instructing 1950s children to “duck and cover,” rooted my conception of nuclear war in the black-and-white tedium of history. My generation inherited a world in which talk of nuclear weapons, radiation, and reactors seldom showed up in media coverage or in serious private conversations. Doomsday seemed to be showing up on our doorstep from all directions at once — global climate change, post-9/11 terrorism, widespread epidemics and superbugs, cyber-warfare — and yet not at all. We witnessed no Day After-like national controversies about the ethics of depicting war between nuclear powers precisely because that brand of apocalyptic scenario was passé, presented mostly in 1960s and 80s pastiche and cheesy science-fiction fare. If the box office is anything to go by, nukes are so last century. By 2014, the concept of a nuclear holocaust was anachronistic enough to spur “adaptation” on the part of the Terminator franchise to “current cultural anxieties.” The filmmakers opted to turn Skynet into a hulked-out NSA doppelgänger, with the emphasis being on their terrifying surveillance network rather than their plan to bomb humanity back to the Stone Age. Mad Max: Fury Road (2015) and Rise of the Planet of the Apes (2011), both Cold War-era reboots, took similar revisionist tacks — swapping their mushroom clouds for melting ice caps.
When contemporary media does depict nuclear devastation, it’s rarely divorced from Cold War contexts. The proliferation of post-apocalyptic video games like Duke Nukem (1991), Metal Gear Solid (1998), and most notably the Fallout franchise established a new, irreverent tone by the early 2000s. Set in an alternate world in which nuclear technologies were prioritized over miniaturization of electronics, Fallout is a retro-futuristic role-playing game that allows players to wander the post-nuclear wastelands of various American metropolises. The third installment (2008) opens by spawning the player in Vault 101, a gigantic bomb shelter replete with icons of 1950s culture. Your “Vault Dweller” celebrates their tenth birthday in what resembles a classic American diner, complete with vinyl-upholstered booths and bar stools and a Wurlitzer 1015 jukebox. Juvenile delinquents, called “Tunnel Snakes,” don leather jackets and slicked hair in true greaser fashion. The game’s mascot, “Vault Boy,” issues grave warnings about nuclear blast effects in propaganda posters and the manual, all the while maintaining his cheery grin in the tradition of “Bert the Turtle.” Fallout plunders this Cold War kitsch to comment on the ironies of the early atomic age — apocalyptic violence stewing beneath a white picket fence. This forces a critical distance between the player and the scenario presented. Historian William Knoblauch writes of Fallout 3 that “the game’s reliance on 1950s imagery suggests that nuclear war was only ever really possible during the early Cold War. Put simply, Fallout 3’s apocalypse is born of a distant, but culturally familiar, 1950s era.” Its satirical approach also discourages emotional engagement with in-game characters affected by the fallout. Rather than empathizing with the Vault Dwellers, we are goaded to scoff at the past’s naïveté and take comfort in our cynical knowingness. We think: Look at those idealistic schmucks. Of course their faith in the atom was their downfall. But we know better. We ran the gauntlet of the atomic age and came out wiser for it.
It is important to recognize, within this cultural conception that nuclear war has become “a thing of the past,” impulses — unconscious or otherwise — to revise the nuclear age and contain the total event. By confining the nuclear age to the temporal extent of the Cold War, people that suffered the hideously prolonged expectation of disaster, who suffered the false alarms of the Cuban Missile Crisis and mid-80s escalation, are able to assign those traumas a past tense. This tendency is exacerbated by the predominance of nostalgic signifiers in popular culture, but also by the archival and memorialization of deteriorating nuclear monuments. New York Times journalist George Johnson recently wrote a piece ruminating on his visit to the “Russian Woodpecker” in Ukraine, a “gargantuan steel structure” lined with radio towers built to give the Soviet Union early warning if the U.S. launched a nuclear onslaught. He describes his penchant for collecting memorabilia like graphite from the first nuclear reactor and rocks from the Trinity Site in New Mexico as a way to “make the abstractions of nuclear fission and nuclear politics feel more real,” as though grasping for anchors in the present brought him nothing but rust.
One can easily find analogues in modern-day Los Angeles: about 225 Civil Defense Sirens decay atop poles throughout the city, often stripped of their paint and hidden among palm trees and skyscrapers. Though their last test was in the late 1980s, self-described siren hunters and Cold War history buffs now seek and photograph them like rare birds. Rediscovery and exhibition of nuclear icons as sites of cultural heritage often cultivates their “chaotech” aesthetic of corrosion and decay. It is as though the reclamation of the Cold War apparatus by natural forces in some ways retrospectively subverts the climate of immense, Sisyphean conflict that suffocated Cold War culture. Although this process of embalming does enable historical reflection, it also perpetuates the notion that the specter of nuclear attack has been exorcised for good. The American people would do well to remember that, while nuclear symbols may be rooted in the 20th century, nuclear weapons are far from obsolete.
Now, more than two decades since the end of the Cold War, around 14,900 nuclear weapons — most orders of magnitude more powerful than the Fat Man and the Little Boy — continue to pose an escalating threat to humanity and the biosphere. Globally, these nukes are divvied up among nine nations: the U.S., Russia, the U.K., France, China, India, Pakistan, Israel, and North Korea. Recent studies by atmospheric scientists show that even a “contained,” regional nuclear conflict two of these countries, India and Pakistan, involving just half of their current arsenals could produce climate repercussions even more severe than those predicted in 1980s Doomsday models. And let’s not discount the recent resurgence of far-right, authoritarian strongmen in most, if not all, nuclear-armed countries. To quote the editors of literary magazine n+1, “the installation of an extralegal and extrajudicial personality into the presidency — an office that has been expanded, through Republican and Democratic administrations, decade after decade, to dangerous excesses of power” — has acquiesced more than 7,000 warheads to the whims of a thin-skinned real estate mogul. There is no failsafe. The president has the sole legal authority to conduct unilateral and arbitrary nuclear war. The only barrier is a web of norms, taboos, and phobias — feeble psychological deterrents for a figure who has advocated for an arms race with Russia and thinks the military should be more “unpredictable” with nuclear weapons.
To historians familiar with the bomb’s psychological impact in the Cold War years, the possibility of sleepwalking into nuclear annihilation comes with a powerful sense of déjà vu. During the first wave of nuclear awareness in the late 1940s, images of mass destruction were anticipatory, summoning visions of crumbling civilization that would not become possible for another two decades. By contrast, the holocaust scenarios of the 1980s were conservative to a fault, continually outpaced by an ever-shifting playing field. But we have adapted in kind. In 2017, as the Doomsday clock approaches midnight, our would-be nuclear prophets shout scientific near-certainties at elected officials, press, and public alike, only to hear their warnings returned in mocking echo.