Credit: Joshua K. Jackson, Unsplash

The Tales We Have Told Ourselves

David
Algo Contar
Published in
36 min readSep 14, 2015

--

1

Just recently I wrapped up Everything and More, insisted by author David Foster Wallace to be a mere booklet on the history and general observance/analysis of infinity. Sidestepping rather than expanding upon that particular topical crevasse, we’ll be focusing on the highly apt introduction by sci-fi writer Neal Stephenson, which honestly might have impressed me just as much as the booklet itself. The general gist of Stephenson’s introduction is not to critique the forthcoming material, nor to simply explain the history, but rather to quickly condition any readers to the general objective of a three-hundred page dissertation on transfinite mathematics and other abstruse bullshit. It’s a quick but remarkably comprehensive guide to reading the actual booklet. In Stephenson’s humble-but-well-articulated-opinion, the seemingly impregnable and opaquely didactic quality of Wallace’s prose is owed to his birthplace: the American Midwest.

A land unmistakable from the air, made patchwork by ramrod roads and well-rendered farms, beset upon by a fierce intersection of Northern winds from the Great Lakes and Western gales from the Rockies, the Midwest boasts an abundance of well-meaning academic havens, namely the dozens upon dozens of universities, colleges, seminaries, and every manner of such in-between. Wallace grew up in Champaign-Urbana, Illinois, and attended university out East, at Amherst College in Massachusetts. His father taught English at the University of Illinois Urbana-Champaign (or Champaign-Urbana, no one can seem to agree on which side of the hyphen each title belongs), and DFW would eventually find himself back in Illinois, at the State University, teaching English. Sorry — this is all just some very superfluous yet slightly mission-relevant backstory on Illinois, in case all you happen to know about the state concerns timeless musicals and Sufjan Stevens.

Stephenson posits that certain individuals from this very planar, over-educated part of the world have adopted a rather bizarre tick — a commitment to internally unpretentious, genuine, and unabashedly highbrow academic discussion — a propensity for talking about the Higgs-Boson and Schrodinger’s Cat the same way you or I might discuss that fucking bullshit call during the Pats season opener, or whether or not the family that just moved in across the street is Taiwanese or Chinese, or does Deez Nuts actually have a chance at winning the presidency… (the answer is no, but more on this later).

Stephenson sums up this genial Midwestern edification with some neat-o verbal alacrity:

To me Everything and More reads…as a discourse from a green, gridded prairie heaven, where irony-free people who’ve been educated to a turn in those prairie schoolhouses and great-but-unpretentious universities sit around their dinner tables buttering sweet corn, drinking iced tea, and patiently trying to explain even the most recondite mysteries of the universe, out of a conviction that the world must be amenable to human understanding and that if you can understand something, you can explain it in words: fancy words if that helps, plain words if possible. (Stephenson, Everything and More xxx)

The whole motive behind this approach has to do with narrative, or rather a very roundabout way of addressing the fundamental dichotomy between narrative and the spread of information. This dispersion of facts can come in many forms: traditional education, journalism, politics, law, et cetera. Everything and More is not a story-book — it’s one part history text, one-part philosophical guide, and one-part informal textbook. It strives to remain distinct from those sensationalized, sexy texts that might impose some narrative structure upon content that truly does transcend the over-simplification of a plot. In other words, the general business of mucking about with heuristic phenomena in the world is usually one that should be kept separate from the glitz and glamour of narrative, i.e. storytelling.

In Everything and More, David Foster Wallace occasionally inserts these little jibes at pop-historians and sensationalist journalists who claim the rather tragic emergence of madness in the minds of many great mathematicians is somehow owed to their gazing too deeply into the proverbial abyss, opening Pandora’s Box, meddling with the heady metaphysical abstractions that our parents all warned us about when we were little. [1] He asserts that these sorts of imposed narratives distract from the meaty substance of mathematics, from the true contributions made by towering giants such as Weierstrass, Cantor, Dedekind, Bolzano, Fourier, Cauchy, & cetera & cetera & cetera. Wallace says that the rampant “Hollywood-ing” of pivotal scientific and philosophical discovery[2] results in a gross over-simplification of nuanced ideas, and predicatively condescends the potential recipients of all this information, assuming the general population would rather have a mythic, archetypal narrative spoon-fed to them than sit through a fastidious lecture.

I mean, think about how insulting it seems — the average studio exec or rock-star NY Times contributor purports that a topic as complex as the discovery of a new realm of mathematics[3] or the invention of artificial intelligence[4] or even one a little closer to home — let’s say the war on terror or police brutality or a presidential election — can be (the topic, that is) reduced down to a simple dichotomy — an easily digestible conflict. In point of fact, it is the very same conflict that has been erected as a tentpole for those puzzling conundrums that have had human beings throwing their hands up in surrender since before Aristotle ever wrote his Poetics — the narrative.

The hyperpolarization of information has made it very easy for the self-appointed authors of these narratives to spin them in whichever desired direction. Adam Curtis’s documentary Bitter Lake (2015) tracks the absurdly bungled history of Afghanistan post-World War II, and how the combination of Western meddling and regional extremism crafted this perfect storm of political, cultural, and religious turmoil that would ultimately inspire destructive forces all over the world, from the Taliban to ISIS to the British Invasion Force. Curtis chastises those titans of political narration — Ronald Reagan and Margaret Thatcher and the Soviet Union — who believed the problems in the world could be easily typified into neat little battles between good and evil.

To a metapolitical extent, Curtis rests the blame for Middle Eastern volatility squarely on the shoulders of those politicians and newscasters who thought it apt to convince their constituents that the enemy could be easily identified, and were beyond a reasonable doubt guilty of their horrible crimes, and deserving of capital justice. It’s the same sort of jingoism that inspired post-9/11 slogans — the mid-aughts’ “United We Stand” and “Never Forget” that galvanized a confused and angry nation into an equally confused and angry war, and inspired the sort of recalcitrant hatred that resisted all notions of analysis, contemplation, strategy, understanding, and ultimately forgiveness.

Ronald Reagan told His Nation to support and arm the brow-beaten Mujahadeen in a desert nation from the Evil King and His Empire, and the nation fell into step, utterly convinced a regime they had helped put into power some twenty-five years earlier[5] was guilty of some hellish crime worthy of retribution. Steve Coll, author of Ghost Wars: The Secret History of the CIA, Afghanistan, and Bin Laden, said during an interview with Amy Goodman: “The idea that Afghanistan was a messy place filled with complexity and ethnicity and tribal structures and all of the rest of what we now understand about Afghanistan was it was generally not part of American public discourse.”[6]

Ultimately, Curtis uses Afghanistan as an example — not necessarily an example of the horrors brought about by Western influence, but an example of a far more pervasive, sinister phenomenon: the penchant for narrativization, for making stories of that which defies storytelling, of taking that which should be debated and discussed and instead molding it into something to be experienced, to be fought, to be rooted for and against. Adapting real-life into tight little screenplays. According to an article by Aaron Stewart-Ahn, Curtis said on the subject of his documentary: “…television is really one long construction of a giant story out of fragments of recorded reality from all over the world that is constantly added to every day.”

It’s not an isolated incident, and it certainly doesn’t stop with foreign wars and polemic politics. For just as cholera spread West with the wagon trails and the bubonic plague engulfed the shipping lanes of Medieval Europe, so too has the mass-media age found its own commensurate epidemic: the disease of narrative. The epidemic of the story. The last dastardly blow dealt by crumbling film industries and far-flung political scandals — the bastardized legacy of Upton Sinclair and John Updike and William Randolph Hearst and Truman Capote and Kennedy and Reagan and McCarthy and Malcolm X and TMZ and Bill O’Reilly and Rachel Maddow and Ta-Nehisi Coates and Kanye West — the story. The fucking story. The beat-all-end-all-get-it-at-all-costs story.

2

Before I realized the inherently sinister, exclusionary, elitist, eerily ‘yacht-clubby’ depths to which my University’s Honor’s Program could sink, I was admittedly, and now shamefacedly, excited about the prior’s introductory lecture series. The first of which was delivered by an atypically bearish yet reassuringly bookish art history professor (let’s call him “Professor N”) and packed a hefty amount of arcane socio-cultural historiography[7] into the span of 1.5 hours and into an orientation-weary freshmen’s attention span. We’ll spare ourselves the gory details, but suffice to say Prof N treated his lecture as any self-respecting Catholic priest would his Homily — plenty of anecdotal, historical, textual, and academic evidence built sturdily around a gooey thesis at the center.

The gist ad rem: all of art history has functioned and presumably will function as a sine curve of extremes, cresting and troughing at moments alternatively minimalist and ostentatious, encapsulating or oftentimes contradicting the general mien of Ages and Epochs.

The whole point was as follows: Art history, and therefore all history, might be calculated via any basic trig-Wave Function, with the tops of the curve representing those moments in history when art was opulent and flashy, and the bottoms signifying those poignant periods of barebones beauty, when realists and modernists focused more on the medium, not on the technique.

To wit — baroque art emerged in response to the austere facades and reliefs present within most post-Reformation churches, and the Church (with a capital C) struck back against the weight of Martin Luther’s legacy by erecting palatial theo-Domes worthy of High Byzantium. You might recall the Council of Trent from high school history, which essentially served as a Monday-morning quarterbacking session for the heel-rocked papacy in Rome after their devastating Sunday loss against the 1517 Reformation champions, the Protestants, led by all-time Edict of Worms MVP Martin Luther and his 95 theses /end clumsy sports-analogy and corroborative verbiage. The Church’s reasoning asserted that those crowds that had flocked to Luther’s pasture might be persuaded to return home to roost if tantalized by gaudy and magniloquent religious tableaus, most of which were arranged in horribly awkward places to reach, i.e. the roof of the Sistine Chapel and those little alcoves near confessionals that are uncomfortable to squeeze into even on a good day. Though bear in mind, this movement blossomed a full half-century after the Italian High Renaissance we all know and love — Michelangelo and Raphael and Da Vinci were all long retired or dead.

The nascent Renaissance erupted with a triumphant endurance — sure enough, religious and political arts’ patrons kept the cash flowing from the pockets of the people, up through the Bernini’s and the Caravaggio’s[8] all the way back to the pontiffs. To further support the posited sine-curve notion, eminents of the Dutch Golden Age such as Vermeer and Rembrandt emerged shortly after the birth of the baroque period, in the mid 1600s. Although considered contemporaries of Italian baroque painters and architects, the Dutch quickly toned down the glitz and swank, preferring to use the Italians’ chiaroscuro and tenebrism to depict bucolic pleasantries and pretty young Dutch girls in simple robes and pearl earrings. This return to less grandiose subject matter prompted a stylistic conservatism that would bring the sine curve barreling back down to the lower limit, at least for a short while. Within a hundred years, Classicism and Neoclassicism had all but replaced the aesthetic paucity of the Dutch Golden Age, returning to the typified beauty of Greco-Roman sculpture and art, thereby bring the Wave Function thundering back up to a gaudy crest.

There are some seriously trivia-ticklish and historically juicy proofs for the whole “Art Sine Curve” theory dated before the Reformation[9], but it’s after the Counter-Reformation that the trend really starts to become almost retroactively apparent, as if simply sitting in the lecture hall listening to Prof N ramble could bring back spurious memories of one learning all of this during those pre-pubescent history lessons back in x-grade. You can trace the curve up through Neoclassicism and Romanticism and down through Realism and then back up into Rococo (sometimes called “Late Baroque”)[10], and then the speed between limits on the curve quickens — one might blaze through naturalism, impressionism, constructivism, surrealism, modernism, etc…etc…all the way up through whatever epoch we might place ourselves currently, when “artists” photoshop Kyle McLachlan’s face onto a Douglas Fir and call it “arboreal dysmorphia.

Prof N’s methodological posturing aside, the historiography behind his theory was actually fairly well-codified within his proposed parameters, and one must admit — the theory certainly doesn’t reek of interpolation or impredicative fluff. In other words, it’s not the mirthless academia equivalent of an internet fan theory. But the only real nugget here, at least for our purposes today, concerns the trigonometrical abstraction of historical progress. The slow wind of time has never posed as some linear, planar momentum — it’s volatile, meaty, rich, oftentimes incorrigible — but in terms of absolute abstract trends, it’s been admittedly rather predictable.

The Time Wave Function crests and troughs, alternatively, at the two furthest extremes:

conservatism / minimalism / austerity / censure / conformity

— vs —

liberalism / stylisim / grandeur / permittance / individuality,

what have you, feel free to substitute your own conspiratorial adjectives.

3

So where is this going…well, I’d wager something as meta-scholastic as a Cultural Sine-Curve might apply to more than just art. In fact, it might just apply to all human endeavors — social interaction, governance, democracy, family life, sexual education, religion…and, of course, entertainment. And if the Curve is to be believed, then surely at this juncture we must be living at the apex of a Crest. The Age of High Entertainment, as it were.

A forecast on Statista predicted that U.S. media and entertainment industries would spend over $6 billion on digital advertising in 2015. Another forecast valued the United States 2015 entertainment and media market at approximately $594 billion — though this is still only 50% of what the US would need to pay off the debt it owes to China alone.

Derek Turner of freepress.net wrote an expose on broadcast media consolidation back in 2013. Most of it reads as predictably anti-establishment rage against an ostensibly corrupt FCC, dumping a whole lot of blame for the monopolization of broadcast television on the FCC and the NAB (National Association of Broadcasters). But Turner makes an interesting observation:

However, the primary factor driving this current wave of consolidation is FCC policy. The FCC ushered this trend in by signaling to the market in a 2011 decision that it has no intention of enforcing its own broadcast ownership rules. The FCC’s actions (or more precisely, inaction) ignited an explosion in the use of so-called outsourcing agreements, which allow one entity to control multiple stations in a single market. This “covert consolidation” has resulted in one owner controlling most — or in some cases, all — of the broadcast television news production in markets across the country. (Turner, 11).

Now, no one is arguing television is still the primary medium for news, especially not for most Americans. But it’s certainly still a viable option. And the consolidation of broadcast media doesn’t end with local stations and mid-level companies, such as Sinclair and Gannett. The slow consolidation of broadcast power that has been happening since the early 1980’s has come to a head. According to this loosely researched infographic from 2011, 6 major companies own pretty much all media Americans see, hear, and read on a daily basis. The chart sort of espouses that anti-corporate fear mongering that has come to dominate the hypocritical American middle-class, vis. those members of the 18–45 age group who prance around preaching suspicion of corporate media yet don’t stop to think about who’s sponsoring all those AP News tweets. And a lot of the information on it is wrong — Comcast has replaced GE as Company #6, and Time Warner has nothing to do with Huffington Post. But the point here isn’t to wax paranoid about the extent of corporate control, but rather to illustrate a point — there are a select few…let’s call them authors for the sake of our premise… authors who dictate the media your average American citizen might consume on any given day. This includes film, television, radio, newspapers, social media, blogs, message-boards…it seems only Google News and Reddit are exempt from Big Media ownership. Also, to preface a minor diatribe against BuzzFeed we’ll find in the text below, BuzzFeed has remained independently owned since its inception…at least until August 2015, when NBCUniversal dumped a $200million equity investment into the scrappy little internet upstart.

Also, an excerpt from the LA Times article linked above:

What’s now a trend began modestly two years ago when Jeffrey Katzenberg’s DreamWorks Animation paid $33 million for the teen-focused YouTube multi-channel network AwesomenessTV. The following year Disney Co. bought Internet video firm Maker Studios for $500 million. Then A&E Networks snapped up a 10% stake in the hipster flagship property Vice Media for $250 million (Vice has also been appearing on HBO since 2013 and is set to expand its offerings with a nightly newscast). (David Pierson, LA Times, 2015).

Just in case you thought online media represented some Robin Hood/Martin Luther-y attempt to bring journalism back to the people and away from the fiscal oligarchy: remember, if it’s media you consume, odds are it’s being authored by someone with a serious financial stake in your reception.

According to a Pew Research study done in the middle of last year, 30% of US adults use Facebook to get their news. That might not sound like much, but when you consider a solid 10% of the country uses YouTube for news, and another 8% use twitter, and another 2% use reddit…suddenly you have a distinctive plurality relying heavily on online mediums for their information intake.

Two common threads linking these online platforms are accessibility and simplification. Social media and online news must be, by its very nature, pithy. It must evoke enough emotion and interest within a highly limited number of characters to encourage audiences to click “Follow” or “Like” or what have you, in the hopes that news disseminated in serialized little chunks might provide enough of a narrative cliffhanger to keep customers flocking back through the fiber-optic doors. Also notice how 73% of that aforementioned 30% who use Facebook for news list “entertainment” as their most commonly viewed topic. Effectively, this means about 20% of American adults prefer their news online, with brevity, and not for discourse, but rather for entertainment. And nothing sells entertainment better than narrative.

Some less-empirical more-intuitive yet nonetheless prevalent examples: notice how every clickbait article boasts some pseudo-individualized title, as if some invisible freelance typist somewhere is reaching out through the fiber optics and plucking gently on your sensibilities, assuring you these are “10 Pictures of Cats That Are Sure To Make You Look at 10 Pictures of Cats” and utterly convince you that “This Area Woman Performing An Action is Performing an Action of Some Importance and You Should Be Interested.” It’s no coincidence that the language in clickbait articles eerily mimics those mid-1990’s infomercials that played at 10am during kids’ cartoons and cooking shows — just as advertising gurus back then set a laser-focus on the bored housewives who might still be stuck at home with their puking toddlers, so too do the self-assured geniuses of online marketing attempt to personally connect with screen-addled millennials by narrativizing all their content. It’s all supposed to sound sexy, relatable, entertaining. BuzzFeed and Upworthy are just “trying to get you, man.”

Now brace yourselves, for now we come to, arguably, the most cosmically horrifying little nugget of all:

The Interactive Advertising Bureau (iab), in cooperation with Edelman Berland, did a study in July of 2014 on some consumers’ general responses to digital advertising techniques, specifically those sponsored internal ads that run along the bottom of blogs and news sites.[11] The indicated purpose for the study: “Ultimately, the goal is to enable publishers, marketers, advertisers, and agencies to ensure that sponsored content meets consumer expectations and succeeds.” (Sherrill Mane & Steve Rubel, Getting In-Feed Sponsored Content Right: The Consumer View, 2014).

The survey employed two tried-and-true methodologies — focus groups and online surveys. The online study sampled 5,000 “representative consumers” who accessed their news online at least a few times a week. The survey pool was divided into three sub-groups of approximately 1,650 each; those who used news media for entertainment, for business, and for general information, respectively. The parties involved exposed the sample groups to vertically integrated sponsored content on desktop news sites, eschewing social media and mobile news for the sake of variable consistency. The study lists some key takeaways. All of them read as subtly sinister generalizations attributed to a putatively vulnerable population, one that seems to mean as much to the advertising execs and survey purveyors studying them as gluttonous, genetically modified cattle do to industrial agriculturists.

But there is one result of the iab/Edelman Berland study that not only supports our objective today, but might as well be a zeitgeist for the whole phenomenon: “The best in-feed sponsored content tells a story and fulfills the human need for a compelling narrative.”

If there is indeed a “human need for a compelling narrative,” then who is to say the manipulation of this hunger might be limited to sponsored ads, sensationalized “Top 10” lists, or even network news stories? Though the following extrapolation might read as more reductive than inductive, it also holds some water: this “human need” for stories becomes the “demand” in your run-of-the-mill transaction. And so long as we’re viewing this in terms of supply and demand, narratives become a sort of commodity — an abstract good that might be traded and sold to a starving audience, often processed and pre-packaged and mass-produced. Content itself — not any particular type, but just the general phenomena of content as a whole — becomes a valuable export. Journalism sells this product. Politics sells this product. We’ve seen that entertainment obviously sells this product. Mass-advertising sells this product. It seems that no matter what’s being sold, you can bet that the vendor is packaging a neat little story with your purchase, all for just the low-low cost of your undivided attention, and your continued patronage.

It becomes a rabbit-hole question. Or perhaps, to examine it from a mathematical perspective, a question of proof. If we can prove n is the narrativization of advertising for the purposes of easy digestion, and if we can prove n + h is the same, where h is another type of phenomena i.e. presidential polling results or foreign policy, and if we can then prove that n + k is true, where k is any and all types of news and/or information, then surely we can prove that over-storytelling in the modern world is more than an isolated incident.

It goes without saying that the aforesaid penchant for entertainment links directly to our whole narrativization dilemma, and one need only examine the anecdotal quality of most major news stories in the last ten years to see evidence of this trend. Only look at the Vice News approach to journalism, which consists primarily of snarky extrapolation and subjective first-person perspective as opposed to an objective account. Yes, one might argue this generational development is mostly engineered by trend-happy pollsters who have embraced Millennial ego-centrism as a profitable venture, content to cater to the faux-intellectual’s sense of holier-than-thou progressive enlightenment and produce highly niche content concerning “alternative” news. Really, it seems as if one of the founders of Vice News did actually read A People’s History of the United States and clumsily attempt to transpose that text into an editorialized format, not really understanding the whole success of Zinn’s novel was owed to it being a retrospective, done with all the 20/20 hindsight that those at Vice News must venture into the field without when they attempt to cover 10-year old arms dealers in Papa New Guinea or whatever. But anyway, the whole point here is Vice News lauds itself on being the proto-hipster’s watershed, and ensures all of its content fulfills that same narrative pattern that keeps the Man-Buns and Yeezy Boosts crowding up their bandwidth.[12]

To feed this loop back into the old Cultural Sine Curve, it seems abundantly clear that this High Entertainment Crest has stretched on for too long. It’s time for a metaphorical fiscal bubble to pop — for the dependency on hyperbolic and vitriolic storytelling to wane with the ages and go the way of baroque art and big-band pop music from the 1950’s.[13] But just as empires need some internal corruption before they topple, so too does this peak in the Sine Curve require a little push before it comes tumbling down, and we can all go about the business of forgetting the entertainment value of information before we pursue a topic, and we can at last relegate stories to those realms where they’re most apropos instead of shoehorning the Narrative into every little punctilious facet of our lives.

So the question becomes: What is the final straw? What is the last battle cry, the last guttural yalp of the fast-dying High Entertainment age? Well, it’s probably not such an easily solved riddle. Doubtless there will be a whole myriad of concentric and layered events that ultimately topple us off the Crest and back into a Trough on the Cultural Sine Curve, hopefully one that lasts long enough to banish the last vestigial traces of High Entertainment from living memory. But again, this transition will likely occur very gradually, and be beset by a whole slew of abiding factors that will likely transcend cultural divisions and societal sectors. In other words, there is no easy answer to the Narrative Dilemma.

But I like to think that it’s Donald Trump.

4

Donald J. Trump, known in some circles as “hey this joke isn’t funny anymore” or “Beat The Crap: Celebrity Apprentice Edition,” is arguably the most controversial cross-discipline figure since Ronald Reagan. And Reagan only achieved most of his controversy posthumously: Trump is doing it live, loping right up to the corn-fed denizens of rural Iowa on his Bond-villain chopper, hurling obscenities and disingenuous verbal non sequiturs down at the crowd with bored bemusement and barely shaking a hair as his rabid constituency gyrates accordingly to his oratorical stylings. He’s the pinnacle of distraction, ushering in an unprecedented age of political chicanery and false narrativization.

The gross over-simplification of Trump’s “Make America Great Again” spiel would be universally derided at any other point in history. Even during the Reagan years, when the President co-opted the title of history’s greatest B-Movie to describe a missile defense system, as if somehow correlating the intricate realities of thermonuclear war with a beat-for-beat rendition of the Hero’s Journey would assuage American fears about the utter inaccessibility of Cold War politics. Yes, even then, I find it hard to believe the American public wouldn’t see through the inanity of such a reductive and one-dimensional campaign platform.

But we’re not at any other point in history. This is the Age of High Entertainment, where being distracted and amused is priority numero uno, and any and all thoughts given to longevity and the droll necessities of survival are dismissed as being too rarefied for the layman, instead assigned, as sort of an afterthought, to cultish genius demagogues — the Elon Musks and Sergey Brins of the world. Today, our heroes are the “straight-shooters,” those who “tell it like it is.” If it can’t be pigeon-holed into a neat little narrative of good vs. evil, underdog vs. champion, Rebels vs. The Empire, then it’s not worth our time. And Donald Trump has brought this socio-cultural perversion to such a triumphant apotheosis, that it’s almost a wonder he isn’t leading by more than six fucking percent in the Iowa Caucus.

According to some of the latest polls out of Quinnipiac University, Trump leads the GOP pack with 28%. The other side of the sword on that one, however; Trump holds a 26% absolute neysay, meaning 26% of those polled, when asked which candidate they would undoubtedly never vote for, responded “Trump.” Ben Carson trails a distant second at 12% endorsement, with Jeb Bush as the runner up in the “no way” category: 18% say they would, effectively, rather not vote than vote for Bush. The Quinnipiac polls appear fairly average for preliminary presidentials. Approximately 1,550 voters nationwide were questioned, mostly via phone interviews, with 666 identifying as Republican and 647 standing with the Democrats. There’s some 2.5% sampling error overall, with that number being slightly higher if viewing Democrats or Republicans specifically. In other words, if the numbers lie, they’re not lying any more than all past polls this summer: those CNN polls that placed Donald Trump at 20% back in July, or the latest poll on the other end of the fence that puts Bernie Sanders ahead of Hillary Clinton by 1 point. Again, this is just Iowa we’re talking about here — but judgement day for that particular caucus is fast approaching, and it would be utterly moot at this point to consider national polls when the election is still more than a year away and Americans are endearingly fickle when it comes to electing their leaders.

It’s easy to see how D. Trump himself might be taking this narrativization business a little too far — anyone who seriously considers repealing the 14th amendment likely approaches national leadership from the same standpoint a five-year old boy approaches playing House. “Now the good ones…we can expedite it,” says Trump in a CNN interview, “we can expedite it where they come back in,” connoting a degree of ignorant sensationalist racism that I wouldn’t touch with a flagpole. You don’t need me to sit here and preach to you how Donald Trump is part of the problem — that bit should come off as exceedingly obvious. The real meat here has nothing to do with his instigation, but rather with the national reaction.

For the remainder of the Donald Trump portion of tonight’s program, we’ll be periodically referring back to this superb piece from The Guardian. If you wouldn’t mind popping over to the linked page and giving it a quick read, that would really speed things along.[14]

“So begins one 24-hour episode of the Donald Trump Show,” Paul Lewis writes, “a political satire about a billionaire celebrity who runs for president, breaks every rule in the manual, and becomes the frontrunner in the Republican race for the White House, beating all the senators and governors running for the Republican nomination by double digits in the polls.” You can read as much in the article, but Roger Stone — longtime friend and recently sacked adviser[15] to Trump — said of his former employer’s resounding success in the pre-primary season: “The voters don’t distinguish between reality TV and politics.”

Mr. Paul Lewis goes on to paint an endearingly ironic picture of Trump-hungry Iowa state fair-goers as they gaze slackjawed up at the Trumpcopter zooming wildly into view over the horizon, the massive or and sable TRUMP logo emblazoned shamelessly against the pitched sheen of the chassis, presumably accompanied by Wagner’s Ride of the Valkyries.

Later on, Lewis outlines the sublime, farcical interview conducted by Meet the Press anchor Chuck Todd aboard Trump’s private plane. Some neat gems come out of this sure-to-be-infamous interview — that earlier nugget about repealing the 14th amendment, the cockamammey “wall across the Mexican border” scheme, et cetera…but at some point Todd clearly must have gotten fed up:

“Are we all part of a show?” Todd asks him. “You know some of the criticisms. We all feel like we are in a reality show.”

Trump smiles. “No,” he says. “This the real deal.”

The best part of the article sort of creeps up on you, embedded casually among all the other mad political performance art and false magnanimity. It’s when Lewis recounts an exchange between Trump and a reporter at the Iowa state fair — the reporter asks Trump if he ever gives a thought to conveying his policies to the people, or whether or not he even has any true policy to convey. And Trump responds with that pejorative, smug, unfailingly unperturbed chuckle: “I know the press wants it… I don’t think the people care.”

The whole royal pastiche should inspire laughter, or perhaps the adrenaline-fueled rancour of a seemingly entranced crowd, who, deep down, really do understand that this all has to be some sort of gag. Except it’s not. Not anymore. It doesn’t even matter that Trump is all but guaranteed to lose the presidential election. It doesn’t matter that preliminary polls are historically unreliable and at this point in the 1992 presidential race the frontrunners were Pat Buchanan and Tom Harkin. It doesn’t matter that we’re more than a year away from an election and Trump’s success can probably be explained rather easily by a third-year sociology student at Brown. None of that is relevant because it’s all obvious and will likely play out exactly as everyone imagines in the coming months, and we’ll all merrily go on to have a droll presidential election filled with youthful ennui and senior bemusement.

At the present moment, the very immediate and very immutable concern is not one of a country run by Donald Trump, but of a country influenced by those who broadcast his face out to the nation everyday — the nation filled with those who will still tune in to watch.

Of course the article’s titillating climax occurs on board the Trumpcopter, when Trump offers three lucky young fair-goers the opportunity to rise above the Midwestern cacophony and look sovereignly down at the world — to see America from The Donald’s eyes. William Bowman, a small Iowan boy with a GoPro strapped to his head, mewls a question out from between his candy-glazed lips: “Mr. Trump? Are you Batman?”

And Trump replies: “I am Batman.”

Let’s humor this one. A billionaire entrepreneur / filial prodigy who decides the world does not acquiesce to his High Moral Standard re-allocates massive chunks of his wealth to a vigilante crusade against the corrupt orthodoxy and morphs into a powerful, polarizing, almost symbolic force… he’s missing the cape and the laryngitis, but “Donald Trump” might as well be a nom de plume for Bruce Wayne.

And what makes a better story than Batman? Nothing sells out comic books and ballot boxes like a caped (read: toupeed) crusader. It’s the sort of universal narrative the politically jaded population can really jive with. Pundits love making a big stink about how voters are tired of the “script” in Washington. But it’s exactly the opposite. Voters are tired of the non-narrative format. Trump’s constituency are sick of being expected to understand complex issues. They want the truth — simplified and easily regurgitated and “shot straight.” The interwoven, multifarious implications of fixing climate change and widespread homelessness and immigration reform and overpriced health insurance are too mentally exhaustive for the average American. It’s been more than thirty years since Reagan hit the scene, and the world has finally reconstituted after the 1980’s complexity purge. It’s time to re-open the storybook and start reading another chapter.

Come on people, there are bad guys with guns in Syria blowing up monuments, we have to bomb them! There are illegal immigrants committing crimes in our streets, we have to expel them! There are liberal heathens fouling up our Congress with discourse, we have to silence them!

Bibliography:

  1. Trump, Donald. Let’s Make America Great Again. United States: The American People. 2015. Everywhere.

5

Iowa has recently assumed this ley-line-level of national importance, housing points of socio-political intrigue that seem almost compelled to occur there, in the corn-dependent heartland of the Western World. I recently flew over Iowa on my way to the Rockies. It really does boast all those Midwestern attributes that Neal Stephenson discussed in his introduction to Everything and More. It’s puzzling but somehow cosmically reassuring to consider that David Foster Wallace, who inadvertently provided a very lengthy and sinuous epigraph for this entire piece, grew up not three hundred miles due Northwest of my flight path, and the Iowa state-fair and all assorted caucus-relevant material stretched out slovenly somewhere below our polymer wings. Aloft these winds of metaphysical guidance, we come to the edgy post-teen angst paragon in our twisted exploration of hyper-narrativization in the modern world: Deez Nuts.

Deez Nuts[16] was born Brady Olson, now aged 15. He lives in rural Iowa with his parents and his younger brother Tyson, who endeared the notion of running for president in his older brother’s head when he commented one day, “I bet you could do a better job than these guys.” The name either originated from the popular internet meme or the Dr. Dre song, take your pick, but remember we are talking about two Generation Z teenagers from rural Iowa here. That being said, Olson has commented to multiple publications, including Rolling Stone, that if he were old enough to vote, his candidates of choice would be Democratic upstart Bernie Sanders or long-time Libertarian Gary Johnson. According to an article by Kyle Munson at the Des Moines Register, Brady’s mother Teresa is astonished by her son’s knowledge of the electoral process. Brady, predictably, offers some honey-oats justification for his candidacy by bashing the two-party system and probably lauds himself on throwing a monkey wrench into the whole political minstrel-show by “punking” the internet. Kyle Munson throws up some fairly observant speculation as to the nature of Deez Nuts’ popularity:

It’s the sort of media lightning that’s impossible to bottle: Brady picked the right meme at the right time. There was the incongruity of Deez Nuts originating with a seemingly innocent rural Iowa boy. Late summer is a good time for oddball stories. And news headlines already were primed for just this sort of political satire — thanks to the mainstream presidential candidates already flooding into Iowa to gobble pork chops and dispense stump speeches at the Iowa State Fair.

But as of mid-August Deez Nuts, according to PPP (Public Policy Polling), boasts a 9% approval rating in North Carolina and an 8% drive in his home state of Iowa. Feeding back into an earlier dropped statistic, we must recall that over 30% of the US adult population gets their news from social media — a well-known breeding ground for memes. So certainly more than just 8% of Iowa has heard of Deez Nuts, and you have to figure that enough of those same jaded internet-jockeys might pose some fertile ground for the satirical machinations of a 15-year old kid from Iowa with a campaign platform as simple as, “I’m tired of all the bullshit.”

Yes, Deez Nuts is a hero, poised to usurp the bureaucratic oligarchy in Washington and present a refreshing, self-reflexive look at the state of American politics. He may seem woefully underqualified and even duplicitous, but that’s just the knee-jerk reaction to his desperate message. Truth be told, this is the only time in history Deez Nuts could have gotten as far as he has — the political climate is hackneyed and weary, old and cumbersome — it’s only because the American people are sick and tired of their ineffectual government that they’d even consider supporting, however mockingly, such a felonious candidate. This young boy may just be a practical jokester today, but just you wait — the faux-libertarian ideals of a Holden Caulfield wannabee are exactly what this country needs to get back into the swing of things. And even if he doesn’t win (which he legally can’t), perhaps he’ll inspire a return to some less bespoilt American values…now wait a minute. This all sounds familiar.

Postulate: Deez Nuts is the Superman to Donald Trump’s Batman.[17]

Remember, a Trigonometric Wave Function is an oscillation between two extremes, with two variables…it’s always two, some hyper-abstract realization of the power of pairs.

The rise and fall of history’s Function is always owed to the establishment of a pattern, that which cannot exist unless predicated by the existence of two members in its sequence. Socrates and Plato. Hamilton and Jefferson. Robespierre and Napoleon. MLK and Malcolm X. Francis Bacon and John Locke. Martin Luther and Johannes Guttenberg. Steven Spielberg and Martin Scorsese. Donald Trump and Deez Nuts. Batman and Superman.

It’s that archetypal dynamic duo as old as Enkidu and Gilgamesh. The blue-eyed Boy Scout with naive dreams of salvation, and the world-weary self-made cynic with the money and the means to instill in the world his own brand of change. Maybe neither of them fully grasp the sea change they’re both about to unfurl, but even if they aren’t the final spark upon the match they’re certainly another link in the catalyst chain. Rome built its biggest temples before it crumbled. Artists paint their best work before they die. Pride comes before the fall and all that — we’ve reached the maximum saturation point in the age of High Entertainment.

Self-parody is the death-knell of all authors. And Deez Nuts and Donald Trump both reek of self-parody — not just self-parody of their political beliefs or their moral ideals (although both men do indeed pose as absurd exaggerations of their prescribed type), but self-parody of the cultural condition. Global authorship of some omnipresent narrative has devolved into self-effacing kitsch, and whether most of us recognize it or not, we’re fast hurdling to the next Trough in the Cultural Sine Curve, where the spurious sheen of vapid entertainment and derivative narration will be replaced by something a little more genuine, and likely endlessly more boring.

The last ten years have been dedicated to the formation of a global infrastructure, one that held promises of mutually-beneficial globalization and convalescent co-independence. But the last ten years have also narrowed a very particular sort of lens — one that has us all cataloguing and ranking our experiences according to their entertainment value alone. We feed off the cookie cutter narratives of comic-book movies and scripted reality, never expecting to sluice some universal truth from these distractions but instead hoping they fulfill those escapist fantasies we were guaranteed at the turn of the 21st century. We chose to disappear inside our glowing screens and reach out with wiry tendrils toward the rest of the world, using all our societal wherewithal to craft, pell-mell, a pleasing story out of all the fragmented little narratives zipping through our field of view.

It’s no wonder the Age of High Entertainment chose Donald Trump as the last great addition to its oeuvre. The man is a cartoon character, a comic book incarnate, the karmic comeuppance for our having paid more than $1 billion to watch The Avengers. He is as alien as he is relatable, as inane as he is tempered, as voluble as he is stoic…he’s a walking caricature of all those political ideas we’ve spent the last fifteen years trying so hard to abridge.

Superman is a selfish being. Superman built a fortress in the ice and snow but Superman still works in an office. Superman has all the powers of God and deigns to employ them at his discretion. Superman has the means to carry out his holy mission twenty-four hours a day, seven days a week. Superman could spend every waking minute of every day saving the world. Superman will never grow tired, never grow old, never feel pain. But Superman chooses to live among us Mortals, donning a pair of prescient hipster glasses and picking up a day job just like the rest of us drones, simply because he wants to belong. Because he wants to flirt and fuck and eat and laugh and be a part of the big ol’ mundane. Superman could invent amazing Stories every day. Superman could Entertain us all until the end of time. But Superman chooses to hide his cape beneath an Oxford button-down and some neatly-pressed slacks — like the sobering man leaving the bar, going home to his wife, rolling down his sleeves to hide the scar he was so keen to show off to his friends. Deep down, Superman knows life cannot pass with all the exultant resplendence of a comic book. When the final page is flipped, Superman will go back to being Clark Kent, and he will be truly happy.

And after his 10th email interview today, Deez Nuts will go back to being Brady Olson. He will sit down to dinner with his family, think about the math test he has tomorrow, and forget about all the stories he just heard.

And he will be truly happy.

FN

[1 BACK TO TEXT] Your parents never warned you about the dangers of metaphysical abstractions and untempered exploration of such? Really? No late-night sermons on avoiding delving into the philosophical ether, or letting your mind wander erroneously into meta-solipsistic irrationality? No breakfast-table diatribes against those foolish philosophers and academics who would dare touch the surface of the infinite and expect to remain unscathed? Really. Just me? Oh man, I thought everyone got that talk.

[2]See this past year’s The Theory of Everything, or last decade’s A Beautiful Mind, or that new movie starring Spider-Man about Bobby Fisher, the Chess prodigy who was “driven insane by the endless permutations of the world’s oldest game of strategy” but in reality just morphed into this really sad little self-hating Jew who denied the Holocaust and who probably would have loved to have shared a drink and an opinion or two with Senator Joe McCarthy.

[3]See Everything and More for Everything and More on this topic, but also you could just search Georg Cantor and see what crops up — the man literally discovered and codified an entire new dimension of transfinite numbers, complete with operations for all contained within. Makes you almost want to stop complaining about an undergraduate thesis. But we’re all little egocentric pustules so I’m sure that’s a naive notion.

[4]See the attached link for video, but to quickly summarize: Nick Bostrom’s TEDTalk on AI tries to boil the whole business down to some cosmically spiritual struggle between Man and Machine. Rather than explore and discuss the complexities and ramifications behind such a discovery, Bostrom spins a sexy yarn about this being “the most significant discovery in the whole of human history” and then winds out a lengthy quasi-nihilist argument about how the invention of AI will essentially result in a war-of-consciousnesses between invention and inventor. All in all, Bostrom proves why anyone who gets their information from TEDTalks is the same type of person who jerks themselves off to everyone they meet about how they “listen to Podcasts,” thinking that actually might make them some kind of intellectual. Yes, I can listen to the radio too, just like my fucking great-grandfather did. /end off-topic fulmination.

[5]Okay, this history lesson is going to drag on for too long if I attempt to recap the terrifyingly and astonishingly well-researched 2.5 hours of historical data + stark imagery found in the Curtis documentary, but I’ll attempt to offer up some brief exposition, if only for the sake of avoiding confusion. Again, this is history at the most basic level: the United States entered Afghanistan in the mid-1950’s at the request of the Afghan king Mohammed Zahir Shah to build up infrastructure in the region. The growth of this infrastructure allowed a certain regime to hold power in the country, a regime which was heavily influenced by Islamic fundamentalists exported from Saudi Arabia by King Faisal in the late 1960’s, a move initiated to help distract Soviet influence in the region by giving them a more pressing target to fight against, namely the religious fanatics being trained by the ancestors of an ancient Islamic sect known as the Wahhabists from the turn of the twentieth century. There was also a whole lot of business concerning the first Afghan President, Mohammed Daoud Khan, so whenever you hear mention of “Daoud Khan” in a discussion about Middle Eastern history you’ll know someone is about to throw some serious shade. So eventually these Wahhabist-inspired radicals took hold in the foothills of Afghanistan, and became the very rebels fighting against the Afghan government in the early 1980’s, a government that the Soviets had attempted to control but utterly botched in 1979, even after multiple assassinations (Presidents Taraki and Amin, to be specific) because the Soviets, just like the United States, underestimated the cultural complexities of entering an entirely foreign nation and attempting to impose upon it some WASPY narrative about “freedom fighting.” Oh, and those Wahhabist teachings we mentioned? Yeah, those guys are the ones who eventually crossed the border into Iraq and Syria after all the Afghanistan business in the late 1990’s came to a head and eventually became Al Qaeda, and a very specific branch of Al Qaeda got a little too friendly with Wahhabist teachings and started calling themselves the “Islamic State of Iraq and the Levant,” which I’m sure sounds familiar. Make sense? No, of course not — but then again it never made sense to sixty years’ worth of US and British politicians, so what chance do we have of figuring it out? Sorry for the history lesson, I knew it was going to be longer than I wanted.

[6]Quoted from a 2004 article, so you can see how far back people have been noticing this trend and how little most people seem to care.

[7]Meaning, “the study of the study of history,” a word I owe to my 10th grade history teacher Mr. Cuneo. Mr. Cuneo is a rare breed of educator, the “once-in-a-lifetime” sort. He had enough faith in the academic potential of his 10th Grade U.S. History class to introduce us all to Howard Zinn’s A People’s History of the United States. Upon further consideration, Mr. Cuneo is the perfect antithesis to this whole “narrativization of the modern world” problem: he taught us to think critically and examine history from all angles, rather than just buy into the same old “good vs. evil” accounts we had been taught since preschool.

[8]Most art historians would never clump the prolific El Greco in with this bunch, relegating him to that most vague and sort of incongruous collection known as “Mannerists.” But take one look at El Greco’s extensive use of shadow and focus and you can all but see him presage the birth of Baroque, which he was fortunate enough to witness before his death in 1614.

[9]For example, it turns out the typified black figures emblazoned upon late Archaic Greek pottery (c.620–480 B.C.) were a response to the sparse, even abstract, geometric stylings found on urns buried within Athenian cemeteries during the late 9th and early 8th century B.C.

[10]This is a great one for some Encyclopedia Britannica brush-up, especially if you want to understand the reference that one Arcade Fire song is making.

[11]You might know these by their almost kitschy clickbait titles, presented as so obnoxiously transparent that you wonder, briefly, if some advertising intern might have chosen “This Man Used This Product And You Won’t Believe What This Product Did” as a title simply for the postmodern satirical appeal.

[12]Anyone see those new adidas Y-3 Retro Boosts? Look sick.

[13]Here’s another quick interpolation further demonstrating the Cultural Sine Curve: Notice how high-budget, well-produced pop music a la the Rat Pack and Elvis gave way to low-fi guttural mumbles courtesy of Bob Dylan and Simon & Garfunkel? And those, in turn, gave birth to big rock & roll bands, which then surrendered to the rise of garage-band grunge, and then Britney Spears, and then the Strokes, and on and on and on and damn if this theory isn’t the most ubiquitously applicable generalization since “All White Men Are Bad.”

[14]If you’re already past the point of exasperation at having to read my own content, and have only grown more agitated at my daring to ask you to read additional content as a retroactive prerequisite, I do apologize. I promise the article is a whole lot better and more concise than anything on this page. And do press on, if you can: we’re only a few short paragraphs away from the bit on Superman and Deez Nutz.

[15]Roger Stone is a whole other can of worms — none of his story pertains to our objectives here, but it does add a bit to the premise if we mention that this is a man most infamous for carrying the “tactical thuggery” of Richard Nixon into the 21st century. He ruthlessly demolished Eliot Spitzer with that prostitution scandal a few years back, sat in the little chair next to the likes of Bob Dole and Roy Cohn (better known as the prosecutor during the Rosenbergs’ trial and Joe McCarthy’s lackey), and staunchly supported Trump during his short-lived bid for the presidency in 2000. He frequents Miami “swinger’s clubs” and requires custom-made suits because he used to bodybuild in his youth. During his de facto exit interview, after being fired by or quitting the 2016 Trump campaign (no one seems to know exactly the details of his exodus), he said of Mr. Trump: “I love the Donald. I love what he’s doing, challenging the orthodoxy of the ruling class…these guys don’t know how to handle it.” And this man — this intrinsically hawkish, hardbodied cretin — has denounced the Trump campaign publicly for essentially being too sensationalist. When one Washington’s worst-kept secrets, the guy who always shits the bed at parties but still invites himself back for more, publicly slams your campaign…well, let’s just say Donald Trump is really pushing the envelope this time around.

[16] I highly recommend following this link — it leads to the FEC Statement of Candidacy form Deez Nuts filed on July 26th of this year. If you ever wanted to see a fully matriculated joke in its prime, this is the .pdf for you.

[17]This is not meant to be treated as an epiphanic moment. Quite the contrary — it should be the formulaic result of criticizing narrative overload in the middle of an age of narrative overload. It’s simpler and more coherent this way, so we will stick with this theory, because as we’ve seen — there is nothing more tempting than the expulsion of complexity in favor of trite little theorems.

--

--