The Idea That Elected Obama (Twice)


We are good at electing presidents who eventually lose our approval. Over the past 50 years, each of our last 9 presidents has dropped below 40% in his job approval ratings, including drops by Nixon, Carter, and both Bushes below 30%. Historically speaking, there is nothing new about the two-time election of Barrack Obama and his drop below 40% in a December 2012 Gallup poll, or his drop to 45% in June 2013. Even the dips below 45% for his handling of health, the economy, and foreign policy are not unusual.

What is unusual about Obama - beyond his standing as our first African American president - is not fluctuation in his public approval but the entire phenomenon of his presidency. Both Obama victories are more than a year into the record books, and politically speaking, the pollsters got it right. One week before Election Day in 2008 and 2012, most of them predicted a win for Barrack Obama - including ABC News/Washington Post, Angus Reid, Daily Kos/SEIU/Public Policy Polling, Democracy Corps, Ipsos/Reuters, JZ Analytics, NBC News/Washington Post, Pew Research, RAND, and UPI/C-Voter. Given the degree of political controversy connected with each contest, these predictions are to everyone's credit. But the ability to pick a winning horse after the horses have left the starting gate is not the same as an ability to hang around stables or livery yards and determine whether a colt (or filly) will make it there in the first place. It's this second type of ability that was absent with Barrack Obama. The broader cultural phenomenon of his presidency was not something anticipated by anyone. In fact, it's a phenomenon that remains incompatible with the facts.

“Not in my lifetime”

That phrase summarized the reaction of many civil rights pioneers when describing their disbelief on election night in 2008. Their disbelief was understandable. It was not as if the past decade had revealed U.S. culture to be teetering on the brink of its first black President. The Bush presidential years had seen the election of only one black U.S. senator and one black state governor. There was an ongoing presence of a one black justice on the U.S. Supreme Court - but that was the same head count that had existed for 40 years. And both Bush presidential cabinets had included fewer blacks than the preceding Clinton cabinets.

Nor had Obama been undergoing some meteoric rise in political greatness. In 2000, as a one-term State Senator in Illinois, he received only 30% of the votes in a campaign against Illinois Congressman Bobby Rush which he lost by a margin of 2:1 In 2004, he had fallen behind by double digits in the Democratic primary for U.S. Senate against businessman and securities trader Blair Hull, until David Axelrod, working as Obama's Media Advisor, leaked information about domestic violence allegations against Hull to the Chicago Tribune. He had also fallen behind in the subsequent senatorial campaign against Republican Jack Ryan until the Chicago Tribune successfully sued in a California court to get Ryan's sealed divorce and custody records opened late in the campaign, an event which caused Ryan to withdraw four days later. While this series of events enabled Obama to become a U.S. senator, it hardly reflected a natural rise in prominence.

In comparison with the track record of other U.S. presidents to win office during the last half century, Obama's track record seemed equally insubstantial. Prior to their presidential campaigns, George W. Bush, Bill Clinton and Ronald Reagan had been elected two times to their state governorships. George H.W. Bush had been elected a U.S. congressman and a two-term Vice-President, and he had been appointed Director of the CIA and Ambassador to the United Nations. Even Jimmy Carter had served a full term as governor of his state and two terms as a state senator. Yet Barack Obama had only served 1/3rd of his first term as a U.S. senator when he began his first presidential campaign.

Surprisingly few arrows pointed in the direction of an Obama presidency in 2008. Yet equally surprisingly was the small number of arrows that pointed toward his re-election in 2012. At the time of his re-election campaign in the summer of 2012, more than half of all voters in 37 states expressed disapproval of Obama's job performance. Included in this disapproval was dissatisfaction with his handling of the economy. A national unemployment rate that had remained relatively unchanged at 8+ percent for nearly 12 months and home foreclosure filings that had continued at a rate of 190,000 per month were regarded by voters as marks against Obama's performance. A drop in international approval of Obama's approach to foreign policy had also occurred during his first term, with about 20% fewer citizens in Europe, Russia, China, Japan, Mexico and the Middle East voicing support for his decisions. Particularly strong was objection to his drone policy, with over half of all citizens in all countries surveyed worldwide (with the two exceptions of Britain and India) voicing opposition to Obama's use of drones. In Spain, Japan, Brazil, Turkey, Egypt, Jordan and Greece, the drop in approval had become so great that over 75% now voiced objection to the Obama policy.

In addition to these circumstances hardly aligned with a second term were the demographics of Obama's first victory. In November 2008, 65% of white Protestants voted again Obama, as did 58% of adults attending religious services at least once a week, 55% of whites, 52% of white Catholics, and 52% of men. There was a flip side to this coin involving women and blacks and Hispanics and single voters. But the overall numbers did not reflect a country on the brink of re-electing Obama to a second term.

Explaining the wins

The lack of arrows pointing toward Obama's emergence and continuation as president has made it especially interesting to read explanations of his victories by analysts. While these explanations have contained sound logic, the logic never stretches as far as the analysts need. For example, in 2008, many commentators pointed to the record-setting presidential disapproval rating for George Bush of 69%. This factor remains hard to dismiss as playing a role in Obama's first election. But as a two-term president, Bush was leaving office no matter what. And if presidential disapproval ratings were such an important tipping factor in a 2008, why didn't Obama lose the election in 2012 when over 50% of voters in 37out of 50 states disapproved of his performance several months prior to the vote?

Obama's victory in 2008 has also been attributed to changes in U.S. culture and the public's adjustment to civil rights. As Peter Beinart (senior fellow at the Council on Foreign Relations) has pointed out, many causes whose pursuit had seemed disorderly and rebellious in the 1960's - including civil rights, gay rights, and women's rights - had come to seem far less disorderly and rebellious by the time of Obama's 2008 campaign. This cultural shift allowed Obama to represent these causes without seeming threatening. Beinart has made a great point here - except for the fact that Obama shied away from 1960's-type causes during his 2008 campaign. For example, when considered as a group, Obama's campaign speech in Philadelphia on race (March 18, 2008), his nomination acceptance speech at the Democratic National Convention in Denver (August 28, 2008) and his acceptance speech on election night in Chicago (November 4, 2008) produced 114 mentions of the word “America” or “American,” 20 mentions of the word “job” or “jobs,” and only 2 mentions of the word “rights,” 1 mention of the word “gay,” and only one mention of the word “poor” (except for three mentions within the generic context of “rich and poor”).

Explanations of the Obama's 2008 victory within the context of civil rights have also included a more sweeping look at U.S. history beginning with enslavement of blacks at the time of the country's founding. Couching Obama's first victory within this context makes sense, and Obama did little to undermine the legitimacy of such an approach in his 2008 campaign. In his March 18, 2008 speech on race, he referred to the Declaration of Independence as having been “stained by this nation's original sin of slavery.” In that speech, he also embraced his status as “the son of a black man from Kenya” and as the husband of “a black American who carries within her the blood of slaves and slave-owners.” But as logical as it might seem to interpret Obama's 2008 election within the context of slavery and historical transition, it also leaves out some important features of his campaign and the election itself.

The United States did not come together to embrace civil rights in the 2008 election. 55% of whites voted against Obama (including 59% of white males). So did 54% of southern voters, and 62% of U.S. gun owners. If Obama's election could be interpreted first and foremost as a landmark in the quest for civil rights, these numbers would need to be different. Yet more important than the non-corroborative numbers was Obama's basic message (and the slogan he chose for his campaign): “Yes we can.” This message was not an embrace of the “We Shall Overcome” anthem of the civil rights movement. It was not some kind of shorthand for, “Yes we can overcome.” Obama described “Yes we can” as a creed that summarized the American spirit. He made it far more reminiscent of the slogan used by Cesar Chavez and the United Farm Workers in 1972 (“si, si se puede,” Spanish for “yes, yes, it can be done”) than the saying “We Shall Overcome.” “Yes we can" was much more of a calling out to the Protestant work ethic and to the pursuit of pioneering endeavors than it was a preaching of the overcoming gospel and walking hand in hand in peace.

Nor did civil rights play a prominent role in Obama's 2012 re-election. For example, when he launched his campaign for re-election on May 5, 2011 at Ohio State University, Obama did not mention civil rights, African Americans, or race. Nor did he mention these topics four months later in his nomination acceptance speech at the Democratic Convention on September 9, 2012. In fact, Obama's handling of civil liberties in general was a matter of ongoing criticism during Obama's first term. After making promises during his 2008 campaign to immediately review possible criminal activity by the CIA in the torturing of Afghan and Iraqi detainees (including two torture cases that resulted in death), Obama went on to declare immunity from prosecution to all government officials involved in the torture of detainees. Similarly, after making repeated promises to close the detention camp within the Guantanamo Bay Naval Base in Cuba, and referring back to these promises during his acceptance of the Nobel Peace Prize in Oslo, Norway in December 2009, Obama allowed 169 detainees from more than a dozen countries to remain incarcerated at Guantanamo throughout his first term, some of them jailed for more than 10 years.

Had civil right played a defining role in Obama's re-election, his approval ratings among the poor - those U.S. citizens most likely to have been deprived of their civil rights in a historical context - would not have been expected to fall so precipitously since his 2008 election. And yet in April of 2009, 76% of individuals earning less than $30,000 per year voiced approval of Obama's job as president, as compared with only 45% in June 2011, two years later.

Expanding the time frame

4+ years of dissatisfaction with Bush; 40+ years of cultural adjustment to the 1960's; 400+ years of slavery and its aftermath: no matter how many times we add a "0" to the time frame, we still don't get a satisfying explanation of Obama's two-term presidency. That's because years and decades and centuries are insufficient to explain the cultural phenomenon of Barrack Obama. What is needed are millennia.

"Millennials" - Obama's most inspired block of voters in the 2008 U.S. presidential election - is a word used to describe individuals who turned voting age (18) at the start of our current millennium (2000-3000 AD). Millennials gave Obama more than one third of his total vote count in 2008 and 2012, and more votes than his margin of victory against McCain and Romney. But as important as they are for understanding the outcome of the campaigns, millennials are not the reason we need millennia to explain Barrack Obama. The reason we need millennia involves an idea.

The Swinging Pendulum of Inclusiveness

We've become accustomed to dismissing ideas as deciding factors in real-world events. We view them as too intangible - floating around in side our head until fleshed out and put into practice. An idea might generate excitement in the form of a campaign slogan. But it cannot set up field offices or spend advertising dollars. It must be implemented to have any real clout.

We've carved out two exceptions to this way of thinking about ideas. The first might be called “trained intelligence.” Doctors, lawyers, engineers and similar types of professionals need ideas in order to practice their trade. For example, a lawyer needs ideas like negligence, hearsay, and discovery. An engineer needs ideas like scale, equilibrium, and elasticity. Ideas are part of what gives these experts their “smarts.”

The second area of our experience where we give ideas more room to operate can be summarized as “ethical principles.” We're comfortable with ideas as inspirational factors in the way we live. Telling the truth, being fair, showing kindness - each of these guidelines requires us to think in terms of ideas, and we feel natural doing so.

Neither of these exceptions violates a fundamental principle that we've adopted about ideas and their role in experience. That principle is conscious choice. Ideas like negligence and hearsay become part of the bar exam because we consciously decide to include them there, incorporated into the required know-how for the practice of law. Ideas like truth and justice and kindness become part of our ethical value system because we single them out as worthy of belief and intentionally adopt them as guidelines for living.

Nowhere in our experience do we grant ideas the ability to operate outside of our chosen purpose. We don't expect ideas to take hold of us against our will and overpower us. The only way that might happen would be through the work of a cult leader and we would call it brainwashing. Yet even though we deny ideas any sort of autonomous power, some of them draw us in and get us hooked anyway. They find a coveted timeslot of their own in our thinking and turn us into die-hard fans who keep coming back with the regularity of primetime TV addicts. At the top of this list would be the idea of freedom. As a nation, we've gone beyond a simple endorsement of this idea. Consider Romney, writing for the National Review at the heart of his 2012 presidential campaign:

“... one feature of our culture that propels the American economy stands out above all others: freedom. The American economy is fueled by freedom. Free people and their free enterprises are what drive our economic vitality.”

Or consider Republican primary candidate Ron Paul, in his farewell address to the U.S. House of Representatives. In a 161-paragraph speech, he referred to the idea of freedom (using the words “freedom” or “liberty” or “free”) some 87 times. Barrack Obama mentioned these three words 67 times in the first three chapters of his book,The Audacity of Hope - 7 more times than the number of pages in those chapters. Yet you will not find a single mention of the word “freedom” in the U.S. Constitution, and only one mention of this word in the U.S. Bill of Rights. In fact, in both of these documents combined, you won't find more than two mentions for “free” or “freedom” or “liberty.” Our founding fathers got “outfreedomed” by Romney-Paul-Obama by a score of 158 to 5.

Freedom is a good example of an idea that can be highly visible on our radar screen, part of our chosen purpose, and yet still capable of acting with a tidal force that overwhelms our thinking. But it is not necessary for an idea to be visible on our radar screen in order for that idea to disrupt our thinking. The flow of our thinking can be disturbed by any idea that lacks its rightful place in our experience, by virtue of its having been consistently overlooked, or pushed aside, or crowded out by other ideas.

The idea of inclusiveness is not mentioned in Mitt Romney's National Review editorial or in Ron Paul's congressional address. You'll find it mentioned only twice in Obama's 375-page Audacity of Hope, and in neither instance is it ascribed any overarching value. Yet the idea of inclusiveness - of everything having a place and everybody getting taken into account - is an idea that is finding its way into our thinking and causing things to get tossed around. Our uneasy attraction to this idea is being played out in our social interactions, our politics, and even in the way we entertain ourselves. Here are some examples:

• Immigration. Public opinion polls show that 60-80% of U.S. adults want to keep legal immigration largely unrestricted, and want the government to figure out ways for illegal immigrants to become U.S. citizens. It's the poem by Emma Lazarus engraved on the pedestal of the Statue of Liberty: “Give me your tired, your poor, your huddled masses yearning to breathe free ...” Yet at the same time, 50% of us view immigration as a burden that causes problems in the job market and health care.

• Reality TV. 17% of all television broadcasting is now “reality TV” (non-scripted broadcasting). 2001 was the first year in which a non-scripted TV show hit number 1 (Survivor). Reality TV offerings now include: America's Next Great Restaurant, America's Next Muppet, America's Next Producer, America's Toughest Jobs, Average Joe, Average Joe: Hawaii, Celebrity Rehab with Dr. Drew, The Real World: Austin, The Real World: Boston (and eleven other cities), Mall Cops: Mall of America, The Next Great American Band, The Next Iron Chef, The Next Action Star, The Next Great Champ, The Real Housewives of Atlanta (and seven other cities), My Big Friggin' Wedding, My Big Redneck Wedding, My Dad Is Better Than Your Dad, My Big Fat Obnoxious Boss, My Big Fat Obnoxious Fiancé, The Search for the Next Elvira, and Wants To Marry My Dad? Not included in this list because of its record-breaking, most-watched TV status is Fox TV's American Idol, which hosts try-outs for 100,000 singing contestants each year.

We're attracted to a little bit of anything and everything on TV, despite the fact that reality broadcasting bothers us. In public opinion polls, 71% of persons surveyed think there should be fewer reality broadcasts. 63-66% believes that reality TV has “changed things for the worse.” And insofar as reality TV shows (for example, the Jerry Springer Show) contain profanity, violence or sexually oriented material, 58-61% of TV viewers say they are offended by what they watch.

• iPods. Somewhere between 25-40 million people in the U.S. now own an iPod - the portable media player first released by Apple in 2001. (For the purpose of this example, I'm sticking with iPods. But the number of people who could be included in this example is undoubtedly larger, since half of everyone in the U.S. now owns at least one Apple product, most of which are capable of playing music.) On any given day, approximately 225,000 iPods are available for purchase on eBay. My own iPod - a 2005 fifth generation model - can hold up to 20,000 songs. So far, I've loaded in about 6,500. When I turn on my iPod, I'm like a kid in a candy store. I can listen to any song I've ever loved whenever I want. Yet as weird as it might sound, sometimes I turn on my iPod and start scrolling through all 6,500 of my song options and cannot find anything I want. At the gym, when I am listening to my iPod through a headset and doing my exercise thing, I look over at everyone else doing their exercise thing while listening to their iPod through their headset and I feel disgusted with myself for having cut myself off so completely from any possible social interaction. And adding to my feeling of attraction/uneasiness: when I come across some music that I could either take or leave, I can't decide whether to load it into my iPod or not. There's plenty of space. But is that enough of a reason?

• Facebook. The internet's most popular social network service, Facebook, was launched in 2004. Its U.S. user base has nearly doubled every year since its launch and now stands at 165 million active users - half the U.S. population. Facebook users like it enough to spend several hours each month interacting on the website, and the average Facebook user now has 229 "friends" - other users who agree to share a connection. Some “friends lists” get much longer. (Comedian Steve Hofstetter's “friend quest” earned him just over 200,000 friends before Facebook put the kibosh on it.)

Facebook users cite many reasons for their attraction to it. Unlike a chat
room, where the point is often hanging out with people you don't know in real life, on Facebook you can keep up with things about people you do know in real life, but without being obtrusive (unless you are commenting on all posts, which can seem like stalking). It’s a practical way to make plans and track down lost acquaintances, and it's convenient if you already have a cell phone or tablet or computer and essentially free.

But alongside of users' attraction to Facebook is an equally strong anxiety about participation in its services. 70 percent of Facebook users say they are “somewhat” or “very” concerned about privacy when using its services. Included among these privacy concerns are fears about identity theft. According to Gallup surveys, worry over identity theft has become the number one crime worry in the minds of Americans, topping our worry about car thefts, home break-ins, or muggings. While Facebook cannot be singled out from other internet services as a catalyst for privacy concerns, the history of its growth from 2004 until the present as the premier social networking site in the U.S. (and the world) parallels the rise in privacy-related concerns within the United States. Identity theft-related complaints to the U.S. Federal Trade Commission increased from about 247,000 in 2004 to 314,000 in 2008, and this same time period included establishment of the President’s Identity Theft Task Force (2006), the Identity Theft Red Flags Rule (2007), and passage of the Identity Theft Enforcement and Restitution Act (2008).

Immigration, reality TV, iPods and Facebook: the primary force behind these cultural events has not been the genius of Steve Jobs, the passion of Mark Zuckerberg or the 2012 Supreme Court verdict on Arizona immigration law. What's propelled this constellation of events has been an idea. We want everything and everyone included - every immigrant, every wannabe singer, every song we've ever heard, every friend we've ever made. Yet the more we include, the more reservations we have. Unlike freedom, inclusiveness is not an idea that we have adopted as a cause for banner-waving. It's not even part of the public discussion. But it's finding its way back into our experience and creating unexpected turbulence, including disturbed fluid dynamics like Obama's two-time election.

(a) Inclusiveness Before Facebook and iPod

One millennium ago - in 1,000 AD - the earth was 20 times less populated with humans and the continent of North America was home to less than 3 million. Yet included in these 3 million were more than 500 tribes who had developed rich cultural traditions. Included were the Mississippian cultures (Apalachee, Cherokee, Chickasaw, Choctaw, Natchez and Seminole), Plains cultures (Arapaho, Blackfoot, Cheyenne, Comanche, Crow and Lakota), Puebloan cultures (including Acoma, Hopi, Laguna, Rio Grande and Zuni) and Pacific Northwest and Marine cultures (including Chinook, Makah, Salish, Umpqua, Willapa, and Yurok).

While distinct in their specific practices, these tribes were similar in the core character of their experience. By “core character” I am referring to their fundamental sense of space and time and their experience of the here-and-now. For Native Americans in 1000 AD, the most distant reaches of time (primordial, sacred events taking place during the world's earliest moments) and the most distance reaches of space (stars and planets appearing in the morning or evening sky) were included in the here-and-now, instilling it with great power. When the Crow, Nez Perce, Pueblo, and other tribes came across the waterfall where Coyote got into trouble, or the tree where the Big Horn Sheep's horns got lodged, they experienced these locations not as neutral geographic landmarks but as places that were sacred and alive with the distant beyond of primordial events. By journeying to the waterfall or the tree, they gained direct access to that beyond. Similarly, the energy of birth, renewal, and new beginnings was included in commonplace experiences like facing eastward, traveling eastward, or tossing pinches of cornmeal in an eastwardly direction.

One millennium later, our experience has become quite different. The Hopi birth ritual of tossing cornmeal in an eastwardly direction might strike us as interesting. But it is difficult for us to treat this eastward tossing of cornmeal as a practice that literally draws power from the worldly beyond. We regard east as a compass direction, useful for getting our bearings but not possessing any special energy. It is difficult for us to imagine three people - one living in New York, one in Chicago, and the third in California - gaining genuine self-renewal by heading eastward. The Californian would end up in Chicago, the Chicagoan in New York, and the New Yorker afloat in the Atlantic.

(b) How Inclusiveness Got Displaced

It took an unprecedented mix of religious and scientific ideas in the minds of Europeans who colonized North America between 1000 and 2000 AD to dislodge the idea of inclusiveness from North American thinking. But the combined influence of Christianity and European science in the minds of the colonists turned them away from the idea of a world with innate grandeur. In fact, it turned them away from a world that could make sense on its own terms. Only faith in God and confession of God's sacrifice of His only begotten son could take the incomprehensibility of the world and turn it into something comprehensible. Along with the Christianity practiced by George Washington, John Adams, Thomas Jefferson, John Hancock, Benjamin Franklin, Samuel Adams, James Madison, James Monroe, William Penn, Roger Sherman, Benjamin Rush, John Witherspoon, John Jay, and Patrick Henry came a recognition of the highest good as no longer belonging to the world but rather to the Kingdom of Heaven:

“Do not love the world or the things in the world. If anyone loves the
world, the love of the Father is not in him. For all that is in the world - the
desires of the flesh and the desires of the eyes and pride in possessions -
is not from the Father but is from the world. And the world is passing away
along with its desires, but whoever does the will of God abides forever.”
(1 John 2:15-17, New American Standard Bible)

Prior to European colonization (beginning with a group of Christian friars brought over by Columbus on his second voyage in 1493), North America had never been a home to religious traditions that looked away from the world in this manner. Instead of treating the world as being contaminated with impermanence, what characterized the spirituality of Native Americans was a sense of divinity inherent in the world itself. “Do you know that trees talk?” wrote Tatanga Mani or Walking Buffalo, a Stoney Indian who lived from 1871-1967. “Well, they do. They talk to each other, and they'll talk to you if you listen. I have learned a lot from trees: sometimes about the weather, sometimes about animals, sometimes about the Great Spirit.”

The impermanence of the world and the unreliability of our connection with it through our bodily senses was a theme duplicated in the scientific thinking of the colonists. Like Christianity, European science promoted a view of the world's true nature as being inaccessible through daily experience. While we might enjoy seeing vibrant red colors in flowers or fruits, this experience could never help us recognize the reality of retinal receptors, reflectance spectra, and 700 nanometer electromagnetic radiation. Turing to worldly experience for answers about life was equally inappropriate in both the Christianity and science-based thinking of the colonists.

No historical development between 1000 AD and 2000 AD better epitomized rejection of the world's flow than the 1975 denunciation of astrology by 186 U.S. scientists:

“In ancient times people .. looked upon celestial objects as .. intimately connected with events here on earth. They had no concept of the vast distances from the earth to the planets ..

We are especially disturbed by .. dissemination of astrological charts .. We believe that the time has come to challenge directly, and forcefully, the pretentious claims of astrological charlatans.”

Thus wrote Sir Francis Crick, co-discoverer of DNA; Sir John Eccles, discoverer of the basic nerve cell function; Dr. Gerhard Herzberg, discoverer of free radical structures; Sir Peter Medawar, discovered of acquired immune tolerance, as well as 14 other Nobel Prize winners and 168 other accomplished scientists of the 20th and 21st century. In sharp contrast with Native American tribes who regarded the flow of planetary rhythms through human experience as essential for development of personal strengths and overcoming of personal weaknesses, and who wove seasons and moon phases into the assignment of birth totems, the idea of an intimate connection with the beyond was denounced by most revered scientists of our time as preposterous. Their position was consistent with the science-mindedness of the colonists two hundred years earlier, and with the Christianity of the colonists as well: let reason and faith inform our thinking about the here and now, not our experience in the everyday world.

(c) Peak Displacement

The extent to which inclusiveness had gotten pushed aside prior to the election of Obama was evidenced by two sayings that became popular during the 1980's and 1990's: the Nike slogan “Just do it” and the nutrition mantra “You are what you eat.” Just do it: set the world aside as a frame of reference, dis-regard all of the circumstances and go with your gut. Implied by “just do it” is: “just do it regardless.” Determining the “it” in “just do it” is not necessary in order to promote this viewpoint. Similarly, “You are what you eat”: even though this phrase suggests definite consequences (garbage in, garbage out), it still avoids saying anything about what, when, where, why, or how you eat. Like the saying “just do it,” it is based on an absolute indeterminateness. It is completely unnecessary to determine what is eaten in order to champion this viewpoint.

Over the course of the 1970's (Maze War, Spasim), 1980's (MIDI Maze), and 1990's (Wolfenstein, Doom, Duke Nukem, Quake), video gaming in the U.S. witnessed a dramatic rise in first person shooters (FPS). Today's top-selling FPS video games (the Call of Duty series from Treyarch and Activision; the Halo series from Bungie, 343 Industries and Microsoft; the Battlefield series from DICE and Electronic Arts; and the Medal of Honor series from DreamWorks and Electronic Arts) have collectively sold over 160 million copies, with total sales for the Call of Duty series alone standing at $3 billion. FPS games can be exciting and intense. Because you are looking at the world directly through the eyes of your character, it is easier to feel totally immersed in the action. But in exchange for this feeling of immersion, FPS gaming also requires a large amount of disregard for other aspects of the environment. In many FPS games, it is not possible to see what is happening far up ahead of you, far behind you, or out on the periphery. And while you might be able to practice tactical skills (e.g., flanking or occupying) or control assets during a mission, FPS games focus attention on immediate surroundings and immediate tasks (e.g., killing enemies or destroying enemy vehicles). The ability to make split-second decisions determines each player's win record, kill-to-death ratio, and ranking. In comparison with other game types, there is relatively little real-time interaction with the environment. Killing or destroying what gets in your way is the key to success.

Like the sayings “just do it” and “you are what you eat,” the rise in popularity of FPS gaming between 1970-2000 reflected the degree to which broad dimensions of experience - including events taking place behind, up ahead, and out on the periphery - became systematically excluded from decision-making. Immediate goals and immediate obstacles moved to the forefront. Not coincidentally, the phase “you are not the boss of me” and the mental health problem known as Oppositional Defiant Disorder (1980, U.S. Diagnostic and Statistical Manual of Mental Disorders III, diagnostic code 313.81) came into existence during this same time period. The common thread linking these cultural events was a conceptualization of freedom as the absence of constraint. Like an FPS mission, freedom became focused on removal of limitations that might be imposed by another person, by a government, or by the world itself. During Obama's second presidential campaign, the phrase “you are not the boss of me” was actually proposed by some Tea Party members as a national slogan for preservation of freedom.

The idea of preserving freedom by removing constraints only makes sense if inclusiveness is removed from experience. Consider the example of a river flowing “freely.” Only from the vantage point of a first person shooter - looking directly across the water's surface at eye level - is it possible to focus exclusively on immediate objects in the foreground like rocks or logs that block the way. From a third person vantage point looking down on the river from above, the importance of rocks and logs gets overshadowed by the influence of the wider landscape. However unacceptably obtrusive a rock in a riverbed might look from the perspective of a first person shooter, it's a falsely exaggerated unacceptability when more fundamental constraints are considered. imposed on the river by presence of hills and mountains, and by geological formations that decide the width and depth of its banks. In fact, if inclusiveness becomes the standard for evaluating the freedom of the river, it not only becomes impossible to view rocks in the riverbed as primary constraints on the river’s flow, but equally impossible to continue treating environmental factors as constraints on the river's freedom and nothing more. Hills and mountains cannot exclusively be viewed as “constraints” on the river, because without them, the river would lose its identity and become something else - like a marsh or lake or swamp.

Of course, rivers do not usually lose their identities and become marshes or lakes or swamps. They do not have the ability to change their behavior and proceed as if constraints did not exist. But humans do, and when we step forth (just do it) regardless (whatever) and proceed as if constraints were non-existent (you are not the boss of me), loss of identity becomes a realistic outcome. If we proceed regardless, we risk losing everyone and everything that might otherwise get held in high regard. If we exclude the world and its history, we leave ourselves with no way to discover what we are good at, what we are good for, what we feel called on to become. These are the experiences that shape identity, and without them, identity gets untied from individual lives - it's the price that gets paid for removal of inclusiveness from experience.

Once untied from individual lives, identity has no choice but to start diffusing outward. A basketball team is described as “needing a Michael Jordan.” The Republican Party is described as “needing a Teddy Roosevelt.” This description could have been phrased, “needing someone like Teddy Roosevelt.” But instead of pointing to a second identifiable someone, we turn the unique identity of Teddy Roosevelt into a category. No guest on The Jerry Springer Show is ever selected on the basis of individual identity. Jerry Springer guests must place their personal identities up for grabs, allow their life stories to be stripped of all but a few particulars, and then take the stage as categories: mistresses who got dumped, brothers who slept with their sister's boyfriends.

The diffusing outward of identity throws identity up for grabs, and it is accompanied by the worst outcome of displaced inclusiveness: identity theft. Among the 950,000 identity theft complaints that were filed with the U.S. Federal Trade Commission in 2008, you won't find a complaint about FPS gaming, Oppositional Defiant Disorder, or the Nike slogan “just do it”. You won't find any such complaints because we associate identity theft with widespread use of digital technology, computers, and the internet. We blame it on worldwide database access to personal identifiers (social security numbers, driver's license numbers, passwords and PIN codes), widespread use of plastic (credit cards and debit cards), and encoded tags to regulate personnel access student IDs, company IDs, etc.). But identity theft is simply the price tag for experience being emptied of inclusiveness.

(d) Backward Acceleration

The sayings “just do it” and “whatever” became highly popular in our culture as a result of lost inclusiveness. As sayings founded on dis-regard, they captured the truth of lost inclusiveness. But if inclusiveness had been removed from our experience and nothing more, it would not have been possible for iPods and Facebook and reality TV to skyrocket in popularity, since the attractiveness of these events requires us to feel some “pull” from inclusiveness.

During the years 2000-2008 - the years just prior to Obama's presidency - something had to happen in order for inclusiveness to recapture our attention. That something was the internet. 2000 AD marked the first time in human history that the volume of traffic over computer networks exceeded the volume of voice traffic over telephone lines. It was also the year in which final agreement was reached in Germany on international standards for digital video broadcasting (DVB). Both of these events paved the way for what today is sometimes referred to as the "macroscope:" a phenomenon (the internet) that forces us to look at everything (skopein, “observe” and makros, “great”) all at once and in real-time. Continuous inundation with everything is a defining feature of the internet. It is also the defining feature of inclusiveness.

In the years just prior to Obama's presidency, inclusiveness was not only a lost idea. It was an idea also thrusting its way back into our experience. It was as if this idea had become a ball weight attached to the wire of a pendulum and we were being struck by the force of the pendulum as it accelerated back into our experience:

Inclusiveness Getting Displaced

In this illustration, 1000 AD is represented as the pendulum's resting position. It is treated as a time when inclusiveness shaped the core character of North American experience. 2000 AD is depicted as a moment of peak displacement. By this time, inclusiveness had been completely pushed aside in favor of whatever.

Backward Acceleration and the Return of Inclusiveness

2000-2008 AD is portrayed as the initial moment of backward acceleration. During this period of time, the inevitability of inclusiveness began sweeping back into our experience and creating confusion and mixed emotions over issues like immigration and reality TV.

This pendulum swing became Barrack Obama's magic carpet. He did not need a political track record to get elected in 2008. Nor did he need majority support from men, white Protestants, churchgoers, or gun owners. Similarly, in order to get re-elected in 2012, he did not need a strong first term or majority support from married voters or voters over 65. There was no need for the electorate to choose Obama. Inclusiveness chose him for us. The pendulum swept him into office. Obama's missing rise to greatness and the missing strength of his commitment to civil rights did not hold him back as a presidential candidate because the potential for inclusiveness was embodied in his presence (and self-identification) as an African American and because his campaign occurred at a time when inclusiveness had gained fresh footing in our experience. All that Obama needed - as Matt Tabbai, political analyst for The Rolling Stone wrote in 2007 - was to be “big enough to be anything to anyone” while still being “intimate enough to inspire” voters. So long as Obama did nothing to undermine the potential for inclusiveness that he embodied, inclusiveness would sweep him into office.

Corporate marketing firms like GMMB and The Parker Group helped “brand Obama” win Advertising Age's Marketer of the Year award in 2008 - beating out Apple, Nike, Zappo's, and Coors. (Obama for America also won 2 Grand Prix awards at the Cannes Lions International Advertising Festival in the following year.) There are many well-identified features of the Obama brand: acknowledging all viewpoints; showing an ability to feel the unique trials and tribulations of all races, religions, and nations; giving an impression of serving widespread public interests (complete with astroturfing as needed), and refusing to dismiss the greatness of private enterprise and entrepreneurship. Yet no aspect of this branding strategy would have been successful without the pendulum swing. The idea of inclusiveness - and its moment of return after centuries of absence - elected Barrack Obama. And it allows his ongoing presidency to fly in the face of the facts.

Email me when Buck Levin publishes or recommends stories