(21) Thu Dec 8
The App Generation

BEFORE CLASS:
1. Watch
Jonathan Safran Foer is a novelist who teaches creative writing at NYU.
Video: Jonathan Safran Foer’s 2013 address to the Middlebury College class of 2013. Note: start watching at 4:00, when Foer begins his main address. Share a one sentence critical response on Vialogues.
2. Read/Write
Source: Andrew Keen, #digitalvertigo (2011)
1
@alexia: We would have lived our lives differently if we had known they would one day be searchable.
2
Rather than virtual or second life, social media is actually becoming life itself — the central and increasingly transparent stage of human existence, what Silicon Valley venture capitalists are now calling an “internet of people.” As the fictionalized version of Facebook president Sean Parker — played with such panache by Justin Timberlake — predicted in the 2010 Oscar-nominated movie The Social Network: “We lived in farms, then we lived in cities, and now we’re gonna live on the Internet!” Social media is, thus, like home; it is the architecture in which we now live.
3
Nicholas Carr, one of today’s most articulate critics of digital utilitarianism argues: “What we don’t share is as important as what we do share.”
4
In 1787, at the dawn of the mass industrial age, Jeremy Bentham designed what he called a “simple idea in architecture” to improve the management of prisons, hospitals, schools and factories. Bentham’s idea was, as the architectural historian Robin Evans noted, a “vividly imaginative” synthesis of architectural form with social purpose. Bentham, who amassed great personal wealth as a result of his social vision, wanted to change the world through this new architecture. Bentham sketched out this vision of what Aldous Huxley described as a “plan for a totalitarian housing project” in a series of “open” letters written from the little Crimean town of Krichev, where he and his brother, Samuel, were instructing the regime of the enlightened Russian despot Catherine the Great about the building of efficient factories for its unruly population. In these public letters, Bentham imagined what he called this “Panopticon” or “Inspection-House” as a physical network, a circular building of small rooms, each transparent and fully connected, in which individuals could be watched over by an all-seeing inspector. This inspector is the utilitarian version of an omniscient god — always-on, all-knowing, with the serendipitous ability to look around corners and see through walls. As the French historian Michel Foucault observed, this Inspection House was “like so many cages, so many small theaters, in which each actor is alone, perfectly individualized and constantly visible.” The Panopticon’s connective technology would bring us together by separating us, Bentham calculated. Transforming us into fully transparent exhibits would be good for both society and the individual, he adduced, because the more we imagined we were being watched, the more efficient and disciplined we would each become.
5
(The internet) is finally realizing (Jeremy Bentham’s) utilitarian dream of allowing us to be perpetually observed. This digital architecture — described by New York University social media scholar Clay Shirky as the “connective tissue of society” and by U.S. Secretary of State Hillary Clinton as the new “nervous system of the planet” — has been designed to transform us into exhibitionists, forever on show in our networked crystal palaces. And, today, in an age of radically transparent online communities like Twitter and Facebook, the social has become, in Shirky’s words, the “default” setting on the Internet, transforming digital technology from being a tool of second life into an increasingly central part of real life…As WikiLeaks founder and self-appointed transparency tsar Julian Assange said, today’s Internet is “the greatest spying machine the world has ever seen,” with Facebook, he added, being “the world’s most comprehensive database about people, their relationships, their names, their addresses, their locations, their communications with each other, and their relatives, all sitting within the United States, all accessible to US Intelligence.” But it’s not just Facebook that is establishing this master database of the human race. As Clay Shirky notes, popular geo-location services such as foursquare, Facebook places, Google Latitude, Plancast and the Hotlist, which enable us to “effectively see through walls” and know the exact location of all our friends, are making society more “legible,” thus allowing all of us to be read, in good Inspection-House fashion, “like a book.” No wonder, then, that Katie Rolphe, a New York University colleague of Shirky, has observed that “Facebook is the novel we are all writing.”
6
This contemporary mania with our own self-expression is what two leading American psychologists, Dr. Jean Twenge and Dr. Keith Campbell, have described as “the narcissism epidemic” — a self-promotional madness driven, these two psychologists say, by our need to continually manufacture our own fame to the world. The Silicon Valley– based psychiatrist, Dr. Elias Aboujaoude, whose 2011 book, Virtually You, charts the rise of what he calls “the self- absorbed online Narcissus,” shares Twenge and Campbell’s pessimism. The Internet, Dr. Aboujaoude notes, gives narcissists the opportunity to “fall in love with themselves all over again,” thereby creating a online world of infinite “self-promotion” and “shallow web relationships.” Many other writers share Aboujaoude’s concerns. The cultural historian Neal Gabler says that we have all become “information narcissists” utterly disinterested in anything “outside ourselves.” Social network culture medicates our “need for self-esteem,” adds best-selling author Neil Strauss, by “pandering to win followers.”
7
Twenge, Campbell, Aboujaoude, Strauss and Franzen are all correct about this endless loop of great exhibitionism — an attention economy that, not uncoincidentally, combines a libertarian insistence on unrestrained individual freedom with the cult of the social. It’s a public exhibition of self-love displayed in an online looking glass that New Atlantis senior editor Christine Rosen identifies as the “new narcissism” and New York Times columnist Ross Douthat calls a “desperate adolescent narcissism.” Everything — from communications, commerce and culture to gaming, government and gambling — is going social. As David Brooks, Douthat’s colleague at The Times, adds, “achievement is redefined as the ability to attract attention.” All we, as individuals, want to do on the network, it seems, is share our reputations, our travel itineraries, our war plans, our professional credentials, our illnesses, our confessions, photographs of our latest meal, our sexual habits of course, even our exact whereabouts with our thousands of online friends.
8
Zuckerberg’s five-year plan is to eliminate loneliness. He wants to create a world in which we will never have to be alone again because we will always be connected to our online friends in everything we do, spewing huge amounts of our own personal data as we do it. “Facebook wants to populate the wilderness, tame the howling mob and turn the lonely, antisocial world of random chance into a friendly world, a serendipitous world” (wrote) Time’s Lev Grossman.
9
Facebook, with its members investing over 700 billion minutes of their time per month on the network, was the world’s most visited Web site in 2010 making up 9 percent of all online traffic. By early 2011, 57 percent of all online Americans were logging onto Facebook at least once a day, with 51 percent of all Americans over twelve years old having an account on the social network and 38 percent of all the Internet’s sharing referral traffic emanating from Zuckerberg’s creation. By September 2011, more than 500 million people were logging onto Facebook each day with its then almost 800 million active users being larger than the entire Internet was in 2004. Facebook is becoming mankind’s own image.
10
Whether we like it or not, twenty-first-century life is increasingly being lived in public. Four out of five college admissions offices, for example, are looking up applicants’ Facebook profiles before making a decision on whether to accept them. A February 2011 human resources survey suggested that almost half of HR managers believed it was likely that our social network profiles are replacing our resumes as the core way for potential employers to evaluate us. The New York Times reports that some firms have even begun using surveillance services like Social Intelligence, which can legally store data for up to seven years, to collect social media information about prospective employees before giving them jobs. “In today’s executive search market, if you’re not on LinkedIn, you don’t exist,” one job search expert told The Wall Street Journal in June 2011. LinkedIn now even enables its users to submit their profiles as resumes, thus inspiring one “personal branding guru” to announce that the 100 million member professional network is “about to put Job Boards (and Resumes) out of business.”
11
Writing in 1948, Orwell imagined…“In principle a Party member had no spare time, and was never alone except in bed,” Orwell wrote in Nineteen-Eighty-four. “It was assumed that when he was not working, eating, or sleeping he would be taking part in some kind of communal recreation: to do anything that suggested a taste for solitude, even to go for a walk by yourself, was always slightly dangerous. There was a neologism for it in Newspeak: Ownlife, it was called, meaning individualism and eccentricity.” And there was another neologism in Newspeak: “facecrime,” Orwell coined it. “It was terribly dangerous to let your thoughts wander when you were in any public place or within range of a telescreen,” he wrote. “The smallest thing could give you away. A nervous tic, an unconscious look of anxiety, a habit of muttering to yourself — anything that carried with it the suggestion of abnormality, of having something to hide. In any case, to wear an improper expression on your face (to look incredulous when a victory was announced, for example) was itself a punishable offence. There was even a word for it in Newspeak: facecrime, it was called.”…Newspeak’s “facecrime” has been turned on its head in our world of endless tweets, check-ins and status updates. In Nineteen Eighty-four, it was a crime to express yourself; today, it is becoming unfashionable, perhaps even socially unacceptable not to express oneself.
12
As best-selling digital evangelists Don Tapscott and Anthony D. Williams argue in their 2010 book MacroWikinomics, today’s Internet represents “a turning point in history.” We are entering what they call “the age of networked intelligence,” a “titanic” historic shift, they pronounce, equivalent to the “birth of the modern nation-state” or the Renaissance. Mark Pincus’s always-on social dial tone, Tapscott and Williams argue, represents a “platform for the networking human minds” that will enable us “to collaborate and to learn collectively.” Echoing Mark Zuckerberg’s five-year vision of social media’s revolutionary impact on the broader economy, they predict that politics, education, energy, banking, healthcare and corporate life will all be transformed by what social utopians embrace as the openness and “sharing” of the networked intelligence age.
13
Botsford, Rogers, Tapscott, Williams and the rest of the social media quixotics are wrong that the Internet is resulting in a new age of “networked intelligence.” In fact, the reverse may well be true. From Zuckerberg’s Facebook, Hoffman’s LinkedIn and Stone’s Twitter to SocialEyes, SocialCam, foursquare, ImageSocial, Instagram, Living Social and the myriad of other digital drivers of John Doerr’s third great wave, the network is creating more social conformity and herd behavior. “Men aren’t sheep,” argued John Stuart Mill, the nineteenth century’s greatest critic of Benthamite utilitarianism, in his 1859 defense of individual freedom On Liberty. Yet on the social network, we seem to be thinking and behaving more and more like sheep, making what cultural critic Neil Strauss describes as “the need to belong,” rather than genuine nonconformity, the rule. “While the Web has enabled new forms of collective action, it has also enabled new kinds of collective stupidity,” argues Jonas Lehrer, a contributing editor to Wired magazine and a best-selling writer on both neuroscience and psychology. “Groupthink is now more more widespread, as we cope with the excess of available information by outsourcing our beliefs to celebrities, pundits and Facebook friends.
14
The more “friends” you have on Twitter or Facebook, therefore, the more potentially valuable you become in terms of getting your friends to buy or do things. We “manage” our friends in the social networking world in the same way as we “manage” our assets in the financial marketplace. “There is something Orwellian about the management speak on social networking sites,” notes the ever perceptive Christine Rosen, who adds that such terminology encourages “the bureaucratization of friendship.”
15
In our digital age, we are, ironically, becoming more divided than united, more unequal than equal, more anxious than happy, lonelier rather than more socially connected. A November 2009 Pew Research report about “Social Isolation and New Technology,” for example, found that members of networks like Facebook, Twitter, MySpace and LinkedIn are 26 percent less likely to spend time with their neighbors (thus, ironically, creating the need for social networks like Nextdoor.com and Yatown that connect local communities). A 2007 Brigham Young University research study, which analysed 184 social media users, concluded that the heaviest networkers “feel less socially involved with the community around them.” While a meta-analysis of seventy-two separate studies conducted between 1979 and 2009 by the University of Michigan’s Institute for Social Research showed that contemporary American college students are 40 percent less empathetic than their counterparts in the 1980s and 1990s. Even our tweets are becoming sadder, with a study made by scientists from the University of Vermont of 63 million Twitter users between 2009 and 2011 proving that “happiness is going downhill.” Most troubling of all, a fifteen-year study of 300 social media subjects by Professor Sherry Turkle, the director of MIT’s Initiative on Technology and the Self, showed that perpetual networking activity is actually undermining many parents’ relationship with their children. “Technology proposes itself as the architect of our intimacies,” Turkle says about the digital architecture in which we are now all living. But the truth, her decade and a half of research reveals, is quite the reverse. Technology, she finds, has become our “phantom limb,” particularly for young people who, Turkle finds, are sending up to 6,000 social media announcements a day and who have never either written nor received a handwritten letter. No wonder, then, that teens have not only stopped using email, but also no longer use the telephone — both are too intimate, too private for a digital generation that uses texting as a “protection” for their “feelings.”
16
In describing what she calls the “practice of the protean self,” MIT’s Turkle argues that “we have moved from multitasking to multi-lifing.” But while we are forever cultivating our collaborative self, she argues, what is being lost is our experience of being alone and privately reflecting on our emotions. The end result, Turkle explains, is a perpetual juvenile, somebody she calls a “tethered child,” the type of person who, like one of Turkle’s subjects in her study, believes that “if Facebook were deleted, I’d be deleted too.” The end result, Turkle explains, is a perpetual juvenile, somebody she calls a “tethered child,” the type of person who, like one of Turkle’s subjects in her study, believes that “if Facebook were deleted, I’d be deleted too.”
17
So what is the real value of social media in repressive regimes? “Twitter is a wonderful tool for secret policeman to find revolutionaries,” Friedman told me. His analysis reflects the so-called “Morozov Principle” of Stanford University scholar Evgeny Morozov, whose 2010 book, The Net Delusion: The Dark Side of Internet Freedom argues that social media tools are being used by secret policemen in undemocratic states like Iran, Syria, and China to spy on dissidents. As Morozov told me when he appeared on my TechcrunchTV show in January 2011, these authoritarian governments are using the Internet in classic Benthamite fashion — relying on social networks to monitor the behavior, activities and thoughts of their own citizens. In China, Thailand, and Iran, therefore, the use of Facebook can literally be a facecrime and the Internet’s architecture has become a vast Inspection-House, a wonderful tool for secret policemen who no longer even need to leave their desks to persecute their own people.
18
Not only is social media being used by repressive regimes or organizations to strengthen their hold on power, but it is also compounding the ever-widening inequalities between the influencers and the new digital masses. If identity is the new currency and reputation the new wealth of the social media age, then today’s hypervisible digital elite is becoming a tinier and tinier proportion of the population…On Twitter, for example, only 0.05 percent of people have more than 10,000 followers with 22.5 percent of users accounting for 90 percent of activity, thus reflecting the increasingly unequal power structure of an attention economy in which the most valuable currency is being heard above the noise. “Monopolies are actually even more likely in highly networked markets like the online world,” wrote Wired editor-in-chief Chris Anderson…The inequalities between rich and poor nodes is even more exaggerated in the wake of 2009’s Great Recession. “The people who use these [social media] tools are the ones with higher education, not the tens of millions whose position in today’s world has eroded so sharply,” notes Time magazine business columnist Zachary Karabell. Social media contribute to economic bifurcation.… The irony is that social media widen the social divide, making it even harder for the have-nots to navigate. They allow those with jobs to do them more effectively and companies that are profiting to profit more. But so far, they have done little to aid those who are being left behind. They are, in short, business as usual.”
19
The problem is that our ubiquitous online culture of “free” means that every social media company — from Facebook to Twitter to geolocation services like foursquare, Hitlist, and Plancast — relies exclusively on advertising for its revenue. And it’s information about us — James Gleick’s “vital principle” — that is driving this advertising economy. As MoveOn.org president Eli Pariser, another sceptic concerned about the real “cost” of all these free services, argues in his 2011 book The Filter Bubble, “the race to know as much as possible about you has become the central battle of the era for Internet giants like Google, Facebook, Apple and Microsoft.”
20
As MoveOn.org president Eli Pariser, another sceptic concerned about the real “cost” of all these free services, argues in his 2011 book The Filter Bubble, “the race to know as much as possible about you has become the central battle of the era for Internet giants like Google, Facebook, Apple and Microsoft.” “It is fundamentally impossible for a digital advertising business to care deeply about privacy, because the user is the only asset it has to sell. Even if the founders and executives want to care about privacy, at the end of the day, they can’t: the economic incentives going the other direction are just too powerful,” Michael Fertik, the Silicon Valley–based CEO of Reputation.com, a company dedicated to protecting our online privacy, told me. Fertik’s argument is reiterated by the media theorist and CNN columnist Douglas Rushkoff who explains that rather than being Facebook’s customers, “we are the product.”
21
“Bowling Alone syndrome” — a reference to the communitarian theories of Harvard University sociologist Robert Putnam, whose highly influential and best-selling Bowling Alone regards the digital network as the solution to what he considers as the crisis of local community. Writing, in 2000 — only a couple of years after @quixotic created the first social media business — Putnam sees electronic media as the twenty-first-century means of reinventing community engagement. “Let us find ways to ensure that by 2010 Americans will spend less leisure time sitting passively alone in front of glowing screens and more time in active connection with our fellow citizens,” he argued with communitarian fervor. “Let us foster new forms of electronic entertainment and communication that reinforce community engagement rather than forestalling it.”
22
This intellectual obsession with the social, an obsession with sharing — what today, “as the arc of information flow bends toward ever greater connectivity,” is fashionably called a “meme” (but is, in many ways, a virus) — can be seen across many different academic disciplines. The concepts of togetherness and sharing have acquired such religious significance that, in stark contrast with the research of Oxford University’s Baroness Susan Greenfield, some scientists are now “discovering” its centrality in the genetic make-up of the human condition. One “neuroeconomist,” a certain Dr. Paul Zak from the California Institute of Technology, has supposedly found that social networking activates the release of “generosity-trust chemical in our brains.” Larry Swanson and Richard Thompson from the University of Southern California are even “discovering” that the brain resembles a interconnected community — thereby triggering the ridiculous headline: “Brain works more like internet than ‘top down’ company.”
23
“The future is already here,” William Gibson observed in 1993, “it’s just unevenly distributed.” One version of the future, at least our social future, may have arrived, a handful of years after Gibson first made this prescient remark, at the very end of the twentieth century.
24
As John Stuart Mill argues in On Liberty, government exists to protect us from others rather than from ourselves and the reality, for better or worse, is that once a photo, an update or a tweet is publicly published on the network, it becomes de facto public property. So, without wishing to sound too much like the uber-glib Eric Schmidt, the only way to really protect one’s own privacy is by not publishing anything in the first place.
25
The European Union has been much more aggressive than the United States government in pushing for privacy rights over social networks. On the all-important issue of online tracking by social media companies, for example, European privacy regulators have been pushing to establish an arrangement in which consumers could only be tracked if they actively “opt in” and permit marketers to collect their personal data. Europeans have also been more aggressive in pushing back against the leading Web 3.0 companies. In April 2011, for example, the Dutch government threatened Google with fines of up to $1.4 million if it continued to ignore data-protection demands associated with its Street View technology. Apple and Google face much tighter regulation in Europe with the EU classifying the location information that they have been collecting from their smartphones as personal data. European Union data protection regulators have aggressively scrutinized Facebook’s May 2011 rollout of its facial recognition software that reveals people’s identities without their permission…EU justice commissioner Viviane Reding is even intending social networks to establish a “right to be forgotten” option that would allow users to destroy data already published on the network. “I want to explicitely clarify that people shall have the right — and not only the possibility — to withdraw their consent to data processing,” Reding told the EU parliament in March 2011.”
26
According to the executive editor of The New York Times, friendship has become a kind of drug on the Internet, the crack cocaine of our digital age. “Last week, my wife and I told our 13-year-old daughter she could join Facebook,” confessed The New York Times’ Bill Keller in May 2011. “Within a few hours she had accumulated 171 friends, and I felt a little as if I had passed my child a pipe of crystal meth.” A June 2011 Pew Research Center study of over two thousand Americans reported that electronically networked people like Keller’s daughter saw themselves as having more “close friends” than those of us — those “weirdo outcasts” according to one particularly vapid social media commentator — who aren’t on Facebook or Twitter. The Pew report found that the typical Facebook user has 229 friends (including an average of 7 percent that they hadn’t actually met) on Mark Zuckerberg’s network and has more “close relationships” than the average American. But this June 2011 Pew study made no attempt to define or calibrate the idea of “friendship,” treating each one quantatively, like a notch on a bedpost, and presenting Facebook and Twitter as, quite literally, the architects of our intimacies. What this survey failed to acknowledge is that human beings aren’t simply computers, silicon powered devices with infinitely expandable hard drives and memories, who can make more friends as a result of becoming more and more networked. So how many real friends should we have? And is there a ceiling to the number of friendships that we actually can have?
27
A couple of miles north of the Oxford Mal hotel sits the gray-bricked home of Oxford University’s Institute of Cognitive and Evolutionary Anthology. It is here, in the nondescript academic setting of a north Oxford suburb, that we find a man who has determined how many friends we really need. Professor Robin Dunbar, the director of this institute, is an anthropologist, evolutionary psychologist and authority on the behavior of primates, the biological order that includes monkeys, apes and humans. And he has become a social media theorist too, best known for formulating a theory of friendship dubbed “Dunbar’s Number.” “The big social revolution in the last few years has not been some great political event, but the way our social world has been redefined by social networking sites like Facebook, MySpace and Bebo,” Dunbar explains his eponymous number. This social revolution, he says, attempts to break through “the constraints of time and geography” to enable uber-connected primates like @scobleizer to establish online friendships with tens of thousands of other wired primates. “So why do primates have such big brains?” Dunbar asks, rhetorically. Their large brains, he says, borrowing from a theory known as the “Machiavellian intelligence hypothesis,” are the result of “the complex social world in which primates live.” It’s the “complexity of their social relations” defined by their “tangled” and “interdependent” personal intimacies, Dunbar argues, that distinguishes primates from every other animal. And as the most successful and widely distributed member of the primate order, he goes on, humans brains have evolved most fully of all because of the intricate complexity of our “intense social bonds.” Memory and forgetting are the keys to Dunbar’s theory about human sociability. You’ll remember that The New York Times’ Paul Sullivan suggested that the Internet is “like an elephant” because it never forgets. But what really distinguishes animals like elephants from primates, Robin Dunbar explains, is that they “use their knowledge about the social world in which they live to form more complex alliances with each other than other animals.” Thus primates have a lot more to remember about our social intimacies than elephants — which may be one reason why humans forget things and elephants supposedly don’t. For better or worse, nature hasn’t come up with a version of Moore’s Law that could double the size and memory capacity of our brain every two years. Thus, while our big brains are the result of our complex social relationships, they are still confined by their limited memories. And it’s our biological inability to remember the intricate social details of large communities, Robin Dunbar explains, that limits our ability to make intimate friendships. “We can only remember 150 individuals,” Dunbar says, “or only keep track of all the relationships involved in a community of 150.” That is Dunbar’s Number — our optimal social circle, for which we, as a species, are wired.
4a. Link — articles
According to AVG Technologies, 58% of children aged between 3 and 5 can successfully operate a smartphone but fewer…www.breitbart.com
Owen Lanahan's parents demand that his cellphone be stored in the kitchen by 10 p.m., but sometimes he sneaks it into…www.nytimes.com
"It kind of, almost, promotes you as a good person. If someone says, 'tbh you're nice and pretty,' that kind of like…www.washingtonpost.com
Boys like Facebook, girls like Instagram. Wealthier kids Snapchat. Lower income kids Facebook. And somehow Google+ is…www.npr.org
Young children-even toddlers-are spending more and more time with digital technology. What will it mean for their…www.theatlantic.com
REDWOOD CITY, Calif. - On the eve of a pivotal academic year in Vishal Singh's life, he faces a stark choice on his…www.nytimes.com
US senator Charles Schumer says some videogames aimed at kids "desensitize them to death and destruction." But dire…www.wired.com
Article on challenges facing advertisers and media and technology companies as they try to reach millennials…www.nytimes.com
4b. Link — video
DURING CLASS:
1. Current events
- Discussion leader: Robert
2. Lesson work
- Assigned reading/video discussion leader: A
- Online discussion leader: B
- Links library discussion leader: C
3. Digerati: Katie Davis
- Activity leader: D
Katie Davis is an Assistant Professor at The University of Washington Information School, where she studies the role of networked technologies in teens’ lives. She is the co-author with Howard Gardner of The App Generation: How Today’s Youth Navigate Identity, Intimacy, and Imagination in a Digital World (2013) which explores how today’s “digital youth” are different from the youth who grew up in a pre-digital era.
In-class reading (click here for a copy) excerpted from: Katie Davis and Howard Gardner, “The App Generation: How Today’s Youth Navigate Identity, Intimacy and Imagination in a Digital World (2013)
In-class video: Katie Davis talks about her book “The App Generation” (2013).
4. Preview
Preview Sunday Story #11
- David Shields chapter
Preview homework for class 22: Tue Dec 13
- Homework
- Classroom leadership assignments