“The Internet is Over”

Or: “Science Finds, Industry Applies, Man Conforms”

J. Robert Oppenheimer, the father of the atomic bomb, testifying before the Atomic Energy Commission in 1951
“When you see something that is technically sweet, you go ahead and do it, and you argue about what to do about it after you’ve had your technical success.”
— J. Robert Oppenheimer

This idea — this simple, insidious, ubiquitous idea — will be our undoing. It pervades the mind of every Promethean technocrat: “Onwards and upwards! May the public do with our devices what they will.”

But this mantra is dangerous and pervasive for the same reason: it’s easy — and it’s fun. It means dreaming up every idea and chasing each one with zealous fervor. It’s being an experimental bull in the china shop of our collective socioecosystem. It’s recklessness, it’s thoughtlessness — and it’s creativity at its sweetest and its most risky.

So because this is what I’m saying, I feel I have to say this too:

I am no technophobe. I’m not advocating against innovation here.

In other words: I do not live in constant fear.

I have not smashed my iPhone and its SIM card with a hammer. I have not futilely attempted to delete all digital traces of myself. I have not opted to live, nor do I long for, an analog life, seeking out — perhaps even attaining — a quiet semblance of privacy. The tumultuous relationship between Technology and 21st-century life does not make me cower in tinfoil-lined corners.

I do not want to flee to a cabin in the woods.

So, no, I am not John Zerzan; I am not Ted Kaczynski; and I certainly do not agree with Chuck Klosterman’s immediate affirmative upon being asked whether “it would be good for the world if the Internet spontaneously went black and never returned.”

But what do we do about recent inventions which, like most innovations, have the potential to benefit some and hurt a whole lot?

Of course, this doesn’t mean that I don’t have some very serious questions about our centuries-old love/hate affair with Technology — and, of course, it doesn’t mean that certain governments don’t have some serious explaining to do regarding certain abuses of certain modern technologies.

But as the reactions to the ongoing revelations about the scope of digital surveillance conducted by the world’s most powerful governments stay along the lines of, “Well, I suppose I’ll be a Luddite now,” or, “Who cares! I’ve got nothing to hide,” I’m constantly brought back to my own attempts at reconciling ideas regarding what exactly to do about recent “technical sweets” (e.g., drones, Google Glass, the World Wide Web) which, like most inventions (e.g., guns, cars, GPS), have the potential to benefit some and hurt a whole lot.

“Technologies are not impartial conduits through which users enact their wishes. A technology is not a neutral thing in the world.” — Jathan Sodowski

Jathan Sodowski’s critical review of Google cohorts Eric Schmidt and Jaren Cohen’s The New Digital Age, makes one excellent, rarely-heard point:

[Technologies] are not impartial conduits through which users enact their wishes. There are indeed people who will seek to use digital tools for specific purposes not intended by their designers, but the technologies themselves also come with a suite of biases, politics, and values that are both consciously and unconsciously programmed into them.
A technology is not… a neutral thing in the world. Technologies afford some activities and types of relationships more than they do others, and it’s crucial for those who design, manufacture, and market those technologies to understand those affordances.

This is wonderfully put: Sodowski turns the facetious, blame-displacing argument of “X doesn’t kill people, people kill people” on its head and rightfully chastises otherwise intelligent designers for not seeing an increasingly crucial bigger picture. I, too, advocate for a more conscious approach to design on the part of inventors and creators; I’m a huge fan of Carl DiSalvo’s concept of “adversarial design,” described by Evgeny Mozorov as:

…not just [building] an artifact to fulfill some genuine social need but also [making] us reflect on how that need has emerged, how it has become a project worth pursuing, and how it may actually not be worth pursuing at all.

DiSalvo points to inverted crime maps “that… [instead] show which city blocks have the most former residents incarcerated,” or to “browser extensions that add information about military funding to the websites of universities.”

What about an approach to innovation that sacrifices a modicum of convenience in favor of a healthy helping of critical thinking and self-awareness?

Or take the Natural Fuse, a complex system of plants, sensors, and electric outlets, available to multiple users, powering everyone’s plugged-in devices so long as each participant’s plants can absorb the carbon emissions they’re responsible for.

The Natural Fuse in action.

The catch? A switch toggling between Off, Selfless, and Selfish; use up your share of energy, and you’re forced to go into Selfish mode, now feeding on the carbon reserves of the plants your neighbors and friends may have generously set to Selfless. Overuse the system and you’ll kill their plants, and then — along with everyone else on the network — receive an email declaring the plant’s demise, and your abuse of the system.

The above serve as just rudimentary examples of an approach to innovation that sacrifices a modicum of convenience in favor of a healthy helping of critical thinking and some much-needed self-awareness.

But even still… Sodowski’s declaration remains holey.

In order to illuminate the fault inherent to his position, we have to turn to a similar(ly incomplete) criticism, this one from halfway around the globe.

This criticism came from curious article from Russia Today, making the requisite rounds on the social-media circuit a while back — despite being years old at the time — and alerting free-speech advocates and their allies to the Apple patent that could enable institutions to “block transmission of information, including video and photographs, from any public gathering or venue they deem ‘sensitive.’”

The article was met with the requisite cursory shock and appall — the sort that fades quickly down the Newsfeed tubes, replaced by selfies and status updates. But like so many articles of its kind, its rhetoric — and the rhetoric of those passing it through the grapevine — pointed the castigatory finger at Apple for inventing the technology in the first place.

The Russia Today piece falls prey to the same fallacy Sadowski does, one often perpetuated by articles like these and their disseminators: an increasingly prevalent rhetoric that continues the dangerous trend of only vilifying inventors, and vilifying them for engaging their creative ferocity. Apple Inc. is surely no saint (what Big Tech company is?) but trying to guilt it and only it (and into what, suspending creativity?) would mean implicating only one part of the complex chain of those “at fault.”


Notorious for a certain flippancy in the early days of the Snowden summer, Obama proclaimed in June of 2013 — amid the (surely nodding) heads of the Silicon Valley elite — that we “can’t have 100 percent security and also then have 100 percent privacy and zero inconvenience.

A handy guide to the Silicon Valley Supper of 2013, with President Barack Obama at the center (http://searchengineland.com/obamas-silicon-valley-tech-supper-65412)

That statement — while simultaneously problematic, ominous, and overly dismissive coming from a head of state with increasingly inordinate amounts of power — rings true at the level of the individual person, the everyman.

21st-century life — online and off — won’t be as easy or straightforward as it has been since the Internet’s advent, but that doesn’t mean it has to become impossible. And we can find ways to make it easier — but we must also put in a certain amount of effort.

We have to forge an ethos of collaboration between experts and dilettantes, between technophiles and technophobes, between citizens on the ground and citizens in cyberspace.

Much like the simple but comprehensive walkthroughs Glenn Greenwald describes Snowden making for him in the former’s recently-released No Place to Hide, many a guide to acquiring (a pinch of) privacy in an NSA world popped up everywhere post-Snowden. They came paired with detailed explanations of why these techniques could only really give you the tiniest inch of security — and why you should employ them anyway.

Download Mozilla Firefox these days, and you’ll be greeted with a full-fledged manifesto on online privacy and security — complete with discussion questions to pose to friends and family! Mosey over to the website for the Surveillance Self-Defense (SSD) project — a work of the Electronic Frontier Foundation — and you’ll find video guides to beefing up digital protections, tailored for protestors, journalists, human rights activists, and even seasoned security vets.

In other words, rather than security professionals simply scoffing at the public’s shock, dismissing the reaction as juvenile or misplaced (or both), many actually reached out just as Snowden had: to advocate and to educate.

What a concept!

It may seem obvious, but this ethos of collaboration — between experts and dilettantes, between technophiles and technophobes, between citizens on the ground and citizens in cyberspace — is important to foreground when unraveling our relationship with Technology. Bruce Schneier’s solution seems like familiar advice, but no less important for our collective failure to listen:

Curbing the power of the corporate-private surveillance partnership requires limitations on both what corporations can do with the data we choose to give them and restrictions on how and when the government can demand access to that data.
Because both of these changes go against the interests of corporations and the government, we have to demand them as citizens and voters. We can lobby our government to operate more transparently… and hold our lawmakers accountable when it doesn’t.

Now we’re starting to get even further. We have our fingers pointed at the creators, and the manufacturers, and the legislators — the big picture becomes even more visible.

But the key to “seeing it all” — paraphrasing here from the mind of former NSA Director Keith Alexander — is hidden in the full text of the aforementioned Apple patent, resting unread by those who took the Russia Today article at face-value (emphasis mine):

A locker room facility may issue a command that prevents use of a cellular phone camera or laptop computer camera while in that area, thereby preventing surreptitious imaging of customers/users. Likewise, an airline operator or airport may cause the mobile device to enter into an “airplane” mode… thereby more affirmatively preventing interference with aircraft communications or instrumentation and enhancing safety. Similarly, if a terrorist threat or other security breach is detected, the airport may disable at least a portion of the wireless communications within a terminal using a policy command, thereby potentially frustrating communications between individual terrorists or other criminals.

Guns allow hunters to efficiently gather nourishment and murderers to take innocent lives; GPS allows scientists to better map our world and governments to better track their citizens; religion allows us to come to peace with the as-yet-unknowable and extremists to carry out atrocities in the name of a higher power; and so on, and so forth. Indeed, technologies are not neutral things in the world — they can be used to enrich and they can be used to erode. Nothing new here.

But note carefully what sorts of enrichment this patent promotes: “preventions” and “frustrations,” i.e., preemptive attacks upon potential danger. It’s obvious in hindsight, but examination of the patent’s language elucidates a major goal of so many prevalent (and pervasive) technologies: to curtail potential malevolence.

The call to deploy such technologies is compelled by our willingness to fixate on technology as the end-all, be-all answer to social ills. Rather than chip away at the difficult, complex forces behind our not-uncommon tendencies toward perversion and violence, we instead attempt to throw technology at them.

Now I’m pointing the finger backwards — at ourselves, for seeking the path of least resistance, rather than considering the much more difficult, decades-long route for “preemptively attacking” malevolence, the path that not so coincidentally is the only one that truly sticks.

I’m talking here, of course, about education. I’m talking about advocacy. I’m talking (maybe in weird or discomfiting, anachronistically moralistic terms) about the instilling of an ethical code. I’m talking about that national conversation we may not even know how to have.

I’m talking about active coordination — instead of passive incarceration — on the part of every citizen that thinks anything I’ve said thus far is even mildly important.

I’m talking about together teaching everyone just about everything there is to know about navigating modern society. At every age. For life.

How’s that for big picture?

Jeremy Bentham’s panopticon has predictably found resurgence in light of the NSA revelations. The panopticon, a prison whose prisoners would be perpetually haunted by “the sentiment of… omnipresence,” is a horror of a structure, allowing a prison guard to survey all prisoners without ever being seen himself.