The Information Game

…or How to Think About Cyber

There’s a gut-wrenching scene at the climax of the World War II biopic The Imitation Game. Alan Turing and the codebreakers at Bletchley Park decrypt a German cable and suddenly they know the enemy’s plan to attack Allied ships and, incredibly, all attacks for the foreseeable future. Their celebration is short-lived. Turing grasps the ephemeral nature of their discovery and has a sickening epiphany: To win the war they can’t tip off the Germans that they’ve decoded Enigma. Instead they must simulate ignorance by choosing strategic victories and sacrificing the rest of their men. Panic sets in. One of the codebreakers has a brother serving aboard a targeted convoy. He begs his colleagues to use what they know to spare his brother’s life but Turing is resolved. Their secret must be concealed at the highest cost. The ensuing choices haunted the intelligence community long after the war was won.

Over the last 14 years, Americans have been conscripted into an information war. Individual privacy is now incidental to the objectives of government and technocratic elites, and vulnerable to the exploits of criminals and extremists. The battle for control over the digital space is a gloves off, civil-liberties-be-damned free-for-all. To reestablish trust in our oldest institutions it’s necessary to parse the steps that led to the present situation and decrypt the objectives of contemporary leaders and policymakers.


Nearly 100 years after Nazism flourished in Germany, the question is still asked with incredulity: Why did German citizens permit and participate in genocide? There will never be a satisfactory answer to the moral question of why, but there is a clear beginning in the circumstances of how. The rise of fascism in post-World War I Europe began with a confluence of domestic troubles in Italy: a financial crisis, concomitant economic hardship, grief over millions of Italian war casualties, widespread dissatisfaction with political parties that failed to deliver on promises, and a perceived threat to financial security from a foreign (Communist) ideology.

Onto this stage stepped Benito Mussolini, a staunch nationalist and war veteran whose preoccupation with violence inspired the formation of an army of uniformed “Blackshirts” — unemployed youth, funded by the middle and upper classes, who assassinated opposition leaders, suppressed and destroyed opposition newspapers, and eventually marched on the capital to take power in 1924. “A Brief History of the Western World” summarizes Italian fascism thus:

“In the beginning, as Mussolini himself admitted, [fascism] was largely a negative movement: against liberalism, democracy, rationalism, socialism, and pacifism…[Italians] had been cast adrift, let down by failed hopes of progress and happiness. Faceless in a mass society, they also felt alienated from themselves. The Fascists found an answer to this emptiness by arousing extreme nationalism….The fascist myth rejected the liberal reliance on reason and replaced it with a mystical faith. Stridently anti-intellectual, it held that the “new order” would spring from the conviction of the “heart.” Fascists therefore looked upon intellectuals as…suspicious characters…. Most ordinary Italians accepted Fascism with enthusiasm. The individual who formerly felt alone and unneeded, enjoyed a new sense of “belonging.”

The rise of fascism in Italy took less than six years from invention to political dominance. Fostered by comparable conditions in neighboring countries, the ideology spread across Europe and fatefully intersected with the political ascent of Adolf Hitler in Germany. The Germans have a word for Hitler’s rise to Fuehrer: machtergreifung — macht, meaning power, and ergreifen, to grab or seize. Like Mussolini, Hitler headed up a violent army of unemployed youth and committed illegal acts to dissuade and undermine his opponents, but it was the power vacuum created by ineffective German leadership that paved the way for the Third Reich and Nazism.


Flag of the Soviet Union

A second world war and one Pax Americana later the world was pumped with Cold War adrenalin. In 1962, nuclear superpowers bumbled their way into a stand-off and lucked their way out of the unthinkable during thirteen days of diplomatic posturing over Cuba. The rapid advancement of nuclear technology meant there was no room for error, yet error upon error was made. In effect, American leadership failed the test but passed the class. America and Russia skated by on their shared basic values, but the crisis taught no lessons on how to face an adversary with profoundly different goals, specifically those rooted in tribal conflict and revenge.

In the aftermath of America’s nuclear showdown, political theorist Graham Allison published his seminal work “Conceptual Models and the Cuban Missile Crisis.” It would form the foundation of American foreign policy. Allison defined three distinct methods for understanding policy outcomes: The rational policy model (foreign governments behave rationally in relation to their goals), the organizational-process model (the military typically wants X, the bureaucracy typically wants Y, and historically they have n relationship to each other so the outcome will predictably be z), and the bureaucratic politics model, where shapeshifting factors such as interpersonal conflicts, bureaucratic inertia, and availability of resources act on each other to influence foreign policy outcomes. Government elites strongly favored the bureaucratic model as conventional wisdom that would shape American foreign policy for decades to come.

Political theorist Stephen Krasner reassessed Allison’s models, first in 1972, and later at the height of the “first” Cold War. He was troubled that President Kennedy, subcabinet members, and scholars from top public policy programs in the 1960s wholly adopted the bureaucratic approach, where outcomes were viewed as an evolving compromise of inputs. Krasner identified the fundamental flaw in the model as giving elite decision-makers a blanket excuse for their failures. Specifically, he reframed bureaucratic-politics thinking as a biased framework for blaming policy errors on the “self-serving interests of the permanent government,” where elected officials were viewed as powerless to corral the government “machine.” He summarized the infinite loop of accountability thus:

Bureaucracy is a machine, and “[machines] cannot be held responsible for what they do, nor can the men caught in their workings.”

This is a stunning double entendre for the Information Age.


Rights and privacy are dictated by an elite group of decision makers who control the laws (Government) and the digital infrastructure (Technocracy.) Internet usage and hardware purchases now constitute a “vote.” Government and technology sectors each employ 1% (3–4 million people) of the American population. The percentage of top-level decision-makers, technicians and analysts within those fields is assumed to be less than .01% of the American public and is therefore elite. Technocratic elite lumps Anonymous hackers in with tech CEOs, and government elite includes members of all branches of government and political influencers with monetary or legislative sway. Since both elites invest billions of dollars successfully marketing themselves to society, the benefits they provide are widely known and will not be discussed here. Instead, the focus is the encrypted cost of advancement. Decoding the costs reveals which services and policies are truly beneficial, and to whom.


The Technocracy

The history of the government’s relationship with computer technology is long and complicated. Perhaps only one fact is universally accepted: Al Gore did not invent the internet. Contrary to popular folklore, he never claimed to invent the internet. Gore’s words were twisted, the transcripts are widely available and he was subsequently defended by two of the “fathers of the internet” as deserving “significant credit for his early recognition of the importance of what has become the Internet.” The urban legend illustrates the strange paradox of the Age of Information. Even with unprecedented access to the truth, millions of people are often misinformed.

Internet development began in the 1960s, became its broadly used iteration in the mid-1970s, was commercialized through the 1980s and came into its own in the early 1990s with the introduction of the World Wide Web, the universally accepted infrastructure for data exchange on the internet. Web engineering is credited to Tim Berners-Lee’s 1989 proposal at CERN. It was developed over the next few years and made free to the public in 1993. Anecdotally, this snippet enumerating current issues confronting global governing bodies from the then-definitive International Law Anthology reveals the digitally unsophisticated world that received this new technology:

Global Communications: The earliest topics in this burgeoning field were international postal services and the laying of submarine cables. The invention of radio, television, and facsimile and modem communications technology, have led to explosive growth in this area of international regulation. Jamming and counter-jamming of another nation’s radio wave frequencies, channel regulation, remote sensing, and stationary satellite transmission are matters of intense interest. There is a move toward international broadcast standards and transmission quality. But there are also countervailing pressures against freedom of information, with some nations (and religious groups) desiring the suppression of international telecommunications relating to the advocacy of war or revolution, criticism of governmental officials or policies, regulation of commercial messages, and materials depicting real or fictional violence or pornography. — Anthony D’Amato, “Domains of International Law,” International Law Anthology

It reads like a mid-century newspaper clipping but that passage was published in 1994. Bill Clinton was president.

Twenty years later, Laura Poitras’s Oscar-winning documentary CITIZENFOUR is more than an exceptional historical record. The film is also a primer for technocratic culture and ideology. In June, 2013, after months of anonymous communications, National Security Agency contractor Edward Snowden sat down face-to-face with Poitras and The Guardian journalist Glenn Greenwald in a Hong Kong hotel room. Snowden spoke eloquently and fluently about the values at the root of his dangerous undertaking to leak classified documents detailing secret surveillance programs run by the United States government.


Glenn Greenwald: So, why did you decide to do what you’ve done?
Edward Snowden: For me, it all comes down to state power against the people’s ability to meaningfully oppose that power. I’m sitting there every day getting paid to design methods to amplify that state power. And I’m realizing that if the policy switches that are the only thing that restrain these states were changed you couldn’t meaningfully oppose these. You would have to be the most incredibly sophisticated technical actor in existence. I’m not sure there’s anybody, no matter how gifted you are, who could oppose all of the offices and all of the bright people, even all of the mediocre people out there with all of the tools and all of their capabilities. And as I saw the promise of the Obama Administration be betrayed and walked away from and, in fact, actually advance the things that had been promised to be curtailed and reined in and dialed back, actually got worse. Particularly drone strikes…That really hardened me to action.
GG: If your self interest is to live in a world in which there is maximum privacy, doing something that could put you in prison in which your privacy is completely destroyed as sort of the antithesis of that, how did you reach the point where that was a worthwhile calculation for you?
ES: I remember what the internet was like before it was being watched and there has never been anything in the history of man that’s like it. You could have children from one part of the world having an equal discussion where they were granted the same respect for their ideas in conversation with experts in the field from another part of the world on any topic anywhere any time all the time, and it was free and unrestrained and we’ve seen the chilling of that, the cooling of that, the changing of that model toward something in which people self-police their own views and they literally make jokes about ending up on “the list” if they donate to a political cause or if they say something in a discussion. It’s become an expectation that we’re being watched. Many people I’ve talked to have mentioned that they’re careful about what they type into search engines because they know it’s being recorded and that limits the boundaries of their intellectual exploration. I’m more willing to risk imprisonment, or any other negative outcome personally than I am willing to risk the curtailment of my intellectual freedom, and that of those around me whom I care for equally as I do for myself. Again, that’s not to say that I’m self-sacrificing because I feel good in my human experience to know that I can contribute to the good of others.

[transcription from video]

It’s striking that Snowden didn’t say privacy in his mission statement. Greenwald framed the debate with the question many of us would ask after hearing that we’re being surveilled, and subsequent news reports by outlets across the globe widely referred to “privacy.” It’s unclear whether Greenwald and Poitras heard more of Snowden’s thoughts where he raised the issue of privacy himself, but he doesn’t say the word. He advocated an unmonitored internet from the vantage point of someone who is highly skilled at protecting his own privacy. He recollected the realization, at his NSA desk, that before too long he — a member of the tech elite — would be technologically outpaced and unable to protect his privacy. The technocracy was losing ground to the government.

Society owes Edward Snowden an enormous debt for his decision to blow the whistle on the NSA at great personal risk. To be clear: he enabled a profoundly necessary conversation to begin. However, his poetic description of the unrestrained nature of intellectual advancement is technocratic rhetoric for a digital utopia that never existed. As compelling and passionate as he is, Snowden made several incorrect assertions that should be dispelled in the interest of productive discussion.

First, there have been many inventions in the history of man like the internet, including the space shuttle, the airplane, the telephone, or the galleon, all of which brought people together across vast distances at previously unmatched speeds to have discussions and exchange knowledge. Mankind went through periods of adjustment to those profound changes in infrastructure and we will navigate this one as well. Innovation is not unprecedented. This invention will mature beyond its makers and it must assimilate to the needs of civilization, not the other way around.

Second, the children can still spend their days online talking to experts as equals if they want to (though it’s doubtful they do.) Invoking chilled children and cooled innocence is misleading rhetoric when it’s primarily adults who spend their time staring at a screen. Further, the tech industry pushes expensive gadgets and software for kids but, as highlighted by the New York Times’ “Steve Jobs Was a Low-Tech Parent,” many technocrats strictly limit gadget exposure for their own families because they’re aware of the harmful effects of internet and technology use on young minds. Teenage youth are a more complicated issue with regard to internet freedom, which is especially clear in the case of ISIL’s recruiting techniques, but Snowden wasn’t referring to Muslim children discussing ideas with expert terrorists across the globe. He wasn’t lamenting privacy incursions on thugs. In fact, he didn’t acknowledge the grey areas of internet freedom at all.

The most important falsehood in Snowden’s statement, and the core message of the technocratic ideology, is that the internet was once and should always be free. This is a seductive idea, especially to people with good computing skills and entrepreneurial leanings, but it is patently untrue. Getting online requires expensive hardware and infrastructure that is designed and sold by the same community that dominates the internet through technical expertise.

For the last 20 years the technology industry has hard-sold hardware to citizens, corporations and governments alike along with software that seamlessly replaced or supplanted infrastructure for everything from financial transactions and brick-and-mortar stores to research and even face-to-face meetings. The technocracy orchestrated one of the greatest heists in history by amassing “free” content from writers and established media publications trying to maintain their brands with a millennial generation that wasn’t taught to pay people for their time, research, and intellectual work. As a final insult to “freedom,” tech companies undertook the systematic repackaging of users’ private information as data useful for advertizing, which they bundle and sell to whoever they choose at a profit. (The word “user” rather than “customer” has always implied a barter arrangement, but it is rarely spelled out exactly what is being given and gotten. You open a social media account once, perhaps only use it for an hour or a day, but the service provider owns your personal information forever and can sell it many times over.)

In 2015, Apple, Microsoft, Google, IBM and Samsung have risen to the top ten of Forbes’ World’s Most Valuable Brands, and 11 more technology companies are in the top 100. Six of the world’s 20 richest billionaires are computer technology elite. All of that free internet has paid for mansions and private educations. There’s nothing wrong with companies and people making money off of this invention — America is a proudly capitalist society — but perpetuating myths about intellectual freedom and raging against government misuse of personal data is hypocritical and misleading.

If it appears I’ve misinterpreted Snowden’s meaning entirely, breathe easy. It’s clear that Snowden’s “free internet” refers to freedom of thought, communication and information, not freedom of goods and services. However, the cyber conversation can’t bifurcate those billions of dollars from the billions of devices and trillions of gigabytes of data. Doing so hides the massively lucrative business objectives behind fun, sometimes addictive, products. If technocrats truly want a free, unrestrained internet they’re now rich enough to forgo that pile of money, make cheap hardware, set chaos-legitimizing rules (First Rule of Internet: There are no rules) and enforce the entropy. I doubt they’d have billions of takers and no one would be typing their credit card number into a chaos box.


Screenshot from the Department of Justice website

The Government

Spying, surveillance and covert activity have always been part of America’s security and defense apparatus; that activity just wasn’t legal. Illegality was at the heart of clandestine work, making it extremely risky and therefore far more considered by those commissioning it and those undertaking it. The legalization of amoral behavior came about in the weeks after 9/11 because, ostensibly, the president and his cabinet wanted the freedom to openly plan illegal activity without fear of legal repercussions. The PATRIOT Act inoculated government officials from risk and, many would say, ethical pause. What followed was a confident, unrisky expansion of intelligence infrastructure with no heeded supervision or endgame.

A nation that was once gripped by the unraveling of Richard Nixon now shrugs off revelations of CIA agents breaking into Senate Intelligence Committee computers in 2014. Government workers have spied on elected officials before, but today the public digests these incidents with a vague assumption that all criminal behavior by the government has a footnoted legal justification somewhere. These stories translate as infighting among elites. Fourteen years of the Patriot Act have conditioned Americans to expel what little outrage they can muster in a matter of days and then go limp. The groups taking legal action against injustices are typically news or special interest organizations with a financial or moral dog in the fight and powerful legal teams to back them. (The latest New York Times op-ed piece from Wikipedia’s Jimmy Wales and the AP’s lawsuit against Hillary Clinton are two cases in 2015 alone.) Even with funded legal representation, there’s a pervasive sense that their effort is futile. For all of the flagrant rights abuses, the government’s tracks are papered over by the PATRIOT Act.

One way to step off the merry-go-round is to take a page from Alan Turing’s estimable problem-solving approach and look at what isn’t happening in our every day lives. Government elites have made several huge assumptions on our behalf and, in light of Edward Snowden’s unspooling NSA leaks, it’s worth revisiting their decisions after seeing the results. The government uses negative hypotheses to great effect (if we don’t renew the PATRIOT Act…) and so can the people whose rights are in the balance.

What isn’t being done with NSA-collected data?

Potentially, the important stuff. Through indiscriminate data-collection, the NSA is extensively aware of wrongdoing by the American people, corporations, government agencies and officials. We don’t need Edward Snowden’s evidence to know this is true. Daily news stories show that digital communications include sexually harassing emails in the workplace, threats of murder or violence, faxed paper trails of embezzlement, proof of premeditated theft, telephonic recordings of gender and race discrimination, and documented personal indiscretions by public officials. The American government inadvertently nets evidence to myriad criminal acts, both domestic and foreign. It then employs people to sift through these stores looking for some lawbreakers, but not others. When intelligence officers stumble upon criminal or threatening activity that doesn’t serve their objectives do they look the other way to conceal their methods? It’s conceivable and probable that actual lives have been lost to inaction rooted in concealment. What happens in situations like these? What do the numbers look like on paper — lives lost or ruined versus casualties from terrorist attacks. The legal ramifications are mind-boggling but the ethical question is straightforward: Is a government obligated to protect its people or its objectives?

What else isn’t being done with NSA surveillance data? For all of their time spent sweating over Apple’s Xcode, the U.S. government didn’t stop the Tsarnaev brothers, the French government didn’t stop the Charlie Hebdo murderers, and the U.K. government isn’t stopping thousands of teenagers from leaving the country, unaccompanied, to join ISIL. Most disturbing was the story of three teenaged girls who left the U.K. in February and may have been aided by a western spy in transit, forcing us to question why governments aren’t helping their most vulnerable citizens return to safety (and whether they may be using them as unsuspecting spy assets instead.) With the Snowden data we have proof that our individual rights, and lives, are considered a worthy sacrifice to what the government deems “the greater good.” When spy agencies might be risking the lives of teenagers in the name of future terrorist attack victims, it’s clear government objectives no longer align with the values of the citizens they work for.

What if we don’t have the internet?

When Lindsey Graham weighed in on Hillary Clinton’s email debacle on Meet the Press with an I’ve-never-sent-an-email statement, he pumped a figurative fist of defiance. He’s a loud, proud Luddite in the new millennium. However, ask him where he does his banking, whether he gets money from the ATM, uses a cellphone, watches cable television, or has ever read the news online and he’ll be forced to admit he’s got a digital footprint. His televised statement gives him credibility with the anti-technology demo, the people who are done with all the smart talk and just want to love America with all of their hearts [see: Fascism, precursor to]. The only people alive today who aren’t consciously reliant on cyber technology are toddlers. The rest of the modern world communicates regularly online and is increasingly aware that public officials lack cyber expertise.

But what if we did live in Lindsey Graham’s la-la-land and didn’t have a digital footprint? A world without the internet is inconceivable today, but that world existed only two decades ago. In that time we traded infrastructure for more than just privacy. What we save in time and gain in information should be held up to what we spend in dollars to participate in the digitized world.

A sliver of the data shows that in 2014, 177 million smartphones sold in North America, amounting to $71 billion in sales. Globally, 1.3 billion smartphones sold. Add to that the pc, tablet and cellphone sales, software sales, internet and cellphone service contracts…Americans pay a lot of money to go about their daily lives. This is not to suggest we should shun progress and innovation, but we should know what we’re getting for our money. We aren’t getting shiny new laws for the digital infrastructure we depend on. Our brightest technological minds unwittingly innovated a cyber-police state and elected officials aren’t knowledgeable enough, or confident enough, to walk back what technology wrought. For a country that leads the world in cyber technology, many of our legislators are tech-dumb to the point of ridiculousness. The fatal mistake would be to insist we can separate ourselves from the infrastructure of modern society by never sending an email. Politicians like Graham sell that idea because it sounds freeing [See: Paternalism, Fascism’s sweet-faced uncle named] but they’re diverting attention from the pressing issue of lawmaking because they clearly have no idea where to begin. The gridlock in Congress might not be gridlock at all. Perhaps our representatives are simply confused about how to hit “Send.”

Finally, who doesn’t control personal data?

If the answer to this question isn’t obvious yet then it’s worth stepping into the nearest bathroom and checking out the wall above the sink. (Or ask Hillary Clinton. She gets it.) In military jargon, intelligence refers to strategically useful information. Information becomes intelligence when it has an application, and that application is determined by whoever finds, reads, assesses and controls the information. To grasp how important this seemingly obvious statement is, consider the juxtaposition of Director of National Intelligence James Clapper and former NSA contractor Edward Snowden, two men working at the same government agency in control of the same information who found starkly different uses for it.

From this we must conclude that, within the government, a select group of officials and contractors control our information and they each have specific objectives in mind. Then we must acknowledge that almost none of us can articulate what those individuals’ objectives are so we don’t know if we agree with them. As internet-reliant citizens, we play the odds every time we connect digitally, not knowing which side of the numbers game we’re on. To use the analogy of WWII Britain, are we the majority at home or the unsuspecting brothers on targeted convoys? None of us can answer this question because the government elite draws up the map in secret. To the extent that events unfold in a manner we agree with and our lives aren’t negatively affected, we can only say we got lucky.

Loading screenshot of Google’s Virtual Library project


Living in Asia in the late 90s, I spent time in countries that were then considered “developing” economies. Textbooks were filled with prognostications about the potential growth and downfall of these places but no bar chart captured the terrifying hilarity of driving an hour outside of Seoul at high speed in a brand new sedan on unpaved roads and only potholes and feral animals to navigate by. Technology was tangibly out of sync with infrastructure. A blocked road sent drivers veering onto the front steps of houses. Parking was wherever you feel like it, and parked cars were often rendered inaccessible due to other people’s feelings about parking. Disagreements were resolved the old-fashioned way with pointing, yelling, and threat of fists. Over time, enough pedestrians were casualties and enough expensive tires were blown in potholes that laws became necessary, as did the paving of roads. The automobile is no less amazing because society set a speed limit. We mitigate and retard technology where it threatens and outpaces us. This is how we civilize our innovations.

The most poignant irony of the Information Age is the internet’s role in restructuring our relationship to politics. Snowden avowed his intent to end the tyranny of the snooping government, but technocratic paternalism is equally invasive and it’s built into the digital realm. Complicated legal documents pop up at the outset of a business relationship and people with no legal background are conditioned to move ahead with a trust us one-click “Agree.” Our relationship to intelligent technology is best portrayed by the routine updates we tacitly agree to without reading or understanding what they entail. I Agree to whatever you’re about to load onto my phone or into my computer, agree to what you think is best for this device and my use of it, agree without stipulation, agree without working knowledge, agree because not agreeing seems time-wasting and foolish and questioning is beyond my technical ability. I always agree with you because everyone else is agreeing with you so it must be okay. I always agree with you because I don’t know why I should disagree.

This habitual agreement has proved deadly to the exchange of real information. The technocracy devised the fastest, most appealing method for securing a user, and internet users subsequently became desensitized to the act of giving away their rights. The repetitive process has numbed healthy suspicion of any organization that demands legal agreement to a loss of personal agency. Those internet service agreements are not there to protect individuals, they are documents created by expensive legal teams to ensure a company has no responsibility to the consumer. If these statements aren’t disturbing enough, stretch them to apply to the government in the shocking months and years after 9/11. The PATRIOT Act was the federal government’s service agreement, and the majority of the American people agreed to it without understanding what they were signing away.

Fourteen years on, perhaps the greatest misstep in rectifying our mistake is to begin with privacy. Loss of privacy is an end result. Privacy can be protected, it can be violated, but it cannot be given. That notion is a falsehood born of Victorian manners — I’ll give you some privacy — which preempt uncomfortable directives: Leave the room. Get off the line. Turn your head. Don’t read my emails. I need my privacy. The sci-fi notion of “mindreading” is terrifying precisely because it violates the only space on earth that belongs entirely to us. When we communicate with people, through talking, writing, or touch, we consciously extend that private space to include others. A violation of private space is a form of mindreading. In building society around the digital world, we’ve ceded a massive amount of private space to move in safely. The only recourse to learning your boyfriend has read your journal is to hide it in a new place, but the only recourse to discovering people can hack your emails is to stop writing anything sensitive or private at all. By necessity, we’ve retreated inward. Our truly private worlds are almost entirely interior now. That loss of intimacy has already alienated us from one another. Unable to safely extend a hand or share a thought, our knowledge of people stops with avatars and public text. We can’t know people’s deeper feelings and they can’t know ours. There’s nowhere safe to talk. We are alienated.

When Glenn Greenwald asked Edward Snowden why he would risk imprisonment — the obliteration of privacy — Greenwald identified the one circumstance where personal agency is taken away. That the cyber debate revolves around the give and take of privacy tells us that we’re already in a prison of sorts. To get out, we need to reestablish laws and agreement. Not the tacit agreement of accepting free stuff in exchange for unknown costs, but overt agreement and an expectation of legal recourse if our rights are violated. As Stephen Krasner observed: “The Constitution is a document more concerned with limiting than enhancing the power of the state.” Modern lawmakers violated this precept into extinction with the PATRIOT Act. There’s no underlying belief that our present government will give up the PATRIOT Act of their own volition, and no reason to believe the public has the will to make them. This is where most people drop out of the resistance movement and succumb to prison life.

The other misstep in solving the puzzle is our obsession with predicting the future. Pew Research Center’s Net Threats survey of over 1400 technology experts asked them to predict “the most serious threats to the most effective accessing and sharing of content on the Internet.” But with so much emphasis on forecasting, we’re overlooking today’s storm. If you’d asked a South Korean mother living 20 miles from the DMZ in 1997 what the most serious threat to her children’s lives was, most Americans would have expected her answer to be a doomsday scenario of war with the north. However, it’s just as likely she would have said: “See that black sedan driving 50mph over my front doormat…?” The news-grabbing headlines often obliterate imminent dangers. Public discussion leapfrogs over what we could solve today because no one wants to dig in and do the unglamorous work of painting a dotted line down the center of the road. (Why isn’t Pew asking these 1400 experts to identify today’s most solvable problem and offer a specific solution? That’s 1400 solutions right there.)

If technology is responsible for creating a state of alienation then the government is guilty of capitalizing on that alienation. When politicians appeal to people’s confusion over new technology, they perpetuate a dangerous myth: that people can separate themselves from the digital age. Lindsey Graham’s opinion on cyber surveillance is useless if he doesn’t understand how Americans use email or why they might be upset that those emails are intercepted and read by government officials. Perhaps he’d like to turn his diary over to the CIA and see how that feels. His vote on privacy legislation would certainly be made with the necessary wisdom.

America is a world leader in computer technology and innovation. Every member of Congress, and certainly the next president, should be knowledgeable about computer technology. America’s elite governing body must be prepared to debate cyber. My 90-year-old grandmother has been sending emails for years and she has a Facebook account. If senators can’t keep up with her rudimentary computing skills then they don’t belong anywhere near the Capitol. The most important action Americans can take is to vote for cybersmart House and Senate representatives in upcoming elections.

As backwards as Washington seems, cybersmart politicians do exist. It’s clear from Hillary Clinton’s decision to house computer servers in her home during her tenure at State that she’s knowledgeable about cyber. Despite her public statement, Clinton’s use of personal servers has nothing to do with convenience and everything to do with security. Clinton owns her data. She also possesses depth of knowledge about what goes on in the intelligence community, and I expect that is precisely what drove her to take control of her privacy. If she wants to do the country a great service, in or out of the White House, she should make cyber legislation her top priority and level the playing field for citizens everywhere. It would unite the country to speak plainly about the state of our internet. Honest talk about cyber surveillance from a public figure who can speak to both sides of the debate would be a huge step forward for the country.

What will hopefully become apparent, to decision makers and citizens alike, is that both sides of the ideological struggle derive their power from the online participation of citizens. The present situation has left people with nowhere to turn for trustworthy leadership. The conditions that permitted fascism’s spread — post-war malaise, financial struggles, political distrust — tamp down people’s natural resistance to incremental loss of agency. The circumstances that facilitated the rapid creation of totalitarian governments in previously liberal, rational societies are cropping up again a century later. The situation is again ripe for machtergreifung.

Democratic European societies once made a desperate attempt to escape their status quo by funding unstable third parties with disastrous consequences. We are now seeing many radical ideas thrown into the mix, some backed by logical process, others attempting to shake people out of rhetoric fatigue. Reboot the Government! Reboot the Bible! Reboot the Brain! Drop one letter from those slogans and we’re deep in A.I. territory. Bill Gates, Elon Musk, Stephen Hawking and their ilk proclaim their fear of the dark side of artificial intelligence with increasing regularity. We should be afraid too. There’s no precedent for the power vacuum created by a flaccid Congress and a disproportionately wealthy technology sector. This situation could pave the way for the first artificially intelligent leader. The engineering is getting there, and the rest would be…history.


At the end of The Imitation Game, when the Germans have been defeated and the war declared a victory, the British codebreakers sit around a table to be dismissed. They are solemn and alienated from one another because of secrecy, spying, suspicion, and lying, though they each believe their transgressions were the morally responsible thing to do. They’re ordered by their government to keep yet another secret — deny everything they know and deny they know each other. The path they’re on has no exit and no truth. They’re in a prison of past decisions and will be for the rest of their lives. However, the circumstances that created their prison are the opposite of America’s situation today. In WWII the British government was desperate. The enemy was winning. Their strategy wasn’t clandestine by design but by circumstance, and the British public was spared the burden of deciding who to sacrifice.

Today we’re faced with governments and corporations that spy, lie, classify decision-making, and manipulate online users. These conditions are self-perpetuating. There is no definitive endgame in the shapeshifting political narratives and money-making schemes except to exert more power over the online space. To reclaim the space for public privacy, we must take the messages we’re being sent and decrypt the puzzle ourselves. Whether your bias is to fault the system or the individuals who make decisions within it, both are responsible for mistakes, and both hold the keys to solving the puzzle. The trick is to look at what isn’t there, and to ask why something is free.