Geopolitical rivalries behind the cyber-threat narratives in the United States.

Aug 20, 2014 · 31 min read
Internet meme originated form the now extinct Weekly World News


Cybersecurity narratives in the United States are influenced by geopolitical representations of vulnerability and threat. Some of these representations challenge the innovation behaviors that are so prevalent in online environments, and antagonize with the entrepreneurial ethos of the country. These securitization moves have been explicitly rejected by an important sector of the Information Technology (IT) community that sees in the hacker an archetype of creative behavior and not an enemy of the state. This story explores alternatives to de-securitize cyberspace in order to create a more democratic cybersecurity policy that rebalances fears and facts to empower the citizen-hacker to protect online environments outside the politics of fear.

Story adapted from the the paper: Cybergéopolitique : de l’utilité des cybermenaces, published in the scientific journal Herodote. If possible, please cite original version. Citation can be exported from google scholar, clicking here.


Cyber attacks play an important role in the construction of vulnerability narratives by security and defense practitioners and scholars in the United States. The idea of hackers producing catastrophic damage is so fundamental to current threat assessments, that the Department of Defense created in 2009 a sub-unified Command to respond to these challenges. General Keith Alexander became its first commander, occupying a “dual-hatted position” as he simultaneously directed the National Security Agency (NSA) and the US Cyber Command (CYBERCOM). He described the challenges behind the creation of this new new command in the following way:

“We will do this as we do it in the traditional military domains of land, sea, air and space. But cyberspace is unique. It is a man-made domain. It is also an increasingly contested domain. That makes everything even tougher. Our job in U.S. Cyber Command is to assure the right information gets to the right user at the right time at the right level of protection.”

This quote contains most of the units of cultural transmission (what Richard Dawkins called memes) of cybersecurity imperatives: It begins by framing the definition of cyberspace as a geospatial environment or a fifth military domain (the other four being land, sea, air and space are all geospatial in nature) where military forces can operate and maneuver. The speech-act then places the accent on the technological nature of this “man-made territory”, while reinforcing at the same time the idea of a besieged domain by nefarious actors. The citation concludes this securitization move by explaining how the mission of the CYBERCOM is to provide Information Assurance (IA) to cyberspace, so data flows are limited by rules (levels of protection), time and users. The creation of CYBERCOM is a step towards the “militarization of the Internet.”

The key assumptions are that information sharing must occur only within limited environments (in technology terms these environments are often described with the spatial metaphor of “walled gardens”) and that this predictability should define all flows. Hacking can then be defined in opposition to this frame: It is what happens when the system is tampered in a deliberate way so it behaves outside those intended rules. When that occurs, information sharing does not follow predesigned paths and the core mission of CYBERCOM is to prevent hackers from sharing data with the “wrong” users at the “wrong” time and more importantly, without following those pre-established rules.

It is that third condition that is the most problematic. As I will elaborate, there are at least three problems with this cybersecurity frame:

  • First, it is incompatible with key elements of the entrepreneurial culture so admired by the liberalist ethos of American innovation; a culture that still today represents entrepreneurs as heroic underdogs challenging the establishment by breaking the rules, not by following them. In Silicon Valley’s definition of heroism (replicated all over the country in the geopolitics of technology clusters), it is by breaking the rules that one can go from a garage startup in Los Altos, CA to a multibillion dollar company that can change the world.

In this, the entrepreneurial community differs significantly in its approach to technological innovation to hackers. While entrepreneurs “hack” markets through the use of technology for financial gain, hackers favor tackling technological challenges because of the pleasure of problem solving itself.

  • Second, this admiration of the entrepreneur as the ultimate rule breaker is driven by a quest for what Clayton Christensen called disruptive innovation: A challenge to established value propositions through the breaking of rules to disrupt markets. Nevertheless, the hardening of rules that the dominant cybersecurity frame often demands, trades most of the flexibility required for disruptive innovation in exchange of security measures, without rationalizing the consequences of this tradeoff.
  • Third, it is a securitization move that has been explicitly rejected by a key sector of the High-Tech community (not the same as the community of entrepreneurs, although with some overlap between the two) that sees the hacking ethic through a positive light, and where many members self-define as hackers. This hacker ethic, like many other moral codes, is a complex set of implicit values and emerging behaviors shared by a community without the need for codification, although two famous postulates do a good job defining key elements of it:

1)Information wants to be free and;

2)If you cannot open it, you don’t own it.

Steven Levy systematized the hacker’s ethic in his famous book “Hackers: Heroes of the Computer Revolution”:

“Hackers believe that essential lessons can be learned about the systems — about the world — from taking things apart, seeing how they work, and using this knowledge to create new and even more interesting things. They resent any person, physical barrier, or law that tries to keep them from doing this. This is especially true when a hacker wants to fix something that (from his point of view) is broken or needs improvement. Imperfect systems infuriate hackers, whose primal instinct is to debug them.”

The hacker ethic explicitly claims a right to tamper with systems, a conduct that most definitions of cybersecurity explicitly prohibit. This “not playing by the rules” receives in computer interaction studies the name of technological appropriation:

“[Users] adapt and adopt the technology around them in ways the designers never envisaged. Think to your own experience: perhaps you have used a screwdriver to open a paint tin, or heavy textbook to prop open a door … or tried to open a bottle of wine without a corkscrew. Improvisation is critical to ‘getting things done’.


These improvisations and adaptations around technology are not a sign of failure, things the designer forgot, but show that the technology has been domesticated, that the users understand and are comfortable enough with the technology to use it in their own ways. At this point we know the technology has become the users’ own not simply what the designer gave to them. This is appropriation.”

Through trial and error, hackers are engaged in systematic technology domestication. The central goal of hacking is to appropriate artifacts and systems outside the designer’s intent, sometimes even against his or her desires and even outside of the law. Hacking is an act of individual will and often also an act of rebellion against the system designer’s intent. It is also a key behavior behind technological progress and innovation by identifying exploits and recombinatorial possibilities to make something new out of old systems.

The speech-acts through which cybersecurity policies have framed cyberthreats have failed to recognize this link between hacking and innovation capacities and the the heavy cost of cybersecurity enforcement on entrepreneurial potential.

Cyber-geopolitics and its representations

This “techno-clash of civilizations” between cyber-enforcers and hacker/innovators has some of its fracture lines in two under-explored geopolitical representations that I will now analyse, with the goal of expanding the framework used to discuss the securitization of cyberspace to include its costs on innovation potential. I have called these representations cyberscale and cyberubiquity. Finally, I will also include in this framework an educational problem in the United States that operates as a metastructure of cyberscale and cyberubiquity, for which I use the name of cyberscare.

1.- Cyberscale: The whole threat narrative that surrounds cybersecurity is geopolitically represented as a problem of scale. Cyberattacks are framed as asymmetrical power rivalries over territories in which the small and the large scales collide. A problem of “cyber” is typically a problem in which small groups or individuals successfully challenge big institutions (mostly governments or big corporations) by appropriating big networks of technological infrastructure with the potential of affecting a big number of people and producing significative economic damage. Even when these hackers act as soldiers of state sponsored “cyberarmies” in the context of the so called “cyberwars,” what makes them different from other adversarial forces is their capacity to defeat those same big institutions from the scale of small groups and from far away by appropriating the critical systems of the adversary. In this sense, cyber threats can be and often are framed as a “normalizing” force that closes the power differential between the large and the small scale, ultimately benefiting the individual hacker or small group of hackers. Exploited as a threat narrative, the representation of cyberscale is pushed to the extreme to make this power differential a source of fears.

2.- Cyberubiquity: The dominant discursive framework of cybersecurity politics affirms and extends the geopolitical representation of deterritorialization of battlefields that was initiated by replacing symmetric cold war communist enemies by asymmetric islamic terrorists (by the way, this metaphoric asymmetry also refers to scale). According to it, every system in our technologically dependent civilization is a potential source of threat as the “man made” nature of the cyberspace domain places us all behind enemy lines. The hacker’s potential for damage increases with each new technology that is integrated into our human environment and like the nuclear threat, there is nowhere to escape because there is no possible detachment between the cyber battle-zone and the human geography. The hacker is everywhere (and he can be everybody). In one of the most quoted books by governmental officials on the topic of cyber, Richard Clarke, former National Coordinator for Security, Infrastructure Protection, and Counter-terrorism exploits the representation of cyberubiquity in very clear terms:

“Cyberspace. It sounds like another dimension, perhaps with green lighting and columns of numbers and symbols flashing in midair, as in the movie The Matrix. Cyberspace is actually much more mundane. It’s the laptop you or your kid carries to school, the desktop computer at work. It’s a drab windowless building downtown and a pipe under the street. It’s everywhere, everywhere there’s a computer, or a processor, or a cable connecting to one.

And now it’s a war zone, where many of the decisive battles in the twenty-first century will play out.”


3.- Cyberscare: The geopolitical representations of cyberscale and cyberubiquity are supported as threat narratives by a “cyberscare”; a kind of politics of fear not unlike the “red scare” tactics of McCarthyism in the 1950s, the “missile gap” of the 1960s, and the global war on terrorism of the early 2000s.

Evgeny Morozov warned us in 2009, cyberscare governmental reports

“Are usually richer in vivid metaphor — with fears of “digital Pearl Harbors” and “cyber-Katrinas” — than in factual foundation…

It is alarming that so many people have accepted the White House’s assertions about cyber-security as a key national security problem without demanding further evidence. Have we learned nothing from the WMD debacle? The administration’s claims could lead to policies with serious, long-term, troubling consequences for network openness and personal privacy.”

The quote even became an internet meme. Source: Unknown

At the center of the these efforts to securitize cyberspace through a cyberscare lies an educational problem that Carl Sagan appropriately described as a “prescription for disaster.” For the famous astrophysicist and science popularizer, the fact that “We live in a society exquisitely dependent on science and technology, in which hardly anyone knows anything about science and technology… is a clear prescription for disaster.” He then continued to ask “how can we decide national policy if we don’t understand the underlying issues?”

Cybersecurity threats in which cyberscale and cyberubiquity play a role do exist, but designing a proportional response to the geopolitical challenges posed by these cyber-geopolitical conditions and representations requires a minimum understanding of the critical technological infrastructures that shape our sociotechnical civilization. Without a solid educational foundation on Science, Technology, Engineering and Math (STEM), those critical systems become unintelligible for the majority of the population (including elected officials) effectively shielding them from being able to effectively participate in technology policy debates they do not understand about matters that are essential to their freedoms and well being.

As Arthur C. Clarke postulated in his third “law” of prediction, “any sufficiently advanced technology is indistinguishable from magic” and cyberscare is a securitization condition that transforms hacking into threatening black magic and the hacker into a scary warlock.

In the securitization of cyberspace, the hacker is framed as the ultimate homeland security adversary.

Unlike the islamic terrorist of post 9/11 Islamophobia (unsophisticated, radical) or the communist of the red scare (unpatriotic and a mindless puppet of the USSR) the Hacker-as-a-source-of-fear is an agent represented as a superior enemy. To the average techno-layman, the hacker is perceived as a super empowered individual with knowledge that makes him or her (normally a him) more dangerous than any other previous enemy of the social status quo. He can open doors that are closed to the rest of us, take control of nuclear power plants or any other technology, and through computing languages that are indecipherable to the vast majority of the people, he holds the power to destroy us.

The hackers, individually defined or as part of organized crime syndicates or international cyber-military complexes, have the evil profile of a comic book supervillain: clever and empowered to destroy the world from the shadows, by opening flood gates or detonating nuclear power plants using the internet (in some indescribable way) for selfish purposes or for no reason at all.

For these supervillains, the cyberscare also reinforces the cyberubiquity representation by triggering geographic proximity fears. Mimicking the islamophobic discourse that exploited the “sleeper cell” meme to produce fear about islamic people (especially migrants) who live in close proximity to non Islamic populations, the cyberscare exploits the anonymity of the online world for the same effect. For example, in the movie the Matrix, an important film loaded with almost all the cybersecurity representations ever conceived, agent Smith, a character that represents the authorities enforcing cybersecurity norms, confronts Neo, the protagonist of the story, during their first encounter in the following terms:

“It seems that you’ve been living two lives. In one life, you’re Thomas A. Anderson, program writer for a respectable software company. You have a social security number, pay your taxes, and you… help your landlady carry out her garbage. The other life is lived in computers, where you go by the hacker alias “Neo” and are guilty of virtually every computer crime we have a law for. One of these lives has a future, and one of them does not.”

To sum up, I define cyberscare as the politics (and marketing by military and cybersecurity corporations) of fear that exploit the lack of technological knowledge and the lack of comfort with science and technology that is required from citizens to understand the complex sociotechnical civilization we live in; cyberscare has given us an enabling environment for a cyber-geopolitical narrative that transports battlefields of cyber-wars to the heart of our civilizational infrastructure, allowing the representation of cyberubiquity to emerge. Finally, the cyberscare facilitates the menacing framing of small groups of hackers as super empowered adversaries embedded in our society capable of inducing catastrophic collapse to nations or corporations; this is the heart of the geopolitical representation of cyberscale.

All the speech acts behind these representations are polysemic concepts loaded with subjective values within competitive narratives. Cybersecurity is a rich policy environment in which definitions convey political intentions as stakeholders use them in order to increase their power (and profits) or to decrease the power of adversaries. Like most geopolitical representations, cybersecurity speech-acts introduce cognitive frames and representations with a policy goal in mind, and like all geopolitical representations, cyberscale and cyberubiquity are political perceptions presented in the context of their relation to territories, introduced in the public conversation by actors who expect to obtain political gain. Their accuracy is irrelevant because their key role is to support political narratives, but geopolitical representations are normally validated by verifiable facts that can be and often are distorted to achieve their main political mobilization goal.

For example, for the geopolitical representations of cyberscale and cyberubiquity to be successfully exploited as speech-acts to securitize cyberspace, risk management approaches that balance risk and response are mainly excluded from the discourse. A “perception gap,” defined here as the “potentially dangerous distance between our fears and the facts” is capitalized by the dominant narrative to omit the fact that comprehensive evaluations of the cybersecurity threat have shown that the collective costs of anticipation against cyberattacks already surpass the cost of potential damages, and are still on the rise. Most governmental programs and cybersecurity products on the market are anticipatory in nature, and therefore part of this excessive spending.

Furthermore, policies enacted in the name of cybersecurity, like the NSA’s dragnet cyber-surveillance programs that employ offensive anticipatory tactics, have had a hefty cost (albeit hard to quantify) on the democratic credentials of the cybersecurity policies of the country and this reduces the competitiveness of the American High-Tech sector by making it look incapable of protecting the privacy of its clients and thus, unreliable. Nevertheless, this extra cost of current cyberdefense programs is omitted or dismissed as unimportant in cyberubiquity and cyberscale speech-acts.

Well before the information about the NSA dragnet programs was leaked by Edward Snowden in 2013, Rod Beckström, a member of the aforementioned High-Tech community became the bold choice of the Obama administration to become the first Director of the National Cyber Security Center (NCSC) at the Department of Homeland Security (DHS). Turf wars with the NSA forced his resignation in 2009, less than a year after his appointment. In his resignation letter, the most important cybersecurity official for DHS (who would then become president of the Internet governing body ICANN) declared that:

“NSA effectively controls DHS cyber efforts through detailees, technology, insertions and the proposed move of NPPD and the NCSC to a Fort Meade NSA facility. NSA currently dominates most national cyber efforts. While acknowledging the critical importance of NSA to our intelligence efforts, I believe this is a bad strategy on multiple grounds. The intelligence culture is very different than a network operations or security culture. In addition, the threats to our democratic process are significant if all top level government network security and monitoring are handled by any one organization (either directly or indirectly). During my term as Director we have been unwilling to subjugate the NCSC underneath the NSA. Instead, we advocated a model where there is a credible civilian government cybersecurity capability, which interfaces with, but it is not controlled by, the NSA.”

With American allies and transnational companies challenging the centrality of the United States in the geopolitics of the internet as a consequence of the cybersecurity programs, a situation potentially more costly to the interests of the United States than any cyberattack, it is easy to demonstrate that cybersecurity policies have not factored the negative important costs they generate.

Finally, the move to securitize cyberspace also neglects to factor the negative consequences on the creative economy of demonizing the hacker ethic when digital cybersecurity policies make technological appropriation more difficult and digital rule breaking criminal.

De-securitizing cyberspace: Hacking and entrepreneurship, two sides of the same bitcoin

Now that the previous analysis deconstructed the cyber-geopolitical representations of scale (cyberscale) and a sense of place (cyberubiquity) at the center of current cybersecurity narratives, the stage is set to discuss how they are in direct conflict with the much courted disruptive innovation, creating a very important (and somewhat schizophrenic) conundrum.

To a certain extent, this confrontation was unavoidable. Hacks induce power rivalries in cyberspace that self-organize to form a sort of evidence-based cultural critique, continuously testing, exposing and exploiting sociotechnical structures. Individual hacks are literally an exegesis of the coding that underpins our technologically dependent civilization. Through them, hackers appropriate systems to produce novel behaviors that were either not imagined, not wanted or explicitly forbidden by the power holder (I.e. the system administrator or authority).

On one extreme, by not playing by the rules, hackers defy institutional architectures through technological empowerment, and this sets in motion cyber-geopolitical representations and conflicts. On the other extreme, cybersecurity policies are shaped by those representations to stop unintended behaviors from taking place outside of the system rules. This is why under its current shape, these two ethos are not easily reconciled.

Hackers are motivated by the desire to understand and obtain access to systems beyond the designer’s intent, leveraging knowledge to compensate lack of access. But knowledge is a scarce commodity, and this process empowers the knowledgeable to appropriate systems to multiply their force.

Therefore, the essence of the cyberscale problem is force multiplication through appropriation. The hacker embodies all of humanity’s fears (but also its ambitions, as we will now see) vis-a-vis science and technology as he appropriates systems; his hacks push the limit of what others thought could not be done. As such, hacking is the direct antithesis of the problem of “failure of imagination” that has troubled homeland security since its inception. Hacking is what happens when failure of imagination does not occur and systems are consciously affected at the right point of intervention to produce a desired effect.

What is less evident, is that these behaviors are unavoidably intertwined with the entrepreneurial spirit that drives corporate America.

As Gabriella Coleman explains, hackers challenge some of the restrictive tenets of neoliberalism that gave birth to current cybersecurity imperatives (e.g. the criminalization of decryption hacks in the name of copyright protection) not from an external and frightening ideology like communism was, but from some of the most classical principles of western liberalism, in particular free speech (“code is speech” is another key element of the hacker ethic), and the very American meritocratic ideas of laissez-faire and hard work.

As a matter of fact, it is impossible to dissociate startup technological innovation from hacking, because corporate innovation is hacking.

The most admired entrepreneurs behave exactly like hackers. By appropriating new technologies and destroying in the process legacy markets, in particular those whose justification for existing are big technology barriers to entry that hacking rendered antiquated, entrepreneurs impose their will on sociotechnical systems. Encouraging innovation in open and democratic societies requires a certain tolerance to rule breaking, and the right mechanisms to channel that entrepreneurial behavior to improve the human condition through creative endeavors that challenge the status quo. Joseph Schumpeter, the father of innovation economics famously named this process “creative destruction” and for him it was actually the key advantage of democratic capitalistic societies:

Capitalism […] is by nature a form or method of economic change and not only never is but never can be stationary. […] The fundamental impulse that sets and keeps the capitalist engine in motion comes from the new consumers’ goods, the new methods of production or transportation, the new markets, the new forms of industrial organization that capitalist enterprise creates. […] The opening up of new markets, foreign or domestic, and the organizational development from the craft shop and factory to such concerns as U.S. Steel illustrate the same process of industrial mutation […] that incessantly revolutionizes the economic structure , incessantly destroying the old one, incessantly creating a new one. This process of Creative Destruction is the essential fact about capitalism. It is what capitalism consists in and what every capitalist concern has got to live in.

So, paradoxical as it may seem, from a Schumpeterian perspective, this constant hacking of systems to appropriate them and modify them from within, this creative destruction, is not only compatible but essential to free market economics.

Nevertheless, If Steve Jobs and Steve Wozniak, founders of Apple and members of the Schumpeterian Pantheon of heroes of the computing revolution, would have initiated their quest to revolutionize computer markets under current cybersecurity regulatory environments (specifically the Digital Millennium Copyright Act), they would have risked hefty prison sentences for the hacking of the AT&T telephone system that got them into the Information Technology business. Steve Jobs “confessed” these “crimes” to Walter Isaacson in his official biography:

It was then that they [Steve Jobs and Steve Wozniak] reached an important milestone, one that would establish a pattern in their partnerships : Jobs came up with the idea that the Blue Box could be more than merely a hobby; they could build and sell them.

“I got together the rest of the components, like the casing and power supply and keypads, and figured out how we could price it,” Jobs said, foreshadowing roles he would play when they founded Apple. The finished product was about the size of two decks of playing cards. The parts cost about $ 40, and Jobs decided they should sell it for $ 150.”

Bluebox at the Computer History Museum in San Jose, CA. Source:

Steve jobs, The “perfect CEO” according to Wired Magazine, the “best CEO of his generation” according to the Wall Street Journal or “the Best-performing CEO in the world” according to the Harvard Business Review, transitioned from the commercializer of a hack developed by Steve Wozniak, into the poster child of the American free enterprise who first creatively destroyed and then forever transformed the business markets for computers, portable communication, music, book publishing and movie production, by appropriating technological tools, playing outside of the rules (even breaking the law sometimes), and hacking those industries. Steve Jobs, Jeff Bezos or Elon Musk to cite just a few examples, embody all the Schumpeterian creative destruction ideal, intervening systems to destroy previous value propositions by the appropriation of technologies… like hackers do.

This is why criminalizing appropriation in the way many cybersecurity policies do today as part of a cyberscare is not a trivial event. The current state of computing science and information technology in general is the result of the exploratory behaviors we amalgamate under the name of hacking. This experimental hands-on approach to system intervention is what defines the hacker ethic (See the Hacker Manifesto at the end of this sotry as another example of codification of these hacker ethics) , and it is impossible to separate its positive consequences from the discovery of vulnerabilities that make cybercirmes possible.

Demonizing hacking in the name of a cybersecurity “uber alles” strategy has undeniable negative effects on the innovation potential. Hackers are system thinkers unsatisfied with the current state of a system, and they possede the appropriate cognitive tools to try to change it, exactly like entrepreneurs. While business acumen and coding aptitude are different skills and not all coders are good entrepreneurs and not all entrepreneurs have to be coders, both essential groups for the improvement of sociotechnical systems mine the vein of technological appropriation. In that sense, both groups are made out of hackers.

Like an autoimmune disease attacking the host it is trying to protect, cybersecurity excesses not only hurt the democratic architecture of the United States as the former Director of the National Cybersecurity Center of DHS explained in his previously cited resignation letter, but they also have the explicit objective of limiting the right of individuals to wander outside the rules of the system to innovate by tweaking, exploiting, defeating and improving those systems.

A good understanding of cyber-geopolitics is the first step towards a strategy of de-securitization of cyberspace. While scale and a sense of place do matter, a healthy sense of proportionality with respect to cyberscale and cyberubiquity would also be necessary for a democratic and appropriation-friendly cybersecurity policy. The metaphor of a “cyberwar” is unwarranted and most of the politics of fear behind the cyberscare would have to be subject to a process of de-escalation, because they do not foster an environment conducive to proportional responses and risk management approaches to cybersecurity challenges.

For cybersecurity policies to be considered appropriation-friendly, they must stay out of the way of disruptive innovation and paradigmatic challenges to the status quo. This often means that some rule breaking should be promoted, some tolerated and only the most hideous kind of hacking should be punished by law.

Hackers engaged in a process of Schumpeterian creative destruction identify previously unidentified vulnerabilities of critical systems, and while many of those “zero-day vulnerabilities” end up being exposed in hacking conventions or become the subject of pranks or other methods to bring attention to them so they can be patched, some of these “zero-day vulnerabilities” may indeed be appropriated by criminal or terrorist agents to become the source of “zero-day attacks.” Although terrorist attacks like 9/11 are an extreme case of a hacking conduct successfully defeating a sociotechnical architecture in the the analog world and there has never been an attack of the same magnitude launched exclusively from cyberspace, the potential of technology appropriation for deviant acts does exist.

But even if we grant the possibility that clandestine innovation could produce terrorist attacks of the magnitude of 9/11 exclusively from cyberspace (a claim yet to be demonstrated with facts), a democratic cybersecurity policy based on a good risk management approach would balance this fear of hypothetical terrorist attacks or cyberwars with the strategic virtues of serendipitous disruptive innovation. The reason is that disruptive innovation rewards and positively transforms only the open societies where the hacking of rules can take place without fear of drastic punishment.

The difference between this risk-based approach to democratic cybersecurity and the current one produced under cyberscare conditions is a matter of degree, but in policy degree matters. More importantly, this change in degree would also permit the de-securitization of the cyberspace agenda, allowing an open and ongoing global debate on the topic and an adaptive strategy to patch the vulnerabilities of our open society in real time, with the help of the hacking community.


I will conclude with some policy recommendations. Their common denominator is a goal to de-securitize cyber-geopolitics while at the same time produce a robust and democratic cybersecurity environment to preserve the freedoms and functionalities of cyberspace. The list is not exhaustive, but that is precisely the key point; de-securitizing cyberspace will foster a richer conversation where more voices can be heard and multiple paths can be tried to improve our critical infrastructure. In a way, these concluding thoughts are designed to foster the appropriation of cybersecurity policy by the citizen-hacker.

1.- An innovation friendly cybersecurity policy criminalizes crimes, not the technology

Cyber-child pornography, cyber-harassment or cyber-fraud are nothing but child pornography, harassment and fraud that happen to be mediated by digital communication technologies. In these three examples, the protected legal interest is always easily identifiable in the analog world: in the first case is the children’s well being, in the second, one’s physical and psychological integrity and in the third case is individual property rights. The “cyber” prefix is, under best circumstances accessory, or most probably a trigger for a cyberscare speech-act.

When technology appropriation is used to perform well defined crimes, the medium is irrelevant to the harm even if cyberscare narratives make it central. Children are as traumatized by the distribution of analog pornographic photos as they are by digital ones, harassment harms individuals when it happens through verbal communication or postal mail too, and fraud makes the individual poorer also when the victim gets his paper cash stolen.

Law enforcement and a strong criminal legal system are still the best tools states have to deal with crimes. On the other hand, neutering technologies or turning them into spying tools in the name of fears against “dark magic” (I.e. technologies we do not understand) are poor cybersecurity choices, but effective tools to securitize cyberspace.

In fact, police departments with a strong policy of technology adoption would have better situational awareness vis-a-vis the challenges of new technologies in the commission of crimes. Collaborative partnerships between hackers and law enforcement can help in that process, but the existence of those partnerships depends on a mutual understanding of the hacking ethic, and as such, trust building should be a critical component of democratic cybersecurity policing. There is a good precedent in the analog world in the practice of community policing because in cyberspace, hackers are the local community.

2.- Information wants to be free: Neoliberal copyright policies are a source of dissent actively resisted by the some in hacking community.

Certain conducts that are considered criminal in the analog world may not have a metaphorical equivalent in cyberspace, in spite of the fact that current cyberscare narratives do try to build those metaphoric links. For example, a very famous publicity campaign launched in 2004 by the Motion Picture Association of America was a speech-act designed to equate piracy to stealing property in the analog world. The cyberscare speech-act had the following text: “You wouldn’t steal a car, you wouldn’t steal a handbag, you wouldn’t steal a television, you wouldn’t steal a movie. Downloading pirated films is stealing, stealing is against the law, PIRACY IT’S A CRIME@

In 2012, United States Attorney Carmen M. Ortiz, the prosecutor who handled the case against hacktivist Aaron Swartz (creator of the RSS format and Creative Commons advocate) for making public millions of scholarly papers on an act of civil disobedience against walled gardens of science, framed it again in the same terms: “Stealing is stealing whether you use a computer command or a crowbar, and whether you take documents, data or dollars. It is equally harmful to the victim whether you sell what you have stolen or give it away.”

JSTOR, the non-profit that operates the scientific database from where Swartz downloaded the papers declined to press charges, but the prosecutor charged the hacker anyway with two counts of wire fraud and 11 violations of the Computer Fraud and Abuse Act, carrying a potential maximum sentence of 35 years in prison, and a million dollars in fines. Aaron Swartz committed suicide as a result of this act of rebellion against the copyright regime, in a traumatic event for IT community.

Nina Paley, a well known artist and a free culture hacktivist has responded to the same kind of allegations that equate stealing physical objects with digital copyright violations, by composing a catchy song and producing an animated YouTube video (that went viral) with the title “copying is not theft,” produced through a grant from the Andy Warhol Foundation for the Visual Arts. These are the lyrics:

Copying is not theft/ Stealing a thing leaves one less left/ Copying it makes one thing more;/that’s what copying’s for.

Copying is not theft./If I copy yours you have it too/One for me and one for you/ That’s what copies can do

If I steal your bicycle/you have to take the bus,/ but if I just copy it/ there’s one for each of us!

Making more of a thing,/that is what we call “copying”/Sharing ideas with everyone/That’s why/copying/is/ FUN!

As Kal Raustiala and Chris Springman identified, the video ridiculed the “piracy-is-stealing” cyberscare frame by pointing out a clear economic difference:

“In economic terms, intellectual property is non-rival, whereas tangible property is rival. As a result, the “piracy” of intellectual property is simply not the same sort of zero-sum game that car theft — or theft of any tangible property — is. And that means that when Hollywood or the U.S. government says that music or movie downloaders are “pirates” or “thieves,” they are indulging in a bit of loose rhetoric.”

The hacker ethic opposes the neoliberal interpretation of property rights that are at the root of copyright regulations and therefore, democratic cybersecurity policies should at least acknowledge that the current regulatory environment for intellectual property is far from enjoying unanimous consent, that it is in a state of flux and that political movements have emerged all over the world to criticize and oppose the current copyright regime. Many of the hacking behaviors surrounding copyright have a strong civil disobedience component, and democracies have learned to treat illegal acts committed in the name of civil disobedience differently from other illegal behaviors. A democratic cybersecurity strategy would do the same.

3.- End-to-end cryptography is the real National Security issue

A democratic cybersecurity strategy would give priority once again to Information Assurance, the stated core mission of the CYBERCOM and the NSA. The inviolability of communication transmissions (including the secure financial movement of capital) should again become a governmental priority. The paradox is that, under cyberscare narratives, the federal government has gone from defender of the internet freedom to communicate privately to becoming the worst offender. Defending American Internet freedoms to communicate freely against adversaries would be the primary objective of a democratic cybersecurity policy, even when adversaries are American governmental agencies.

An indirect collaboration between military and national security authorities and cipher-hackers to reinforce cryptographic capacities can take place under the right architecture. Hackers see in the tampering of encryption systems a healthy process to learn about them, improve them and make them more resilient. In the same way that certain companies now reward hackers when they demonstrate a vulnerability, cybersecurity policies should reward, not criminalize as they do today, cipher hacking that exposes weaknesses of current encryption.

4.- Open sourcing Cybersecurity

A key benefit of de-securitizing cyberspace by reversing the cyberscare and the representations it triggers, is that bringing back cyberspace to the realm of political debate creates opportunities for the ever growing community of citizen-hackers to learn about the key issues, appropriate them and their technologies, identify exploits and work as a community to make systems more resilient in the long run.

In the IT world where they have been allowed to do that, the idea that open software was less secure than proprietary software does not exist anymore. Open source software like some Linux distributions, the Apache web server or the Firefox browser are in fact the benchmarks of IT security for their respective markets. In cryptography, the most secure crypto-algorithms are those that are open to public scrutiny, so everybody can try to crack them. The creative destruction of hackers makes software and encryption more secure in the long run, despite short term breaches.

Robert Oppenheimer explained the importance of openness for security policies in 1955:

“The trouble with secrecy is that it denies to the government itself the wisdom and the resources of the whole community, of the whole country, and the only way you can do this is to let almost anyone say what he thinks — to try to give the best synopses, the best popularizations, the best mediations of technical things that you can, and to let men deny what they think is false — argue what they think is false, you have to have a free and uncorrupted communication.”

5.- To stop fearing technology, devaluate technology

A democratic cybersecurity policy would place the accent on a systematic learning process to understand what technological networked systems have grown to a scale in which they become “too big to fail” to then encourage hackers to identify ways to reverse this criticality. For example, if cybersecurity hard liners argue that the electricity grid can be hacked to produce billions of dollars on damage and millions of casualties (again, a claim yet to be demonstrated with facts) the solution of a democratic cybersecurity policy would not be an anticipatory criminalization of hacking, but a cost effective Critical Infrastructure Protection (CIP) improvement of the energy sector.

Furthermore, the encouragement of radical innovation should be a central tenet of democratic cybersecurity. For example, innovators are already exploring green technologies paired with the batteries of electric cars to either make the grid more resilient, or get rid of it completely. This is hacking behavior in its purest form and in a non securitized environment vis-a-vis cyberspace, it would be possible to explore many more catalytic solutions not yet imagined to salient cybersecurity challenges.

A risk does exist that criminal actors could exploit vulnerabilities identified through an open process before they can be patched (hardware is harder to patch than software), but the solution to this problem is once again a question of degree. Total secrecy and the criminalization of hacking have given us an aging and outdated electric grid that does not require hackers to fail, demonstrating one more time that the alternative to open sourced cybersecurity is not better security, but a monopoly of knowledge that stifles innovation.

Therefore, a risk also exists from not allowing hackers to identify and demonstrate vulnerabilities early in the process, before they are “too big to fail”. Learning from hackers in controlled environments and adapting as a result of their hacks creates an environment in which small failures may occur more often, but they will be less catastrophic.

A democratic cybersecurity policy could approach the problem by opening the grid to controlled hacking to build a better grid to begin with, allowing exploits to be identified before they become critical and innovation to explore new systemic architectures. Monopolies and big corporations are particularly vulnerable to innovator’s dilemmas conundrums because of their dependency paths and sunk costs; because of their lean profile, hackers are ideal actors to mitigate this condition.

6.- Reverse the science and technology gap

The ultimate responsibility for the health of a republic resides on the citizen. In our society “exquisitely dependent on science and technology” what this means is that more than ever before, it is the citizen’s obligation and the state’s responsibility to make sure that science and technology are well understood and that there are no perception gaps in the collective imaginary of society vis-a-vis the accelerating pace of technological innovation that can be exploited by the politics of fear.

Civic hacking at the White House. Source:

As even the White House learns to engage the hacking community during events like the National Day of Civic Hackings (running its very first “Hackathon”) and a tamed version of the hacking ethic goes mainstream, a democratic cybersecurity policy could take advantage of this change to build an educational environment where hackers are nurtured, technology knowledge democratized and hands-on civic hacking encouraged.

Education is therefore not only a tool for justice, social mobility or innovation; it is also the most powerful cybersecurity tool in the arsenal of democratic societies.

Allow me a paraphrasis to conclude. As cyber-geopolitics have demonstrated, George Clemenceau was right, once again: La cyber-guerre! C’est une chose trop grave pour la confier à des militaires.

 ==Phrack Inc.==

Volume One, Issue 7, Phile 3 of 10

The following was written shortly after my arrest...

\/\The Conscience of a Hacker/\/


+++The Mentor+++

Written on January 8, 1986

Another one got caught today, it's all over the papers. "Teenager Arrested in Computer Crime Scandal", "Hacker Arrested after Bank Tampering"...
Damn kids. They're all alike.

But did you, in your three-piece psychology and 1950's technobrain, ever take a look behind the eyes of the hacker? Did you ever wonder what made him tick, what forces shaped him, what may have molded him?
I am a hacker, enter my world...
Mine is a world that begins with school... I'm smarter than most of the other kids, this crap they teach us bores me...
Damn underachiever. They're all alike.

I'm in junior high or high school. I've listened to teachers explain for the fifteenth time how to reduce a fraction. I understand it. "No, Ms. Smith, I didn't show my work. I did it in my head..."
Damn kid. Probably copied it. They're all alike.

I made a discovery today. I found a computer. Wait a second, this is cool. It does what I want it to. If it makes a mistake, it's because I screwed it up. Not because it doesn't like me...
Or feels threatened by me...
Or thinks I'm a smart ass...
Or doesn't like teaching and shouldn't be here...
Damn kid. All he does is play games. They're all alike.

And then it happened... a door opened to a world... rushing through the phone line like heroin through an addict's veins, an electronic pulse is sent out, a refuge from the day-to-day incompetencies is sought... a board is found.
"This is it... this is where I belong..."
I know everyone here... even if I've never met them, never talked to them, may never hear from them again... I know you all...
Damn kid. Tying up the phone line again. They're all alike...

You bet your ass we're all alike... we've been spoon-fed baby food at school when we hungered for steak... the bits of meat that you did let slip through were pre-chewed and tasteless. We've been dominated by sadists, or ignored by the apathetic. The few that had something to teach found us willing pupils, but those few are like drops of water in the desert.

This is our world now... the world of the electron and the switch, the beauty of the baud. We make use of a service already existing without paying for what could be dirt-cheap if it wasn't run by profiteering gluttons, and you call us criminals. We explore... and you call us criminals. We seek
after knowledge... and you call us criminals. We exist without skin color, without nationality, without religious bias... and you call us criminals. You build atomic bombs, you wage wars, you murder, cheat, and lie to us and try to make us believe it's for our own good, yet we're the criminals.

Yes, I am a criminal. My crime is that of curiosity. My crime is that of judging people by what they say and think, not what they look like.
My crime is that of outsmarting you, something that you will never forgive me for.

I am a hacker, and this is my manifesto. You may stop this individual,
but you can't stop us all... after all, we're all alike.

+++The Mentor+++


Homeland Security

A Platform by the Center for Homeland Defense and Security For Radical Homeland Security Experimentation. Editorial guidelines (Publication does not equal endorsement):


    Written by

    Homeland Security

    A Platform by the Center for Homeland Defense and Security For Radical Homeland Security Experimentation. Editorial guidelines (Publication does not equal endorsement):

    Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
    Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
    Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade