Let’s have a go at solving the problem of invasive government surveillance

By Ted Cullen

I promise this will take a lot less time than reading the hyperlinks, the government’s latest 382 page report and watching the literally hours of lectures, talks & video I’ve included. There’s also GIF’s, because that’s how you get people to read stuff these days, apparently.

#BigRead #LOL #ActuallyThisIsReallyImportant


As you will see, I have fulfilled the minimum requirements for youth & millenial engagement.

Since completing a degree in which I pertain to be a ‘Master’ of The International Politics of the Internet, I still find myself endlessly confused by the fast paced landscape of technological policy and both simultaneously in awe of, and infuriated by those I read daily, reporting and writing about it.

Having spent the past year attempting to attain gainful employment, I have stumbled upon no end of job suggestions, interview tips and procrastinating clickbait entitled ‘things all twenty-something’s are finally figuring out’. Apparently, I am not qualified for anything, yet if I am to believe such articles, “everyone is totally just winging it, all the timeanyway.

If this really is true, then my fear of inevitably failing, misunderstanding or completely missing the boat is also something Quinn Norton, Stephen Levy, Susan Crawford, Manoush Zomorodi, Tim Wu, Madeline Carr, Andrea Calderaro, Helen Margetts and countless others I idolise, and whose content I vigorously consume like a hungry honey badger, have also been through. On the other hand, if it’s not true, and my incessant confusion about how to effectively craft policy around something which is literally infusing every iota of our existence on an unprecedented scale is a misnomer, and a mere failure to grasp the actual big picture, then I implore those reading to stop doing so and immediately attempt to regain the minute or so’s worth of inane scrolling I have so cruelly taken from you. However, I’m 99% sure I’m not wrong about that, so here. goes. nothing.

In light of this, I appear to have finally given in to my inner anger and apathy and actually attempted to accomplish something. #WatchTheNewsroom

One thing that certainly is clear, is that the intersections between politics, economics, society and technology have never been so grey and foggy. The West’s reliance on laissez-faire back-seat regulation and the principle of a self-governing multi-stakeholder marketplace has actually done well in getting us this far, but at some point during the past five years, these grey intersections have become worn, overpopulated and full of ever deepening pot-holes that only well crafted and effective policy and governance can fill in and smooth over. Mary Cummings, an associate professor at the Department of Mechanical Engineering and Materials Science at Duke University has recently stated herself on the subject of drones, that “People want to blame the technology when it is policy that is the real culprit.” Professor Mary is right, and this applies to more than just drones.


Edward Snowden. Surveillance. Anonymity. Privacy. Big Data. The Cloud. Corporate Monopolies. Corporate Responsibility. ISP Responsibility. DSL, Cable, Fibre, Mobile & Satellite Infrastructure. Net Neutrality. Streaming Media. Piracy. Copyright. Advertising. Social Media. Citizen Journalism. Right to be Forgotten. Education. Open Sourcing. Encryption. Blockchain. The Deep-Web. Hacks. Cyber-War. Critical Infrastructure. The Internet of Things. Transportation. Self-Driving Cars. Uber. Drones…
And the list will only keep getting longer and longer.

Every one of these issues, and these are only some, are issues we currently have hugely insufficient regulatory frameworks or effective policy in place to deal with as they evolve further. Due to the sheer nature of embracing a globally decentralised network, each one uniquely interacts with the other, and each poses a further unique set of problems to our societies; which, if you hadn’t also noticed, are equally diverse and ever-changing, with differing priorities and needs: making arriving at cross-border and necessary global consensus ever more difficult. Now, as we enter a time in which some pretty important trans-national trade agreements are floating around looking to lock in legislature on everything from investment safeguards to state owned enterprises and infrastructure; getting our policy right here, where it really matters, is critical to ensuring all of the services we have become accustomed to, retain their ability to be utilised to their most effective ends, both now and in the future.


The UK is probably, the most closely surveilled nation in the world.
If Pixar did invasive government surveillance…

It’s certainly a challenge. In the UK currently, there is a strong push for new surveillance laws, and rightly so. The UK is probably, the most closely surveilled nation in the world. There are certainly others with more intrusive and overt systems of control for observing and curtailing their citizens ongoings, and others with significantly better infrastructure and penetration rates. However, following Snowden’s leaks in 2013, it has become increasingly clear that sometime over the past fifteen years, the UK has exploited its position, and its power. It has coupled its unique stature as both a world leader in technology and an already well surveilled nation via CCTV, with a rapidly increasing internet penetration rate, to produce some of the most encompassing, widespread, invasive and legally moribund data collection practices, bar none. Even the NSA, the ever present dartboard for anti-imperialistic privacy advocates, reigned in their domestic surveillance due to constitutional constraints; effectively outsourcing the job of surveilling whatever small amount of traffic failed to exit the US borders to the UK’s GCHQ. The UK however, found no legal constraints or any need for such outsourcing, and instead conducted its domestic surveillance with about as much legal oversight as its international surveillance efforts.(That being: little to none.)

Jacob Appelbaum is one of the people at the forefront of this debate. He currently works on the Tor Project, and as a journalist for Der Spiegel, but also likes to travel the world apologising on behalf of the entire United States and holding talks on the subject of surveillance. Lately, when considering the UK’s surveillance practises, he expressed his delight at the rarity of the US “not being the biggest asshole on the planet”. In his talk at the 2014 Security Summit in South Africa he briefly explains the span of UK surveillance tactics, and the ‘Tempora’ programs which essentially created a data buffer to retain every single byte of data, both content and metadata, flowing in and out of the UK for around an entire week, in order to analyse more thoroughly at a later data, regardless of individual particularised suspicion. These passive surveillance practises were coupled with yet more, intrusive, active surveillance techniques, both autonomous and human; such as the widespread clandestine exploitation of large scale Internet Service Providers and prominent Internet corporations. In doing so, they gained access to private user databases in order to more thoroughly monitor select individuals (read: completely control their technology and any other technology they happen to connect with) through sophisticated programs such as “Quantum” and “BadBios”.

Added to this (and to this end) are the use of Deep Packet Inspection & Injection techniques, and even the literal hijacking in postal transit of hardware to manually install backdoors. It even conducted unwarranted surveillance on its own people and this was all too far. Whilst these revelations may not be newsworthy anymore, it is worth revisiting and reminding yourself of why our surveillance laws are in desperate need of an overhaul in light of Home Secretary, Theresa May’s impending “Snoopers Charter”, aimed at enhancing the legality as scope of these powers, rather than providing the “clean slate” many others have been hoping for.


Whilst I find it difficult to completely side with Appelbaum’s notion that in solving these now seemingly inherent problems; “if you’re not a utopianist, you’re a schmuck”; it’s easy to understand why he and so many others in the community feel that way. Ideally it seems, we should form some sort of “Digital Proletariat” in which every citizen would be responsible for their own independently audited and verified open source hardware, running only independently audited and verified open source software, ideally communicating through a unanimously end-to-end encrypted anonymous and global neutral network.

It’s a sentiment echoed by many in the community including chief information security officer for In-Q-Tel, (a not-for-profit venture capital firm that invests in technology to support the CIA) Dan Geer and Wired & Medium journalist, Quinn Norton. At its core is a fundamental problem in our societal lack of a critical mass of knowledgable people in the area, but even more so, is the sheer farfetchedness of both such an unobtainable utopian ideal, and also the certainty that even were this scenario realistic, there would still remain no winning against the seemingly omnipotent panopticon of the technologically advanced State simply because of the nature of the network. “There is only the state of permanent revolution” as Appelbaum puts it; or as Geer states: “when failure is inevitable… defence becomes irrelevant.”

The lengths, exposed by whistleblowers and journalists, that our security services have been through to monitor us as citizens have certainly gone too far. To have singled out those individuals who have exposed or were deemed liable to do so, along with countless others without their knowledge for supposed “Cast Iron” surveillance; a total 100% monitoring treatment in which every single piece of data attributed to them will be retained indefinitely and for eternity. To have devised ways to compromise and hijack our most advanced personal technology without our knowledge from relatively simple postage intercepts through complex bluetooth and microphone hacks to get over air-gapping; it has become evident that there is no uncrossable line, no bridge too far.

In which ironically, perpetual anarchist Jack Black serves as my metaphorical 1984 surveillance state. #WeAreAllInGlassCasesOfEmotion

Yet there remains a dug in irony that we who inhabit the few nations capable of such secretive and invasive surveillance believe we are exempt from their practises; that commonly, they do not concern themselves with us and best of all; that we have nothing to hide and that only the unequivocal “other” that is “targeted”, is done so correctly and within the confines of the law, regardless of how they came to be targeted or tracked in the first place.

Winning this section of our population over to simply recognising the benefits of the aforementioned ‘Digital Proletariat’ ideal would be nigh on impossible, let alone educating them to graduate level CompSci in order to accomplish it. It’s difficult enough getting journalists to use PGP and NGO’s to stop mass-emailing vulnerable Word .doc’s and PDF’s as attachments.

In this now self-perpetuating cycle in which powers in the global north engage in an ever increasing cyber-arms race, Dan Geer is right: defending against attacks is futile. Attempting to provide 100% security for citizens or users is no longer as rational as investing in a good cyber-offence or cutting the cord all together — the latter unthinkable for a set of economies heavily reliant on Internet infrastructure; leaving only one status quo in which States rationally engage in a cyber-arms race with their equals and thus globally, society actively engages in a technological power divide, something Dan Geer also picked out at the last Black Hat Conference in 2014 when he stated that:

“the masses who quickly depend on every new thing are effectively risk seeking, and even if they do not themselves know it, the States which own them know, which explains why every State now does to its own citizens what once States only did to officials in competing regimes.”

Empathising with the shadowy other…


Something we mustn’t lose sight of in any debate however, but particularly one in which secrecy is of utmost importance to one side, is that humans operate on both sides of this divide. The State is not imposing these regimes purely out of its own will. There are people at the helms, and whether Appelbaum and Geer and Poitras and Snowden and Assange like it or not, the previously shadowy figures will have to be involved in any conversation and crafting of policy and further legislation regarding the matter, not least because they are the only ones who actually know what we’re up against when it comes to other States capabilities vs our own. Maintaining the ability to empathise with those we in the dragnet of surveillance view as the “other” will be imperative to avoiding either a stalemate in discussion or a continuation of the status quo once the media becomes bored of the story and pressure on our politicians is eased.

As investment in our network’s security became necessary for security, the rise in governmental or contractor salaries in entities such as BAE and GCHQ has eclipsed those of non-for-profits and traditional academic posts such as the Tor Project and the IETF. Therefore, the sensible place for the brightest minds over the past ten years or so has been behind a government desk, completing everything the hierarchy asked of you, regardless of morals, under the guise of security and secrecy; and even if the brightest minds were to question the direction surveillance was headed in, the veiled threat of becoming ‘Cast Iron’ yourself and exiled to Russia would hardly have encouraged dissent. Well, except for one, obviously.

No nations security services, nor those in charge of them aspire to watch the next 9/11 come under their time at the helm. For them, I can assume it would be easy to develop a rationale which reasoned that such an event was to be avoided by any and all means available to them. And whilst we may think we feel close to the very atrocities we entrust our security services with protecting us from, via 24 hours news, live streams and constant reporting viewed from this side of the fence; the nature of their job states that we do not see everything. The close calls, the genuine threats that never come to pass or even the deaths of colleagues in providing this security anywhere near as close up and personal as these people do. Applebaum likens it to a Lord of the Rings metaphor:

“They want to wield the ring because they believe they’ll be benevolent but I want to melt it down because I don’t believe there is such a thing as human benevolence when you have such a ring.”
Gandalf agrees, and so do the rest of us. (If you’re looking for a twist wherein our pop-culture and literature doesn’t line up with our own morals though, it’s useful to remember Professor Xavier and his “Cerebro” machine, designed to locate any mutant, anywhere, at any given moment. That power also didn’t go down well when abused.)

The argument that all of this is necessary if it only helps stop one terror plot does have merit though, whatever you believe. In essence, what has occurred isn’t the endgame of some subset of powerful State-wielding puppetry, but a natural progression of rational human reasoning in the face of unseen threats, coupled with an unregulated and in-securable technological advance in developed societies. When Vint Cerf, heralded “father of the internet” and one of the inventors of TCP/IP, (the protocol which basically makes the internet work) was asked if he wished they’d changed anything, he stated two things: Firstly, that they’d used IPv6 from the get go, but also that they’d implemented encryption across the protocols. Had they done so, the landscape we find ourselves in today may be very different. (In this Hangout, he also interestingly states that the levels of encryption they’d have liked to implement remained classified at the time and that overcoming problems outside of the network is paramount to overcoming them within it, calling the Internet “a mirror” for inherent human problems.)

Understanding the viewpoint of those behind the surveillance will be critical to overcoming the issue and persuading them that they have overstepped, acted outside of the public interest and achieved what could most probably have been achieved by means which didn’t violate the privacy of millions, both inside and outside your sovereign borders.


“right now, there is a huge cold war happening on the internet”

In light of what we now know however, it appears that the NSA and GCHQ have vindicated Gandalf’s position on the fallacy of benevolence and found themselves slapped with The White Hand of Saruman. There is definitively, corruptible power in highly advanced, unregulated and secretive warrantless dragnet surveillance. And it is highly advanced. It’s a hell of a technological achievement that only a few nations on the earth can really ever aspire to achieve, and with the US and China exchanging cyber-attacks like Pokemon cards whilst involving themselves in a race to wire the Global South through both wired and wireless infrastructure, it would not be too outlandish to echo the thoughts of Costin Raiu, director of global research and analysis at Kaspersky Lab, when he says that “right now, there is a huge cold war happening on the internet.” Whilst ostensibly under the guise of promoting and extending internet access to the poorest countries, the notion that those who install the architecture will inevitably have an edge, if not a pre-installed back-door into that network will not have been an added bonus floating at the back of the mind, but rather an important swing vote in the decision making process.


Having noted all of this, it’s still difficult to truly comprehend the power such a network would grant a nation over others, but some insight can be gleaned as to how powerful such an invasive surveillance machine can be from Quinn Norton’s latest post, in which she states:

“I could build a dossier on you. You would have a unique identifier, linked to demographically interesting facts about you that I could pull up individually or en masse. Even when you changed your ID or your name, I would still have you, based on traces and behaviors that remained the same — the same computer, the same face, the same writing style, something would give it away and I could relink you. Anonymous data is shockingly easy to de-anonymize. I would still be building a map of you. Correlating with other databases, credit card information… public records, voter information, a thousand little databases you never knew you were in, I could create a picture of your life so complete I would know you better than your family does, or perhaps even than you know yourself. I could accurately diagnose you with mental illnesses, for instance — behaviors that correlate to bipolar, depression, addiction, and so on. I could understand you like no lover ever did, and you would never know I was there. While I could pull you individually out of that database, the real magic is that I would never have to. I could let algorithms understand you, process you, follow you, and never have to know any of you myself. You would be tracked and described by a thousand little bots you could never see.” [emphasis added]

What I’ve neglected to mention however, is that Norton here is discussing her capacity with a medium sized advertising agency using freely available tracking techniques employed by most high traffic websites around ten years ago.

“What I’d do next is: create a world for you to inhabit that doesn’t reflect your taste, but over time, creates it. I could slowly massage the ad messages you see, and in many cases, even the content, and predictably and reliably remake your worldview. I could nudge you, by the thousands or the millions, into being just a little bit different, again and again and again. I could automate testing systems of taste making against each other, A/B test tastemaking over time, and iterate, building an ever-more perfect machine of opinion shaping.”

Now imagine what a) these websites can track now and b) what a State with virtually unlimited resources could achieve today given blanket powers.

The Intercept has only recently exposed some of these abilities:

“it’s a lot more powerful than previously thought… it’s fed a constant flow of data from all over the world straight from fiber optic cables, can store content from three to five days and metadata for even longer (up to around 45 days)… the tool helped the agency look up other private info beyond emails and chats, including “pictures, documents, voice calls, webcam photos, web searches, advertising analytics traffic, social media traffic, botnet traffic, logged keystrokes, computer network exploitation (CNE) targeting, username and password pairs, file uploads to online services, Skype sessions and more.” The NSA even tracked phone connections to Google Play and Samsung’s App Store.”

And the best thing, anyone who can use Google can use it!

“it’s also incredibly easy to use. Toucan Systems CEO and security researcher Jonathan Brossard told The Intercept that hacking remote computers using the tool takes just a few “minutes, if not seconds.” Plus, doing so is as simple as typing words into Google search — it’s so easy that the agency can train personnel on how to use the tool effectively within a single day.”

In effect, the problem of surveillance we are seeing today has only become so nuanced because of the sheer technological capacity and overprotective moral reasoning and disregard of select nations’ security services, of which the UK has been heavily complicit, and a colossal misinterpretation and personal oversight of our own position on privacy today. What is clear though, is that if we continue down this path without affecting real and effective change; whatever hope nations without such technological capacity may have previously had for their citizen’s privacy, is literally negligible.


Going forward then, the possibility of continuing with any sort of dragnet surveillance seems implausible, and literally incompatible with the notion of particularised suspicion. Jacob Applebaum has tackled this from a humanistic and purely technological perspective, the need to regain our privacy and integrity on the network through the safety of secure communications. This is no doubt important, but in doing so, it does rather sideline State-actors in the debate and in providing the nation’s security; and the State is important in both of these, along with the delicate state of anarchy in which it, and the rest of the international community operates too. Secondly, it depends on a public, extremely well educated in best internet practices, online communication and cryptography. And thirdly, it relies on that public to govern itself, set its own boundaries for privacy and not use that knowledge to criminal ends.

If everyone understood exactly how every aspect of the web, surveillance, hacks, databases etc. worked, do you think more people would quit the net, or more people would commit crimes on it?

There is middle ground however, most notably heralded by the UK’s independent reviewer of terrorism legislature, David Anderson QC in his recent and extremely thorough Report of the Investigatory Powers Review: “A Question of Trust”. In an interview on BBC Radio 4’s “Law in Action” he’s stated that on the issue of surveillance “there is substantial common ground that most people of good sense could probably congregate on.”

And he’s right. Firstly, that the majority of this debate is “A Question of Trust” and secondly, that there is common ground, but not without a clean slate; calling our current system “fragmented, obscure, under constant challenge… undemocratic, unnecessary [and] variable in the protections that it affords the innocent”, calling for “comprehensive and comprehensible new law… drafted from scratch, replacing the multitude of current powers and providing for clear limits and safeguards on any intrusive power that it may be necessary for public authorities to use.”

At first glance, it does seem to echo what most privacy groups have been calling for since the first Snowden leaks, but it also emphasises a strong need to regain trust in the establishment, with whatever these new laws are to be, requiring popular consent. Something arguably even more necessary now, especially considering the latest UK election’s outcome having produced the least representative parliament we have ever had. Anderson’s report is fantastic, and unlike every other reporter and journalist out there, I will not purport to have read the entire 382 pages, but what I have read, is extremely level headed and sensible, something I’m sure will do no harm to his argument for level headed and sensible judicial oversight. However, I am certain that whatever debate we are to have in light of this, and many other points of view in the run up to whatever piece of legislation does hit the floor of the Commons, we cannot discount our continuing need to educate, if not to Appelbaum’s “Digital Proletariat” levels, then at least to a level where there is consensus on what we view as privacy, private space and ‘secure’ communication.

For too long we have assumed that the ‘generational gap’ in technology would be bridged by the time millennials reached their 30’s, but the longer I sit on the sidelines of this debate, the more I grow to understand that the people who really understand are either actively involved in the struggles, or are those who grew up beside the internet, rather than those who grew up not knowing a world without it. “Terms and conditions” are for one generation a binding agreement and for another, just another hurdle in their race to acquire yet another free service. (Don’t worry, I’m guilty too.)


Privacy online is a gamble…


In getting over this then, we have to finally accept that our entitlement to the privacy we think we should be granted may not be the privacy we are getting, nor the privacy we should aspire to have.

“We care about privacy as it relates to the people from whom we want to keep certain pieces of information secret.” Benjamin Wittes & Jodie Liu

Benjamin Wittes, Senior Fellow in Governance Studies at the Brookings Institution and Harvard Law student Jodie Liu have previously posited “The privacy paradox” whereby “we worry obsessively about the possibility that users’ internet searches can be tracked, without considering the privacy benefits that accrue to users because of the underlying ability… to acquire sensitive material without facing another human, without asking permission, and without being judged by the people around us.”

Others have discussed the reality that all the internet has done is served to create the illusion of a privacy that we’ve never truly had, positing the notion that before we built towns which could house a large populous and had the ability to communicate globally, we lived in such proximally small communities that finding privacy within them was all but pointless. Whilst this should serve as more of a conversation starter than a point to finish on, it does explicitly show the divide between those watching us with our knowledge and those without it. Whilst, as Wittes and Liu write, privacy is “not a zero sum game”, it’s a useful binary to be aware of in a society where we freely give over personal information to online services, and ask our most embarrassing questions of a data retaining algorithm, yet shiver at the prospect of doing so to an unknown next door neighbour or even close personal friend.

When your mate asks to use your laptop and you can’t remember the last time you cleared your browser history.
“By entrusting our data with big and small businesses and with the government, we’re putting some of our most treasured secrets in the hands of someone else, where they’re waiting to get hacked or otherwise used against us.” Jason Koebler, Vice

Jason Koebler of Vice asks why we should have to gamble with our personal data though. If, as we have established, the failure of systems is inevitable in the long run; the trade-off for any online service which seeks to utilise personal data is this very gamble, that our personal secrets are at the behest of every hack, leak, bug, thorough investigation and rogue or warranted surveillance effort. I’m not excusing poor network security, but blame cannot always be entirely placed at the feet of software engineers for unseen consequences, poor user security or complex social engineering.

The privacy we seek is, for now, an eternal gamble on the internet, and making sure everybody in the debate is aware of this before we attempt to safeguard what we cannot, is imperative. After all, Koebler states that the chances of a spurned wife discovering her husband’s infidelity by accessing, searching and analysing a hacked dating sites database, knowing the correct username or even suspecting this exact situation in the first place is a “longshot compared to the chances of some parent from your daughter’s softball game seeing you cruising for discreet hookups at a bar.” And even then, recently, divorce attorneys have reported over an 80 percent increase in evidence gleaned from smartphone and social network data in cases. It won’t be long before Koebler’s unlikely scenario becomes ever more likely.

After all, hell hath no fury and that.

The leverage we have to put into legislation then, is that which lessens the ability of those who may violate our privacy without our knowledge, or our suspicion of their doing so. This means targeting those who store our personal information, and those with the capacity to access it, when it comes to writing up policy and legislation, with everything from providing user specified data-retention to minimum security requirements for holding personal data, and crucially, an active interest in educating their users in best practises for retaining privacy on the internet.

It’s not OK to not know about the internet anymore. You’ll hurt people.Quinn Norton

Best practices can include making users easily aware of their privacy settings, something which many large social media companies are currently doing, but also making them aware of data requests made by law enforcement both attaining to them personally and on a network-wide scale. Informing users of breaches in their security, whether they comply with the EFF’s Industry Accepted Best Practices, and policies on data retention and encryption stances. And on the user’s side of responsibility: truly understanding what all of these mean when uploading any piece of personal information to an online application and making well informed decisions based upon that. To this end, we cannot, with a clear conscience allow ourselves to post our faces & ideals online publicly and expect them not to be used by law enforcement if we are suspected of planning to commit or of having committed a crime. And to that end, we must encourage transparency and cultivate trust in the law to use the data we make available to them publically in a sensible manner. After all, the difference between mass dragnet passive surveillance and particularised active surveillance is small. Taking a photo or artists drawing of a suspect and running it against some of the most advanced facial recognition software in the world to find a match in any other publicly available photos posted on social media profiles or through the extensive CCTV network is different to raising a suspicion and immediately knowing where that person is already. There are probably minutes between them, yet one is significantly better and will sit much better with the public than the other.


Transparency, Trust and effective legislation…


Once we fully understand where we stand on our notion of privacy, we can finally begin to build policy which all sides can agree upon. Crucially, as much of the States power and necessities for effective law enforcement rely on a modicum of secrecy, there must be trust between the electorate, the executive, the judiciary and those who seek to enforce the laws enacted by the two. Trust in this relationship is key, as Dennis Frank Thompson has so succinctly put it, “Citizens have a right to insist, as the price of trust in a democracy, that officials not give reason to doubt their trustworthiness.”

And boy have they given reason to doubt.

To acquire this trust once again, it is imperative that the executive provide accountability and a measure of transparency to their surveillance operations. An idea pioneered by Co-Director of the Harvard Library Innovation Lab at Harvard Law School, David Weinberger and echoed by others in the journalism community from NYU professor of journalism and media critic Jay Rosen, Director of Interactive Journalism at the City University of New York Graduate School of Journalism Jeff Jarvis and even the now infamous Intercept & Guardian journalist Glenn Greenwald, is that, in the fourth estate now: “transparency is the new objectivity.” I would argue that transparency may also be both the new public accountability and basis for legitimacy, in politics. Both sets of logic reside in their ability to communicate trust to those whom they are affecting, rather than relying upon a moral notion of assumed trust between parties simply due to hierarchy and voting power. Since we have come to find ourselves untrusting of the State here, that assumed trust is no more, yet trust between parties across the divide is completely necessary for the continuation of a status quo whereby the State continues to provide for its citizens safety and the public continues its use of the internet in roughly the same manner it does today.


That the State continues to provide for its citizens safety is in essence, the end goal of any legislation. The caveat is, that it should not come at any cost. Somewhere in there lies the question of how effective the law needs to be in order to provide this safety. Lawrence Lessig, who many know for his work on copyright reform, money in politics and net neutrality, has discussed the arbitrary notion that the law is a set of binary rulings, instead stating that “The law need not be completely effective in order to be adequately effective”, something which David Anderson echoed in his interview with BBC’s Law in Action when he stated that “there is only so much we can do with the law, because we do have a lot of very strong counterterrorism laws” and that “in a free country, with relatively open borders, you are not going to stop everybody.”

If we know both this, and the fact that eventual failure of any network security is inevitable to be true, even the notion of a legal dragnet surveillance network capable of providing 100% security seems to be a fallacy. When you add in the fact that the perpetrators of many of our latest terror atrocities, including Lee Rigby’s murderer and the Charlie Hebdo shooters were already on the radar of the British and French intelligence services, but they decided to stop monitoring them because of lack of resources”; spinning the idea to the public that with even greater powers capable of infringing on the public privacy, they could prevent the next attack would seem to be an increasingly difficult spin to achieve.

Anderson himself still recommends bulk collection, especially outside of a domestic context, (technical difficulties aside) but inserts a further layer of judicial review to provide the necessary transparency.

This is the United States, but the ‘judicial review’ explanation holds for most democratic nations. It’s just how they got there and the separation of powers which tend to differ.

Firstly, authorisation of a warrant for surveillance would be taken away from the small pool of ministers who can currently sign off on them, and instead be handed to a set of independent judicial commissioners. As a great majority of warrants are issued for police use in serious crimes; drugs, firearms and trafficking rather than in espionage or counter-terrorism, it is logical to give the decision to a judge, who by professional default, has a much more thorough understanding of policing, crime and how to put a case together properly than a career politician. This firstly takes suspicion away from decisions made by often inexperienced ministers and provides the ‘ability to respond quickly and effectively to threats of national security or serious crime’, by doing away with the need to busy up the schedule of a Home Secretary with warrant signings.

Secondly, on the issue of national security, Anderson does accept that the judiciary does not, or very rarely plays a part in shaping foreign policy, or what the executive would label as important to national security. Here, Anderson admits that where a warrant is sought on this basis, the decision must lay with the executive, and that we must trust the judicial commissioners to understand this line, and the limits of their knowledge and responsibility.

Asking some of the most intelligent people in the country whose job it is to literally be fair, to get in on this debate, is some of the most sound advice I’ve ever heard.

Thirdly though, and key to this, Anderson seeks the oversight of a judicial commission, as most people will never know that a warrant has been obtained because British courts do not allow intercepts to be admitted into evidence, providing no judicial redress after the event, therefore, begging the need of it beforehand. But this lack of legal recourse after the fact is a huge achilles heel in the strive for transparency and trust. It still leaves a system which can be abused, whereby judges can be strong-armed, convinced or simply over-ruled by the executive into issuing vaguely written warrants, legalising the exact sort of dragnet surveillance the public is trying to avoid, and covering its tracks by public in-admission in courts. All Anderson is suggesting really is adding another layer of red tape, in which already rich judges, get more well paid jobs at the behest of the British tax-payer to continue a status-quo that nobody wanted in the first place.

On the question as to whether handing power to an unelected official makes the system less democratic, Anderson states that where interception has been conducted, politicians can never be held to account anyway because it’s illegal to disclose when interception has been conducted, and that the government never confirms or denies intelligence matters relating to the fact.

For this reason, you have to return to the notion of providing Lessig’s “adequately effective” legislation. Law enforcement and those who seek to break the law are, and always will be in an endless game of cat and mouse. The whole point of a dragnet surveillance is to limit the possibility of crimes being perpetrated, but whilst crime numbers don’t seem to have taken a drastic hit compared to previous years, public trust in government and their own perception of privacy certainly has.

When Daniel Rigmaiden was caught filing fraudulent tax reports in California, he eventually discerned that he had been caught through the usage of a ‘Stingray’ device, something the police would use to fake a cell tower signal, connect to your phone and access your data. However, as Anderson has stated, the way this evidence was acquired was never submitted into evidence and it seems the FBI and US government refused to allow any mention of the device in a court, effectively forcing States to drop cases where they could not put a prosecution together without the evidence gleaned from the device. It has since come out that these devices are also in widespread use within the UK.

When you have law enforcement utilising devices which passively access data on every single mobile phone within their radio radius in order to surveil just the one device, and when the fruits of that surveillance prove to be the only useful evidence in prosecuting a now known criminal; barring the police from bringing that criminal to justice in order to keep such a device secret certainly defeats the object.

Left: Police Cell-Tower Device used to capture Daniel Rigmaiden. —Right: Deadly cartilaginous fish which caused the death of Steve Irwin

Where our nation’s security services have backdoored social networks through PRISM in order to acquire data we believed to be private without allowing the company in question to perform as the necessary intermediary for governmental requests, or even forced them to divulge personal data without letting their users know. Where it is supposed they have completely compromised ISP’s and the Tor network to pinpoint individuals complicit in deep-web crimes and where lately, UK police have scanned the faces of thousands of festival goers; having this evidence inadmissible in court on the basis that the State distrusts both the law-abiding public and those with criminal intent with knowing its abilities to safeguard its citizens, definitely straddles the borders of democracy, and certainly gives you a sense of the regard those in power have for those they now subjugate.

I would argue that having this information come to light in a court would probably not have the long lasting implications law enforcement, or security services are afraid of. Following the facial scanning at Download Festival, police stated that faces were only being matched against those of known criminals and that any further data acquired was destroyed afterwards. Would the Stingray have been made public and had law enforcement openly opted for the same data retention strategy, I imagine public distrust would not be as high relating to the device’s use as it is now. Would this lead to criminals opting for other methods of communication or using evasive techniques to hide dominant facial features? Perhaps, but did (intelligent) criminals use Facebook Messenger or WhatsApp to organise their heists before Snowden? At some point, asking a criminal to go the extra mile and do the legwork on a extra set of 10–20 surveillance efforts before the fact, will deter them and cost less to society and law enforcement than performing timely and costly surveillance on a suspect and arresting them after catching them in the act having not put the legwork into avoiding surveillance they didn’t know about. This is cat and mouse. This is law that is adequately effective.

What’s an article on the internet without a cat.gif eh?

What our security services have been striving for is a definitive end to that cat and mouse cycle, in which the cat lives in a mouse free home, but the secret is that there is no such thing. There is no omnipotent panopticon capable of inhibiting criminal activity completely, not in any democratic society, not in any developing nation, not in China, not in North Korea, not in 1984 and not on the small island nation of Niue. There is only a state of existing in which the States capabilities to protect its citizens far outweighs that of a criminal’s ability to violate them. Balancing those capabilities is what makes a nation both free and safe, not low crime statistics, and not an all powerful police force.


Obviously, there should be no safe haven for criminals. As Anderson states, “no channels should be immune from interception” but the circumstances under which they could be intercepted should be open, properly regulated and upon fair request, services should be obliged to hand over incriminating details.

However, this isn’t me saying I’m for any sort of governmental criminalisation of encryption or stifling of innovation when it comes to providing private services. Encryption and cryptography are integral to building trust on the network, and are incredibly important in avoiding a large majority of the potential and inevitable breaches, along with, you know, keeping our banks running.

In a cat and mouse system, the onus is on the cat to catch the mouse. However, if we only encourage a ‘Digital Proletariat’ uptake in heavily encrypted anonymising practises such as Tor browsing, secure VPN’s with no IP logging, hardware proxies, blockchain payments for online goods and encrypted PGP communications, without allowing for those means to be legally intercepted, we are not contributing to the solution of the problem of the surveillance State. Whilst for now, many of these seem uncrackable until we invent quantum computing, in the long run, we will only be exacerbating the problem once these avenues have been thoroughly cracked and are no longer safe. And when/if that time does come, we will be all the better placed to deal with it if we get our policy correct now.

“if there is an app which Paedophiles and Terrorists know they can use without slightest possibility of detection, you are subverting the rule of law, and you are letting the bad guys win… it doesn’t mean it should be easy for State to read information they really want, but potential has to be there.” David Anderson, QC
On the point of suspicion rising from online surveillance: There are few crimes which reside entirely hidden online. Hate speech is public by nature. Cyber-bullying obviously has at least one witness & the sale of drugs has a physical illegal market on both ends. Other deep-web criminal services such as Ransomware & hacking are sophisticated enough crimes that I imagine their perpetrators understood the risks of sophisticated online surveillance to a degree which would render the usefulness of it pointless.

To know you may be surveilled and that there is no channel upon which you will be immune from interception provides you with avenues for and the freedom to pursue legal recourse, but to know you are not being surveilled as the norm provides you with integrity and privacy. Implementing a transparent judicial review for legal interception based on warranted particularised suspicion, liaising with the companies which provide private service to the suspect and making the suspect, and public aware of the full extent of the surveillance conducted towards them after its conclusion regardless of whether a guilty act was perpetrated or not provides the electorate with trust, the state with accountability and law enforcement with ample capability to do their job.


Outside actors: Balancing policy with network management…


Domestically, it is feasible that the outlines above could help shape policy, but Anderson’s point about judicial commissioners knowledge boundaries and the constitution of national security make a point about the domestic ability to safeguard from outside actors. Appelbaum provides a nice soundbite to aid in understanding this difference, stating that:

“If we have any regime of spying, where some people spy, some of the time legally, we have a problem. Which is that, someone else, who is not in your legal regime, is spying ALL OF THE TIME.”

His point here, whilst poignant, really only echoes the long running difference between domestic police and a foreign spy. It is an inescapable inevitability in the anarchic system, now only amplified by the size and nature of such a ubiquitous and global network. We are all at the behest of a dragnet surveillance regime we have no control over or any ability to exercise protest over. Legally, we can only hope to trust the law enforcement tasked with protecting us works, and that they will catch those who spy all of the time through the avenues discussed above, and not through their own blanket, dragnet surveillance.

In attempting to solve these murky problems though, it may not be best to look to politics but instead, to possible technological solutions. After all, as Appelbaum also states, really “the problem is not the NSA, the problem is that those capabilities exist at all” and involving engineers and people who write protocols in possibly solving this problem whilst crucially retaining all the things we love about the internet could prove useful.

Sidenote: Let it be said mind, that my supposed expertise on these issues, end when it comes to conversations about involving engineers and devising better protocols. I have a basic understanding, and I know that this > is just some run of the mill HTML, but I’m always ready to learn more, and well… it looks pretty hollywood right? #WhiteText #BlackBackground #DivTags #LeetHacker

In 2000, Lessig wrote on the inbuilt regulations of cyberspace, and the abilities for code to effectively change how we view these aspects of human privacy, trust and anonymity in the internet age stating that:

“as this code changes, the character of cyberspace will change as well. Cyberspace will change from a place that protects anonymity, free speech, and individual control, to a place that makes anonymity harder, speech less free, and individual control the province of individual experts only.”

We live in a world where our citizens no longer live entirely within our territorially defined borders. If they use the internet today, they live outside of them every day, whether they choose to or not. Jeremy Burton, president of product and marketing at EMC, recently correctly pointed out that “these days, it’s pretty standard to have corporate data triple replicated and geographically dispersed.This is simply because it’s the only way to operate a reliable online business that works, wherever you are in world. The internet, whilst invented with the express desire of creating a decentralised and therefore, veritably indestructible military communications network in the face of all out nuclear war, is actually a lot more fragile than many would give it credit for. I even wrote about it for my Master’s dissertation. Geographically dispersing data around the world not only safeguards it from destruction, but also serves to provide faster and more reliable access to whomever wishes to access such data, regardless of their geographical position.

For these reasons, we cannot request that every piece of data attaining to a citizen remains within the confines of their national territory. Even just implementing such a regime for metadata would cripple most services which utilise the ad-supported free business model we have all taken to.

If we’re going to start paying for services with our eyeballs, our clicks and our upvotes, we can be damn sure that whoever is paying for them won’t want their new product market confined geographically.

Therefore, defending civilians privacy against threats from outside actors to the best of their abilities becomes increasingly difficult for States. Especially for those who do not posses any sort of sovereign hold on large scale social networks or providers of online services. This however, is replicative of Vint Cerf’s “internet/mirror” narrative, whereby the Internet merely reflects or amplifies already existing problems in our society.

We live in a fairly hegemonic world, where certain States exert significantly more power than others in all aspects of global governance. Attempting to fix this decades long status quo with technological policy, or even just technology would be like plugging one hole in a colander. The best we can hope for is to leverage the international community, and the necessity for a Hegemon to maintain a level of trust from others, into a fairer, yet still unequal system of foreign surveillance capabilities. Essentially, we grant those who previously had no say in whether their citizens privacy was violated, a seat at the negotiating table.

This model also leaves the power for most states to retain some autonomy and a measure of control over their citizen’s safety from “ring bearing” cerebro-superpowers. Of course, no “binding” international treaty is wholly enforceable, but for one: Democratic Western States have a much better track record of keeping to them and two: this would definitively draw a line under the legality and illegality of State sponsored surveillance actions for legal recourse within international institutions, and also define anything outside of this system as espionage. Anything after this fact is, as always, the prerogative of the State.

The only way to inhibit a strong State’s ability to request data from a national firm on citizens outside of its borders is to make sure every request carries a burden of proof, and to have these services sign binding trade agreements with the States in which they operate, prohibiting them from giving up data on a citizen without certain undertakings, thus shifting law-enforcement surveillance into the cross-border policing model they have already been using for years when it comes to drugs, trafficking & international crime, and co-ordinating with the country in question. Then, if that brings no recourse and it can be proven to a judge that the avenue has been exhausted, but the request is still warranted, only then may data be requested by the home nation and legally given up by the service.

This addition of State treaties with large companies has worried many, and rightly so. Augmenting the power of a private entity whose sole end goal is their profit margins rather than the well-being of their users is a worrisome thing. It’s at the core of the TTIP protest movement, but here, I wager that it is not a bad thing. Firstly, for free web services, the well-being of users can be equally as important as turning profit. It’s well known that companies in the Valley can go years without making it into the black, relying on a constant stream of funding based on valuation and size. And secondly, it’s actually something that other privacy advocates are calling for.

Julia Horwitz works in consumer protection for the Electronic Privacy Information Centre. Not only does she echo the sentiments of Mary Cummings when it comes to bringing about change with policy rather than technological change, she advocates for these exact legal frameworks I have explored above saying that “it shouldn’t be up to the consumer to try to protect his or her own privacy.”

“There should be a robust enough legal framework in place that would be incumbent on the company to comply with the law, rather than on the consumer to shop around for the most privacy-protecting service, when by the nature of the service, the consumer’s not going to have all of the relevant information.”

This won’t affect many countries. In fact, it’s liable to really only affect the U.S. and its intelligence sharing allies. Best of all too, it likely won’t stifle innovation as the last thing Silicon Valley and the US tech industry want is more regulation and more data requests at the end of the big government stick. Giving Facebook the legal ability to rebuke a request for swathes of data will not only make them happy, but also play well with their users across the globe. In the free business model, users do have power.

Sidenote: On the question of whether the government should have a responsibility to protect national corporate interests from cyber-attacks: We must be wary not to create a chain of command in which these corporate interests retain zero say in how security is attained, and what data is accessible to this end. Currently, this isn’t a question I have given much thought to, but I’m open to debate. Also, if you choose to watch anything, watch this. It won an Oscar for a reason.

Opinion: the likelihood of any change happening is sadly small…


Everything above; that’s my two cents. My hat in the ring. Take it or leave it. But as far as I can see, the government in charge of the UK at the moment doesn’t look to be one seeking fair change. A “Snoopers’ Charter” anything like the one laid out by the Conservatives before the latest general election would be a travesty, but it’s likely that something like that will hit the floor before anything sensible taken from the likes of David Anderson or myself. Anderson has stated that a new Snoopers’ Charter needs “a detailed operational case to be made out, and a rigorous assessment conducted of the lawfulness, likely effectiveness, intrusiveness and cost of requiring such data to be retained.” As FFTF have pointed out, “So far the Government hasn’t made such a case.”

38 Degrees/My Instagram

Instead, they have made a report which suggested a legal alternative top secret, pushed forward with the Transatlantic Trade and Investment Partnership “built for corporations and not citizens” and distanced themselves from Anderson’s viewpoints on multiple occasions. They have stated that obeying the law does not exempt you from surveillance, attempted to criminalise encryption, cut benefits, redefined core aspects of democratic society and cut legal aid, minimising the ability of those who have been surveilled to acquire any sort of legal recourse.

But before I invalidate everything written up to now by pushing some blasé leftist opinion and find myself accidentally quoting Russell Brand, there may be some hope from the man who started this all off in the first place. When we think about Wittes and Liu’s Privacy Paradox, and their statement about ‘only caring about privacy as it relates to those from whom we wish to keep information secret’, we can possibly glean something about Snowden’s betrayal, and the gambles he is willing to take for the prospect of change.

The crime in Snowden’s betrayal: making information public that was necessarily secret due to the advantage it would provide an adversary, was not a crime perpetrated against the civilians of his nation or of the civilians of the world, it was against his nation’s security services. (Although, as discussed above, these are rather entwined). The first thing many, including myself noted following Snowden’s revelations was blatantly, “what did you expect the NSA and GCHQ were doing?” However, in exposing their actual cyber-capabilities, more than anything he provides outside nations or actors with crucial, pertinent and honest information, something incredibly difficult to come by within the intelligence community. But his intent was to bring a long suspected open secret into an open debate. After all, pre-Snowden, any debate we had about dragnet surveillance was purely hypothetical, and mostly only existed in dystopian fiction.

In essence, the US should only really have cared about the privacy of that information as it related to keeping it secret from State adversaries, not from its own citizens.

In trailing Snowden, the judge must really take into account Snowden’s criminal intent, and whilst there is no doubt he has weakened the security of the nation’s security services and the cyber capabilities he has exposed; in doing so, he has enlightened the populace to their overstepping and invasive practises, and if the populace does decide he was right in doing so, and they disagree with the practises of their own security services, his exposing may have legal basis for defence.

MRW I consider sharing a cell with Bradley/Chelsea Manning for the rest of my life.

Possibly then, if any government were to change their surveillance laws before Snowden is trialled, they would be de-facto admitting he was right, and has this legal basis for recourse. I’m not saying that Snowden should or should not get prison time, but there is the possibility, given a new US president will be looking to make an impact in 2016, that if Snowden were to ‘take one for the team’ and serve a sentence, in the aftermath of doing so, he may instigate better change for having saved the US, and in turn, the UK, from losing face and admitting he was right. Perhaps real policy change can only be undertaken after Snowden gives himself up. Even Former US Attorney General Eric Holder has now stated that Snowden could strike a plea deal.

Like our privacy online though, our trust in governments and security services and our incessant uptake in new technology; it’s one hell of a gamble, but we have to roll the dice now.


If you would like a printable, plaintext copy of this, I’ve made one here.

Ted Cullen, also confusingly known as Ed Peeters is an out of work politics graduate trying his luck at freelancing. Hire me/Phd me

You can find him on Twitter | The Web | Email | Facebook

Show your support

Clapping shows how much you appreciated Ted Cullen’s story.