Terrorism: Et tu, Google!

Nick Harkaway
Essays and non-fiction
8 min readNov 24, 2015

The headline tells you a lot: “Why is Silicon-Valley helping the tech-savvy jihadists?”

Fixed that for ya.

Oh, why, indeed? We are betrayed! I was so moved by this piece that I colour-coded it as follows:

Inaccurate — orange

Empty rhetoric — purple

Misleading/incomplete — red

The image pretty much speaks for itself, but a few points of information might be helpful in assessing the vaguer parts of the original piece.

Straightforward inaccuracies:

1. WhatsApp uses “fiendishly-complex” encryption to secure user privacy.

WhatsApp is robustly private if you’re sending cat pictures and don’t want anyone to know you have a cat. For the operational secrets of a terrorist cell? You’d literally be better off sending a postcard. (Ermm: here’s a piece in the Telegraph about it.)

Even Telegram, which the article goes on to excoriate as a monster of secret horror, apparently has some fairly serious problems in its implementation. Cryptography isn’t only about codes: it’s also about system architecture. The Telegram system was built by mathematicians rather than cryptographers, and is regarded with considerable scepticism by many. Any system reliant on a mobile phone connection also leaks metadata — from which remarkable amounts of usable information can be gleaned.

2. GCHQ invented encryption.

The Government Communications Headquarters was officially active from 1919 A.D.; GCHQ, on its own website, puts the date of its practical creation at the beginning of the First World War. The first known use of encryption is dated somewhat earlier, to around 1900 B.C., a discrepancy of 3800 years.

3. Tim Cook and Apple and Google and all those carefree quinoa tech types in their silicon towers could perfectly well solve the theoretical problems and give us hard encryption with a masterkey for the intelligence services!

Pretty sure you mean “Gandalf”.

It’s hard to know if this is complete nonsense or if it is technically true but requires a whole new conceptual understanding, the kind of epochal breakthrough that allows cold fusion or anti-gravity. Sure, it would be great, but mathematics is merciless: it doesn’t yield to policy memos from Downing Street — which is, of course, why most governments secretly hate science even though they know they oughtn’t.

Damn you, science!

Rhetoric:

There’s quite a lot of this, and it’s largely content-free. A few things are worth picking up on.

1. “What will it take? 129 dead on American soil?”

Americans do, in fact, know what it’s like to be the victims of a mainland terror attack. Aside from the glaringly obvious 9/11, there was the Boston Marathon attack, and of course the US suffers from internal acts of terror such as the Oklahoma bombing. Also, given the present mood of the Republican primaries, it’s a bit bizarre to wish for things to get more heated. Frontrunner Donald Trump is promising that he will if elected to the Oval Office require registration for Muslims in the US. It might be best not to go down that road.

2. This is all weaselly Edward Snowden’s fault.

Snowden is responsible for the highest-profile leak about surveillance spying in recent times, but it’s risible to claim that all this is his fault. GCHQ has acknowledgedly broken the law — the new IP Bill, which the UN Special Rapporteur on Privacy called “worse than scary”, is an attempt to patch it — and the NSA’s overreach is majestic, even given the latitude it enjoys. The FISA court, which grants approval for NSA surveillance, received 39,000 requests between 1979 and 2012, and denied just 11 of them.

Snowden is the messenger, and pays the traditional price.

Also: popular superstitions in various countries to one side, there’s nothing inherently bad about weasels. Their somewhat-domesticated cousins, ferrets, have been hunting assistants to humans for thousands of years — appropriately enough because their flexible bodies allow them to go down the rabbit hole in search of prey. Weasels are bendy and a bit bitey. That’s their whole thing. Give them a break.

3. Encryption creates a safe space for terrorists.

The Paris attackers do not appear to have used encryption at all. They stayed in touch on Facebook, quite openly. The 9/11 hijackings took place even though considerable intelligence existed about Bin Laden’s plans, about the hijackers themselves, and about the possibility of using commercial planes as weapons. It was briefly a scandal in 2002 but is now forgotten.

Yes, of course: encryption secures messages, if you’re very very careful, reasonably competent, and if no one has compromised your hardware or your correspondents’ — something we know the services already do very well. There are limits to how much it can hide, especially if you’re using mobile devices, and user errors can radically reduce its effectiveness.

If you really want to keep information dark you don’t put it online, period. Ever. For anything. You write it in soft pencil on a piece of flashpaper while leaning on a glass surface which you then wipe down, and you walk the message to the person you want to give it to and then you burn the flashpaper. It’s a recipe terror outfits know quite well — and have known since the 70s — and which was amply demonstrated in 2002’s Millenium Challenge military exercise. Even then, though, a courier’s patterns of movement will eventually become familiar and detectable. It’s very hard to stay invisible and function at the same time. The difficulty is not data gathering, it’s knowing what data are important.

4. Terrorists are “dunderheads in the desert”.

This reads a little bit racist-y to me, but let’s assume that’s the inner social justice warrior flaring up. I’ll also give the author a pass on the obvious contradiction of “we need to take extraordinary legal and technological measures against a bunch of highly tech-savvy dunderheads”.

It’s never wise to assume your enemy is stupid. In fact, it’s stupid. So let’s say that somewhere in the large number of people who have put their muscle behind the various incarnations of IS, there might be one or two who are capable of reading a technical diagram and using a soldering iron. I don’t know, but it seems more plausible than the idea that they’re buying IEDs on Amazon — although Amazon does come off remarkably unscathed in this article, which is a disappointing gap in some otherwise hugely inventive journalism. Or let’s say that they have sufficient funding to hire in technical talent under a false corporate banner. Suppose that we put a backdoor in every iPhone, phablet, tablet, rabbit, hobbit, gobbet and froglet in the world. I know what I’d do if I was a terrorist determined to use digital comms at that point: I’d build the 21st century equivalent of the AK-47 — a basic phone handset running open source software on common parts with encryption prepared by ISIS for ISIS — and then the only people with access to hard encryption will be terrorists. Yay! Freedom!

Misleading/incomplete:

1. It’s all the fault of wicked tech companies for creating these products.

Yes, the tech companies are working to make communications secure. If they don’t, we can’t have online banking, online shopping and so on. In fact, the credit card network has to go back to signatures and so on. Oh, and things like air traffic control and hospitals and power grids are vulnerable to hostile intrusion, which I think we can all agree is a bad thing.

Practical cryptography isn’t straightforward. You might imagine that if you’ve created a killer code, you’d want to keep it secret, but actually that’s not how it works. Cryptography is tested in the open, in an acid-bath environment. If you want something to be secure, you have to put it out there and let every crazed cryptofiend on earth try to kick the doors off it. This means that cryptographic systems are freely available, so even if Facebook doesn’t put PGP into its consumer interface, you can still get PGP and implement it if you want to. It’s boring and cumbersome, but if you’re plotting a terror attack you’re presumably prepared to put the hours in.

Except that, as we know, the Paris attackers didn’t bother. The problem isn’t that they were invisible: like the 9/11 hijackers, they were at least somewhat known. The problem is that, despite knowing about them, our intelligence threat assessment systems somehow didn’t recognise them for what they were.

It’s also not true that tech companies don’t cooperate with reasonable requests regarding terrorism. Even Telegram routinely shuts down ISIS channels in response to requests. What various companies have refused to do is hand over user data entire so that it can be trawled for possible infractions. It’s a staple of cop tv shows that you can’t go on “fishing expeditions” — you must have some reason to believe someone is engaged in nefarious activity before you invade their privacy. The offline equivalent of what Twitter and so on don’t want to allow would be police visiting your house every so often as a matter of course to sift through your belongings and make sure you weren’t doing anything bad.

2. The government would only ever use surveillance powers with a warrant and for anti-terror purposes.

The warrant process described in the proposed UK legislation is thin and not subject to judicial scrutiny except with reference to procedure.

Historically, existing RIPA powers have been used to spy on farm animals, fly tippers, houses with poor insulation, and children (to make sure they really live in the catchment areas of local schools). It’s hard to see why this would be any different — especially as the government is cracking down on the Freedom of Information Act and on Judicial Review in the same breath.

3. We must preserve our way of life!

Yep. Now, explain to me how our way of life is preserved by introducing blanket surveillance? Historically, we have prefered our government’s writ to end at the front door, and we do not accept causeless intrusion or searches that go beyond the very specific scope of a given investigation.

I’m not a huge fan of the idea that a long history sanctifies intellectual and moral positions, but if you want to invoke British tradition, be my guest: it’s not on your side.

Afterword:

It took me longer to colour in the printout than it did to establish quite how much is wrong with this piece. I didn’t check the claim that ISIS has a helpdesk because if it’s not true I don’t want to know. I’m delighted by the idea of some beleaguered terrorist cryptomastermind sitting behind a desk banging his or her head against the keyboard and yelling into a Skype headset that no, “jenniferlawrence” is not a secure password.

I don’t reject the idea that we need powerful surveillance capabilities for national security. I believe, those tools being powerful, and human nature being what it is, that we should have absolutely ferocoious and independent judicial oversight. I do not believe it is appropriate for a democratic state to collect in bulk the personal data of all its citizens without specific cause. If, however, such an enterprise were undertaken after a full and informed debate in Parliament, the use of such data should be strictly limited to national security. It would be possible to sift it for all manner of other things — but if that happened, we would literally (yes, literally) no longer be living in a free society. We’d be living in an open prison.

The debate is important, and it is not well-served by nonsense like this.

--

--