Book Review: Click Here to Kill Everyone by Bruce Schneier

Oliver Thylmann
Oliver Thylmann’s Thoughts
5 min readApr 28, 2019

If you mix Bruce Schneier, one of the most published security experts out there, and a great title, you need to read and I was not let down.

The premise of the book is simple:

Everything is becoming vulnerable in this way because everything is becoming a computer. More specifically, a computer on the Internet.

Of course there are a lot of details and anything Bruce cites, is linked to the origin story, so you really get all the sources that are used to bring that above point home. Especially when thinking about a few years down the road, internet connectivity will be like the power grid, something you do not think about.

1989, Internet security expert Gene Spafford famously said: “The only truly secure system is one that is powered off, cast in a block of concrete and sealed in a lead-lined room with armed guards — and even then I have my doubts.” Almost 30 years later, that’s still true.

and …

Rod Beckstrom summarized it this way: (1) anything connected to the Internet can be hacked; (2) everything is being connected to the Internet; (3) as a result, everything is becoming vulnerable.

If you have to do with technology, this is all kind of clear, but it still needs to be very much in your mind at all times because it is so easy to forget that security is important. Bruce also added some additional fun points:

David Clark, an MIT professor and one of the architects of the early Internet, recalls: “It’s not that we didn’t think about security. We knew that there were untrustworthy people out there, and we thought we could exclude them.” Yes, they really thought they could limit Internet usage to people they knew.

Ok, security is important, but on the other end we have to make sure that we can still hack things and find backdoors to be able to increase security because current laws are prohibiting that and are actually using stuff like copyright to enforce things that normally wouldn’t be enforceable. Check this:

Keurig coffee makers are designed to use K-cup pods to make single servings of coffee. Because the machines use software to verify the codes printed on the K-cups, Keurig can enforce exclusivity, so only companies who pay Keurig can produce pods for its coffee machines. HP printers no longer allow you to use unauthorized ink cartridges. Tomorrow, the company might require you to use only authorized paper — or refuse to print copyrighted words you haven’t paid for. Similarly, tomorrow’s dishwasher could enforce which brands of detergent you use.

Think about it, you cannot build your own K-cups because the software that verifies the codes can’t be analysed. Before software, that wouldn’t be possible. Strange world. But it is not simple either:

For example, some people are hacking their insulin pumps to create an artificial pancreas — a device that will measure their blood sugar levels and automatically deliver the proper doses of insulin on a continuous basis. Do we want to give them the ability to do that, or do we want to make sure that only regulated manufacturers produce and sell those devices? I’m not sure where the proper balance lies.

I agree. It’s as always complicated. But if the source code for insulin pumps was open, if everyone could hack it, and you had authorized update providers, then people could choose who could update their device but also know that the world has made it more secure.

Of course, then we need to make sure we have enough funding to let people find security problems in all those systems because otherwise they will not search for them.

And this again brings us to something I have been really thinking about a great deal. The country Estonia has their resident records in a blockchain because it is a lot more important that they are not changed than that they are stolen or similar. It is ok that everyone knows my blood type, but that it is changed in some important database can be life threatening.

It’s the same with databases. I am concerned about the privacy of my medical records, but I am even more concerned that someone could change my blood type or list of allergies (an integrity threat) or shut down lifesaving equipment (an availability threat). One way of thinking about this is that confidentiality threats are about privacy, but integrity and availability threats are really about safety.

[…]

In the future, however, we might also see more cyber operations that will change or manipulate electronic information in order to compromise its integrity (i.e. accuracy and reliability) instead of deleting it or disrupting access to it. Decision-making by senior government officials (civilian and military), corporate executives, investors, or others will be impaired if they cannot trust the information they are receiving.

Maybe, just maybe, one such hack will lead to suddenly the incentives being put in place to make Internet security happen. Really happen.

[…] once we got the incentives for security right, the technologies came along to make it happen. With spam, it took a change in the e-mail ecosystem to shift the incentives of e-mail providers. With credit cards, it took a law to shift the incentives of banks. Similarly, Internet+ security is primarily a problem of incentives — and of policy.

He also has a lot of suggestions what could possibly be done but admits its a complicated road ahead, just that we need to travel it. Additionally it is not only about security and technology but also about philosophy and social norms.

I really liked this part here, which just means we need to have discussions and not hold everything sacred.

Addressing the 2014 Munich Security Conference, Estonian president Toomas Hendrik Ilves observed: I think much of the problem we face today represents the culmination of a problem diagnosed 55 years ago by C. P. Snow in his essay “The Two Cultures”: the absence of dialogue between the scientific-technological and the humanist traditions. When Snow wrote his classic essay, he bemoaned that neither culture understood or impinged on the other. Today, bereft of understanding of fundamental issues and writings in the development of liberal democracy, computer geeks devise ever better ways to track people . . . simply because they can and it’s cool. Humanists on the other hand do not understand the underlying technology and are convinced, for example, that tracking meta-data means the government reads their emails. C. P. Snow’s two cultures not only do not talk to each other, they simply act as if the other doesn’t exist.

Great book.

--

--

Oliver Thylmann
Oliver Thylmann’s Thoughts

Father, Serial Entrepreneur, Developer Whisperer and currently Co-Founder @giantswarm and Co-Host of the Crypto Nerd Show Podcast