IMAGE: Mohammad Izar Izhar — 123RF

Everything is hackable: part two

Enrique Dans
Enrique Dans

--

Imagine how you would feel if around 11:40 pm, all 156 sirens in your city’s public defense system, designed to warn of tornadoes, hurricanes, earthquakes and other emergencies began to sound at once, and continued to do so in ninety-second intervals about 15 times until 01:20. This was what Dallas lived through on Friday: the whole city plunged into panic, fearing missile attacks, calling friends and relatives and flooding emergency services with thousands of phone calls. It turned out be a cyberattack of unknown origin.

A public warning system is a critical infrastructure. The attack in question, in addition to bringing the system’s management from their homes to try to combat the problem, could only be stopped by disconnecting the entire network, which would have made it impossible to warn the population in the event that, by chance or intentionally, an event did occur. In addition, such events erode trust in the system, which could eventually cause problems in the case of a real emergency.

This is the second time that I have written a piece with the heading Everything is hackable, and I doubt it will be the last: in my previous entry, I talked about certain Jeep models that could be altered from a computer, endangering its occupants. On that occasion it was a relatively controlled experiment: hackers were trying to prove to a Wired journalist that this was possible and was a vulnerability that someone could eventually exploit maliciously to try to do harm.

About Friday’s incident in Dallas we still know virtually nothing: it could be simply a prank, a way to warn of a vulnerability in the system and to demonstrate what could happen if it is not corrected; equally, it could have been a politically motivated attempt to spread panic. There are any number of possibilities. In any case, the conclusion remains the same: as we connect more and more things to the network, we have to remember that there is a possibility that someone, whatever their intentions may be, can access and manipulate them.

This is particularly true in the case of structures created or designed before such eventualities were a real possibility. In my earlier entry, the vehicle was made by Jeep, a traditional automotive company developing connected vehicles and that presumably lacks cybersecurity because it simply did not need it until now. Cybersecurity is complex: it requires professionals with experience, who are keeping up to speed with new developments and with ties to the community of experts. A cybersecurity practitioner knows that total security does not exist: as the great Gene Spafford, Spaf, famously said in 1989, “The only truly secure system is one that is powered off, cast in a block of concrete and sealed in a lead-lined room with armed guards — and even then I have my doubts.”

In short, cybersecurity experts must create sufficient security so that certain scenarios do not take place, within reasonable limits and assuming a certain level of interest on the part of the attacker.

Hackers are not evil criminals: they are people with a specially developed set of skills that allow them to perform certain tasks. The legendary developer Eric S. Raymond wrote an article a few days ago updating what he considers the typology of hackers, which helps understand a little better their skills and the ethics they work to: hackers are a good thing, they are in many cases highly skilled professionals, and in no way should the term conjure up sinister connotations. If you get a message from a hacker revealing a vulnerability that he or she found in your corporate system, do not call the police or threat with legal action: just go ahead, talk to your IT personnel, and fix the damn hole as fast as you can. If you disregard the message and do nothing, you’ll find that the hacker, after giving you that proper notice, will use the vulnerability to shame you publicly for doing nothing, and it will be your fault, not the hacker’s. Let’s call things by their proper name: people who attack systems to cause damage and break havoc are not hackers, they are simply criminals.

Companies that understand and appreciate the dangers from cyber-attacks learn to react to vulnerability warnings to develop a cybersecurity-oriented culture internally. In the face of events such as last Friday’s, we must try to react calmly: they are a way to warn us that everything that is connected to the internet is hackable and that we must review our systems from top to bottom to understand the risks we face and come up with a balance between the cost of trying to improve security levels and eventual problems derived from the lack of it.

The most positive effect of all this may be that managers from all sectors of society begin to consider the enormous importance of this issue and to think about how to implement a culture oriented towards cybersecurity. These types of cases are simply the consequences of a new environment in which many creators and managers of products and services of all kinds that were not previously subjected to such problems are not yet able to respond flexibly and quickly. In the meantime I fear we will see many more breaches of systems.

(En español, aquí)

--

--

Enrique Dans
Enrique Dans

Professor of Innovation at IE Business School and blogger (in English here and in Spanish at enriquedans.com)