How to patch a human

Why we need to change our thinking from cyber security ‘awareness’ to ‘influence’.

Blair Adamson
8 min readMar 6, 2019

Cyber security preparedness is built on three pillars: people, processes, and technology. While technology is a critical element of an effective cyber security program, alone it is not enough to protect against modern cyber threats; even the best technical security efforts can be undermined by a weak security culture.

It’s not only hackers, corporate spies, or disaffected staff who present a threat to organisations; in most cases, breaches are often unintended consequences to mistakes made by non-malicious, uninformed employees.

The Office of the Australian Information Commissioner (OAIC) publishes quarterly statistical information about notifications received under the Notifiable Data Breaches (NDB) scheme. In both its 1 July — 30 September 2018 and 1 October — 31 December 2018 reports, it listed human error as a major source (37 and 33 percent respectively) of reported breaches.

While the largest source of reported breaches (57 and 64 percent) was attributed to “malicious or criminal attack”, a significant proportion of these exploited vulnerabilities involving a human factor, such as tricking employees to click on a phishing email or to disclose their passwords.

These figures illustrate the fundamental role security awareness can play in an organisation’s cyber security defences and how a strong security culture can act as a ‘force multiplier’.

Awareness doesn’t work

There is a view from some in the security community that awareness programs don’t work, and therefore technical solutions should be pursued over human ones.

But the answer is not to stop investing in these programs, instead we need to recast them based on an understanding of why they may not be working.

A traditional approach to awareness might be to:

  • force feed information annually to busy staff and then have them brute force their way through a multiple-choice questionnaire, only to wonder why they keep making the same ‘mistakes’;
  • publish long, dry policy documents and advisories on an intranet site only to wonder why no one is reading or adhering to them; or
  • lecture and threaten staff with the consequences of doing the wrong thing only to wonder why they then simply do the bare minimum to avoid landing in trouble.

The flaws in these approaches should hopefully be obvious to most.

The real problem with awareness, though, is that it is often a means for the security team to project its priorities on staff — what staff should know, and how staff should consume that information — instead of first considering the needs of staff, specifically, what is driving behaviour and what is the most effective way to engage them?

From awareness to influence

What is needed is a shift in mindset: from awareness to influence. Influence — which is ultimately the desired outcome of an awareness program — requires a deep understanding of your customer. I need more than subject matter expertise to effectively influence behaviour: I also need to understand your needs and motivations. For a security team, this requires domain expertise in human behaviour.

The skills required to configure and effectively manage a security system are not the same as the skills required to understand and influence human behaviour. While technical skills are of course critical to building any good security team — and the effectiveness of Telstra’s Cyber Influence team is entirely dependent on the expert technical skills and knowledge we have available to us — having technical experts lead human programs will likely not yield the desired results.

It’s not enough to simply know all about security, we also need specialists who understand humans and what makes them tick. Simply providing information isn’t effective, we also need that information to be easily and enthusiastically retained in the minds of staff. It must ‘stick’ and be a motivating force for change.

The same can be said of information versus intelligence. Any expert worth their salt can provide information. But for it to be considered intelligence — noting intelligence must inform decision-making — it needs to be relevant to the audience and it must be actionable. Good intelligence takes into consideration business context and the needs of the customer.

Similarly, if our awareness activities are limited to blindly broadcasting information without considering the needs of those we are broadcasting to, then we may as well be shouting from the rooftops hoping that — by sheer case of dumb luck — someone is listening and acting on our advice.

A model for influence

Harvard psychologist, Herbert Kelman, published a paper in 1958 titled Compliance, identification, and internalization: Three processes of attitude change. The study examined the drivers for attaining ‘durable’ attitude change: that is, where change in attitude is lasting and extends beyond public conformity to private acceptance, where beliefs are integrated into an individual’s own value system.

These processes can conveniently be applied to attitude change with respect to cyber security:

Compliance

Change will occur when individuals look to gain reward or approval, or to avoid punishment or disapproval. For security awareness, this is where many organisations start and finish. Policies lay out the ground rules, and punishments are put in place for breaking them. And though it’s essential to provide clarity to staff, and while policies can be effective in driving acceptable behaviour, they may not necessarily translate into a change in belief.

Compliance can drive change, but not lasting or durable change… which is where the other two processes come into play.

Identification

People will be more inclined to be influenced by those they identify with; think brand and reputation. No matter how good your advice may be, if you haven’t established credibility or trust with your audience, your ability to influence will be diminished.

This requires the security team to look inwardly and determine how they may be perceived by their customers or stakeholders. Does the security team provide unclear, inconsistent, or impractical advice? Or are you seen as a trusted partner who supports and adds value to the business?

For the Cyber Influence team at Telstra, this means building a consistent narrative and tone of voice that serves as a united front. Our whole security group’s ability to influence is undermined if a team or individual in the group provides advice that is inconsistent with, or contradictory to, another team or individual’s advice within the group.

It also means providing review and quality assurance for presentations and major publications (not just those presented or developed by the Cyber Influence team) to ensure a credible identity for Cyber Security is built and maintained.

Internalisation

Lasting change is ultimately achieved when people have internalised the message and taken on your beliefs as their own. They are no longer doing something because they’ve been told to, but because they believe it is the right thing to do and is in their interests to do so — the new behaviour is intrinsically rewarding.

A working paper published by the Global Cyber Security Capacity Centre, titled Cyber Security Awareness Campaigns: Why do they fail to change behaviour? argues that “end users [already] know about the dangers. Security experts have warned them, confused them, and filled them with fear, uncertainty and doubt. People base their conscious decisions on whether they have the ability to do what is required and whether the effort will be worth it.” Achieving this type of lasting change requires us to provide advice that is understandable, relatable and — importantly — engaging.

At Telstra, this has meant working with creative agencies, for example, to develop story driven video content more akin to a Netflix series than a traditional talking-head training video (complete with its very own trailer below). Staff who viewed this video miniseries have shared it with friends, family, and colleagues in an amplification of our security message driven not by a request to do so, but because the content is engaging and relatable.

This is influence. Our story has been memorable, shared, and impactful. Staff have responded to our avoidance of information-heavy content and retained and shared important key security messages.

Human problems require human solutions

Security education needs to be more than simply providing information to people; it must be targeted, actionable, and achievable, with simple consistent rules of behaviour that people can follow. It must be relevant to their roles and interests, free of jargon, and composed of plain language, case studies, metaphors and allegories to explain the why (not just the what), provide context and build a conceptual understanding for the audience.

It must ultimately motivate the audience to take an action because it is in their interest to do so — not yours.

It’s not just what you say, it’s how you say it.

Stories and mnemonics

The one form of communication that has stood the test of time is storytelling. Good stories are shared and retold, while mnemonics and slogans — like Telstra’s Five Knows of Cyber Security or the Australian Cyber Security Centre’s Essential Eight — help make information memorable. Long lists of instructions or facts, however, are discarded almost as quickly as they were relayed.

Millions of Australians and New Zealanders easily recall the 1980s skin cancer awareness program ‘Slip, Slop, Slap’ and its message to “slip on a shirt, slop on sunscreen, and slap on a hat”, but precious few would recall the numbers of deaths per year from the disease.

Stories which Telstra’s Cyber Influence team tell often demonstrate how an empowered mindset of scepticism and gut feel trump rote recall of static security facts (such as ‘phishing emails have typos’, or ‘look for sites with SSL’). Our stories reinforce messages and feedback has shown that they have been impactful, are more easily remembered and are being actively shared — expanding our reach across the organisation and even to employees’ personal networks:

“Just watched this feature… a highly engaging way to highlight the potential personal impact of being hacked. I will be sharing this with my broader team today.”

“I am going home tonight to speak to the family about how we protect ourselves online.”

Relying solely on static facts is a particularly perilous approach since the landscape is in a constant state of flux. Such facts can be true one day and no longer applicable the next — for example, over 50% of phishing sites now have a valid SSL certificate.

Stories can be enhanced by actions that simplify the complexity of ‘doing’ security. For example, rather than ask staff to select passwords that contain 12 uppercase, lowercase, numerical, and special characters — which is confusing and difficult advice to follow — a simplified approach is to explain the concept of a passphrase which is both easy to remember but hard for criminals to guess, or to recommend password managers to maintain strong, unique passwords across all accounts.

Case studies can be used to demonstrate the risks of poor password practices and, in doing so, provide the ‘why’ which is a crucial motivator for change; simply telling people to do something isn’t enough.

Sustained culture change takes time but, with the right expertise and focus, it can lay the groundwork to ultimately weaponise what is simultaneously the biggest strength and weakness in most companies’ cyber security posture: their people.

--

--

Blair Adamson

Telstra Cyber influence manager | backwoods traveler | reformed fashionista | cinephile | intrepid gourmand | Carlton tragic. All views my own. @Reluctant_Us3r