Boom! Behaviour Change!

Why security behaviour change campaigns fail, and how to make sure yours doesn’t.

Joe Giddens
People. Security.
7 min readMar 14, 2020

--

On Thursday 5th March I presented at CyNam 20.1. I spoke about three key concepts, and how they can be applied to security behaviour change campaigns. This article is an adaptation of the talk.

Concept 1: Security awareness is dead on its arse.

The old model is broken.

Train, test, analyse. Train, test, analyse. Train, test, analyse.

Repeat until you have a data breach, and you realise this doesn’t actually work.

Old model. Shite.

This strategy may derive some short-term success. You may see improvement. But, long term it will harbour a “them and us” culture.

It will breed mistrust. People will get bored quickly. Ultimately, it is a lot of work for only minimal gain.

It’s what I call the “gun-to-the-head” approach to behaviour change. The model makes some dangerous assumptions:

  • People are interested in cyber security, they want to learn. Actually, people don’t give a damn. They have jobs, targets, clients to please, and bonuses to claim. They’re not interested in security. Security is someone else’s job.
  • It tries to fix wrong behaviour, rather than trying to understand why people behave the way they do. We’ll see shortly why this is silly.
  • It assumes people are rational. We’re not.

Ultimately, it assumes non-optimal behaviour is wrong behaviour.

Human behaviour is not wrong. It’s not broken. And it’s not something that can be fixed.

Human behaviour is just that. Human. Natural. It is why we’re the most dominant species on this planet.

Which brings us to concept 2.

Concept 2: People are not “the weakest link”. Our understanding of people is the weakest link.

And that’s great! Once we understand, we are the problem, then we can start to change things.

If we just blame users, we have no control. Who are we to think we can control what other people do?

This concept is demonstrated most clearly by a bowl of cashew nuts. Obviously.

A bowl of cashew nuts.

The following excerpt is taken from ‘Nudge’, by Cass Sunstein and Richard Thaler.

“Many years ago, Thaler was hosting dinner for some guests (other then-young economists) and put out a large bowl of cashew nuts to nibble on with the first bottle of wine. Within a few minutes it became clear that the bowl of nuts was going to be consumed in its entirety, and that the guests might lack sufficient appetite to enjoy all the food that was to follow. Leaping into action, Thaler grabbed the bowl of nuts, and (while sneaking a few more nuts for himself) removed the bowl to the kitchen, where it was put out of sight.”

If the guests knew a three course meal was imminent, why were they filling up on cashew nuts? The behaviour seems irrational, right?

If you take a step back, it’s obvious why this behaviour is more natural than we might think. You see, biology is slow. Really slow (as my fierce little diagram shows).

Fierce diagram.

This is not to scale. I’m not a biologist. I’m a security professional. But it gives some idea of the problem we face.

500 million years ago, life as we know it appears on our planet. Crustaceans. Fish. Lobsters.

Jump forward 450 million years and primates have arrived. 55 million years later they stand up on their back legs and pick up tools to hunt.

Just 4 thousand years ago, modern civilisation emerges (farming and religion). About 70 years ago computers are built - depending on what you consider as the first computer to be.

Look at the world around us now. Look at where we are. This room is powered by technology (it was). It’s in a building powered by technology (also true). In a city powered by technology (the event was in Cheltenham).

People evolved to survive in environments very different to the ones we now find ourselves in.

It’s easy to see why, then, some behaviours we think of as “irrational”, are actually natural. Like filling up on cashew nuts before dinner.

Half a million years ago, if you found food, you ate it — especially if it was high in calories — because you didn’t know when your next meal might be. The concept of storing food for consumption later hadn’t entered our consciousness yet.

Behaviours like this are inherent to our genetic makeup. They’re not something that can be “turned off”, “unlearned” or “fixed”.

So we come to our last concept, and the crux of our argument.

Concept 3: Don’t try and measure why behaviours are happening. Measure why they are not happening.

Daniel Kahneman - a Nobel Prize winner, and possibly the most influential psychologist of the last decade, said of this idea…

“I have never heard a psychological idea that impressed me quite as much as this one.”

When someone like that, says something like this, you sit up and take note. So we did.

That’s when it hit us.

The old model focuses exclusively on driving behaviour change. Pushing behaviour change. Forcing behaviour change! It does not account for existing barriers to good behaviour.

The answer was simple. Understand your environment. Identify the barriers. Work to remove them. When you approach the problem like this, instead of asking,

“How can I get them to do it?”

We ask,

“Why aren’t they doing it already?”

If we try and change behaviours before understanding why they’re occurring, we’re just guessing at ways to address the underlying causes.

Our interventions might work, if we’re lucky. There is some correlation between some interventions and positives results (like training). But really, if we haven’t measured first, we’re just shooting in the dark.

As fun as that may sometimes be, it’s not the smart thing to do. It’s not the right thing to do.

Let me give you an example: Phishing emails.

Most simulated phishing campaigns seek only to understand how many people are clicking. They then guess at ways to stop them.

They don’t stop to consider why people click.

However, if we go from asking, “How do I get people to stop clicking clicking on phishing emails?” to, “Why are people clicking on phishing emails?” a world of possibility opens up.

Here are two examples of a phishing email. One is paying me a compliment, “Hey Joe, your shoes are awesome.” The other is trying to wind me up, “Hey Joe, I wouldn’t have worn those shoes myself…”

Some phishing emails.

What we have is an example of “Liking” and “Annoyance”. Which one am I more likely to click on? Who knows. But by testing we might find out.

By varying the category, influence techniques and emotions within phishing emails, we can derive a wealth of information.

Think, for example, if you could tell that users within the finance department were susceptible to legal category emails, that try and evoke fear, and use authoritative language.

You no longer need to carpet bomb them with generic, catch-all training. Training that tries to cover every possible scenario. Instead, you can be focused and intelligent with your intervention.

Apply this question to other areas of your awareness, behaviour and culture campaign efforts and…

Boom! Behaviour Change!

Stop deploying training, posters, blogs, workshops, phishing, on-premises pen-tests and any other strange and fantastic behaviour change interventions before you know why behaviour is occurring.

Measure first!

That’s how you drive real and lasting change.

That’s how you demonstrably reduce human cyber risk.

That’s how you prove ROI.

And that is how you get a promotion. Maybe. (This last point isn’t guaranteed. If it does happen though, next time you see me, you can buy me beer.)

Which seems like a great place to end.

Further Reading

At CybSafe, we obsess about this stuff. We are determined to make a difference. You can read more about out work here.

If you would like to submit an article for publication, please get in touch. The best way to reach me is LinkedIn.

--

--