The feeling of security and the reality of security don’t always match, says computer-security expert Bruce Schneier. In his talk, he explains why we spend billions addressing news story risks, like the “security theater” now playing at your local airport, while neglecting more probable risks — and how we can break this pattern.
Security is two different concepts: a feeling and the reality: you can feel secure even if you’re not; you can be secure even if you don’t feel it.
From economic POV, security is a trade-off.
The question to ask is not whether this makes us safer but whether is worth the trade-off.
There is often no right or wrong here.
We make security trade-off all the time without even realizing it. But we are bad at it.
We tend to respond to the feeling of security and not the reality.
And there are several cognitive biases in risk perceptions:
- We tend to exaggerate spectacular and rare risk and downplay common risk (eg. Flying vs. driving).
- The unknown is perceived more risky than the familiar (eg. Fear of kidnapping by stranger when data supports kidnapping by relatives is more common).
- Personified risk is perceived greater than anonymous risk.
- People underestimate risk in situations they control and overestimates them is situations they don’t control (eg. terrorism).
- Availability heuristic (eg. News repeat rare risks)
If it’s in the news, don’t worry about it. Because by definition, news is something that almost never happens.
These biases act as filter between us and the reality. So feeling and reality diverge.
Security theater: products that make people feel secure, but don’t actually do anything
If the market, drives security, and if people make trade-offs based on the feeling of security, then the smart thing for companies to do for the economic incentives is to make people feel secure. And there are two ways to do this. One, you can make people actually secure and hope they notice. Or two, you can make people just feel secure and hope they don’t notice.
People notice when they have a good understanding the risks, the threats, the countermeasures. Feeling matches reality.
People don’t notice when they have a poor understanding of the risks: when you don’t understand the costs, you get the trade-off wrong. Feeling doesn’t match reality.
Furthermore, feeling works in pair with a model. Feeling is based on intuition, model on reason. In simple situation, there is no need for model: feeling is close to reality. In complex situation, models help us understand the risks we face.
Models are an intelligent representation of reality. They are limited by our biases, but can override our feelings.
We get models from others (eg. culture, science, religion, teachers, elders). Models also come from the media and the industry.
Model can change. The more we get comfortable in our environment the more our models match our feelings.
In the history of the past 50 years, the smoking risk shows how a model changes, and it also shows how an industry fights against a model it doesn’t like.
Compare that to the secondhand smoke debate[…]. Think about seat belts.[…]. Compare that to the airbag debate[…]. All examples of models changing. What we learn is that changing models is hard […]. If they equal your feelings, you don’t even know you have a model. And there’s another cognitive bias I’ll call confirmation bias, where we tend to accept data that confirms our beliefs and reject data that contradicts our beliefs.
In a world where we don’t have first hand experience to judge models, we rely on proxies (eg. government agencies, codes, experts). It works as long as they are the right proxies.
1982 […] there was a short epidemic of Tylenol poisonings in the United States [..]. Someone took a bottle of Tylenol, put poison in it, closed it up, put it back on the shelf, someone else bought it and died. This terrified people […]. There wasn’t any real risk, but people were scared […]. Those tamper-proof caps? That came from this. It’s complete security theater. […] think of 10 ways to get around it. I’ll give you one: a syringe. But it made people feel better. It made their feeling of security more match the reality.
[…] when a baby’s born now, they put an RFID bracelet on the baby, a corresponding one on the mother, so if anyone other than the mother takes the baby out of the maternity ward, an alarm goes off. I said, “[…] I wonder how rampant baby snatching is out of hospitals.” I go home, I look it up. It basically never happens. But if you think about it, if you are a hospital, and you need to take a baby away from its mother […] to run some tests, you better have some good security theater, or she’s going to rip your arm off.
If our feelings match reality, we make better security trade-offs.