Blameless Culture

Blameless culture has been on my mind recently. Aspects of it can be seen in the Toyota Production System, however, only more recently did “blameless” start to be used to describe it. Across the software engineering profession, it seems to have become the gold standard and practiced by all the top tech companies.

Personally, I’ve always felt uncomfortable when blame was thrown around. I tended to get defensive quickly or remove myself from the situation, whether I was being blamed or not! I can’t remember a time in my life (work or personal) when blaming lead to a positive outcome. So in 2012 when I read John Allspaw’s (Etsy) article on a blameless post mortem it was a lightbulb moment for me. He articulated a way of doing things that I already felt was right, a fundamental belief had now been articulated for me. I was sold.

Let’s remind ourselves of the basic principles — I love this stuff.

Blameless culture fundamentals

  • Assume people are doing the best they can with the information they have.
  • Create an environment where people can explain why the action made sense to them at the time, without feeling insecure or judged.
  • See human error as an effect of deeper systematic vulnerabilities within the organisation. Focus on the system, rather than the person.
  • Dig deeper into what caused the problem. Use the 5 Whys technique.
  • Celebrate that organisations can only improve by constantly seeking out their systematic vulnerabilities. The very best way to find out … through the person who made the mistake.

Alan Malpass wrote a great post years ago on the fallibility of humans. He took us back to ancient times with this anecdote:

The ancient Greeks … would respond to calamities such as plague, famine, or invasion by selecting a person, usually a cripple, or a beggar, or a criminal. This person — the pharmakos — would be beaten, stoned, and driven from the city … This isn’t healthy. It’s also not effective. We don’t learn from our mistakes this way. The Greeks kept succumbing to plagues because it turns out that beating up a beggar is less effective at preventing disease than coming up with a functional sanitation system.

Rather than working in a climate of fear (I’m going to get fired, judged, excluded), a blameless culture creates a workplace where there is a genuine focus on understanding where the system failed. With the intention of trying to stop it failing again — basically problem solving. I find engineers naturally do this really well when given the chance.

And here’s the great benefit that flows from a blameless culture: we get great insight into where our systems are failing! In a climate of fear, people rarely explain the real reason something went wrong. In a blameless culture, people feel more comfortable to speak out, to help solve the problem so it doesn’t happen to someone else.

Of course, I still come across non-believers in the blameless way of thinking. They say: “that’s not really how business gets done”, “people need to know they have made a mistake”, “nobody will be accountable for their actions”. Believe me, when someone makes a mistake which causes an incident, they know! Sometimes there are edge cases where people aren’t aware of their mistakes and in those situations it’s management’s responsibility to deal with that separately. That being said, it is rare in workplaces I’ve worked in.


For people who are in an environment where blame is often used, I’ve included a list of alternatives you can use when a colleague reverts to seeking blame:

Blaming: Who caused this problem?
Blameless: Why did this problem occur?


Blaming: Who’s fault is this?
Blameless: Where did the system fail to allow this to happen?


Blaming: Amy deployed a change to production.
Blameless: A commit was deployed to production.


Blaming: Person “X” caused the incident (note: this is not blameless, it’s just nameless).
Blameless: Let’s really try and understand what happened to prevent it from happening again?


The seeds for Zendesk’s blameless mindset were sown early, most likely before I joined. But in the early days I did notice that people consciously tried to:

  • Recognise hindsight bias. It’s very easy to look back and see how something happened. It’s not always that clear at the time.
  • Use phrases like “lead to human error”, to put emphasis on the system failure.
  • Call out people if they blamed, normally in a private note sent swiftly after the comment.

It’s been nice to see this mindset spread organically to the rest of the organisation. We weren’t that purposeful about it, we didn’t display it in light boxes around the office, chant it or bake it into fortune cookies. It wasn’t rocket science, maybe we got lucky, but that’s how it worked for us. We do it well. We aren’t perfect. But, I’m confident when we regress, the organization will self-correct. It’s one of the things I love about working here, and I want to keep having the conversation about it. It’s fundamental to our success.