Automation in nuclear weapon systems: lessons from the man who saved the world

Nina Miller

Interior of the French Navy attack nuclear submarine ‘Saphir’, during training exercises on 28 February 2009 off Saint-Mandrier, France. Photo: Alexis Rosenfeld via Getty Images.

Petrov’s ‘close call’

What we know about automation bias

  • A higher level of automation — for instance, analyzing information versus providing recommended courses of action.
  • Higher reliability and consistency of automated systems. This is known as the lumberjack effect, because ‘the higher they are, they farther they fall’.
  • Distraction and fatigue in human operators, which could result from multi-tasking and environmental factors.
  • ‘Learned carelessness’, resulting from iterated interactions with an automated system. When operators fail to pay attention and lose situation awareness without consequences, it increases the risk of future complacency.

Lessons from Petrov



Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
International Affairs

International Affairs


Celebrating 100 years as a leading journal of international relations. Follow for analysis on the latest global issues. Subscribe at