Tuesday 2 April 2013


Weak signals and the "normalisation of deviance"


Shuttle Enterprise on the Hudson River On holiday I was re-reading the excellent book "Flirting with Disaster" by Marc Gerstien. This book looks at learning failures and design failures in organisations, and how these can precipitate catastrophic events.

In his analysis of the loss of the Challenger Space Shuttle, caused by failure of the joint seals, Gerstein explains how there had been a whole history of problems with the seals prior to the loss of the Challenger. He claims

  • "On flight STS 51C in January 1985 .... significant amounts of erosion and a type of hot gas leakage known as blowby were observed on two field joints. .... 
  • Flight STS-51 was launched on April 12, 1985. After the SRBs (solid rocket boosters) were recovered, they showed the largest amount of erosion of any flight to date..... 
  • On flight 51B, which launched on April 29, 1985 ....the primary O-ring of the left nozzle joint ... had burned all the way through, and blowby effects had reached the joint's second O-ring. 
  • Flight 51F had flown successfully on July 29 1985 with no erosion, although a gas path was found cutting through the high-temperature putty that was used inside the boosters to seal the joints"
He goes on to claim that, over time, NASA and their engineering partner had become acclimatised to the weak signals of joint failure - an acclimatisation referred to as "the normalisation of deviance". Gerstein explains -
"With each successive flight, acceptance of erosion and blow-by grew; even though those symptoms were clear signs that the joints were not sealing as they were supposed to. However, no matter how many rounds of Russian Roulette you survive, the game is never "safe"
So how do organisations avoid this normalisation of deviance, and pick up the weak signals before they get to the chamber with the bullet in it"? Gerstein suggests many of the elements of a rigorous lesson-learning system, including

1) treating each deviance as a near miss that needs rigorous investigation and learning (What happened, that was not planned?)
2) rigorous root cause analysis (Why did it happen?)
3) taking action. (What are we going to do about it?)

In addition, there needs to be a culture of looking for, and amplifying, weak signals, which also means that looking for the weak signals needs to be someone's responsibility. In organisations with well-developed lesson-learning capability, it is often part of the remit of the central Lessons team,. They are the ones who see all the lessons, and they can spot (or search for) recurring patterns which indicate that something might well be wrong.

Certainly the Challenger story seems to suggest that recognising and amplifying those persistent but weak signals of failure, is the only way to avoid the normalisation of deviance.




No comments:

Blog Archive