I blogged recently on the knowledge supply chain, as a metaphor for the transmission of knowledge and lessons in support of corporate decision making. The supply chain I described was a lateral one, peer to peer, designed for identifying learning from activity which can be used to improve future activity.
There is another knowledge supply chain, which is a vertical one. This is the supply of knowledge, and often the supply of warnings, from deep within the heirarchy, up to levels where major decisions need to be made.
It is the failure of this vertical knowledge supply chain that is behind some of the most spectacular disasters of the last century.
Nancy Dixon recently identified the 3rd age of KM being the integrated flow of knowledge up and down the heirarchy. This is still a very difficult thing to get right, as is profoundly illustrated by Christopher Burns in his book "Deadly Decisions - how false knowledge sank the titanic, blew up the shuttle, and led america into war". Burns mentions several cases where warnings have been ignored, or downplayed, or rationalised away completely. He cites many high profile examples
- Multiple warnings, often very detailed, that Al Qaeda was planning a major assault, using aircraft, within the USA
- Repeated warnings that the O-rings on the Challenger shuttle were at risk at low temperatures (the sam O-rings that failed at low temperature, with catastrophic loss of the shuttle and all crew)
- Warnings that the Titanic was steaming into an icefield
- Warnings that the cooling water system on the Three Mile Island plant was faulty, and might lead plant engineers to make decisions that could lead to melt-down
However if an organisation is to avoid disaster, it must be very sensitive to warnings. Warnings cannot be filtered out or ignored, if we want to avoid our own versions of the Titanic, 9/11, the Enron collapse, the Challenger disaster, or Three Mile Island. The knowledge supply chain must carry these warnings faithfully and accurately, Burns says that
"Warnings are a special class of dissonant information and they are difficult to heed for three reasons. First warnings .... often come from people deep within the organisation who have few credentials and are often hard to understand. Secondly, they contain a prediction about the future based on facts, values and concepts which might be different fron those of the listener. It is important for the person giving the warning to remove as many of these obstacles as possible. And third, there's a pathology of giving and receiving warnings that needs to be overcome".He describes this pathology as the warner, anxious to get the message across and worried that the "warnee" will not listen, having a tendency to overstate the danger. The warnee gets used to these overstatements, and discounts the significance of the message, which prompts the warner to even greater exaggeration. He says that the only way around this is to lay out the facts for the warnee, and let them connect the dots themselves. The end result is that warners find warning to be exhausting, confrontational and career-threatening. Many of the people Burns identifies as having tried to deliver warnings, either lost their jobs or retired soon afterwards.
So to allow warnings to reach the decision making layer, we need
- an openness at senior level to dissonant voices and to the "weak signals" of warnings (perhaps using an anlysis function specifically to look for these)
- a knowledge supply chain that is as short as possible, either through a flat information heirarchy, or the sort of cross-heirarchy knowledge sharing events that Nancy Dixon describes
- to reward warners rather than punish them, much as people are now encouraged and rewarded in safety-conscious cultures for identifying near misses or unsafe conditions. In a safety context, people are encouraged to warn, and a lack of warnings is seen as a sign that something has gone wrong with the system. We need a similar approach to warnings in all areas - not just safety warnings, but warnings of changes in the market, warnings of inefficient processes, warnings of complacency and of obsolete thinking.