Friday, 19 January 2018

Is Learning from Failure the worst way to learn?

Is learning from failure the best way to learn, or the worst?

Classic Learning
Classic Learning by Alan Levine on Flickr
I was driven to reflect on this when I read the following quote from Clay Shirkey;

"Learning from experience is the worst possible way to learn something. Learning from experience is one up from remembering. That's not great. The best way to learn something is when someone else figures it out and tells you: "Don't go in that swamp. There are alligators in there."
Clay thinks that learning from (your own bad) experience is the worst possible way to learn, but perhaps  things are more complex. Here are a few assertions.

  • If you fail, then it is a good thing to learn from it. Nobody could argue with that!
  • It is a very good plan to learn from the failure of others in order to avoid failures of your own. This is Clay's point; that learning only from your own failures is bad if, instead, you can learn from others. Let them fail, so you can proceed further than they did. 
  • If you are trying something new, then plan for safe failure. If there is nobody else to learn from, then you may need to plan a fail-safe learning approach. Run some early stage prototypes or trials where failure will not hurt you, your project, or anyone else, and use these as learning opportunities. Do not wait for the big failures before you start learning.
  • Learn from success as well. Learn from the people who have avoided all the alligators, not just from the people that got bitten. And if you succeed, then analyse why you succeeded and make sure you can repeat the success.
  • Learning should come first, failure or success second. That is perhaps the worst thing about learning from experience - the experience has to come first. In learning from experience "the exam comes before the lesson." Better to learn before experience, as well as during and after.  

Learning from failure has an important place in KM, but don't rely on making all the failures yourself. 


Thursday, 18 January 2018

What did Jack Welch say about KM?

It's always interesting to see what CEOs say about KM. Here's Jack Welch.


jack welch Jack Welch, the famous CEO of General Electric, did an excellent job of summarising the business vision for Knowledge Management, when he made this statement in the GE 1996 annual report.
“Our behaviour is driven by a fundamental core belief; the desire and the ability of an organisation to continuously learn from any source, and to rapidly convert this learning into action, is it’s ultimate competitive advantage”
That's a really clear description of Knowledge Management (continuously learning from any source, and rapidly converting this learning into action), and a great link with the business driver  (ultimate competitive advantage).

Is "Ultimate competitive advantage" a bit strong? Are there other advantages, such as market strength, or brand leadership?

Probably not - even organisations that have been in uniquely competitive positions still need to focus on knowledge management. Knowledge management is big at Microsoft, despite their near monopoly in the business software and operating environment market. De Beers have had a near monopoly of the world diamond market, but have also invested in knowledge management. You need to manage knowledge to be competitive, and to remain competitive, and to keep your monopoly in a rapidly changing world.

So if you are asked about the value Knowledge Management brings, then remember Jack Welch. KM is not about better taxonomies, or  new portal, or about "getting people to be more social". It's about building a company that continuously learns, from any source, and turns that knowledge rapidly into action.

Learning - actionable learning - is the only continuously renewable business resource, and therein lies the ultimate competitive advantage that Jack Welch was talking about.

Wednesday, 17 January 2018

The new ISO KM Standard explained

The video webinar below is from Judy Payne, one of my co-members on the committee to develop the new ISO KM standard.


Judy delivered this last week for the Association of Project Managers, and explains the history of the standard, the thinking behind it, and some of the key messages it carries.





Tuesday, 16 January 2018

What is a knowledge product?

The concept of a Knowledge Product is a common one in the development sector, and is used as a label for many types of document.  But what makes a product a "knowledge product"?

Many organisations working in the development sector create what they call "Knowledge Products". The African Development bank, for example, has a whole suite of economic briefs, working papers and economic reports published under the heading of "knowledge products". These are written by specialists in the Bank, for the education or reference of future Bank programs and for wider society. The main mechanism of "knowledge transfer" regarding these products is "dissemination" - publishing reports to target audiences, often on web-hosted repositories.

Other organisations are looking at the same topic, and wondering if Knowledge Products could be defined as a project output, for example.

But what does the "Knowledge" in the term mean? Are these reports "products of knowledge" or are they "products that aim to transfer knowledge to the user"?

If we are to use knowledge products as a component of a KM Framework, then surely they must follow the second definition, not the first?

Dr. Serafin Talisayon, in his lecture notes, suggests that

A knowledge product is something that enables effective action by an intended user, client or stakeholder of a government agency or a non- government or development organization.  

This is the second definition. A Knowledge Product must carry knowledge, and must enable action by the reader (knowledge is, after all, the ability to take effective action).  It must be actionable.


  • Therefore a project report is not a knowledge product. Even if it contains a detailed history of the project, the reader does not know whether to copy this history or not. 
  • A lesson learned report IS a knowledge product, providing the lessons  
  • An economic summary of a region is arguably not a knowledge project. One could read the report, but still not know what action to take. 
  • A summary of best practice, or recommended practice, is a knowledge product, provided that the description of the best practice is detailed enough, and provides enough contextual background, that people can act on it in their own context.  

We can see in this list from ITAD several knowledge products - lessons, findings, insights. The list quoted above from the African Development Bank seemed to have a looser definition of Knowledge Product, mixing in a whole variety of reports - lessons reports mixed with briefs and working papers.

If we are to follow Dr Talisayon's definition, then writing Knowledge Products requires a lot of care if they are to be usable and used. To quote this blog from the World Bank,

KM should be conceived less as a purely technical information-based area and more as a communication and behaviour-change area ... Knowledge producers need to package the product in a way that can be easily applied, while the users need to be “persuaded” to conceive knowledge as a practical tool that can be applied in their field.

If we are to translate the concept of Knowledge products from the Development sector to the Industrial sector, we would do well to bear this in mind, and use the term "Knowledge Products" only for items that are expressly written to convey knowledge, with the user in mind.

So a set of project reports on a website is not a collection of knowledge products. A wiki containing guidance, good practice and lessons is a knowledge product.  In an ideal world, every project should produce knowledge products which are used to grow and evolve the knowledge base in the organisation.

Knowledge products, if treated the right way, can be a core component of a KM Framework. 


Monday, 15 January 2018

5 reasons why organisations don't learn lessons.

If lesson learning is so simple, why do organisations so often fail to learn the big lessons?

We seem to be able to learn the little lessons, like improving small aspects of projects, but the big lessons seem to be relearned time and time again. Why is this?

Some of the answers to this question are explored in the article "Lessons We Don’t Learn: A Study of the Lessons of Disasters, Why We Repeat Them, and How We Can Learn Them" by Amy Donahue and Robert Tuohy. In this article they look at lessons from some of the major US emergency response exercises, and find that many of them are repeated time and again.

In particular, repeated lessons are found in the areas of

  • Failed Communications
  • Uncoordinated Leadership
  • Weak planning
  • Resourcing constraints
  • Poor Public relations

In fact these lessons come up so often that staff in disaster response exercises can almost tell you in advance what is going to fail.  People know that these issues will cause problems, but nobody is fixing them.  

You could draw up a similar list for commercial projects and find many of the same headings, with the possible addition of issues such as
  • Scoping and scope control
  • Subcontracting
  • Pricing
Donahue and Tuohy explore why it is so difficult to learn about these big issues, and come out with the following factors:

  1. Lack of motivation to fix the issues. As Donahue and Tuohy explain, 
"Individual citizens rarely see their emergency response systems in action. They generally assume the systems will work well when called upon. Yet citizens are confronted every day by other problems they want government to fix – failing schools, blighted communities, and high fuel prices. Politicians tend to respond to these more immediately pressing demands, deferring investments in emergency preparedness until a major event re-awakens public concern. As one incident commander put it, “Change decisions are driven by politics and scrutiny, not rational analysis.” 
In addition, they identify the sub-issues of a lack of ability to sustain commitment, Lack of a shared vision on what to do about the lessons, and a lack of a willingness of one federal or local body to learn from others.

All of these issues are also seen in commercial organisations. There is a reluctance to make big fixes if it's not what you are being rewarded for, a reluctance to learn from other parts of the organisation, and difficulties in deciding which actions are valid.

  1. An ineffective lessons capture and dissemination process. Donahue and Tuohy identify the following points:
"While some (AAR) reports are very comprehensive and useful, lessons reporting processes are, on the whole, ad hoc. There is no universally accepted approach to the development or content of reports... often several reports come out of any given incident... agencies or disciplines write their own without consulting each other. These reports differ and even conflict ... there is no independent validation mechanism to establish whether findings and lessons are “right" ... concern about attribution and retribution is a severe constraint on candour in lessons reporting ...  the level of detail required to make a lesson meaningful and actionable is lost ... meaning is also diluted by the lack of a common terminology ... AARs typically focus on what went wrong, but chiefs want to know what they can do that is right. Reports tend to detail things that didn’t work, without necessarily proposing solutions. ... those preparing the reports need to understand not only what happened, but also why it happened and what corrective action would have improved the circumstances. Reports of this depth and quality are relatively rare ... many opportunities to learn smaller but valuable lessons are foregone (and) there is no mechanism by which these smaller lessons can be easily reported and widely shared". 
That's quite a list, and again we can also see these issues in industry as well. Lesson learning crucially needs

  1. An ineffective lessons dissemination process. Donahue and Tuohy make the following points:
"The value of even well-crafted reports is often undermined because they are not distributed effectively. Most dissemination is informal, and as a result development and adoption of new practices is haphazard. Generally, responders must actively seek reports in order to obtain them. ... There is no trusted, accessible facility or institution that provides lessons learned information to first responders broadly, although some disciplines do have lessons repositories. (The Wildland Fire Lessons Learned Center and the Center for Army Lessons Learned are two prominent examples.)"
In fact, the Wildland Fire lessons center and the Center for Army Lessons Learned represent good practice (not just in technology, but in resourcing and role as well) and are examples that industry can learn from. However the issue here is not just dissemination of lessons, but synthesis of knowledge from multiple lessons - something the emergency services generally do not do.

  1. An ineffective process for embedding change. Donahue and Tuohy address this under the heading of "learning and teaching).
"Most learning and change processes lack a formal, rigorous, systematic methodology. Simplistically, the lesson learning and change process iterates through the following steps: Identify the lesson > recognize the causal process > devise a new operational process > practice the new process > embed/institutionalize and sustain the new process.  It is apparent in practice that there are weaknesses at each of these steps....
The emergency response disciplines lack a common operating doctrine.... Agencies tend to consider individual incidents and particular lessons in isolation, rather than as systems or broad patterns of behavior. ... Agencies that do get to the point of practicing a new process are lulled into a false sense that they have now corrected the problem. But when another stressful event happens, it turns out this new process is not as firmly embedded as the agency thought ... Old habits seem “safer,” even though past experience has shown they do not work. 
Follow-up is inadequate ... Lessons are not clearly linked to corrective actions, then to training objectives, then to performance metrics, so it is difficult for organizations to notice that they have not really learned until the next incident hits and they get surprised".
This is the issue of lesson managament, which represents Stage 2 of lesson learning maturity. Many organisations, such as the ones involved in emergency response, are stuck at stage 1.  Lesson management involves tracking and supporting lessons through the whole lifecycle, from identification through to validated and embedded change.

There really is little point spending time collecting lessons if these lessons are then not managed through to resolution.

  1. A lack of dedicated resources. Donahue and Tuohy again - 
"Commitment to learning is wasted if resources are not available to support the process. Unfortunately, funds available to sustain corrective action, training, and exercise programs are even leaner than those available for staff and equipment".

Lesson-learning and lesson management need to be resourced. Roles are needed such as those seen in the US Army and the RCAF, or in Shell, to support the process.  Under-resourcing lesson learning is a major reason why lesson learning so often fails.

Conclusions.

Donahue and Tuohy have given us some sobering reading, and provided many reasons why lesson learning is not working for Disaster response. Perhaps the underlying causes are the lack of treating lesson learning as a system, rather than as a product (ie a report with documented lessons), and a lack of treating lesson learning with the urgency and importance that it deserves.

Make no mistake, many commercial organisations are falling into the same pitfalls that Donahue and Tuohy describe.


If learning lessons is important (and it usually is), then it needs proper attention, not lipservice.



Friday, 12 January 2018

The 11 steps of FEMA's lesson capture process

The US Federal Emergency Management Agency (FEMA) has a pretty good process for capturing and distributing lessons. Here are the 11 steps.


FEMAEvery Emergency Services organisation pays close attention to Lesson-Learning  (see for the approach taken by the Wildland Fire Service). They know that effective attention to learning from lessons can save lives and property when the next emergency hits.

The lesson learning system at FEMA was described in an appendix to a 2011 audit document  and showed the following 11 steps in the process for moving from activity to distributed lessons and best practices.  Please note that I have not been able to find a more recent description of the process, which may have changed in the intervening 7 years.


FEMA Remedial Action Management Program Lessons Learned and Best Practices Process
  1. Team Leader (e.g., Federal Coordinating Officer) schedules after-action review
  2. After-action review facilitator is appointed
  3. Lesson Learned/Best Practice Data Collection Forms are distributed to personnel
  4. Facilitator reviews completed forms
  5. Facilitator conducts after-action review
  6. Facilitator reviews and organizes lessons learned and best practices identified in after-action review
  7. Facilitator enters lessons learned and best practices into the program’s database
  8. Facilitator Supervisor reviews lessons learned and best practices
  9. Facilitator Supervisor forwards lessons learned and best practices to Program Manager
  10. Program Manager reviews lessons learned and best practices
  11. Program Manager distributes lessons learned and best practices to Remedial Action Managers

This is a pretty good process.

However despite this good process, the audit showed many issues, including 
  • a lack of a common understanding of what a good lesson looks like; the examples shown are mainly historical statements rather than lessons, and this example from the FEMA archives has the unhelpful lesson "Learned that some of the information is already available information is available"
  • a lack of consistent application of the after action review process (in which I would include not getting to root cause, and not identifying the remedial action),
  • a lack of use of facilitators from outside the region to provide objectivity, 
  • limited distribution of the lesson output (which has now been fixed I believe, and 
  • loss of their lessons database when the server crashed (which has also been fixed by moving FEMA lessons to the Homeland Security Digital Library).

So even a good process like the one described above can be undermined by a lack of governance, a lack of trained resources, and a poor technology. 

Thursday, 11 January 2018

How well distributed is your KM budget?

There are 4 legs on the Knowledge Management table. Are you investing evenly in each of them?


Knowledge Management  requires equal attention to the four key enablers of People, Process, Technology and Governance.

The test of whether you truly are paying equal attention is whether your KM program investments equally in these four crucial elements of the KM Framework.

Most of the time, we find that the companies we speak with spend far more on technology than on the other elements, and most of the time we find that their Knowledge Management program suffers as a result. Technology alone will not deliver knowledge management, and an overspend on technology is usually a bad strategic move.

We don't have a lot of statistics on the proportional spend from KM campaigns, but the attached diagram shows the proportions from one program we were involved with, about 12 years ago. In this program we can see
  • 34% of the spend was on Technology (ideally, should have been 25%)
  • 8% of the spend was on Processes (ideally, should have been 25%)
  • 58% was on People and Governance (ideally should have been 50%)
This is a more equal spend than we see in many organisations, but there still seems to be an under-investment in process (process definition, process trial and testing, ad training in KM processes.

We are happy to say that this KM program is still very much alive and well and delivering value 12 years later. A balanced spend seems to have contributed to their success.

Blog Archive