Friday, 24 February 2017

How do pilots stay disciplined in their use of knowledge?

One of the biggest challenges is knowledge re-use. How does the aviation industry address this challenge?



Image from wikimedia commons
I often refer to aviation as a successful example of knowledge management, with lessons captured from every accident and incident and provided to pilots in the form of checklists, or shared through site such as the Skybrary.

But how does the aviation industry address the issue of knowledge re-use? Why don't experienced pilots skip the checklist?

We know that this is a big challenge in other industries, and that experienced doctors, engineers, programmers and consultants often do not re-use knowledge, but rely instead on the knowledge they already have. How do you make sure that experienced airline pilots, with thousands of hours under their belt, stay disciplined and use the checklists, even after they have become routine?


  • Because if they skip it and it was not OK they can be fired and lose their license
  • In four years in an airline cockpit I only encountered 1 person who didn't respect checklists. Perhaps not coincidentally he did not make it through his probationary year and was fired 
  • Because if they skip it and it was not OK they could DIE
  • On commercial flights, key checklist items are forced by using a procedure call a "cross check". The one pilot must "challenge" another for those specific check list items. 
  • The cockpit consists of two people, one reading/actioning the checklist, the other one monitoring and checking/cross-checking. If you say: "Nah, no checklist today", your copilot is bound to say: "Sorry, but we have to!" 
  • The importance of a proper preflight is drilled into you by your primary instructor from day one in light aircraft, and that mentality carries through all the way up to heavy transport-category aircraft: You want to find any problems you can while you're on the ground, because if you take a problem into the air with you it's a decision you can quickly come to regret.
  • If you do it often enough, it becomes a habit. It then feels wrong to not run the checklist. 
  • You stay vigilant by having seen things go wrong.
  • Everyone expects everyone else to do the checklists properly and if you don't do it you will get called out. 
  • In an airline environment you will have recurrent checkrides every 6-12 months and captains will have line checks every year and proper checklist usage is among the most basic requirement to pass these checks.
  • Every year we sit through a day of crew resource management training and part of that day involves looking at past accidents and understanding what the first thing was that set the accident events in motion (pilot error!). These often serve as vivid examples of how bad things can get if you start ignoring the checklists (among other things) 
  • There are a few tricks that are used to stop you falling in to that "Yeah, everything will be fine" mindset and just skipping the checks 
    • Not doing the checks from memory, but actually doing them in reference to a physical check list. 
    • A prescribed order of checks, starting at a point on the aircraft and moving around methodically 
    • The fear of missing something, such as engine oil levels, which gets very serious once airborne. 
    • Once carrying passengers, especially nervous ones, they tend to feel safer when they've seen you PHYSICALLY checking the aircraft before flying it. 

We can see several factors at work here, including

  • a logical and emotional case for learning (we might die, our passengers might die, better to fix things on the ground, fear of missing something), 
  • peer pressure from the copilot (you will get called out) and passengers
  • you might lose your job if you skip it
  • an awareness of what might go wrong (looking at past accidents)
  • training (from day one, and every year)
  • audit (checkrides)
  • physical lists (not relying on memory)
  • logical lists (prescribed order of checks)
  • habit

Many of these can be transferred from the aviation sector into other sectors. You could imagine a company where the re-use of existing knowledge (in checklists or procedures or
other guidance) was mandatory, trained, supported, checked, believed-in (perhaps through regular analysis of failures), audited and habitual.

I agree this is a long way from where many of us are at the moment, but it is a vision for how one industry supports the re-use of knowledge.

To finish, here is a personal story from the Stack Exchange thread of how one person re-learned the importance of checklists
  •  In my case, I learned the discipline to use a checklist for every action on every flight the one time I decided not to use a checklist while taxiing from the fuel pump back to the parking ramp. It was winter, and a snowplow pulled up behind me, so I decided not to use the checklist in the interest of expediency (ha!). I primed the engine, checked the fuel valve, engaged the electrical system, keyed the starter, and the engine responded by firing up and then immediately dying. Repeat about a half-dozen times, at which point, I finally decided to use the checklist because something obviously wasn't right. Once again, primed the engine, checked the fuel valve, move the mixture to the rich posi-- Oh...I had left the mixture in the idle-cut-off position. Oops. I've used a checklist religiously ever since then ;)


Thursday, 23 February 2017

Validation in Knowledge Management

Any knowledge management framework needs to address the issue of Knowledge Validation.


I have an old video of Professor John Henderson, where he says "every Knowledge Management system I have seen, addresses the issue of Validation".

Validation means a process to say "this is good quality knowledge. It's not opinion, or conjecture; it is justified and valid. It has the stamp of approval. We can trust it".

If Validation is important, and I believe it is important, then who validates? Who "signs off" on Knowledge?

There are multiple approaches to this:

  1. Validation by an individual, where a single person has validation sign-off. You see this in organisatuions where people with technical authority sign off on standard procedures.
  2. Validation by a group of individuals. Think of the aviation industry, where new knowledge gained through indicent investigation is used to update pilot chekclists, but only after sign-off by the manufacturer anthe aviation authorities.
  3. Validation by a community of practice. Here a community of practice collaboratively agrees (perhaps through the use of online discussion and/or a wiki) on the validity of knowledge. 
  4. Validation through experience and use.  Again you might expect a wiki to be self-correcting, as the knowledge is validated through use. However you need to make sure that the knowledge is based on evicence, not opinion, preference, prejudice or hearsay. We hear a lot about "evidence based policy" in Governament, or "evidence based healthcare" in the medical world, and often an individual or group is accountable for reviewing the evidence, which moves us back to option 1 or 2 again.

The more we see Knowledge as being community property rather than the property of any one individual, the more tricky the issue of validation becomes. Here are some thoughts about which approach to choose under carying circumstances.

Firstly, there is the issue of the important of the knowledge itself. The more important it is, the more of a life-end-death issue it is, the more validation becomes the provenance of a single person or a small group of experts. This is partly for purposes of accountability. Single-point accountability, in business and government, is the cornerstone of good governance and ultimately, good performance. Without single point accountability for processes, organisations have no means of ensuring that what have been determined as the goals for the organisation are likely to be met. So let's imagine knowledge of Nuclear Power Plant construction. If you want good performance in Nuclear Power Plant construction, then validation of the knowledge requires single point accountability. One person must sign off, generally using a Community of Practice or a smaller Community of Experts as an advisory board.

If the knowledge is of lesser importance, then the Community of Experts can take a collective accountability, and validate the knowledge themselves. If single-point accountability is not so important, then let the group decide.

Where experience and evicence is widespread and dispersed, then a Community of Practice may act as a better validatory mechanism. You could give the users of the knowledge some sort of "voting rights" on the knowledge, so they can vote on what is useful, and what is valid. You would end up with a CoP validation process. This is only possible when the Community Members are experienced enough to be able to validate. Take a community of amateur bakers, validating the best recipe for Victoria Sponge. A community voting process to define the best recipe would be very effective.  In other cases, the CoP members may be largely inexperienced, and lack the capability to make effective validation judgments. A poll of tabloid newspaper readers regarding the validity of newspaper horoscopes might not give an answer that was consistent with scientific study, for example.

So when you are addressing the validation issue, ask yourself who knows enough to validate, who can weigh up the evidence, and whether there needs to be single point accountability.


Wednesday, 22 February 2017

The difference between lessons and best practice - another post from the archives

Here is another post from the archives - this time looking at the difference between Best Practice and Lessons Learned.


Someone last week asked me, what's the difference between Best Practice, and Lessons Learned.

 Now I know that some KM pundits don't like the term "Best Practice" as it can often be used defensively, but I think that there is nothing wrong with the term itself, and if used well, Best Practice can be a very useful concept within a company. So let's dodge the issue of whether Best Practice is a useful concept, and instead discuss it's relationship to lessons learned.

My reply to the questioner was that Best Practice is the amalgamation of many lessons, and it is through their incorporation into Best Practice that they become learned.

If we believe that learning must lead to action, that lessons are the identified improvements in practice, and that the actions associated with lessons are generally practice improvements, then it makes sense that as more and more lessons are accumulated, so practices become better and better.

A practice that represents the accumulation of all lessons is the best practice available at the time, and a practice that is adapted in teh light of new lessons will only get better.


Tuesday, 21 February 2017

12 steps to KM implementation

One of the more effective ways to introduce Knowledge Management is through solving a series of business problems. Here is a 12 step approach to doing just that.


Image from wikimedia commons

I came across this paper recently by Ray Dawson, professor of KM at Loughborough University, proposing a 12-step approach to KM implementation based on the successive solution of a series of business problems. Ray illustrates his 12 steps with case studies of implementation of KM technologies.

A similar approach was implemented at Mars to great effect, and is one component of our favoured KM implementation strategy alongside a longer-term framework-based approach.

Here are Ray's 12 steps.


  1. Undertake a problem audit to identify a recognised problem. Start by targeting a problem in the company that is manageable in size, knowledge-related, and also widely recognised by all staff concerned. 
  2. Find out how bad the problem is.  If there is a problem then there will be a cost as a result, and these cost figures give a baseline upon which a return on investment for KM can be measured. 
  3. Find a knowledge management solution in the context of the problem. 
  4. Check the cost of the proposed solution so you can calculate a business case.
  5. Check the value for each individual. A new knowledge management initiative has two stakeholder groups - it must have financial benefits to a company, but it must also give value to each individual employee that must make it work. 
  6. Get buy-in from management and individuals based on the business case for the identified problem alone. The knowledge management initiative must be “sold” to both management and users. 
  7. Involve the users in the solution. Users should be involved in the requirements process and design of the new KM approach. 
  8. Plan for systems operation as well as the implementation. Neglecting to plan for the operation of a system is likely to mean any initial success cannot be sustained. 
  9. Implement the solution 
  10. Evaluate the actual savings made. You can find many examples of such savings on this blog.
  11. Use the evidence of success to achieve a wider KM rollout and to get buy-in for new initiatives. This is the social proof aspect we discuss here. The solved problems act as proofs of concept and pilot projects for the wider initiative.
  12.  Use smaller knowledge management projects to build bigger projects.  Large projects can be broken down into smaller projects that can each be implemented with the first 11 steps of this knowledge management implementation methodology. In this way the company can work towards the larger integrated system to which it aspires. 

This is a very practical approach to KM implementation which we entirely endorse here at Knoco. The smaller problem-led pilots help build the KM structure brick by brick, and lead you towards implementation of a complete Knowledge Management Framework

Monday, 20 February 2017

Knowledge Management, eliminating the Archaeology from projects.

A client said to me last week, a propos of Information Architecture, that "up to 40% of Architecture is Archaeology".  Knowledge Management can help address that 40%.


Image from wikimedia commons

By the statement "up to 40% of Architecture is Archaeology" he meant that up to 40% of Information Architecture projects is spent digging around trying to find out why systems are built the way they are.

IT systems are often built up piece by piece, and revision by revision. Each revision project is fully documented and those documents can often be filed somewhere, but often nobody keeps track of the overall design and the overall rationale behind the system. That high level knowledge is lost; buried below layers on layers of revisions. Hence the need for archaeology. Each systems architect working on a new revision has to dig through the files of multiple projects to understand how everything hangs together, like an archaeologist sifting through the artifacts of a lost civilisation.  

And then, in the absence of a Knowledge Management framework, the knowledge they have compiled through this archaeology is lost again; as if the archaeologist discards her findings and buries them again.

It is not just Information Architecture that suffers from this problem. If knowledge is not stored and shared, then every project needs to do some digging to find out what has been done before, and why.

A simple knowledge management solution, perhaps a wiki, would allow the rationale behind the system to be recorded, and then updated with each iteration. All key documents could be linked to the wiki, and the wiki can evolve over time as the system evolves.

A simple KM solution such as this could save up to 40% of project time. It just requires people to work in a different way, and to share the results of their archaeology so the next person does not have to start again from the beginning.

FInd out how much archaeology your projects need to do - this could form a good value proposition for Knowledge Management.

Friday, 17 February 2017

The role of Asking in Knowledge Management

Most knowledge sharing in our private lives is driven by Asking. Let's use this in work as well.


ASK
Think about the last time you shared knowledge with one of your friends or family. Maybe it was this morning, or yesterday - maybe you shared advice, a tip or hint, or something you had found out that the other person did not know.

I bet you shared this knowledge bacause you were asked.


  • "Where are the car keys?"
  • "What's the weather going to do today?"
  • "Are you doing anything tonight"?
That's the way that private knowledge sharing seems to work; it follows the three rules below.


When do we share?  Most often, when we are asked

Who do we share with? People who ask us

What is preventing us from sharing? Often, nobody is asking (see here to understand how to tell when sharing is broken)

So how do we take these principles into the workplace?


There are several ways in which you can introducing Asking as part of a Knowledge Management framework.

 The first obvious example is in Communities of Practice. The most important and powerful role of CoPs is providing a forum where CoP members can ASK questions of their peers. The forum allows the person who actually needs the knowledge to ask directly, and the answer comes from the members with knowledge to share.  Communities access the long tail of knowledge, and communities work better with a large element of "Knowledge Pull"

The second case is in After Action Reviews. Here someone in the team, such as the team leader, ASKS a series of 5 questions, to elicit the knowledge of the team. This knowledge will be used by the same team to improve their practices, so the knowledge providers and knowledge users are the same team.

The third example is in end-of-project Retrospects. Here the questioning is led by an experienced external facilitator. The process is an asking process - structured, quality assured, and aimed at answering (in advance) the likely questions from future projects.

Asking is the most powerful way to drive knowledge transfer - Pull is more powerful than Push

Thursday, 16 February 2017

Why you need to place some demands on the knowledge sharer

Sharing knowledge is a two-sided process. There is a sharer and a receiver. Be careful that making knowledge easier to share does not make knowledge harder to re-use.

Image from wikimedia commons
Sharing knowledge is like passing a ball in a game of rugby, American Football or basketball. If you don't place some demands on the thrower to throw well, it won't work for the catcher. If you make it too undemanding to throw the ball, it can be too hard to catch the ball.  Passing the ball is a skill, and needs to be practised.

The same is true for knowledge. If you make it too simple to share knowledge, you can make it too difficult to find it and re-use it.  In knowledge transfer, the sharing part is the easier part of the transfer process. There are more barriers to understanding and re-use than there are to sharing, so if you make the burden too light on the knowledge supplier, then the burden on the knowledge user can become overwhelming.

Imagine a company that wants to make it easy for projects to share knowledge with other projects. They set up an online structure for doing this, with a simple form and a simple procedure. "We don't want people to have to write too much" they say "because we want to make it as easy as possible for people to share knowledge".

So what happens? People fill in the form, they put in the bare minimum, they don't give any context, they don't tell the story, they don't explain the lesson. And as a result, almost none of these lessons are re-used. The feedback that they get is "these lessons are too generic and too brief to be any use".  we have seen this happen many many times.

By making the knowledge too easy to share - by demanding too little from the knowledge supplier - you can make the whole process ineffective. 

There can be other unintended consequences as well. Another company had a situation as described above, where a new project enthusiastically filled in the knowledge transfer form with 50 lessons. However this company had put in a quality assurance system for lessons, and found that 47 of the 50 lessons were too simple, too brief and too generic to add value. So they rejected them.

The project team in question felt, quite rightly, that there was no point in spending time capturing lessons if 94% of them are going to be rejected, so they stopped sharing. They became totally demotivated when it came to any further KM activity.

 Here you can see some unintended consequences of making things simple. Simple does not equate to effective.

Our advice to this company was to introduce a facilitation role in the local Project Office, who could work with the project teams to ensure that lessons are captured with enough detail and context to be of real value. By using this approach, each lesson will be quality-controlled at source, and there should be no need to reject any lessons.

Don't make it so simple to share knowledge, that people don't give enough thought to what they write.

The sharer of knowledge, like the thrower of the ball, needs to ensure that the messages can be effectively passed to the receiver, and this requires a degree of attention and skill. 

Blog Archive