Monday, 26 October 2015

Short break

Sunset Mauritius
Sunset Mauritius, by Matthias Ott

This blog is on holiday for a week


Normal service resumed soon

Friday, 23 October 2015

3 cultures of management and how they affect KM

In his 1996 article “Three Cultures of Management: The Key to Organizational Learning in the 21st Century”, Edgar H. Schein of the MIT Sloan School of Management describes three cultures of Management he believes are prevalent in organisations.


Classic John Cleese sketch
The walls between these cultures can form impenetrable barriers for knowledge management and the flow of knowledge.

His three cultures are as follows
  • The Operator Culture evolves locally in organizations and within operational units. It is based on human interaction and high levels of communication, trust and teamwork. The focus is on getting the work done efficiently. Operators know that the world is unpredictable and that sometimes you have to use your own innovative skills to bend the rules.
  • The Engineering Culture is found in the designers of products and systems. The engineers are driven by utility, elegance, efficiency, safety. They are looking for systems that work well, all the time, and ideally without operator intervention.
  • The Executive Culture is the set of shared tacit assumptions that CEO's and their immediate subordinates share. This executive world view is built around the necessity to maintain the financial health of the organization. Executives think in terms of control systems and routines which become increasingly impersonal. Executives feel an increasing need to know what is going on while recognizing that it is harder and harder to get reliable information, which drives them to develop elaborate information systems alongside the control systems.
I was reminded of this distinction recently when talking with a knowledge manager who had attended a meeting at the Institute of Directors, where all the talk had been about corporate governance, sustainability, personal development, endurance, growth and the like. Nothing about knowledge.

There is a good history of effective Knowledge Management within the engineering and operator cultures, and also some history of knowledge management involving both operatrs and engineers. Many of the major projects we have worked with recently have involved operators in the engineering design stage of the project, often with massive benefit.

However there is far less history of the effective use of KM at executive level, or even of KM crossing the boundaries from Engineering to Executive.

Perhaps this is because we too seldom apply KM to the really big strategic issues?

We assume that Knowledge is something that applies to the tactical levels of operators and engineers, and forget that senior managers are knowledge workers as well, albeit working at a strategic level. 

As I have argued in my blog post "Who are the Knowledge Workers", this is not the case.

Senior managers can easily be engaged in KM. The senior level knowledge capture we did for mergers and acquisitions at BP and Mars, for example, involved the VPs, the chief legal counsel, the company presidents etc.  It is possible to use KM to address issues such as growth, endurance, sustainability and governance. The executive culture may see organisations in a different light, but still need to learn,

Set your sights on the senior level. Make a case for KM solving their problems, and helping illuminate their decisions. You will find that this opens many doors for you.

Contact us for advice on strategic-level KM.

Thursday, 22 October 2015

When rewarding knowledge sharing does not work

Do incentives for knowledge sharing actually make any difference?  Here are some studies from the world of law enforcement that may shed some light on the question.


In law enforcement, the authorities are keen for people to share knowledge of criminal activity, and often offer rewards for information.

But how well do those rewards work?

The BBC Crimewatch program has operated for many years without offering any reward, for example.

Also, according to a letter in the Guardian newapaper (UK),  the charity Crimestoppers provides the most rewards in Britain for people informing on criminals. Even people reporting crimes anonymously can claim, without their identity being revealed. But the charity says that even with its anonymity selling point "there is a notable year-on-year decrease in rewards being paid out". Hannah Daws, for Crimestoppers, said:
"In 2007/08 less than 1% of informants claimed the reward they are entitled to of at least £1,000. The key motivating factor for people to phone our number is that they are vulnerable and feel trapped."

In 1994, the charity paid out a peak of nearly £121,000 in rewards, but the figure now is a tenth of that, with no less information being offered. 
Daws said: "We have downplayed the rewards in our marketing and concentrated our messages on 'the communities doing the right thing'.

An interesting comparison is the US Crimestoppers programme, where the rewards payout is around 70%. So in the UK, its more about "doing the right thing" and less about the money, whereas in the US it's the opposite?  Or are there more complex factors at work?

Internal and external incentives


Here is an interesting study on the effects of incentives on sharing knowledge of illegal activity, which concludes as follows (my emphasis):

 Where levels of moral outrage are expected to be low, financial rewards will likely be a decisive factor, and the inquiry may shift to discovering the true price tag of the reporting behavior. 
For inherently offensive misconduct.....where the informant is expected to have a greater ethical stake in the outcome, regulation must fully appeal to the informant’s sense of duty. This may mean that financial incentives are not only unnecessary but are counterproductive and offset internal motivations to report.

The world of Knowledge Management


How can this translate to the world of KM? I think the answer lies in an extrapolation of the last quote, which we can take to read...

Where sharing knowledge (in this case, knowledge of illgal activity) is seen as the right thing to do (the knowledge provider having an ethical stake in the outcome), then monetary incentives have no effect, and can be counter-productive. 
Where sharing knowledge is seen neither as right nor wrong, then monetary incentives will work.
As this study also says, "To the degree that people are motivated by legitimacy, people cooperate because they feel it is the right thing to do, not because of material gains or loses"

Let us therefore be careful when incentivising knowledge sharing, and look to build a culture where knowledge is shared and reused, because knowledge sharing and reuse is the right thing to do, rather than a way to gain rewards and bounties.

Wednesday, 21 October 2015

Is it time for a Knowledge Management certification standard?

Knowledge Management has appeared as a line item in the Quality Standard ISO 9001:2015. But is it time for a KM Standard?

I think it is.

The reason behind my view is that I have seen so many KM initiatives failing for the same reasons, and a (simple, principles-based) standard would help avoid this waste of time and effort, and also avoid the bad press that "failed-again KM" can generate.

It's not as simple as that though

There are various KM standards already. I was involved with the British Standard Institute Guide for KM, published in 2001; there is an Australian standard for KM published in 2005, and an Israeli standard, published more recently in 2013.

The British and Australian standards have been more “best practice” guides than standards, representing the immaturity of the topic at the time. Since then two things have happened
a) There has been a longer time for KM to establish itself, and so to allow us to look at the common factors for long term KM success in organizations, 
b) The new clause in the ISO 9001 standard at least makes the point that knowledge must be considered as a resource and managed as such, if an organisation is to achieve the 9001 quality standard.
The value a certification standard can add to the market is to provide a minimum definition of what “managed as such” in the previous sentence entails.  There is some guidance on this in the new 9001 standard, but not enough.

There are other renewed standards efforts under way.

  • There is a committee set up to develop a new British Standard. I am on that committee for my sins.
  • The American Institute of Information Management is working on a KM standard.
  • A working group has been set up to develop an ISO standard for KM, as part of the family of HR standards. I know that HR is not the ideal home for KM, but there is no ideal home for KM in the ISO stable. I am on that working group, for my sins.


Is a certification standard an option at all? 
You can guess, from my membership of the working groups and committees, that I believe it is. When I talk with others who have been working KM for a long time, I already see a convergence of views at a principle level. I believe that a common model for KM principles is emergent but not yet synthesised.

I would suggest we know enough now to be able to define – not a standard set of practices – but a standard set of principles for the good management of knowledge. The challenge will be making this generic enough to apply to all scales of organisation and to all industries (Industrial, Legal, public sector, development sector etc etc), but I think this challenge can be met.

By “standard set of principles” I would include things like

I also believe that deriving such a standard, even at principle level, will require a huge amount of time and patience, but at least it has the backing of ISO, and a good committee in place.

 I am not so sure that we can provide a certification standard for KM implementation, and I would personally steer clear of descriptive maturity models, but in terms of creating a standard set of principles which define effective KM and which every organisation can aim for – I believe we know enough now to do this.

Tuesday, 20 October 2015

What sort of KM roles are in use?

KM roles are one of the four legs of the Knowledge Management table, and represent one quarter of the Knowledge Management framework, but what roles do organisations use?

This is one of the things we wanted to find out as part of our Knowledge Management survey. The diagram below shows the results.




We asked people to tell us which of the following roles they had in place in addition to the KM team:


  • Community of practice leader
  • Community of practice facilitator (in addition to leader)
  • Knowledge Management Champions 
  • Knowledge Manager for a department or division
  • Knowledge manager for a specific project
  • "Owner" for a specific knowledge topic
  • Content management support
  • KM Technology support
  • No other roles
  • Other (please specify)

Results

The results are shown in the bar chart above.  The most common roles are related to technology support and content management support, probably reflecting the (unfortunate) emphasis that companies give to technology and content.

More happily, the third most common role is the Community of Practice leader. 80% of the communities also have a CoP facilitator.

The incidence of Knowledge Owners is disappointingly low

The people who chose "Other role - please specify" specified the following (some of which, to be honest, are KM team roles);
  • "Knowledge Management Champions" are called "Local Learning Leads" and exist one per operational region. 
  • All the work is done under a generic role 
  • Chief Innovation Officer 
  • chief knowledge officer 
  • Competitive Intelligence
  •  Consultant Knowledge Management 
  • COP leader and KM champion are not funded roles 
  • Director of KM 
  • Document Management 
  • Ambassador enterprise content mgmt 
  • Group Knowledge Sharing Manager 
  • Innovation Program 
  • KM Admins 
  • KM Coordinator for the organisation km department 
  • KM Internal Consultant responsible for KM Service 
  • KM officer to support Knowledge Manager 
  • KM Policy Lead
  • KM link to programs 
  • Knowledge Enthusiasts in offices to support regional knowledge champion 
  • Knowledge Management Ambassadors 
  • Leader for "Learning from Experience" (learning about what we are doing well, doing wrong, etc 
  • meeting host (rolling position in CoP membership) 
  • Profesional Support Lawyer 
  • Research and Analysis roles related to some sort of knowledge management - e.g. audit/quality etc 
  • Senior Knowledge Advisor and Regional KM Team Leaders 
  • The team is being built.

Monday, 19 October 2015

When investigation is implied as criticism

One organisation's learning culture and behaviours can easily be misinterpreted by another. Here is a cautionary tale from General Stanley McChrystal.


The culture of the US Army is always to review, and always to learn. Any operation which offers learning potential, whether it is conducted by a US Army unit or a coalition partner unit, will be investigated so that the learning points can be gathered. This is particularly the case when the unit suffered casualties.

This investigation is routine for the US Army, for successful or unsuccessful missions.  However the coalition partners do not have the same culture.

When the General asked for an investigation of a partner mission in 2006, this was seen as implied criticism. As the General reports -


"In their culture, a senior officer ordering such an investigation was sending a clear message: “I don’t trust your judgment.” As a result, the coalition operators took a hard look at how they were operating and decided they needed to adjust some of their tactics on their next mission. Their commanders thought that, because I ordered the investigation on their previous raid, I wanted to avoid the use of additional firepower (such as airborne assets) at all costs, even if it meant endangering the safety of operators on the ground. This couldn’t have been further from the truth — but it was their interpretation that mattered, not the intent behind the order that sat in my mind".
Therefore in the next mission, the coalition unit refused to ask for airborne assist based on their assumption that this had been implicitly criticised, and had a hard time of it as a result.

The lessons for Knowledge Management


General McChrystal derives several personal lessons from this event, mostly about the need to communicate strategic intent so that it is not misinterpreted at tactical level.

However there is one clear lesson for all Knowledge Management initiatives, which is to clarify the intent of any learning activity. This particularly applies to lesson learning, which many organisations interpret to imply criticism, and therefore see lesson capture events as "witch-hunts".

Before conducting any lesson capture event in an organisation (or area of an organisation) where this is not routine, you need to make it very clear that this is not an evaluation exercise; it is not a means to criticise or to appoint praise and blame - it is instead a no-blame process intended only to derive learning for the future.

If you do not make such an intent clear, then (as General McChrystal's story shows), people will draw their own inferences of the intent, often with negative consequences.

Friday, 16 October 2015

Avoiding the forgetting curve - the Basis of Design document

I blogged last week about the forgetting curve, and how easily vital knowledge can be lost from the human memory. Here's one way in which the learning curve can be countered.

My post had a reply on Linkedin from Vladimir Riecicky, who wrote

"A forgetting curve is a very well known paradox in software engineering: A well designed software system does not require much maintenance and software engineers are not busy fixing its bugs. This is a good news for the maintenance budget and at the same time a bad news for the (expensive) engineering knowledge concerning the respective software design - it erodes since software engineers do not have to utilize it. At the end you are kind of lucky to have a certain level of maintenance just to keep the knowledge fresh. A depreciation curve of a software system in general is quite an interesting issue that keeps (clever) CIOs busy".

Not just in software design either -  in the oil sector the concept of the forgetting curve is well recognised. Whenever there is a hiatus of more than a year in drilling activity on an oilfield, performance is markedly poorer when drilling resumes, even when there is a well documented methodology, because the knowledge associated with that methodology has been lost.

In the oil sector, they counteract this with a document that they call the Basis of Design.

The basis of design is a simple document that tells you why an oilwell was planned the way it was. A well design is based on a whole lot of assumptions, many of which eventually turn out to be wrong. Unless you capture these assumptions, you can never understand the basis for the design. The document is written as a pre-cursor to the detailed plan, and then rewritten at the end of the well to capture best-practice thinking on the well design.

The BOD is rewritten after the well is drilled, and the difference between the pre-well and the post-well BoD provides a learning history for the well, and captures the new knowledge gained.

The “basis of design” documents, for each section of the well, “What is the objective of each design element for this section? What are the performance measures”?  This breaks the well plan down into manageable portions, and sets the context for the detailed plan, which will be based on the required objective for each element.

As one drilling engineer said, referring to the Basis of Design for his oilfield  - "I could move to this office and put out a quality well plan within a week based on this document, and there hasn't been a drilling rig here for two years".

The Basis of Design therefore counters the forgetting curve, by capturing the reasoning behind the methodology.

Thursday, 15 October 2015

Quantified KM value story #94 - time savings at Skandia

According to the book "Knowledge management in construction" which contains a table of example benefits from KM


Skandia AFS reduced the time taken to open an office in a new country from 7 years to 7 months by identifying a standard set of techniques and tools which could be implemented in any office. 

Wednesday, 14 October 2015

Great NASA story on checklists

The NASA CKO office publishes a series of "learning from experience" stories on their blog called "my best mistake".  This is a great way to spread the concept that mistakes make great learning opportunities.


Photo from wikipedia
I really like this story from David Oberhettinger

It's not so much a "mistake" story as a "value of learning" story. David tells how we was in a small plane preparing for landing when..............

Suddenly, dense black smoke begins to fill the cockpit. I flip the checklist over and follow the five steps listed on the back under In-Flight Electrical Fire:
  • (1) Master Switch to Off
  • (2) Other Switches (Except Ignition) to Off
  • (3) Close Vents/Cabin Air
  • (4) Extinguish Fire (in this case, I isolated a faulty transponder)
  • (5) Ventilate Cabin
These steps took maybe 90 seconds. Then we descended to an uneventful landing. The crisis hardly caused a significant increase in heart rate, because I just followed the checklist.

I have written before about how checklists are a fantastic way to provide knowledge to the decision maker at the point of need, and there can be no more graphic indication of the "point of need" that a smoke-filled cockpit of a small aeroplane.


However David then goes on to talk about how that checklist ended up in his hand.

  • The formal checklist as a concept derives from a lesson that was learned on October 30, 1935, during a test flight of the B17 bomber prototype. The pilots attempted to take off with the tail wheel locked; this prevented the wheel from swiveling and resulted in a crash and the death of both pilots. From then on, Boeing provided a printed checklist with each production version B17. (Previously, pilots were expected to make their own checklists.) 
  • Today, all airplane manufacturers provide a pilot’s handbook containing checklists specific to the model of plane

Therefore, when you buy the plane, you buy the knowledge of what to do in any form of emergency.  This is an excellent example of a "knowledge supply chain", which had become standard practice in the aviation sector because of the life-saving value of the knowledge.

As David concludes
Why do I love checklists? Because a checklist helped avert what could have been some serious unpleasantness. And because rather than letting my imagination run amok to my detriment (otherwise known as “panicking”), effective use of checklists allow me to direct my imagination to more productive purposes.


Tuesday, 13 October 2015

Don't limit learning through "top 10" lists

Transferring only the "Top 3" or "Top 10" lessons creates an arbitrary and unnecessary limit on learning.


I reviewed a project a few years ago where the team decided to collect and discuss on their Top 10 lessons (actually 20, as they captured the Top 10 engineering lessons, and the Top 10 project management lessons).

It puzzles me why they chose 10 lessons. This was a multi billion dollar project, and I bet they learned more than 10 things. So why restrict it to 10? Why not 12? What if there were 15 lessons - would we not record numbers 11 through 15?

Imagine I had been through a powerful learning experience, and learned many things. Imagine you came to me and asked me to share what I knew, and I said "I learned about 20 things, but I am only prepared to share 10 of them with you".  Would not not be a little annoyed? Especially if you ended up making repeat mistakes in areas 11 through 20?

Perhaps people limit investigation to the Top 10 to avoid overloading the organisation.

This may be a worthy aim, but no organisation I know of is overloaded by learning. Generally there is a dearth of good knowledge available, and people are very pleased to receive good helpful material.

I can understand restricting to 10 lessons if the lessons are turgid and boring and not very helpful, but that can’t be our aim, surely? I blogged a while ago about the project which generated 700 lessons, of which 400 were reused, resulting in savings of tens of millions of dollars. What would have happened if they had restricted themselves to 10? 390 opportunities for learning and improvement would never have been re-used, and 97.5% of the value would have been lost.

Perhaps people limit investigation to the Top 10 to avoid overloading themselves.

Perhaps they think that documenting lessons is not really worth doing and that it would a lot of effort, so lets high grade only the most important lessons. This may be a more likely scenario. But I think that's an unhelpful attitude. Why not identify all the valuable and reusable learning points, whether it's 9, 15, 19, 29? Why not document them all?

Why stop at 10?

Monday, 12 October 2015

The CoP application form?

You can take two main approaches to the creation of Communities of Practice - Top Down, or Bottom Up. One company mixes the two through the use of a CoP application form.


Communities of practice are one of the main building blocks of a KnowledgeManagement framework, and many organizations will seek to create, launch and sustain a number of networks, as a pilot within Knowledge Management Implementation. The communities need to be chosen wisely. Choosing the wrong networks – those without the conditions for success – invalidates the pilot. Choosing the correct networks allows a test of the efficiency of this component in delivering knowledge sharing.

But how do you select the CoPs -  top-down or bottom-up?

In top down selection , the company decides on strategic knowledge areas, and deliberately selects communities to support these, assigning leadership and core members and securing resources. This allows resources to be spent supporting the communities which will have most value to the company, but sometimes these top down communities may not align with the interests of the workers.

In bottom-up selection, the company enables the organizations with community tools, and watches for communities that form spontaneously around an area of business need. These CoPs are often high energy, but may not coincide with areas of knowledge which are strategic for the business.  Also it is all too common to find multiple CoPs starting up which cover the same topic as each other.

Given that we have two stakeholder groupings in KM, and both need to be satisfied, and given that satisfying one without the other can lead to instability in your KM program, how do we choose which CoPs to support?

The CoP application form


One company seeks bottom-up communities and provides a community technology platform, but asks proposers to complete a CoP application form in order to gain company support and resource. The application form asks


  • Does your CoP pass the criteria for eligibility?  There are several eligible categories, such as expert groups in specific areas, regional networks around a specialist area related to the business of the organisation, or communities created specifically to address a single issue
  • What is the name and description for your Communty of Practice?  This includes the welcome text that will be used for new members, and a list of key issues the community will address
  • Proposed membership and scope. Specifically whether this is an open or a closed community, and whether it includes people from outside the organisation
  • The community failitator.
  • Technology requirements.
  • Engagement plan.

The application form goes to the KM team, who review it and may suggest
  • The CoP goes ahead as planned
  • The proposed members join an existing CoP which covers the same area
  • The work of the proposed CoP could be done better through a workshop, or a conference, or some other means

In this way the bottom up energy is combined with some top down control, to give a good balanced approach. 

Friday, 9 October 2015

What the new ISO 9001really says about KM

I have mentioned here and here the inclusion of Knowledge within the 2015 revision of ISO 9001. The new version of this international quality standard has been published, and we can see the final wording of the Knowledge clause.

The inclusion of Knowledge Management within ISO 9001, 2015 edition marks a huge change within the world of KM. For the first time one of the global business standards makes an explicit mention of Knowledge as a resource, and specifies expectations for the management of that resource.

We can now see the final version of the standard, and the final wording of the relevant clauses.

The new clause (in the English Language version of the European standard) reads as follows:

 Clause 7.1.6. Knowledge 
  • Determine the knowledge necessary for the operation of its processes and to achieve conformity of products and services. 
  • This knowledge shall be maintained and made available to the extent necessary. 
  • When addressing changing needs and trends, the organization shall consider its current knowledge and determine how to acquire or access any necessary additional knowledge and required updates. 
  • NOTE 1: Organizational knowledge is knowledge specific to the organization; it is generally gained by experience. It is information that is used and shared to achieve the organization's objectives. 
  • NOTE 2: Organisational knowledge can be based on: a) Internal Sources (e.g., intellectual property, knowledge gained from experience, lessons learned from failures and successful projects, capturing and sharing undocumented knowledge and experience; the results of improvements in processes, products and services); b) External Sources (e.g., standards, academia, conferences, gathering knowledge from customers or external providers). 

 The new standard offers the following commentary, which gives a little more guidance on the sort of things that an auditor might be looking for:

In 7.1.6 the international standard addresses the need to determine and manage the knowledge maintained by the organization, to ensure the operation of its processes and that it can acheive conformity of products and services. Requirements regarding organizational knowledge were introduced for the purposes of:
a) safeguarding the organization from the loss of knowledge, e.g. - through staff turnover - failure to capture and share information
b) encouraging the organization to acquire knowledge, e.g. - learning from experience - mentoring - benchmarking". 

We can see from the text above and in the previous section, that many of the common elements of Knowledge Management are implied or specifically mentioned. These include:

  • an appropriate system for learning from experience, including the use of lesson learning; 
  • an appropriate approach to knowledge retention and reducing the risk of loss of knowledge, including mentoring, tacit knowledge capture, and knowledge sharing; 
  • some form of KM audit, benchmarking and/or KM strategy, sufficient to identify the critical knowledge needed to deliver quality products and services, and the main knowledge gaps which need to be filled; 
  • a framework (roles, processes and supporting technology) for maintaining knowledge and making it available to the extent necessary.

You can find more commentary on the implications of ISO 9001 for Knowledge Management in our most recent newsletter, available for free here.

Thursday, 8 October 2015

The curious case of the forgetting curve

The learning curve is a common phenomenon we see in Knowledge Management. As an organisation acquires more knowledge, their performance increases as they "climb the learning curve".  However without care and attention, the curve can reverse, and become a forgetting curve. Here is a case history where that happened.



I have blogged before about our Bird Island exercise, probably the longest running KM experiment in the world, and about how it demonstrates in a very clear way that Knowledge management can drive performance.

It is like a lab experiment in KM, with very clear learning points.

We had an interesting twist to the game a couple of weeks ago, where we had two people in the class who had done the game before, about 6 months ago. Now you might expect that this previous experience and knowledge would give them an edge.  They ought to remember some of the key design principles from the game, and they should therefore be well ahead of the other teams based on this knowledge.

So I put these two people with experience into the same team, to see if this would happen.

Well, it happened to an extent. The two people remembered some bits and pieces, and these included some high level design principles, and a few tips and hints. However much of the other detail required to succeed had been forgotten over the intervening 6 months. They built a tower slightly taller than the other teams, but one third the height of their performance 6 months previously.

As one of them said in the debrief "a little knowledge is a dangerous thing".

The graph above shows their performance 6 months previously, where the 5 bars on the left represent how they gained knowledge through their previous Bird Island experience. The red line is the learning curve they went through in the game.

The greay bar on the right shows their performance 6 months later, built with the help of a hazy memory and "little knowledge" from the previous exercise.and therefore how much had been lost in the interim. The green line is therefore their forgetting curve.


This result reinforces recognition of the frailty of human memory as a long term knowledge store, and therefore the need to support that memory through some sort of capturing and recording. Even 6 months is too long to leave knowledge in memory alone. We need to be capturing it as we go, even as an aide memoire, otherwise we lose it.

Or even worse, we retain a little knowledge, and find that it is just enough to be dangerous.

Wednesday, 7 October 2015

How KM governance evolves as KM becomes embedded

One of the KM elements we looked at as part of our 2014 global KM survey was KM Governance


We asked the respondents to identify which of 12 KM governance elements they applied as part of their KM program.  we also asked them to assess the level of maturity of their KM program.  Among other things, this allows us to see how KM governance evolves as a KM program becomes more mature.

The 12 incentives were as follows (listed roughly in order of application).



The maturity levels were as follows:


  • We are investigating KM but have not yet started 
  • We are in the early stages of introducing KM 
  • We are well in progress with KM 
  • KM is embedded in the way we work



The figure above shows how the use of these elements varies as KM maturity increases.

Firstly every element increases in use from the "early" stage to the "well in progress" stage. The increase is greatest for the KM Success stories (unsurprising, as you need to be in progress in order to deliver success), and least for the KM business case. Although business cases are uncommon, they tend to be developed early.

It is more instructive to see the change in governance elements from the "well in progress" stage of active KM implementation, to the "fully embedded" stage.  You can see some of the changes from the graph above, but to make it easier to see, the graph below shows the percentage change in each governance element between "well in progress" and "fully embedded".  These are analysed, with some suggested explanations, below



The following governance elements increase considerably as KM becomes embedded

  • KM reference materials and training, needed to support the knowledge workers
  • KM success stories and metrics, needed to keep the focus on KM
  • KM strategy, KM policy and KM Framework, needed to define the way in which KM is expected to be applied



The following governance elements neither significantly increase nor decrease in application as KM becomes embedded (both of these are replaced, to an extent, by the Strategy, Framework and Policy)
  • Senior champion
  • KM vision


The following governance elements decrease as KM becomes embedded

  • Separate KM incentive system, as this should now be subsumed into normal business incentives
  • Separate KM champions, as there will now be defined KM roles in the business
  • A KM Business case. Once KM is embedded, the time for a business case is past. 

Tuesday, 6 October 2015

How to give knowledge shelf-life

The best knowledge is always the freshest, but sometimes we need to preserve knowledge for longer term use. How can we do this, without losing the freshness?


Every cook knows that it's best to cook with fresh tomatoes. There is nothing sweeter that a fresh tomato, still warm from the sunshine, picked fresh from the vine.

However a fresh tomato does not stay fresh very long, and if we want to cook with tomatoes long into the winter, we need to find a way to preserve them for the longer term.  Hence the effort invested by many a gardener/cook standing over a hot stove, canning produce for the winter.

Canned tomatoes do not have the same quality as fesh, but we can use them in january when the snow is on the ground and the tomato vines are part of the compost heap.

The key, however, is to preserve them properly. Poorly canned tomatoes can be a source of botulism and food poisoning.

Knowledge is like tomatoes.

The best knowledge is also the freshest. Tacit knowledge - knowledge held in the head - has context, it has depth, it has vibrancy. It lives in practice, grows with practice, and is transferred through conversation in communities of practice.

However if it is not kept alive in continued practice, this knowledge loses its freshness

Imagine a task where there is a gap in practice - where we do the task once, then several months pass before we do it again. During this time, the human memory starts to leak. It starts to reinvent the past, and forget details. It's fallibility becomes apparent, and the three Gorilla Illusions start to work their destructive spell. The tacit knowledge loses its freshness and may go bad.

We need to preserve this knowledge for the future, which is where explicit knowledge comes in - knowledge which has been documented and stored in order to give it some shelf-life.

Explicit knowledge is always a second-best to tacit knowledge.  It does not have the same quality, the same freshness, as tacit knowledge.  However it has shelf-life and it has longevity. Its reliability in the longer term offsets many of its impediments, and the checklists, the wikis, the knowledge assets become our primary source (or at the very least, back up the tacit knowledge and fill in the gaps)

Our challenge then, as knowledge managers, is to recognise where knowledge must be tacit and where it must be explicit; and where it has to be explicit, to capture it with the maximum of context, the maximum of depth, and the maximum of life.

Like the gardener standing over the hot stove, preserving knowledge for the future is an investment in time and effort, and must be done properly if the knowledge is to be usuable again in the future.

Monday, 5 October 2015

The knowledge interchange system

Knowledge is derived from activity, and re-used in activity. However the path between the two is not linear, and needs some form of "knowledge interchange system".

image by stockarch - stockarch.com

Knowledge is generated in operational work, which very often takes the form of projects. In projects, people from many disciplines come together to deliver an objective or to create a product.

Knowledge is re-used in projets.

However in its journey from project to project, knowledge is handled and managed using a different dimension. This can be


  • the dimension of Practice - where knowledge is discussed in communities of practice, complied into practice guidance, and stewarded by a practice owner (subject matter expert); 
  • the dimension of Product - where knowledge is discussed in product focused communities, complied into product guidance, and stewarded by a product lifecycle manager, or product line engineer;
  • the dimension of Customer - where knowledge is discussed in customer-based communities, and stewarded by a key account manager.
Whichever of these three dimensions best fits your your corporate context, knowledge needs to be taken from the project dimension, managed in the practice/product/customer dimension, then returned to the projects.

An example


Let's look at lesson learning in a practice based organisation - a construction company.

  • Let's imagine Project A has generated some excellent learnings about contract management, safety, and site survey
  • Let's imagine Project B has generated some excellent learnings about mobilisation, safety, and piling
  • Let's imagine Project C has generated some excellent learnings about piling, steelwork construction and contract management.

If we leave the lessons in the project reports, effectively filing the knowledge in a project dimension, then future project wishing to find lessons about mobilisation, or piling, or contract management, need to read each project report to see if there are relevant lessons.  That's not difficult if there are only 3 projects and only 6 topics, but once it gets to 300 projects and 60 topics, filing the knowledge by project is untenable.

Instead, the lessons management system should tag each lesson by topic, so it becomes easy to find piling lessons, safety lessons, steelwork lessons and so on.

In addition, the lessons management system should route those lessons (eg by email) to the relevant communities of practice and to the relevant practice owners, so they can be notified of new knowledge in their area of practice, and update practice guidance accordingly. 

So the practice owner for contract management (head of Contracts, for example) is notified that Project A and Project C have new lessons. The practice owner for safety (head of Safety, for example) is notified that Project A and Project B have new lessons. And so on.

Then once the practice guidance is updated, the future projects know where to go to learn how best to manage a contract, or how best to construct steelwork.

This is the interchange system


This system is like a railway interchange, shifting knowledge from a Project track to a Topic track.  Without a system such as this, it can be very difficult to route knowledge to its correct destimation, so it can be reused in future. 


Friday, 2 October 2015

Wikis - to edit, or not to edit?

There are two main approaches to Wikis as a Knowledge Management component - providing page editors who act as validators or gatekeepers, or allowing free editing.


In the first case, the contents of the wiki page can be edited by only a few people, while others can suggest edits, or comment on existing content, by using the comments facility.  

In the second case, everyone has editing rights, and the assumption is that if enough people edit, the wiki is self-correcting and that errors and opinions will not last in the long term.

The second model is often quoted as the Wikipedia model, although even Wikipedia is not so simple to edit. Although everyone can make minor edits, the protocol is to discuss major edits in the article discussion/talk page prior to publishing, and even on Wikipedia there are locked pages that only approved people can edit.

How to decide which model to take

Before we look at each model, we need to consider that knowledge in an organisation generally has three levels of validity.


1. Mandatory, or “Must Do” knowledge. This is the level of company standards, and everybody reading this particular guidance document realises that they need to follow exactly what’s written. If there is a major problem, they need to get in touch with the process owner and discuss it with them, but that the default should be to follow this documentation exactly.

 2. Advisory, or “Should Do” knowledge. This is the level of best practices, and everybody reading this particular process documentation realises that this is current best way to approach this particular process, based on existing company knowledge. However there is always a drive to improve on best practice, and if somebody can find an even better way, then that’s great. So Advisory process is advised, but not compulsory. However if people ignore advisory knowledge and things go wrong, some awkward questions may be asked.

 3. Suggested, or “Could Do”. This is the level of good ideas or good practices that others in the organisation have used, which the reader should feel free to reuse or re-adapt to his or her own context. These good ideas can still save the reader a lot of time and effort, but there is no real requirement to copy them.

Level three is the chit-chat level of knowledge, in social media, discussion forums or meetings, where ideas and opinions are kicked around and solidified over time. There is a validation step to move from level 3 to level 2 (validation by SMEs or a Community of Practice), and another validation step (validation by the company) to move from level 2 to level 1. 

Where wikis come into the picture

The question is - what knowledge do you want to host on your Wiki? Level 3, or level 2?

The KM4Dev community use their wiki mainly as level 3 - as they say

"This wiki is both a working area for the Knowledge Management for Development (KM4Dev) community and a way for us to make our joint work accessible to a wider audience"
Wikis in places like Shell, Pfizer and Conoco use the wiki for level 2. Level 1 knowledge is held in the discussion forums and lessons management systems, and the wiki page editor promotes content to the wiki once it is considered "valid". They make reference to level 1 knowledge by linking to official procedure documents and standards.

Wikipedia, without making this explicit, is primarily for level 2 knowledge. The article discussion/talk pages are where the level 3 interchange happens, and the promotion of major edits to the main page is by "editorial consensus".

Conclusion

If you are using your wiki for level 3 knowledge, then allow everyone to edit.

If you are using your wiki for level 2, then a) introduce a validation step, and b) decide where your level 3 conversations will take place. 

Thursday, 1 October 2015

Quantified success story #93 - lives saved through Knowledge Managament

The video below is a KM success story from the Customer Service wing of KM


Ths story is told by Linda Yeardly of eGain customer support software. It describes how rapid access to high quality knowledge allowed a new member of a Gas Utility customer service team to make a call that probably saved many lives.


Blog Archive