Thursday, 29 March 2018

The "learn before, during and after" model in KM

"Learning before, during and after" is one of the oldest models in KM, and still one of the most useful.



Learning before, during and after was one of the early bywords for Knowledge Management at BP in the 90s - a simple and memorable mantra that project staff can grasp easily and quickly. It forms the basis for an operating philosophy for KM, and describes how Knowledge Management activities can be embedded  within the cycle of business activity.

The management of knowledge, like any management discipline, needs to be systematic rather than ad hoc, and needs to be tied into the business cycle. In any project-focused business, where business activities (projects) have a beginning and an end, knowledge can be addressed at three points.

  1. The project team can learn at the start of the project, so that the project begins from a state of complete knowledge (‘learning before’). This is where processes such as Knowledge handover and Peer Assist can be applied. 
  2. They can learn during the project, so that plans can be changed and adapted as new knowledge becomes available (‘learning during’), for example through the use of After Action Review.
  3. Finally, they can learn at the end of the project, so that knowledge is captured for future use (‘learning after’) and entered into the Lesson-Learned workflow.

These activities of "Learning before," "Learning during" and "Learning after" can become an expectation, or even a mandatory activity, for projects.  This model of ‘learn before, during and after’ was developed in BP during the 1990s, and was also developed independently in several other organizations. Shell refers to this as “Ask, Learn, Share”.

However, there is more to the model than the ‘learning before, during and after’ cycle. The knowledge generated from the project needs to be collated, synthesised, and stored as Knowledge Assets (guidance documents such as SoPs, or Wiki-based guidance) in order that knowledge deposited in the “knowledge bank” at the end of the project becomes more useful when accessed at the start of the next project. Communities of Practice need to be established to manage and share the tacit Knowledge Assets.

 This five component framework (learning before, learning during, learning after, synthesis of knowledge into Knowledge Assets, and building Communities of Practice) is a robust model which creates value wherever it is applied.

Wednesday, 28 March 2018

Why you can't have AI without KM

The rise of AI in the form of intellegent agents requires the rise of KM to support it. 


Image from wikimedia commons
This is the conclusion of Gartner research, quoted in this Computer Weekly post entitled "IT staff will need to retrain when automation deskills their jobs".

According to the post - 

Before automation and intelligent agents can really take off in the enterprise, IT operations teams will need to build knowledge management systems. 
Gartner said knowledge management is essential for a chatbot or virtual support agent (VSA) to provide answers to business consumers, but the response can only repeat scripted answers when based on existing data from a static knowledge base. It warned that intelligent agents without access to this rich source of knowledge cannot provide intelligent responses. 
As such, Gartner suggested that infrastructure and operations managers will need to establish or improve knowledge management initiatives. Gartner predicted that, by 2020, 99% of AI initiatives in IT service management will fail because of the lack of an established knowledge management foundation.
That's quite a prediction, but really it makes sense. AI in the form of intelligent agents like IBMs Watson is really a delivery vehicle for knowledge, allowing contextual answers to be provided quickly and effectively, and it requires a robust source of knowledge in order to work. Without KM, AI will fail. 

Tuesday, 27 March 2018

A key point in the difference between knowledge and information

I have often used this story as a way to distinguish knowledge and information, but here are a few more points:


map courtesy of NASA
I like to illustrate the difference between Information and Knowledge, with a story or an example.

Let's take the example of a geological map of mineral data, which you might use to site a gold mine.
Each point or pixel on the map is a datum - a mineral sample point, with a location in space. 
The map itself is information; built up from the data points in such a way that it shows patterns which can be interpreted by a trained geologist. 
However, to interpret that map requires knowledge. I could not interpret it - I am not a mining geologist - and unless you are a mining geologist, you could not interpret it either. The knowledge - the know-how, acquired through training and through experience - allows a mining geologist to interpret the map and come to a decision - to site a gold-mine, to take more samples, or to declare the area worthless. 
In this example, the data, the information and the knowledge come together to form a decision, but the ignorant person, the person with no knowledge, could never make that correct decision.

The key point in the story is this;

The mining geologist applies their knowledge in order to interpret the information. It is the knowledge which makes the information actionable.

I know there are quite a few people who define knowledge as "actionable information", but that's not quite right. It is the knowledge that makes the information actionable.

Knowledge + Information = Action.

That's the key disctinction between Knowledge and Information, right there.

Monday, 26 March 2018

The NATO lesson learned portal

The video below is a neat introduction to the concept behind the new Lesson Learned Portal at NATO



The video is publically available on the Youtube channel of JALLC, the joint Analysis and Lessons Learned Centre at NATO

The Youtube description is as follows:

The NATO Lessons Learned Portal is the Alliance’s centralized hub for all things related to Lessons learned. It is managed and maintained by the JALLC, acting as NATO’s leading agent for Lessons Learned. 
Observations and Best Practices that may lead to Lessons to be Learned can be submitted to the Portal, and the JALC will ensure that these Observations find their way through the NATO Lessons Learned Process. 
The information shared on the NATO Lessons Learned Portal can help saving lives. The little piece of information you have, may be the fragment missing to understand the bigger problem/solution – make sure you share it.

Friday, 23 March 2018

Using MOOCs to transfer knowledge at WHO

When it comes to transferring knowledge to a massive audience, MOOCs are a potential solution.  Here is a fantastic example from the World Health Organisation.


Image from wikimedia commons
There are some Knowledge Management cases where knowledge is created through research and analysis, then needs to be spread around a wide audience. There are challenges to this sort of knowledge transfer, and just sending our reports and emails often does not work. MOOCs - Massive Open Onloine Courses - can be an answer.

This article describes an approach taken at WHO, and contains the following quotes from WHO scientists and doctors:

  • “The major epidemics we have seen this century highlighted the need for a system that quickly transforms scientific knowledge into action on the ground,”
  • “The key is actionable knowledge. For us the value of knowledge is when it is shared – and it is especially important that responders have enough knowledge to protect themselves and do good work. We had information on diseases like Plague, MERS and Ebola, we had a number of courses but they were on paper, not accessible from the field.”

WHO have created 34 online courses in multiple languages on a Portal they call "Online WHO", covering 4 areas

  1. pandemics and epidemics
  2. emergency response operations
  3. soft skills such as risk communications, social mobilization and community engagement, and 
  4. preparation for emergency field work.

So far response has been very good, and more than 25,000 people have signed up for the course.

Let's finish with a couple more quotes that show the critical importance of the effective transfer of transfer.
“We don’t call it training – we call it knowledge transfer. OpenWHO allows us and our key partners to transfer life-saving knowledge to large numbers of frontline responders quickly and reliably,” 
“Too many people have died from lack of knowledge. We want these online courses to help save lives.”

Thursday, 22 March 2018

A case study of a failed learning system

When lesson learning failed in the Australian Defence Force, they blamed the database. But was this all that was at fault?



design-database Here's an interesting 2011 article entitled "Defence lessons database turns off users". I have copied some of the text below, to show that, even thought the lessons management software seems to have been very clumsy (which is what the title of the article suggests), there was much more than the software at fault.
 "A Department of Defence database designed to capture lessons learned from operations was abandoned by users who set up their own systems to replace it, according to a recent Audit report. The ADF Activity Analysis Data System's (ADFAADS) was defeated by a "cultural bias" within Defence, the auditor found. Information became fragmented as users slowly abandoned the system".
So although the article title is "defence lessons database turns off users", the first paragraph says that it was "defeated by cultural bias". There's obviously something cultural at work here ......
"Although the auditor found the structure and design of the system conformed to 'best practice' for incident management systems, users found some features of the system difficult to use. Ultimately it was not perceived as ‘user‐friendly’, the auditor found. Convoluted search and business rules turned some users against the system". 
....but it also sounds like a clumsy and cumbersome system

"In addition, Defence staff turnover meant that many were attempting to use ADFAADS with little support and training".
...with no support and no training.
"An automatically-generated email was sent to 'action officers' listing outstanding issues in the system. At the time of audit, the email spanned 99 pages and was often disregarded, meaning no action was taken to clear the backlog".
There needs to be a governance system to ensure actions are followed through, but sending a 99-page email? And with no support and follow-up?
 "It was common for issues to be sent on blindly as ‘resolved’ by frontline staff to clear them off ADFAADS, even though they remain unresolved, according to the auditor".
Again, no governance. There needs to be a validation step for actions, and sign-off for "resolution" should not be developed to frontline staff.
 "Apart from a single directive issued by Defence in 2007, use of the database was not enforced and there were no sanctions against staff who avoided or misused it".
There's the kicker. Use of the lessons system was effectively optional, with no clear expectations, no link to reward or sanction, no performance management. It's no wonder people stopped using it.

So it isn't as simple as "database turned off users". It's a combination of

  • Poor database
  • Poor notification mechanism
  • No support
  • No training
  • No incentives
  • No governance
  • No checking on actions

It's quite possible that if the other items had been fixed, then people might have persevered with the clumsy database, and it's even more likely that if they built a better database without fixing the other deficiencies, then people still would not use it.  A

What they needed was a lessons management system, not just a database.

So what was the outcome? According to the article,
.....establish a clear role and scope for future operational knowledge management repositories, and develop a clear plan for capturing and migrating relevant existing information ..... prepare a “user requirement” for an enterprise system to share lessons.

In other words - "build a better database and hope they use it" Sigh.

Wednesday, 21 March 2018

Knowledge as "the voice of experience"

Is KM a way of sharing the "voice of experience"?




In many ways, you can look at much of Knowledge Management as being a systematic approach to identifying, distilling and transmitting the voice of experience around the organisation.

Experience is the great teacher, and experience which is shared through Knowledge Management can teach many people other that the person who had the experience themselves. People trust knowledge when they know its provenance and when they know it is based on lessons from real experience.

So how do we make this voice of experience heard? Here's some ways -

  • Always attribute the source of the knowledge. Give its provenance, put people's names against it. It shows that it comes from a real and experienced source.
  • Use people's own words if possible. Include quotes. Let the experience talk.
  • Include pictures of the people who provided the knowledge. Knowledge seems more authentic of you can "see" the person who provided it.
  • Capture the stories. Provided they are true stories, told in the words of the people involved, they convey authentic experience.
  • Include the case studies. These are the record of experience.

Tuesday, 20 March 2018

Military Axioms for Knowledge Management

Courtesy of this linked-in blog post, here are some proposed axioms for KM from a military source.


The author of the post is Brett Patron, and Brett has taken a set of established Axioms for Special Operations Forces, and suggested how they can be applied to Military KM. The original SOF axioms are as follows:

  • Humans are more important than Hardware. 
  • Quality is better than Quantity. 
  • Special Operations Forces cannot be mass produced. 
  • Competent Special Operations Forces cannot be created after emergencies occur. 
  • Most Special Operations require non-SOF assistance.

Brett's adaptation of these make for interesting reading.  With Brett's permission, here are his axioms with my comments - you can read his comments in the original post.


1. Humans, organized effectively, drive the processes and tools that get decisions made and achieve outcomes.

This is a reworking of the "Roles, Processes, Technology" mantra, with people at the heart, and with the fourth component of Governance implied in the term "organized effectively".

2. Quality is better than quantity. 

For this one, I would like to quote Brett's text, which is puts the point really well.
 "KM is the “demand side” of decision support; Information Management is the “supply side.” ...  Massive expenditures of money, gigabytes of information, and unwieldy programs are far less valuable than a focused understanding of the people in an organization, the organization's goals, and the choices people make to achieve those goals. Elegance comes from the useful joining of the demand and supply sides so the right conditions exist for quality, timely decisions". 
Regular readers of this blog will remember similar reminders to match supply and demand, and to favour quality over quantity.

 3. Trained KM professionals cannot be mass produced. 

Brett believes, and rightly in my view, that the most valuable way to become a KM professional is through experience rather than training, and to "learn by doing" how an organisation can function as a knowledge-based entity.

 4. Competent knowledge workers and effective policies cannot be created after emergencies occur. 

By the time the disaster happens, its too late for KM.  By the time the aircraft is in trouble, its too late to create the pilot's checklist. 

 5. Successful decision support through effective KM hinges on the efforts of every member of the organization.

This is our vision of the organisation as a knowledge factory as well as a product/services factory, with a knowledge workstream interwoven with the delivery workstream, with a knowledge organisation and knowledge outputs. This is a "whole company" matter.

So I think to a large extent these same Axioms hold true in other sectors, as well as the Military. 

Monday, 19 March 2018

How the Emergency Services are resourcing real-time learning

It is common practice to invest time and resources in learning after a project.  Here are some examples of investment during a project. 


I blogged lasy week about lesson learning in the Australian Emergency Services, and made passing reference to Real-Time Evaluation. It's worth spending a little more time on this topic, as this is a departure from the general practice of capturing lessons only in the aftermath of an event.

My colleague Ian sent me these two examples from the Australian emergency services putting resources on to the ground to collect lessons during an incident, rather than waiting until afterwards.

Learning from the Nov 2017 Heavy Rain Event
The Victorian State Emergency Service and Emergency Management Victoria teamed up over December 2017 and January 2018 to conduct a series of debriefs at the incident, region and state level relating to the heavy rain event that occurred at the end of November.
For the first time under the new arrangements a Real Time Monitoring and Evaluation (RTM&E) team was also deployed during the event to inform real time learning. The resulting report, together with the debrief outcomes, will be analysed for insights and lessons and included in EM-Share to support ongoing continuous improvement.

RTM&E Deployed into the State Control CentreOn 19 and 20 January 2018 a small Real Time Monitoring and Evaluation (RTM&E) team was deployed for the first time into the Victorian State Control Centre (SCC) to support the real time learning of SCC staff during the recent heat event.
It was a great opportunity to look at new arrangements and inform future continuous improvement activities across the Victorian Emergency Management sector. All outcomes will be also included in EM-Share.

These are examples of what I call "Level 3" lesson learning; the proactive hunting for lessons rather than reactive capture of lessons after the event.  Please note that Real Time Evaluation is not an alternative to Post-Event Evaluation - both are needed. However the benefits of Real Time Evaluation, and the Proactive capture of lessons, are as follows:


  • The level of resourcing is often greater, rather that trying to squeeze in evaluation time after the activity is over
  • Lesson can be acted on, and problems corrected, while the activity is in progress
  • The RTM&E team can look out for early signs of things happening, and can specifically watch out for lessons on specific topics
  • The RTM&E team can capture lessons while memories are still fresh, before people start to forget.
The main reason why RTM&E needs to be partnered with Post-Event Evaluation such as a Retrospect or After Action review is that until the event is complete, you don't yet know the outworkings of the decisions you made earlier. For example, you may take a course of action that speeds things up, record that as a successful lesson through RTM&E, and then after the event find that there were a whole series of unintended consequences which meant that the course of action was, with hindsight, unwise. 

However, given that caveat, Real Time Evaluation, and the capture of lessons as an event unfolds, can be a really valuable partner to more traditional Post-Event Review.

Friday, 16 March 2018

How the Australian Emergency Services manage lessons

Taken from this document, here is a great insight into lesson management from Emergency Management Victoria. 



 Emergency Management Victoria coosrinates support for the state of Victoria, Australia during emergencies such as floods, bush fires, earthquakes, pendemics and so on. Core to their success is the effective learning of lessons from carious emergencies.

The diagram above summarises their approach to lesson learning, and you can read more in the review document itself, including summaries of the main lessons under 11 themes.

  • They collect Observations from individuals (sometimes submitted online), and from Monitoring, Formal debriefs, After ActionReviews and major reviews.
  • These observations are analysed by local teams and governance groups to identify locally relevant insights, lessons and actions required to contribute to continuous improvement. These actions are locally coordinated, implemented, monitored and reported. 
  • The State review team also take the observations from all tiers of emergency management, and analyse these for insights, trends, lessons and suggested actions. they then consult with subject matter experts to develop an action plan which will be presented to the Emergency Management Commissioner and Agency Chiefs for approval.
  • The State review team supports the action plan by developing and disseminating supporting materials and implementation products, and will monitor the progress of the action plan.

This approach sees lessons taken through to action both at local level and at State level, and is a very good example of Level 2 lesson learning.

Thursday, 15 March 2018

What Google Trends really tells us about KM popularity

Again yesterday I was corresponding with someone who used Google Trends as an argument that KM was dying.

Taken at face value this view is understandable. The google trends plot for KM decreases over time as shown below, showing a steady reduction in relative searches for the term "knowledge management" over the past 8 years.  At first sight this could suggest that the popularity of KM is on the wane, and that fewer and fewer people are searching for the term. However if you dig a little deeper this plot is misleading, and the conclusion that interest in KM is dying is actually a fallacy.


Let me explain why.

Google trends is not an absolute indicator of the popularity of a topic.

That is because Google trends measures "how often a term is searched for relative to the total number of searches, globally", and the total number of searches, everywhere in the world, has rocketed (screengrab from this site below).




Any decrease in the relative percentage, as in the first graph, has to be normalised against the increase in the total number of searches in the second graph.  If the top graph is a measure of the percentage and the bottom graph is the total, then all we need to do is multiply them together to get a measure of the total number of KM searches, and then we will be able to say something meaningful.

That is exactly what I have done in the plot below. The numbers are inexact, as I have just read points visually from the first plot (see table at the base of the post for figures) but the conclusion is obvious.



Conclusion 

Google trends is a meaningless indicator unless normalised against the total number of searches. If you do this, then far from KM being in a decline ...

... the total number of Google searches for Knowledge Management has actually increased steadily from 2004 to 2012.  









Raw data for the 3rd graph
year total searches (billion) Googletrends measure of KM share measure of total number of searches for KM
2004 86 100 8600
2005 141 70 9870
2006 230 50 11500
2007 372 40 14880
2008 584 30 17520
2009 792 25 19800
2010 998 21 20958
2011 1109 20 22180
2012 1216 19 23104

Wednesday, 14 March 2018

Using a "pretend customer" in knowledge capture reviews.

When capturing knowledge, sometimes its useful to have a pretend customer you can introduce. 


Image from wikimedia commons
I have blogged about the need to write knowledge as if it were for a "knowledge customer", and to "document for the customer" when capturing knowledge.  But what do you do if nobody knows who the customer is?

A friend of mine recently came up with an interesting solution to this, but introducing a fictional customer.

He was running a Retrospect, and was discussing the learning points from one particularly tricky set of events. The group were struggling to express the lessons in useful terms for future projects, so he reached into his bag and pulled out a little blue toy dog which he was taking home to his son.

"This is Blue Dog" he said. "Blue Dog is running the next project. What advice would you give to Blue Dog, to help avoid the problems you had this time around?"

OK, a bit corny, but it gave the discussion a focus, and the project team were able to come up with some specific and actionable recommendations for Blue Dog. Blue Dog became a very visible stand-in for the Unknown Knowledge User.

Another friend, Lisandro Gaertner fron Brazil, used a similar approach with a three months long training/best practices sharing online program with Sheriffs.

They were challenged to give advice to a new Sheriff called Sherlock Silva about the most common problems they faced it. That worked very well and Sherlock Silva turned in a kind of a meme in the community. "What should Sherlock Silva do?" 

So if your Retrospect is struggling, maybe its worth having a blue dog in your bag, or introducing Sherlock Silva!

Tuesday, 13 March 2018

Example KM policy - AVANGRID

Here's a great principles-based KM Policy


Avangrid is a US-based service company in the Energy Market. Their website says "Our 6,800 employees collaborate to deliver projects that power America’s future, provide clean energy and improve customers’ lives and communities". To support this collaboration, they recently published this Knowledge Management Policy, most of which I reproduce below.

I really like the vision of making knowledge a common asset, and aligning it with strategic competencies.



 AVANGRID, INC. KNOWLEDGE MANAGEMENT POLICY 
The Board of Directors of Avangrid, Inc. (“Avangrid”) oversees the management of Avangrid and its business with a view to enhance the long-term value of Avangrid for its shareholders. The Board of Directors of Avangrid (the “Board of Directors”) has adopted this Knowledge Management Policy (this “Policy”) to assist in exercising its responsibilities to Avangrid and its shareholders .....
1. Purpose 
The effective development, dissemination, sharing, and protection of Avangrid’s intellectual capital enhances operational efficiency and is a key element in creating sustainable value for Avangrid’s shareholders. As part of Avangrid’s efforts to implement best practices in knowledge management, this Policy sets forth the main principles that will guide the Avangrid Group in the appropriate dissemination, sharing, and protection of existing knowledge and the implementation of initiatives, procedures, and tools that enable its directors, officers, and employees to benefit from the continuous learning and cultural exchange opportunities. 
2. Principles 
To achieve these goals, Avangrid will endeavor to: 
a) Identify the existing knowledge held by each person and working group within the Avangrid Group and promote the further development of such knowledge. To the extent strategically beneficial and permitted by applicable law, the Avangrid Group will make existing and newly developed knowledge accessible to all other members of the Avangrid Group in order to maximize operational efficiency. 
b) To the extent strategically beneficial and permitted by applicable law, integrate the Avangrid Group’s tangible and intangible assets in order to create an intelligent organizational structure that rewards continuous learning and innovation. 
c) Align knowledge management with the competences and requirements set out in the Avangrid Group’s strategy. 
d) Develop standard systems of knowledge management, identification, and protection across the Avangrid Group that streamlines the proper dissemination and sharing of knowledge within Avangrid Group and enhances operational efficiencies. This will include identifying, developing and putting into places the resources necessary to foster knowledge sharing to the great extent possible through efficient internal dissemination and training; where appropriate, creating and enhancing organizational networks throughout the Avangrid Group; and enhancing the cohesion of existing working groups and teams. 
e) Evaluate the existing knowledge within the Avangrid Group in a consistent manner so that the Board of Directors and, where appropriate, management can assess the effectiveness of the initiatives implemented under this Policy, make changes and improvements where necessary, and promote new innovations in knowledge management. 
f) Respect the intellectual and intangible property rights of third parties in the knowledge management of the Avangrid Group.

Monday, 12 March 2018

Is KM dead? Further evidence of life

We often hear claims that KM is dead or dying, but what does the hard data say?



The "KM is Dead" meme is one with a long history; see articles from 2004, 200820112012, 2015, 2016 to choose but a few. It still seems to resurface several times a year; usually when a software vendor has something to sell (example).

Very seldom are these assertions of the demise of KM accompanied by any data or analysis of trends, other than the Googletrends plot, which as we have seen, is based on searches as a proportion of the total, and would also  point to the demise of project management, risk management, financial management, and so on.

I showed some data from our global KM survey last year which suggested that the uptake of KM may actually be increasing, and here is some new data from the academic world.  The authors of this new study looked at academic KM publications since 1974, when the term was first used, and one of the tables in the text of their article is a list of the number of academic KM publications per year. I used this table to create the graph above.

I don't think you could look at this plot and say KM is dead. You might say it has slowed down a little since a peak in 2010, and that the current number of publications is at about 80% of peak levels, but that's a long way from being dead or dying.

Is KM dead? According to the number of academic publications - No!


Friday, 9 March 2018

Why good Titles are important in KM

If you want knowledge in a lesson, post or knowledge article to be found, give it a good title.


One of the occasional recurring themes of this blog is the importance of Knowledge findability. Knowledge needs to be used in order to add value, and before it can be used it needs to be found. This includes the ability to find knowledge in lessons within a database, stories within a story folder, relevant posts within a community blog, or experience within the head of an expert.

One of the key enablers of findability when it comes to documented stories, lessons and knowledge articles is a Good Title. The Title is the most prominent item in any browsing system or set of search results. The purpose of the Title is to enable the reader to understand whether the item is likely to be relevant to them. Based on the Title, they decide whether to open and read the item. If the title makes no sense, then the seeker may not even realise they have found the knowledge, and may pass over it unknowingly.

So part of the role of the publisher of knowledge, in ensuring findability and reusability, is to give a knowledge item a good and relevant title - not a lazy title, or a "clever" title, or an artistic title, but a title that tells the reader what's inside.

Bad titles


Would you know what the lessons listed were about, before opening them? Would the titles help you find relevant content? Would you even bother to open them? (although I could see you might be intrigued, in some cases). Apologies to any of you who wrote any of these, by the way.
  • Duplicate
  • Learning 1 of 3
  • Public Lessons Learned Entry: 0406
  • Additional learning from (Incident X)
  • Spurious event on (Project Y)
  • Z Project - After Action Review (Lesson Learned)
  • When you sweep the stairs, always start from the top (this one was not about stair sweeping by the way)
  • From take-off to landing (and it's not about flying a plane)
  • Problem

Good titles.


If you want to see good practice in using titles, browse the NASA lessons database where you can find titles like these:


These titles clearly tell you what the lesson is about, and the reader instantly knows whether the lesson is relevant to their context. In almost every case the lesson is related to a process - handling of panels, mapping PC boards, fabrication of cable - or to a component such as Hypergol checkout panels. Someone planning a similar process, or designing a similar component, can find the lesson based in the title.

If you want knowledge to be found and used - pay attention to the title!





Thursday, 8 March 2018

3 key roles in a KM pilot project

To achieve success in a KM pilot, there are three major roles that need to be in place.


Blickling Hall, Gardens and Park
Blickling Hall, Gardens and Park by Martin Pettitt, on Flickr
KM pilot projects are when you take Knowledge Management out into the business, and "try it for real". They are a public road-test for knowledge management, and a crucial step in your implementation, You can't afford for them to fail.

In order to ensure success, you need to set up 3 key roles.

The first role is that of the business sponsor, who acts as the customer for the project within the business. They play an active role in setting the direction, providing resources, and agreeing objectives and deliverables. The business sponsor is likely to be the manager of the business unit, and it is crucial that they be committed to the success of the project.

 The second role is that of the local pilot project manager. This person will be accountable for delivering the results of the project. It is important that this role is owned by somebody within the business, so that the project is seen as internal to the business, rather than something “which is being done to us by outside specialists”. The KM person should never be the pilot project manager.

 The third role is that of the knowledge management adviser or supporter, who works closely with the local project manager in implementing the project; providing the knowledge management processes tools and technologies. The knowledge management adviser will be a member of the KM implementation team, and provides learning from the pilot project back to the KM team. They may work full-time on the pilot project, depending on its complexity and scope.

Wednesday, 7 March 2018

4 ways to increase knowledge flow - lessons from fluid dynamics

If we look at knowledge flowing through a company as an analogue of fluid flowing through a porpous medium, can we draw any conclusions to help us with KM?



In their 2011 presentation, Tim Stouffer and Reid Smith did just this. They took an interesting look at the flow of knowledge, and likened it to the flow of oil, water or other fluid in a fluid-bearing rock.

In such circumstances,  flow is governed by the equation shown here, which is known as Darcy's law. Flow is governed by the permeability of the rock, the dynamic viscosity of the fluid, the pressure difference, and the distance the fluid needs to travel.

Stoufer and Smith used this equation as analogy to draw some conclusions about how to increase knowledge flow in organisations, and focused on four factors.


  • If we want to increase knowledge flow, we need to make the organisation more permeable to knowledge. This is the area most KM programs focus on - providing the tools and the organisational structures that remove or reduce the barriers to knowledge flow, making the organisation as transparent as possible as far as knowledge is concerned. They do this through the introduction of community forums, good search, well constructed knowledge bases, lessons management systems with good workflow etc.. This is vital to success of a KM program, but is only 1/4 of the equation.



  • If we want to increase knowledge flow, we need to reduce the viscosity (the stickiness, or flow-resistance) of the knowledge itself. Many organisations will claim on the one hand that knowledge does not flow round their organisation, while on the other hand agreeing that gossip spreads like wildfire. That's because gossip is low-viscosity knowledge - it will find any little gap through which to flow. We need to reduce the viscosity of technical knowledge to a similar level, through packaging it well, through the use of stories, video, examples and lessons. Well written, catchy, punchy, and speaking directly to the reader/listener/viewer.



  • If we want to increase knowledge flow, we need to increase the driving pressures - both Push and Pull. This is the cultural side of the equation, the pressure to share and (more importantly) the pressure to Ask and Learn. The pressure is the sum of Pull and Push, and is the sum of peer pressure and management expectations. The more attention you give to creating expectations for both sharing and learning, the faster the knowledge will flow. 




Stoufer and Smith conclude as follows:
"Getting Knowledge to flow is much like the physics contained in Darcy’s Law  
  • Increase “Permeability” 
    • Improve access to knowledge
    • Build knowledge connections: P2P and P2K
  • Increase “Pressure" 
    • Management leadership
    • Metrics
  • Decrease “Viscosity”
    • Turn tacit knowledge into explicit, actionable knowledge
  • Decrease “Distance” (make things easy)
    • Bring people, knowledge and communities closer together.

So although Knowledge flowing through an organisation is unlike Oil flowing through a rock, the factors of Darcy's law can still be used as an analogy to give us insight into 4 ways to improve the flow of knowledge. 

Tuesday, 6 March 2018

The Aha! moment - how to tell when knowledge transfer is successful

There is one immediate test of effective knowledge transfer, and that is the Aha moment.



Image from wikimedia commons
Anyone who has ever, at any time, tried to explain something to someone else, is aware of the "Aha moment".

The Aha moment, oterhwise known as the Light-bulb moment, is a moment of sudden inspiration, revelation, understanding or recognition. It's such a common experience that it has made its way into cartoon iconography, with the image of a light bulb lighting up above a character's head when he or she suddenly "gets it".

You can see the light bulb in real life - you can see the moment when understanding dawns. It's a brightening of the features, an increase in the level of engagement, stiffening body posture, a widening of the eyes, a smile. Those are the outward signs of the inward dawning of comprehension.

The Aha moment is a very valuable indicator in Knowledge Transfer, because it means that the recipient has "got it". They have recognised the new knowledge for what it is - namely something better and more valuable than the knowledge they currently hold in their head.

I remember a classic example when I was running a Knowledge Management training session in Alaska, and I was trying to get across the idea that KM is not an abstract notion, but is something that needs to be applied to real business problems. I saw someone at the back of the room suddenly "light up", come to attention, and start scribbling rapidly on a piece of paper. I asked him later what had happened, and he told me that the light bulb had come on when I had said "KM can help you with the things you need to know, right now, to deliver your business" - and he had immediately jotted down 10 business problem that KM would be able to help. These would become the basis of his KM strategy.

The Aha moment can only be recognised, and happens in the most dramatic form, when knowledge is transferred face to face. In fact any trainer or teacher looks for that moment, and keeps trying different ways to transfer the knowledge until the lightbulb lights. They watch the faces, and watch the eyes, and watch the body language, and look for the moment when people "get it". Until that point, the knowledge has not been received. So for important knowledge, where the light bulb needs to go on and stay on, you need to look at tried and tested mechanisms such as Peer Assist, Knowledge Exchange, Knowledge Handover and so on, where the facilitator can prompt for, and watch for, the Ahas and the light bulbs.

Knowledge Managers, please watch out for the Aha moment, That is your best indicator, metric or KPI to show that Knowledge Transfer has really happened.



Monday, 5 March 2018

What are the outputs of the KM workstream?

KM organisations need a Knowledge workstream as well as a Product/Project workstream. But what are the knowledge outputs?


I have blogged several times about the KM workstream you need in your organisation; the knowledge factory that runs alongside the product factory or the project factory.  But what are the outputs or  products of the knowledge factory?

The outputs of the product factory are clear - they are designed and manufactured products being sold to customers. The outputs of the project factory are also clear - the project deliverables which the internal or external client has ordered and paid for. 

We can look at the products of the KM workstream in a similar way. The clients and customers for these are knowledge workers in the organisation who need knowledge to do their work better; to deliver better projects and better products. It is they who define what knowledge is needed. Generally this knowledge comes in three forms:

  • Standard practices which experience has shown are the required way to work. These might be design standards, product standards, standard operating procedures, norms, standard templates, algorithms and so on. These are mandatory, they must be followed, and have been endorsed by senior technical management.
  • Best practices and best designs which lessons and experience have shown are currently the best way to work in a particular setting or context. These are advisory, they should be followed, and they have been endorsed by the community of practice as the current best approach.
  • Good practices and good options which lessons from one or two projects have shown to be a successful way to work. These might be examples of successful bids, plans, templates or designs, and they have been endorsed by the community of practice as "good examples" which might be copied in similar circumstances, but which are not yet robust enough to be recognised as "the best". 
  • More generic accumulated knowledge about specific tasks, materials, suppliers, customers, legal regimes, concepts etc.
The project/product workstream also creates outputs which act as inputs to the knowledge workstream; these are the knowledge deliverables, the lessons which capture hindsight, and the useful iterms which can be stored as good practices and good options. The link between lessons and best practices is described here, and shows how the two workstreams operate together to gather and deliver knowledge to optimise results. 

Friday, 2 March 2018

Observations, Insights, Lessons - how knowledge is born

Knowledge is born in a three-stage process of reflection on experience - here's how.


I think most people accept that knowledge is born through reflection on experience. The three-stage process in which this happens is the core of how the military approach learning from experience, for example as documented in  this presentation from the Australian Army (slide 12).

The three stages are the identification of  Observations, Insights and Lessons, collectively referred to as OILs. Here are the stages, using some of the Australian Army explanation, and some of my own.


  • Observations. Observations are what we capture from sources, whether they be people or things or events. Observations are "What actually happened" and are usually compared to "What was supposed to happen". Observations are the basic building blocks for knowledge but they often offer very limited or biased perspective on their own. However storing observations is at least one step better that storing what was planned to happen (see here). For observations to be a valid first step they need to be the truth, the whole truth (which usually comes from multiple perspectives) and nothing but the truth (which usually requires some degree of validation against other observations and against hard data).
  • Insights. Insights are conclusions drawn from patterns we find looking at groups of observations.  They identify WHY things happened the way they did, and insights come from identifying root causes. You may need to ask the 5 whys in order to get to the root cause.  Insights are a really good step towards knowledge due to their objectivity.  The Australian Army suggests that for the standard soldier, insights are as good as lessons. 
  • Lessons.  These are the inferences from insights, and the recommendations for the future. Lessons are knowledge which has been formulated as advice for others, and the creation of lessons from insights requires analysis and generalisation to make the insights specific and actionable . The Australian army defines lessons as "insights that have specific authorised actions attached.... directed to Army authorities to implement the stated action", and there is a close link between defining an actionable lesson, and assigning an action to that lesson.

This progression, from Observation to Insight to Lesson represents the methodology of learning by reflection. The Retrospect meeting and the (smaller scale) After Action Review both provide a structured discussion format which moves increments of knowledge through the three stages..

In other organisations these three stages are separated. Observations are collected, analysts use these to derive insights, and then an authoritative body adds the action and turns the insights into lessons. My personal preference is to address all three steps as close as possible to the action which is being reviewed, using the same team who conducted the action to take Observations through to Lessons.

But however you divide the process, and whoever conducts the steps, these three stages of Observation, Insight and Lesson are fundamental to the process of learning from experience. 





Thursday, 1 March 2018

What would it take, to get you to share more of your knowledge?

"What would it take, to get you to share more of your knowledge



Image from wikimedia commons
This was a question Shell asked in an internal survey, several years ago, in order to understand the incentives and barriers for knowledge sharing. The top 6 answers were as follows

  1. More time 
  2. More feedback on use of the knowledge 
  3. Recognition from peers 
  4. Knowing that it made an impact 
  5. An easier way to do it 
  6. Thank you from colleagues
What was missing from the list of answers were
  • Money 
  • Prizes
  • Badges
  • Hard incentives, and 
  • Directives from management. 

If you want people to share, then make it easy, free up some time from them, and give them feedback on the difference it made (including some "thank-you"s)

That "making a difference" piece is important, and sits behind factors 2,3,4 and 6. Make sure this is built into your Knowledge Management system, so people don't feel that they are just dropping their knowledge into a black hole, with no idea of where it's going, or who is benefiting.

If you want people to share more knowledge, show them that it makes a difference when they do.


Blog Archive