Friday, 29 November 2019

The 6 stages through which a knowledge topic matures

Business in a world of change is a learning race. The winner is the organisation that can develop and mature knowledge more quickly than the competition, bringing new and improved products and processes into the market first, and so gaining First Learner Advantage.


Therefore one way of viewing Knowledge Management is to see it as a strategic approach to maturing critical and competitive knowledge domains in the most rapid and effective way. 

In this view, Knowledge passes through a series of stages, as shown and described below, and the task of Knowledge Management in each stage is to move as effectively as possible to the next stage. As the knowledge domain matures, so the management approach for that knowledge evolves.

The stages are shown in the plot below.

Stage 0. Innovation, or Knowledge Creation.

This step is where ideas are made. Knowledge Management helps innovation by creating proactive processes for generating new ideas in the areas of greatest business need, often incorporating networked innovation (Deep Dive, for example). 

Stage 1. Research

Research is Idea Testing - moving from an Idea to Knowledge through practical experimentation. Knowledge Management helps here by introducing roles, processes, technologies and governance for capturing that first knowledge, as well as by capturing what did NOT work (and why), and also the most promising research leads that there was no time to explore. The knowledge evolves rapidly, through the use of blogs and wikis (see this example). Once the main theoretical problems are solved, the knowledge needs to be passed to the Development team, and also retained for future Research programs.

Stage 2. Knowledge Development

The development stage involves taking the best research ideas and testing them further to develop a viable process or product which can be rolled out in the business, or delivered to a customer. Knowledge Management helps here by introducing a framework for learning during development, both to make the development process more effective and efficient, and to ensure the "knowledge workstream" is well managed (i.e. the creation of knowledge for the benefit of the users further along the value chain).  The techniques of After Action review, Retrospect, and Knowledge Asset development are important here. Once the main practical problems are solved, the knowledge is passed to sales, manufacturing or operations. 


Stage 3. Establishment of Best Practice

Even when the process or product is in use, the knowledge can be further perfected. The organisation can still improve the process or product, and can learn to improve its application. Because the product or process is now in use in many locations, Knowledge Management helps by introducing a framework of knowledge sharing and re-use, so that people all over the organisation can learn together. The techniques of Communities of Practice, Lesson-Learning and development of Knowledge Bases become important. 

Stage 4. Standardisation

Once the knowledge has been perfected through use, the next step is to standardise the knowledge, as further experimentation would now be wasteful "reinvention of the wheel".  The knowledge becomes codified in manuals, reference materials and training. Knowledge Management helps now by ensuring these "knowledge assets" are well constructed and easy to find. 

Stage 5. Reinvention.

However no knowledge lives for ever. There are often cycles of reinvention, where old knowledge is replaced by new ideas, and the cycle begins again with Innovation. Knowledge Management should promote constant challenge of the status quo, to test whether there could be a better way to do things, and to decide whether the maturity cycle needs to be restarted.

The most successful organisations will be those who can run this maturity cycle at optimum speed, and so out-learn their competitors.  


Thursday, 28 November 2019

How should you arrange your knowledge store - by topic, by type, or by organisational unit?

How do you structure your knowledge store? There are 3 options, but one is far better than the other two in meeting the needs of the knowledge seeker. 


Image from Robins.af.mil
There are three ways to organise your knowledge store. You can organise it by operational units, so that each unit has control of their local knowledge.  That way you have a European Division knowledge base,  a Pittsburgh Plant portal, etc. Or you can organize it by the type of knowledge, so you have a website for lessons, another for videos, another for training material. Or you organise it by topic, so there is a store for knowledge about preventative maintenance, another for knowledge about project management, and so on.

I don't have any statistics to prove which of these works better, but for me it is option 3 every time. I always recommend storing knowledge based on the topic. The knowledge may come from many operational units, and it may be of many types; lessons, good practices, training materials etc, but it is all about the same knowledge topic or knowledge domain.

I recommend this approach for 6 main reasons;

 1. Part of the value of Knowledge Management is enabling knowledge to be shared across organisational units. Imagine a manufacturing function divided into regional units. What is the point of organising knowledge by the regional units? Teams within the regions have a closer working relationship already than teams in different units. Far more value is to be gained by exchanging knowledge between units but within the same function. 
2 . Organisational structure changes on a far more frequent basis than topics and subjects.  Maybe one day, instead of a functional organisation, the company reformats into a regional structure, and the manufacturing function becomes split between European manufacturing, US manufacturing etc. Should the old manufacturing knowledge base be divided up between the regions? No. Manufacturing remains a core knowledge topic, and the manufacturing knowledge base should remain, covering all regions. 
3. The real value of knowledge management comes when discussing know-how. A subject equates to a practice area, which equates to know-how. Hence the value of developing Communities of Practice, which create, share, apply and steward the development of practice know-how. Communities of Practice are a fundamental of KM; communities of organisational unit are not. The top level taxonomy of your knowledge base should equate to your CoP structure, which should equate to your list of strategic knowledge topics, and all the knowledge associated with that topic should be owned and managed by the community, in one single knowledge store. In our case above, the manufacturing community of practice manages the manufacturing knowledge base. 
4. Arranging the content by Subject will make it easier to demonstrate compliance with ISO 9001 
5. People seeking for knowledge on a topic will not necessarily know which organisational unit has sourced the knowledge, and will not necessarily care. So long as it helps with their manufacturing issues, they don't care where it comes from.
6. People seeking for knowledge on a topic don't really care which format it comes in - whether it is text, or pictures, or audio or video, or powerpoints, or lessons learned, or training material. To be honest, they want to find all the knowledge, no matter the format, and if its Multimedia - so much the better! They don't want to have to look for video in one place, powerpoints in another.
Imagine looking through a library reference section, where instead of organising the reference books by topic, they organised them by the author's home town! Anyone browsing a library for reference on a topic knows that this is the best way to organise; a section on gardening, one on cooking, one on sport. They organize the library for the knowledge-seeker, and you should do the same.


Having said that
 a) your knowledge base should be arranged by Subject but also tagged by operational unit (and by contributor) to allow search by whichever method is needed, 
 b) your larger knowledge management framework needs elements that apply within the organisational unit (AARs, KM plans, lessons capture, KM champions) and elements that apply across units (CoPs, Lesson-sharing, knowledge bases, SMEs, Practice Owners), so even though the knowledge store is organised by topic, there will be activities which are organised by unit
 c) there are organisations where the Subjects should be organised by Practice topics, others where the Subjects should be organised by Product topics, and some where the Subjects should be organised by Customer topics (see here). 

Arrange your knowledge store by topic, so the seeker can find all the knowledge on the topic in one place, regardless of origin or format.

Wednesday, 27 November 2019

Why you should never leave your lessons in the project report

Probably the worst place to store project lessons is to leave them in the End of Project Report.


Tombstone created using http://tombgen.appspot.com/

I know this is still the default approach for many project organisations, and 19% organisations who run KM programs still use this approach (according to our KM survey). However this approach was linked to the second-lowest satisfaction score for lesson learning, second only to "we don't store our lessons anywhere". So leaving lessons in project files is only slightly better than not storing them anywhere.

But why is this the case?

It makes sense for the project team to leave the lessons in the report, because lessons are a project output, and surely all project outputs are documented in the report? Once the report is written, the job of the project team is done.

However although the job of the lesson writer may be over, the job of the lesson-seeker is only just starting.

People searching for lessons (whether they are people in future project teams, or subject matter experts maintaining process documentation) will be searching for lessons to help with a specific issue, risk, process or task. They would like to find all the lessons associated with that issue, risk or task in the same place. They almost certainly will not know which projects learned lessons on which topic. They don’t want to have to go back through 20 or 30 project reports, looking through the lessons learned section of each one, just in case there is something of relevance.

Therefore they don't bother.

The lessons remain unread; the project report becomes a tomb in the lessons graveyard. Learned once, and destined to be relearned in future.

Two of the basic principles for effective lesson learning are that you need to 1) capture lessons individually, and 2) store them centrally. As I pointed out in yesterday's blog post, lessons are a knowledge output from a project, and should be stored with the rest of the knowledge, not the rest of the project files. In order to make it easy for the lessons to be found, you need to store the individual lessons under themes or topics in a lessons management system until they make their way into updated practice and procedure, guidance and training.

 If lessons are being learned from many projects, for example from Retrospects and After Action reviews, then these lessons need to be stored somewhere where they can be compared, reconciled, and actions assigned, notified, tracked and managed. They need to be captured in a consistent format, and stored in a central system.

 This is often done in some sort of Lessons Management System, set up either for a major business stream such as Sales or Engineering or for the entire organisation, to store and track lessons from many projects.

An effective lessons management system should be structured according to the needs of the “knowledge user”. People who come to access lessons from the database should be able to find what they are looking for very easily. If they don't find relevant and useful knowledge within a few minutes, they will leave and never come back.

Think about the needs and interests of the knowledge user, think about how the knowledge will be re-used, and think about how you should structure the database to most easily give them access to what they need.  The knowledge seeker is  looking for lessons about procuring steel pipe for example, or for all the lessons to do with safety while working at height, or for all the lessons to do with partnering with a particular national authority; in other words, lessons to do with the particular issue that they face. The lessons therefore need to be grouped into a structure,  index or taxonomy, based on work activity or work process. Think about how supermarkets gather and present produce, and use ideas such as this to stock your knowledge supermarket.

In addition, the lessons management system should be able proactively to route lessons and associated actions to the relevant process owner, in order that the lessons can be embedded in improved process, and can be declared properly "learned" and the learning loop can be closed.

Leaving them in the project report graveyard destroys the learning loop at the first step.





Tuesday, 26 November 2019

The 3 types of knowledge output from a project

All projects deliver not just a product, but knowledge as well, and there needs to be a clear understanding of what form that knowledge will take. 


Part of any Knowledge Management policy therefore has to be a definition of the expected  knowledge output from project work.  This knowledge output is distinct from any information products, or project files, or project reports.

Remember what we mean by "Knowledge" - "a human or organizational asset enabling effective decisions and action in context" according to the ISO standard. Therefore any knowledge output should be something that enables others to make better decisions or take better actions.

There are three main types of Knowledge output, which should be carefully defined in the project Knowledge Management plan.


  1. Better ways of doing things. These are recorded as Lessons throughout the project, and especially at the end of project stages. The lessons will be entered into the company Lessons Management System, and fed through into improvements in guidance, training, procedures or standards. For product development projects, there may also be product design improvements that were introduced, and these can be documented as Toyota-style A3 sheets, and stored in the product folder. For R&D projects, the increments in learning are recorded in the R$D knowledge base, to allow future R&D projects to build on the insights gained. In each case, these outputs allow future workers to make better decisions based on better process, better design, or a better knowledge base. 
  2. Good examples. These are project outputs that are "best in class" and can be offered to other projects as templates. They will be stored in the knowledge base associated with that process. These are offered to future workers as things to copy to give them a head start.
  3. Knowledge that is needed further down the value chain. The project may deliver "information products" like installation manuals, as-built designs and so on, but in addition there may need to be knowledge products. These will describe WHY products were designed the way they were (captured in documents such as Basis of Design, or the Rolls Royce Decision Rationale editor, or may capture HOW knowledge such as how best to sell, maintain, install or operate products. They help people further down the value chain - the sales staff, the service staff and maintenance staff, to make better decisions and take better actions.

These knowledge outputs benefits people other than the project team, and therefore should not be started and archived with the project files, but should find their way into the "common knowledge" of the organisation, through the lessons management system, the communities of practice, and the organisational knowledge bases.

The projects should know that it is their responsibility to create this output and to share it through these designated channels. 

Monday, 25 November 2019

The benefits and limitations of KM change and maturity models

This is a reprise and rewrite of a post from 5 years ago about KM change models vs KM maturity models. AKA "why KM change is more like spread of a forest fire than the growth of a tree".


Photo from the US National Parks Service
The use of a maturity model allows an organization to have its methods and processes assessed according to management best practice, against a clear set of external benchmarks. Maturity is indicated by the award of a particular "Maturity Level". The majority of KM maturity models (and ours is no exception) have a series of descriptors of various levels of KM maturity, and the implication is than an organisation can progress from one level to the next in a smooth maturation process.

The analogy, if you like, is that of a tree. As a tree matures, it passes from a seedling to a sapling to a mature tree, but this is a continuous progression. You can describe, using metrics such as the number of branches, size of trunk, number of fruit, where the tree is on its maturation journey. If you won an orchard, you can describe the average maturation level of the trees as (for example) 2.5 on the maturation scale.

Knowledge Management is more like a forest fire than a tree.


A forest fire does not mature slowly. It catches in one small place, then sweeps across the landscape.  In a forest fire, change is not top-down nor bottom-up, but side to side (see my blog post http://www.nickmilton.com/2019/07/km-change-is-not-top-down-or-bottom-up.html for more detyails. A forest fire is not a maturation process, it is a phase-change, from unlit to lit. There are various measures of readiness for forest fires - they can be enabled by hot weather, strong winds and a build-up of combustible material, or disables by fire-breaks and rain - but it still is not a process of maturation.

I am aware as I write this that a forest fire is also a highly dangerous life- and property-threatening phenomenon and a lethal consequence of global warming. If you find this metaphor too negative, please use another, such as kindling a bonfire, or adoption of a virally-marketed product.

Knowledge Management is a forest fire rather than a tree, because implementing KM is a culture change process. It involves changing hearts and minds, and hearts and minds are changed one at a time. We have all seen the moment when a heart/mind changes and someone "gets lit". It's that lightbulb moment, like "catching fire". There is no maturation for a process, only the question "has it caught fire". Once it has, the question becomes "how much is burning".

I describe here a change model for hearts and minds which you can apply to your key stakeholders, that takes them up to a commitment threshold, beyond which KM can be adopted. Below this threshold they are unlit kindling. Above this threshold they are alight.

KM then works only if all the conditions are sufficiently right to change the hearts and minds. Once the conditions are right, you light the KM fire in a small part of the organisation (a KM pilot), and once this is burning, adjacent areas will also catch fire, until finally the whole area has caught the KM habit.

That's the change model - what's the problem with maturity models?


Maturity models are popular, and give the organisation a chance to compare themselves against a standard, and to identify room for improvement. This can be a useful check, but maturity models have a number of drawbacks (for a deeper discussion, see chapter 27 in the new edition of the Knowledge Manager's Handbook).



  • The first is that the model may have gaps or be based on inaccurate assumptions. There are many maturity models, for example, which ignore the issue of governance, and others that include content as a key component (thereby assuming that Knowledge Management is basically Content Management). Choose your model wisely, or (better) use more than one.
  • Most maturity models make assumptions about the sequence in which things have to happen, and these assumptions do not hold true universally. 
  • I large and complex organizations, where the organizational landscape is heterogeneous, a maturity model tends to gloss over or average out significant differences in portions of the landscape, removing them from visibility and opportunity for action. The maturity model may say "the forest is cool" when in fact the fire is already blazing somewhere local.
  • Finally, as discussed above, KM implementation is not one of gradual maturation across the organization at large, but of spreading the adoption of a new paradigm, and thus the idea that the organisation matures in a stepwise process is inappropriate.


Take Leadership, for example.

Senior management support is the biggest enabler (and lack of senior management support is the biggest barrier) to KM. Leadership is vital. Imagine a leadership scale from 0 to 4. Imagine you have moved leadership from level 1 to level 2. Is this progress?  If level 4 is "whole--hearted support from senior management", what is level 2? Half-hearted support? That's as bad as no support at all. Until you get to level 4, you don't have what you need for sustained KM.

Rather then trying to move the whole organisation to level 2, why not find the one leader who you can help reach level 4? Leave  the rest at level 1 for the moment, and find the early adopter. Gain their whole-hearted support to pilot Knowledge Management in their part of the business, deliver success, and use this to change the next Heart and the next Mind.

The indicator of progress is therefore not the average level of KM leadership maturity, but the presence or absence of the "first sponsor" in the organisation.


What is the conclusion regarding maturity models?


For all these reasons, maturity models are much better used:

(a) not for assessment and objective benchmarking, but as part of an internally driven diagnostic and planning mechanism along with a lot of independently gathered data, where the question is not “how mature are we against external assumptions?” but “what can this external model suggest to us about our strengths and weaknesses, and which of these areas should we prioritise based on known needs?”; or  
(b) in homogeneous, well defined contexts such as communities of practice, knowledge-base development, or expertise transfer, where there are specific, well-known good practices and reliable precursors that hold true in most cases.

In Knoco, we actually do offer a maturity model, which we offer as a free online survey (choose Maturity Survey in the box at the top of the page). It is of some use, but treat it with caution for all the reasons mentioned above.

In addition we suggest you measure a number of other things;
The real message behind all of this is that KM is a change program, and needs to be measured using change models.

KM does not mature like a tree; it catches hold like a flame, and that is how it should be measured.







Friday, 22 November 2019

Should you use a single technology platform for KM?

Does KM need a single technology platform? More likely it needs several technologies.


This blog post was prompted by a thread in Stan Garfield's SIKM community asking what technology platform people use for KM. My immediate thought was that a single platform probably is not sufficient. However let's look at our survey results to see what people actually use in real life.

Firstly, let's look at how many tools people use for KM. 


In our global KM survey, run in 2014 and again in 2017 and answered by over 700 knowledge managers, people were asked to select, from a list of technologies, which ones they applied as part of their KM program, and if their tool was not on the list, to identify it in an "Other" box. The number of selections (including the Other box) is shown in the pie chart below. This particular question was answered by 270 people.

Please note, from the pie chart below, that 37% of respondents use a single technology platform for KM, while 63% use more than one. 


Secondly, lets look at which technologies people say they use


The table shows which technologies people said they use for KM. Please note that respondents were asked to identify every tool they used, so 63% of respondents identified more than one of these tools.. Please also note 162 people replied "other", which is the third highest category after SharePoint.

Technology brand
Number of users
SharePoint customised
249
SharePoint "out of the box"
188
In-house tools
140
Yammer
100
SAP
54
Confluence
40
SalesForce
39
Lotus
32
OpenText/LiveLink
28
MediaWiki
27
Drupal
27
Jive
32
IBM Social Content Management
13
Alfresco
12
Google
8
Oracle
7
ServiceNow
4
Wordpress
4
Other
162

The people who replied "other" identified the following additional tools:

.NET Cloud Program - switching to Confluence, 365, Tallyfox, Dartfish tv, 3ms internal , Adobe Defense Connect Online, Microsoft Outlook, Aptify, Basecamp, Bespoke, Black board, BMC Remedy, BMC Remedy, O365 (collaboration), Box, Caltura Video, SAP Jam, SAP Portal, Confluence, Adobe Connect, CISCO WebEx and Jabber, Cloud based server - Dropbox, Controlled Access Folders, Cornerstone, Cornerstone OnDemand, Cosential; Newforma; Fairsail; Ajera, CRM Dynamics, CubicWeb Semantic Web Framework, current still using static tool for Intranets (Dreamweaver)., Customized app, cyn.in, Cynapse Cyn.in, Day Communications Intranet, Decisiv search (Recommind), develop own portal , DMS iManage, do not know, Documentum, Documentum , Documentum, Drop Box for external collaboration until we have a solution, eGain, eGain, ELGG, EMC2 E-View, excel, exo, Filesite Document Management - imanage, HP Content Manager, Pega, Huddle, Huddle; Tibbr, IBM Connections, Inquira, iManage, iManage Worksite DMS, ConceptSearching, Recommind Decisiv Search, Outlook, HighQ, In house developed, Institutional repository - DSpace, Integration within ticket tool (CA USD/Service Now), Internal developed platform, Internal support portal, internal system, Internally developed systems, jabber, webx, asana, JAM, JIRA, Joomla, Joomla, Joomla, Knowledgeplaza, Laserfiche, lenus - library  open access repositary , Liferay, LifeRay, Reverb, Linked-In, listservs, Lithium, many in-house tools, Market Logic Software, MEETSYS / I2Kn, Melling and graphic representation applications, Microsoft CRM, Migrating from Lotus to Sharepoint, MindTouch, ServiceNow and Zendesk, MOODLE, moodle, Ms Dynamics, Newsgator, not public information, Office365, OneNote, open asset, pipedrive, 10,000ft, our DMS, PipelineDeals & The Box, platform by ourself, PLM, Plone, Plone by Google, Podio, QDAMax, Qlikview, qimingkeji, radio     internet, Recommind Decisiv Search, Remedy Knowledge Management, Research Management System, Saba, ScienceDirect, SharePoint customized is coming in 2018, SharePoint has been slightly customised on a look and feel basis, ShoreTel Connect, Joomla!, Worldox, Sitefinity, O365/Lync, Sitrion, Sitrion NewsGator, Sitrion, Kapow, Semaphore, Skype, skype for business, Skype for business, Newforma, KA Synthesis, Social Network interface - cannot remember the name of the provider, Social Sites, Social squared, Socialtext wikis, Software platforms due to be replaced, Soutron, SSRS, Starmind; Exalead, Synthesis, Synthesis, Tailored EPI Server, Tibbr, Tibbr, Tortoise Subversion / svn, Trello, TRIM, Unily, Verint, Vivisimo, vivo, We are retiring SharePoint and moving to Google in 2017, We leverage the O365 suite of tools, so I would also include Sway, OneDrive, Skype, OneNote, and to a lesser degree, Delve and Planner, as key platforms, We use our DMS, iManage as a repository for explicit knowledge, Workplace (facebook at work), WorkSite by iManage, XING Groups Management; mixxt wiki and doc mngt., Yammer free version, Yolean Checksheets, Zendesk


Thirdly, let's look at the tools used by the 37% of people who use only one technology.


Here SharePoint dominates this category. 62% of people who use a single tool, use SharePoint.

Please note that this does not say whether SharePoint is a good or bad solution, only whether it is commonly used.

Finally lets look at why people use technology in KM in the first place. 


Ideally this should be the place to start - to ask "what functions do we need of a KM technology platform". As you can see from the graph below, there are many functions.

For many functions, you need many technology platforms. SharePoint, for example, is good at enterprise content management, document publishing, and on collaboratively working on documents (the number 2 most popular function in the graph above).. It's document search (number 1 function) can be described as poor, and it is poorer at searching for people (number 3 function). It is OK as a tool for community forums, but gets few recommendations as a wiki-substitute, and you would not use it for video publishing or lessons management.

So it handles documents OK, but probably needs to be supplemented with one or more other KM tools to cover the more tacit areas of knowledge.

In conclusion; 


You probably need your KM technology to provide multiple functions, and therefore you are likely, like 63% of the survey respondents, to use more than one tool. If you choose a single platform, then recognise what its good at, and what its poor at, and be prepared to supplement it where needed.

Thursday, 21 November 2019

Two managers' questions that drive a KM culture

If you are a leader who wants to help develop a Knowledge Management and Organisational Learning culture in their organisation, you can do this simply, by asking two questions. 



The two questions are
Who have you learned from?
Who have you shared this with?

If you are a leader, then every time someone comes to you with a proposed solution to a problem, or a proposed course of action, you ask “Who have you learned from”? Through this question, you are implying that they should have learned from others before proposing a solution – that they should have “learned before doing”.

Also, every time someone comes to you to report a problem solved or a process improved, or a new pitfall or challenged addressed, you ask “Who have you shared this with”? Through this question, you are implying that they should share any new learnings with others.

The great thing about leaders’ questions, is they drive behaviour. People start to anticipate them, and to do the learning before, and the sharing afterwards. People hate to be asked these two questions, and having to answer “umm, well, nobody actually”.

They would much rather say “we have learned from X and Y, and have a Peer Assist planned with Z”, “We have shared with the A community, and are holding a Knowledge Handover next week with B project”.

And once you drive the behaviours, the transfer of knowledge will happen, the value will be delivered, and the system will reinforce itself.

But the moment you stop asking the questions, people realise that you, as a leader, are no longer interested in KM, so they will stop bothering.

There’s an old saying – “What interests my manager fascinates me”, so make sure you are interested, and ask the questions.

Two similar questions - "Show me that you have shared knowledge" and "Show me how you have re-used knowledge" - are embedded into staff appraisals at Microsoft, as a way of driving the right Knowledge-friendly behaviours. However appraisals happen on an annual basis, and if you want to keep a focus on knowledge all year round, then the two questions described here are very powerful. 

Wednesday, 20 November 2019

The twin KM approaches of Connect and Collect

I have blogged quite a bit recently on Connect and Collect approaches to KM, aka the transfer of tacit and explicit knowledge. Here is a reprise and extension of a useful table which describes the two.

Three of my recent blog posts have touched on

Each of these deals with knowledge transfer through tacit and explicit knowledge, comparing the use of the two, its efficiency and its effectiveness.  These two approaches to knowledge transfer are the connect approach, where knowledge is transferred by connecting people, and the collect approach, where knowledge is transferred by collecting, storing, organising and retrieving documents.

Each method has advantages and disadvantages, as summarised in the table below and the blog posts referenced above.  Effective Knowledge Management strategies need to address both these methods of knowledge transfer. Each has its place, each complements the other. These are not "either/or" choices, they are "both/and".


Connect

Collect

AdvantagesVery effective
Allows transfer of non-codifiable knowledge
Allows socialization
Allows the knowledge user to gauge how much they trust the supplier
Easy and cheap
Very efficient.
Allows systematic capture
Creates a secure long-term store for knowledge
Knowledge can be captured once and accessed many times
DisadvantagesRisky. Human memory is an unreliable knowledge store
Inefficient. People can only be in one place at one time
People often don’t realize what they know until its captured
Ineffective. Much knowledge cannot be effectively captured and codified.
Capturing requires skill and resource
Captured knowledge can become impersonal
Captured knowledge cannot be interrogated
Transfer mediumConversation, whether face to face or electronically mediated, or in team processes such as knowledge exchange, retrospect, peer assist

Content in the form of documents, files, text, pictures and video.


Need for balanceManaging conversation without content leads to personal rather than organisational learning. Unless new knowledge becomes embedded in process, guidance or recommendations, it is never truly "learned", and without this we find knowledge becomes relearned many times.

A focus on content without conversation results in a focus on publishing; on creation of knowledge bases, blogs, wikis, as a proxy for the transfer of knowledge; on Push rather than Pull. But unless people can question and interrogate knowledge in order to internalise it, learning can be very ineffective.

Types of knowledge suitable for this form of transferEphemeral rapidly changing knowledge, which would be out of date as soon as its written down
 Knowledge of continual operations, where there is a large constant community
Knowledge needed only by a few

Stable mature knowledge
Knowledge of intermittent or rare events
High-value knowledge
Knowledge with a large user-base
Organisational demographics which suit this approach A largely experienced workforceA largely inexperienced workforce
CommentsOne traditional approach to Knowledge Management is to leave knowledge in the heads of experts. This is a risky and inefficient strategy. A strategy based only on capture will miss out on the socialization that is needed for culture change, and may fail to address some of the less codifiable knowledge.

Tuesday, 19 November 2019

The biggest barriers and enablers for knowledge Management

This post is an update of an earlier post in 2014, brought up to date with new survey data.

As part of our global surveys in 2014 and 2017, answered by over 700 KM professionals, we asked respondents to rank a number of barriers in order of the impact they had had on their KM programme, ranking these from 1 to 8 (Knoco 2017).

 The results are shown in the table below, with high numbers representing high ranking and therefore high impact.




Barrier

Average ranking

Cultural issues5.8
Lack of prioritisation and support from leadership5.0
Lack of KM roles and accountabilities4.8
Lack of KM incentives4.8
Lack of a defined KM approach4.6
Incentives for the wrong behaviours (inability to time-write KM, rewards for internal competition etc)4.3
Lack of support from departments such as IT, HR etc4.1
Insufficient technology4.0


They were then asked to prioritise the main enablers for KM which had proved powerful, ranking them from 1 to 9. The resulting figures are shown in the table below (high numbers being high ranking).


Enabler

Average ranking

Support from senior management6.2
Championship and support from KM team/champions6.2
Evidence of value from KM5.9
Easy to use technology5.6
A supportive company culture5.6
Effective KM processes5.5
Clear KM accountabilities and roles5.4
Personal benefit for staff from KM4.6
Incentive systems for KM4.2




So what does this tell us?

  • The number two barrier and the number one enabler are support from senior management. Without this, you will struggle. With this, you will succeed. This blog contains much advice about gaining senior management support (see here for example, or here), and if you need more help, we will be happy to advise. Get this support, all else will be much easier.
  • Although culture is the number one barrier, it is much lower in the enablers table. I think this is because the highest enablers - leadership support, champions and evidence of value - are all means by which the culture can be changed. Culture is therefore not the enabler; culture change is the enabler. 
  • Although roles and incentives are seen as major barriers, they are much lower in the enablers table. These are perhaps not the barriers that they might seem to be, even though they are a key part of your Knowledge Management Framework. 
  • Technology is seldom a barrier, nor is it at the top of the enabler list. Anyone thinking that the solution to effective KM is technology alone is ignoring the lessons from the past 2 decades of successful KM.

Monday, 18 November 2019

Why transferring knowledge through discussion is over 10 times more effective than written documents

Connecting people is far less efficient than Collecting while being far more effective - but how much more effective?

Knowledge can be transferred in two ways - by Connecting people so that they can discuss, and Collecting knowledge in written (explicit) form so others can find and read it (see blog posts on Connect and Collect). 

Connecting people is less efficient than transferring documented knowledge, but more effective.  We can never be sure about the absolute effectiveness of knowledge transfer without some good empirical studies, but there are 2 pointers towards the relative effectiveness of these two methods. These pointers are as follows;

First, the often repeated (and sometimes challenged) quote that “We Learn . .
  • 10% of what we read 
  • 20% of what we hear 
  • 30% of what we see 
  • 50% of what we see and hear 
  • 70% of what we discuss 
  • 80% of what we experience 
  • 95% of what we teach others.”
This is similar to Media Richness theory, which ranks media on the basis of it's richness, with unaddressed documents as least rich, and face-to-face as most rich.

Second, David Snowden's principle that

  • We always know more than we can say, and 
  • We will always say more than we can write down
Our assumptions

Let's make two assumptions here, firstly that the percentages in the first list are correct, and secondly that we equate the "more than" in Snowden's principle to "twice as much as." OK, the fist assumption is highly dubious and the second is entirely arbitrary, but I want to see what the consequences are.


With these assumptions, the effectiveness of the Connect route (knowledge transfer through discussion) is as follows
  • I know (100%)
  • I say (50%) 
  • You learn through discussion (70%)
The effectiveness of transmission of knowledge through Connecting is therefore 35% (100% x 50% x 70%) provided there is discussion involved.

If you connect people through video (seeing) the effectiveness drops to 15%. Through hearing only (eg podcasts) it drops to 10%. The most effective way to transfer knowledge would be to work together, so the knowledge donor does not need to tell or write, they just have to show, while the knowledge receiver learns by experience. That way you minimise the filters.

The effectiveness of the Collect route for knowledge transfer through documents is as follows
  • I know (100%)
  • I write (50% x 50% = 25%)
  • You learn through reading (10%)
The effectiveness of transmission of knowledge through Connecting is therefore 2.5% (100% x 25% x 10%)

Transfer through discussion is 35% effective, transfer through documents is 2.5% effective. In the first case you can transfer a third of what you know, and in the second case you transfer one fortieth.

Therefore transferring knowledge through Collecting is 14 times less effective than transferring knowledge through Connecting people.

If we change the proportions in Snowden's principle then we change this conclusion. If for example 
we always know 3 times more than we can say, and we will always say 3 times more than we can write down, Collecting becomes 21 times less effective, and so on.

I know all these figures are arbitrary and inexact, but what we are looking at here is some sort of estimate of relative efficiencies.

Note that this does not mean that Collecting knowledge has no place in Knowledge Management - quite the opposite. Despite being very ineffective, it is very efficient. Knowledge has only to be documented once, to be re-used one thousand times. Efficiency can trump effectiveness. However we can conclude the following
  • Because of these relative efficiencies, Knowledge should shared in explicit form (the Collect route) only when it is relatively simple and when it can be codified with minimum loss of context. 
  • Where efficiency is more important than effectiveness (i.e. broadcasting relatively straightforward knowledge to a large number of users), the Collect route is ideal.
  • The Collect route is also necessary when a Learner (a recipient for the knowledge) cannot be immediately identified, so no Connection is possible (see "speaking to the unknown user").
  • Even then, it is worth "keeping the names with the knowledge" so that readers who need to know more detail can call the originator of the knowledge and have a discussion.
  • Where knowledge is more complex or more contextual, it should be shared through discussion (the Connect route) - for example through conversational processes such as Peer Assist.

Given that transfer of knowledge through documents is so ineffective, choose your KM strategy carefully!

Friday, 15 November 2019

How to identify a knowledge "near miss"

In organisational safety management, they identify a "near miss" as evidence that safety practices need to be improved.  We can do the same in knowledge management.


Image from safety.af.mil
I have often used Safety Management as a useful analogue for KM, and here's another good crossover idea.

In safety management they identify safety breaches (accidents, injuries, "lost time incidents") as metrics and indicators that safety management needs to be improved.

They also track "near misses" - incidents where nobody was harmed, but only by luck, or "unplanned events that did not result in injury, illness or damage – but had the potential to do so". A hammer dropped from height and landing a few feet away from a worker on the ground, a bolt blown past someone's head by an escape of compressed gas, a near collision between two aircraft, all are examples of near misses indicating that safety management needs to be improved. 

In KM we can  track lost knowledge incidents, where time, money or effort was wasted because knowledge should have been available but "got lost" along the way. The knowledge is or was once available to the organisation, but failed to reach the person who needed to act upon it, with resulting cost to the organisation in terms of recovery cost, rework, lost sales, delay etc. If you are lucky you can quantify this cost as part of the Cost of Lost Knowledge, aka the Cost of Ignorance, and use this in your KM business case.

But we can also track Knowledge Near Misses. This is where the knowledge was not lost and no cost therefore incurred, but it was only found or transferred by lucky chance.

I heard a great example recently in a client organisation (and I paraphrase below).

The organisation was planning an activity. It seemed a little risky but quite doable, and there was management pressure to go ahead. They were discussing this activity in a meeting, and someone from another part of the business who happened to be in the meeting by chance (he was not invited to discuss this particular activity) spoke up and said "I was part of a team that tried this before. It was a complete disaster, and we are still recovering from the mess it created".

The lessons from this previous project had not been captured, they were not in the lessons database, and the project report was not findable but buried in a mass of project files on a hard drive somewhere. Had that person not by chance been at the meeting, the "complete disaster" would most likely have been repeated with resulting costs in manpower, money and reputation.

This was a knowledge near miss. This event did not result in cost to the organisation through lost knowledge, but had the potential to do so, and was only avoided through luck. With a proper KM framework in place, and followed by all staff in a systematic way, this knowledge would not have been lost, and the planned activity could have been assessed in the full light of historic lessons.

You can find another KM near miss story here

The knowledge near miss is a useful metric which provides evidence of the value of, and need for, effective KM.



Tuesday, 12 November 2019

Charts and Pilots - an illustration of tacit and explicit knowledge

If you are a competent ship's master, what types of knowledge do you need to be able to navigate on a new voyage to an unknown port?  You need two types - explicit and tacit, charts and pilots.


The charts, and the associated tide tables and weather forecasts are the explicit knowledge, whether these are paper charts or electronic. They record the unchanging features of coastlines, currents, buoys and lighthouses. With a good set of charts, any competent mariner can navigate any sea anywhere in the world, provided they have their own set of tacit knowledge - how to read a chart, how to determine position, how to plot a course allowing for wind and tide.

A chart is never complete - there are often uncharted hazards - but they as being constantly improved, and it is now possible to order a set of charts which are updated as of the day of ordering.

But when it comes to entering the narrow congested waters of an unfamiliar foreign port, the ship master's tacit knowledge is not enough, even bolstered by the explicit knowledge in the charts and manuals. Here you need a Pilot.

A pilot is a sailor who maneuvers ships through dangerous or congested waters, such as harbors or river mouths. They are navigational experts possessing knowledge of the particular waterway such as its depth, currents, and hazards.

The economic and environmental risk from today's large cargo ships makes the role of the pilot essential. Pilots possess detailed knowledge of local waterways, and most ports have compulsory pilotage for big ships. Pilotage is one of the oldest professions, as old as sea travel, and it is one of the most important in maritime safety. The complexities and risks of close navigation make pilotage essential.

Pilotage is an example of deep tacit knowledge which cannot safely be codified, and the master and pilot work together, combining their knowledge of the ship and knowledge of the harbour, to finally berth the ship and end the voyage.

Similarly your knowledge workers need access to two types of knowledge.

They need the documented guidance of the "broad areas" of their work, so they can be guided through most of the job. Then they need access (through communities of practice, or subject matter experts) to the deep knowledge of the most risky or complex tasks that can never be made explicit.

Once again, this takes us back to the dual nature of KM - Connect and Collect, Tacit and Explicit. These both need to be components of your Knowledge Management Framework,so that you can take your knowledge workers to the mouth of the harbour, and then guide them to safe anchorage.





Monday, 11 November 2019

Engaging leaders through a KM "proof of concept" demonstration at De Beers

The story here is taken from the book "Performance through Learning", and tells of a crucial "proof of concept" exercise at De Beers, the global diamond company, which was instrumental in demonstrating the value of Knowledge Management to senior management, and gaining their support and buy-in.


De Beers Headquarters, Johannesburg;
Image from wikimedia commons
 One of the earliest stages of the De Beers Knowledge Management strategy was to try some simple KM processes on some of the key activities or projects within the organization; to see if they worked, to see if they generated value, to come up with some early wins, and to create some success stories which could be used for marketing. 
Ian Corbett, the De Beers Knowledge Management lead, had already identified one or two possibilities, and more had come up during the strategy workshop. There was one very interesting and challenging possibility though, which would be a real test of the power of Knowledge Management; the !Gariep project. 
 !Gariep had been a blue-sky technology project for De Beers Marine. The De Beers Marine team had planned to build a piece of mining technology beyond anything that currently existed. The project had been an ambitious challenge, and many many learnings had been generated; so many learnings that the organisation had been unsure how to harvest them for reuse. Some of the key players were still in the organisation, others had left. 
Ian saw the possibility of using the Retrospect process as a powerful and non-confrontational way of harnessing some of that knowledge. 
 The !Gariep retrospect took place over two days in Cape Town, involving 25 members of the project team, with up to four years history with the project. We divided the project into four stages, and spent half a day on each stage. With the benefit of hindsight, we can see that we invited too many managers and not enough "workers" to the retrospect, because many of the most valuable insights came from some of the more junior members of the project team. 
 However some very interesting and powerful lessons were captured, and we took the opportunity to record some video advice from the participants, as well as some feedback on the retrospect process itself. Although the lessons from !Gariep were extremely valuable, and have already been carried forward to future projects to great effect, those video clips were an even more powerful demonstration of the value of Knowledge Management. 
 Shortly after the retrospect had been completed, Ian attended a meeting of the senior management team of the De Beers group to talk to them about KM. He had recorded one of the engineers at the retrospect, a young credible and eloquent contributor with some excellent knowledge and advice to offer, and he embedded the video in his presentations. 
This was the turning point for some of the senior managers; it transformed the whole presentation and got them on the edge of their seats. It was a real-life, highly relevant demonstration of what KM within the De Beers context, and from a complex and high-profile project as well. And then, when the senior managers asked "and how did the participants feel about the Retrospect process?" Ian was able to show them a second video clip of enthusiastic feedback.

Turning the "proof of concept" into high level support


Ian later reported the following;

"The embedded video was the best way to market how powerful this approach is. I recorded one of the engineers talking. He is young, credible and eloquent, and I put his video in a presentation for the senior management team. I gave the talks, and I showed Steve, and it transformed the presentation and got people on the edge of their seat. This was the turning point for the Director of operations, who became the high-level sponsor for Knowledge Management in De Beers”

This approach for engaging key leaders can be replicated by any courageous knowledge manager.

  • Find a big business issues
  • Apply Knowledge Management as a "proof of concept" exercise to solve, address, or learn from that issue
  • Ask the people involved in the KM exercise to tell the story, in their own words, on video
  • Use that video to engage your senior managers

Friday, 8 November 2019

Why a Performance Culture drives KM

KM requires a learning culture, and motivation to learn comes from motivation to improve.  That's why KM thrives in a high-performance culture, where people are not content with existing performance, and actively seek new knowledge that will help them perform even better.

Image from eglin.af.mil
U.S. Air Force photo/Samuel King Jr

This was a topic of conversation in one of out recent Bird Island exercises where, as always, the teams made massive leaps in performance as their knowledge base increased.

The real value of this exercise comes in the debrief, when we analyse the success factors and relate them back to KM at work. One of the topics we analysed was Motivation and Target setting. What motivated the teams to set themselves aggressive performance targets, and what motivated them to use the knowledge from others, to deliver those targets?

The primary motivation factor was "to do a great job". 

They didn't set the target of beating the world record, they set the target of being "up there with the leaders" and delivering a great result. They wanted to be proud of their achievement - that it should stand well in the rankings. Of course, in order to set the target properly, they needed good benchmark data, and data from historical performance. We gave them that data, so they knew what to aim for, and they know what was possible.

And of course, in order to reach the benchmark, they needed the entire knowledge base of how to complete the task, so they could reproduce the successes of previous teams, and ideally could innovate further.

That combination of knowledge of past performance and of how to achieve it allowed them to set targets that were 2 or 3 times greater than the results had achieved so far, and in each case, through reapplication of knowledge, they exceeded those targets and achieved top quartile performance.

So how do teams set performance targets?


In the class, the teams set their own targets, based on knowing the benchmark, and knowing they had the knowledge to reach the benchmark.

Then I asked them how targets are set at work, and how motivating these are. You could hear the groan go round the room. "Targets are set from the centre" "Nobody buys into them" "Targets are a joke" "We set our targets, then define the metrics later so we can beat the target" "Targets are political". Such a contrast from the motivation in the exercise, where the motivation was internal and the targets were self-selected.

Knowledge and Performance are so closely linked, that we often say that if you don't manage performance, you can't manage knowledge.

If people are not motivated to perform and improve, how can they be motivated to learn? And if nobody wants to learn, then KM is a waste of time. Although many of the participants were from companies which already had KM programs, and were introducing the tools and techniques, they were really struggling with motivation issues.

Is it possible to motivate in the same way that we did during the exercise? Yes it is, and you can see that clearly in the Shell Technical Limit process I have blogged about before. Here teams have access to the performance data from the past, have access to all the knowledge they need, and work out for themselves how they are going to innovate beyond the best of what has been achieved to date (it is taken for granted that they can match the best historic performance. After all, that has been proven to be possible- that's "in the bag"). They set their own targets, and of course they are heavily motivated, through performance bonuses, when they beat the targets.

It was a very interesting discussion, contrasting the success in the exercise with the struggles at work, with motivation and target setting coming out as a key factor.

So when you are introducing KM, start first where the performance culture is well defined, and where people are already motivated to perform. 

If instead you choose a part of the business with no performance management, joke targets, and where people play political games with metrics, then you may very well struggle to develop a learning culture.

Thursday, 7 November 2019

How to find a KM analogue you can learn from

In Knoco, we very often have clients contact us and ask something like "do you have any examples of successful Knowledge Management in the Canadian farm machinery manufacturing business? The answer to such a detailed question is almost always No. 


 Although we have 20 years experience in working Knowledge Management, with over 130 different clients, we have not yet covered every variant of every industry in every country.  However the client is looking for similarities - for analogues - and maybe they don't have to narrow their field quite so closely.

The techniques of Knowledge management, and the details of the Knowledge Management framework, are largely independent of the content of the knowledge (just as  the techniques of Risk Management are largely independent of the nature of the risks).

However if you ARE a client in the Canadian farm machinery manufacturing business and you are looking for Knowledge Management analogues, then here are some of the things to consider.

Guidance for KM analogues.


Firstly there will be no perfect analogue - nothing you can copy and paste. Your Knowledge Management framework needs to be integrated with your business, and with the structures and systems you already have working for you. It needs to be tailored to fit, not bought off the peg.

Secondly, consider the nature of your business, and the nature of your important knowledge. Specifically, are you a Process company, a Product company, or a Customer company? Does your critical knowledge concern your processes, your product, or your customer base? The way KM addresses these three - the practices you use, the roles you develop - are different. Process based organisations such as the military and the oil sector approach KM one way, product based organisations such as the legal world and the aircraft manufacturers approach it a different way.

Thirdly, consider your size. Are you a one-office company? If so, you will manage knowledge in a very different way to a multinational. Communities of practice, for example, may be much less relevant to you, and much of your knowledge transfer should be face to face. If you operate a single hotel, or a single airport, then your KM needs will be very different from an owner of a hotel chain, or an operator of multiple airports, who will need to compare practices, and share knowledge, across multiple sites.

Fourthly, look at your demographics. Are you a company full of young but inexperienced people who need to learn together, and learn fast? Are you a company full of  ageing experienced staff, where knowledge retention is a big issue? Are you a start-up, needing to spread the knowledge currently held in the head of the founder?

Finally look at your national culture. Different nationalities have different approaches to elements such as leadership, tolerance of ambiguity, and individualism. Methods of Knowledge Management that work in Thailand may not work in the Netherlands, and vice versa. Cutting and Pasting from different national cultures is a risky business.

Can you have KM analogues at all?


You most certainly can have KM analogues, within the limits described here. All western-based project-focused multinationals manage knowledge in a similar way, for example; and that is a wide field to choose from (it is also the field that has published the most case studies).  Western engineering companies with ageing demographics are beginning to converge on similar strategies for Knowledge Retention. I suspect similar patterns to emerge in Eastern organisations as well, from conferences such as KM Asia and KM Singapore.

Yes, you can have analogues.

Don't look in too narrow a field, but look at your industry type, your company size, your demographics and your national culture, and maybe a Canadian farm machinery manufacturing business can learn from the KM practices of a Swedish pump manufacturer, or a Swiss chemicals company.

And somewhere in our 100 clients, we probably have an analogue for you


Blog Archive