Wednesday 9 September 2009


more detail from the lessons survey


Here are some more details from the lessons learned survey results I mentioned yesterday. You can find the full results from the survey on the Knoco downloads page


74 responses were received. The organisations represented fell into the following categories, with 11 respondents not identifying their organisation.

• Academic (1)
• Automotive (1)
• aviation (2)
• consulting and services (9)
• engineering and construction (7)
• insurance and banking (2)
• IT (4)
• Legal (2)
• manufacturing and sales (5)
• military (4)
• mining (1)
• oil and gas (10)
• pharmaceutical (4)
• public sector (7)

76% of respondents said that their organisation has a lessons learned system in place in at least one major part of their activity. A further 7% were in the process of introducing one. 6% had previously has a lessons learned system, but had stopped, while 11% had no system. Lessons learned systems seem to be most common in oil and gas, military and engineering and construction, but numbers are too small to be sure.


The respondents were asked which part of their business applied lessons learned. Answers are listed below. At least half of the respondents apply lessons learned within the project context.

• Project management (24 responses)
• All activity (7 responses)
• Software deployment and release (4 responses)
• Bidding and pitching (3 responses)
• industrial safety occurrences (3 responses)
• research and development (2 responses)
• Operations (2 responses)
• Other (11 responses)

The survey asked those respondents who had, or were introducing, a lessons learned system, to rate the effectiveness of the system, using a rating between 0 (not at all effective) and 5 (Excellent). The responses are shown in the graph.

• 6% rated the system as 0 (not at all effective)
• 6% rated the system as 1 (slightly effective)
• 48% rated the system as 2 (moderately effective)
• 15% rated the system as 3 (good)
• 18% rated the system as 4 (very good)
• 6% rated the system as 5 (excellent)



Participants who scored highly (3, 4 or 5), were asked to identify success factors which resulted in a high score. Responses were very varied, with no real consistency, suggesting that many factors are needed.

Participants who scored low (0, 1 or 2), were asked to identify the barriers which resulted in a low score. Reponses are shown below, and several common factors can be identified.

• Senior management (11 responses)
• Culture (10 responses)
• Lack of follow through and application (15 responses)
• Time issues (4 responses)
• Other barriers (13 responses)

The respondents were given a list of components of a lessons learned system, and asked to identify whether they applied these components (see graph to the right). The most common is the use of a defined process for identifying lessons from activity, and 46 of the respondents (80% of those with a lessons learned system) had such an identified process. The least common was the use of rewards to incentivise lessons submission.


The figure to the right shows how each of these components correlates with the effectiveness score for the KM system. For each component, the blue bar represents the average effectiveness of those systems with that component. The red bar represents the average effectiveness of those systems without that component. For example, those Lessons Learning systems which include the definition of Actions arising from the lessons (the top component in figure 6) score nearly 3 on average, while those which do not include definition of actions, score less than 2. We can therefore assume that, on average, the definition of actions helps make lessons-learning more effective.

Therefore all components where the blue bar is longer than the red bar are likely, from the data provided by the respondents, to make a positive contribution to lessons learning. The greater the difference in length, the more positive the contribution.

In general terms, all except 2 components seem to make a positive contribution to lessons learning. However we can (relatively arbitrarily) group them as follows

Strong positive contribution
• Actions defined arising from the lessons
• Clear high level expectations from senior management that the lessons learned process will be applied
• A method to measure whether actions have been completed and lessons closed out
• A process for validating/agreeing the actions
• Accountable person/people assigned to complete the actions
• A defined process for identifying lessons from activity

Moderate positive contribution
• A person or people to track the metrics
• An escalation method if the lesson or action needs to be addressed at a higher level
• A clear accountability for identifying lessons from activity
• A high level sponsor of the lessons learned process
• Quality assurance of this process (eg trained facilitation)
• A method for disseminating the lessons
• A lessons learned database which can hold lessons from multiple projects or units

Fairly neutral
• Quality control of the lessons to ensure they are well written
• A method to measure whether lessons have been captured
• A search function within the lesson database

Strong negative contribution.
• Rewards to incentivise submission of lessons

Respondents were asked which components were missing from this list. There were a variety of answers but no common factors.

Respondents were asked to list the methods they use to identify lessons. These are grouped below. Many respondents indentified more than one method.

• After action reviews (17 responses)
• Other project-related review (28 responses)
• Learning from incidents and events (5 responses)
• Individual (ad hoc) submission (7 responses)
• Other (9 responses)

You can find the full results from the survey on the Knoco downloads page and conclusions in yesterday's blog post
.

No comments:

Blog Archive