NASA found out the hard way that just collecting lessons into a database is not enough.
Image from wikimdia commons |
According to the levels of use and application found by the auditors 5 years ago, there was plenty of room for improvement in lesson-learning. Specifically -
"We found that NASA program and project managers rarely consult or contribute to LLIS even though they are directed to by NASA requirements and guidance.
In fact, input to LLIS by most Centers has been minimal for several years. Specifically, other than the Jet Propulsion Laboratory (JPL), no NASA Center consistently contributed information to LLIS for the 6-year period from 2005 through 2010.
For example, the Glenn Research Center and the Johnson Space Center contributed an average of one lesson per year compared to the nearly 12 per year contributed by JPL .....
Taken together, the lack of consistent input and usage has led to the marginalization of LLIS as a useful tool for project managers"With minimal contributions (other than at JPL), and with rare consulation, then this system was just not working.
Why did it not work?
The project managers that were surveyed offered a variety of reasons for not using or contributing to LLIS, including:- A belief that LLIS is outdated, and is not user friendly
- A belief that LLIS does not contain information relevant to their project
- Competing demands on their time in managing their respective projects.
- Policy Requirements have been weakened over time.
- Inconsistent Policy direction and implementation.
- Lack of Monitoring.
Interesting that three out of these six reasons are directly related to governance. One wonders that, even if a spanking new LLIS were introduced, whether (without better governance) anyone would bother to use it.
The auditors suggested a number of improvements, including improvements to process, policy and resources, but one of the main issues with a lessons database is that it is a clumsy solution. Lessons should not be stored in an ever-accumulating database - lessons need to be embedded into design, into principles and into process.
Levels of lesson learning
I described, earlier this year, 3 levels of lesson learning, and the approach reviewed by the auditors is Level 1 - reactive capture of lessons in the hope that others will review them and learn from them.Ideally any organisation should aim for level 2 - where lessons lead to changes in designs, practices or procedures. A lesson is therefore an increment of knowledge, and those little increments are used to build an ever-improving body of knowledge. Once the lesson has been embedded as a practice change, or a design-principle change, or a change in a checklist, then it can be removed from the database.
Ideally the NASA database would be empty - all lessons would be incorporated in some body of knowledge somewhere. The only lesson in the system would be those pending incorporation.
If this system works well and quickly, then there should be no need for anyone to consult a lessons database - instead they should go to the designs, the checklists, and the design principles.
1 comment:
Liked this article because of my experience in 'Lessons Learned' and the almost same result from gathering Lessons Learned. The people who should be reviewing them never do, the lessons just fall into a black hole (like the NASA lessons) and are never seen again. You need to take the lessons, break them down into categories relevant to the skill groups and actively shove them up their noses! Otherwise just bin them from the beginning and write off the cost; sad but true.
Post a Comment