Fact-checking is an unfortunate part of political life, but what is its role in Knowledge Management?
Image from wikimedia commons and IFLA |
Sometimes the disputes over the veracity of statements turns into a dispute between fact-checkers, as in this story from Forbes magazine;
"I was sitting in the Hong Kong airport and watching Chris Matthews interview Republican presidential candidate and U.S. Rep. Michelle Bachmann, of Minnesota, about a number of topics. Bachmann had been in the news that week because she had claimed that, "now we have the federal government … taking over ownership or control of 51 percent of the American economy." Matthews couldn't resist jumping on the statement, noting that MSNBC’s fact checkers couldn't come up with anything close to that number. Bachmann flippantly replied something like, and I’m paraphrasing, “Well, Chris, you have your fact checkers and I have mine. I think I’ll go with mine.”
Facts are important if people are to make correct decisions, whether this is in politics or in organisations, and it is organisations where Knowledge Management comes in. If organisational knowledge is to be used to support effective decisions making, then that knowledge needs to be based on factual evidence.
Facts and KM
Although we might cynically anticipate the need to check facts in politics, where politicians have long bent facts to their own ends, things are different in organisations. The issue we need to deal with is not political bias, but cognitive bias. All of us are human, and all of us are prone to effects such as confirmation bias (which leads us to select observations that reinforce what we already think) and attention bias (where we often fail to notice things we are not looking for).
These biases mean that what we record in our knowledge bases and knowledge assets can be as much opinion as fact, or even prejudice and misconceptions.
So how do we deal in facts, when applying Knowledge Management within an organisation?
- Firstly when creating knowledge assets, we address the issue of validation. Generally the community validates, either through discussion and dialogue, or through co-creation of material and co-editing of wikis or collaborative documents. As Peter Kemper said in my Lessons Learned Handbook, "A Wiki should be self correcting. The moment I write something wrong, people all over the company will notice and become alarmed".
- Secondly in our customer-facing knowledge bases, the contact agents will constantly testing every article to see if it answers the customers problem, and will flag, review or correct content which they find to be wrong. The watchword is "Re-use is Review" - knowledge is tested in re-use and corrected where wrong.
- In community of practice discussions, we again allow the community to self-validate. If someone posts an incorrect answer into a discussion forum, members of a mature and trusting community will quickly offer corrections. If there is diagreement, the community can talk this through.
- Then in After Action reviews and Lessons Learned meetings, we discuss what actually happened, and aim firstly for ground truth, and secondly for root cause analysis. Where people express opinion, we ask for the stories on which the opinions are based, and we look for the factual core of the story, before we look for interpretation.
- The lesson learning system is also where the knowledge assets are tested and reviewed in use. Many times, the action associated with a lesson will be to correct, edit or update the relevant knowledge asset.
So we rely on analysing evidence at the source of knowledge creation, and on validation and self-correcting mechanisms within the wikis and knowledge bases.
No comments:
Post a Comment