Monday 20 March 2017

Tacit Knowledge and cognitive bias

Is that really Tacit Knowledge in your head, or is it just the Stories you like to tell yourself?


IMAGINATION by archanN on wikimedia commons
All Knowledge Managers know about the difference between tacit knowledge and explicit knowledge, and the difference between the undocumented knowledge you hold in your head, and documented knowledge which can be shared.  We often assume that the "head knowledge" (whether tacit or explicit) is the Holy Grail of KM; richer, more nuanced, more contextual and more actionable than the documented knowledge.

However the more I read about (and experience) cognitive bias and the failures of memory, the more suspicious I become of what we hold in our heads.

These biases and failures are tendencies to think in certain ways that can lead to systematic deviations from good judgement, and to remember (and forget) selectively and not always in accordance with reality. We all create, to a greater or lesser extent, our own internal "subjective social reality" from our selective and flawed perception and memory.

Cognitive and memory biases include

  • Confirmation bias, which leads us to take on new "knowledge" only when it confirms what we already think
  • Gamblers fallacy, which leads us to think that the most recent events are the more important 
  • Post-investment rationalisation, which leads us to think that any costly decisions we made in the past must have been correct
  • Sunk-cost fallacy, which makes us more willing to pour money into failed big projects than into failed small projects
  • Observational selection bias, which leads us to think that things we notice are more common that they are (like when you buy a yellow car, and suddenly notice how common yellow cars are)
  • Attention bias, where there are some things we just don't notice (see the Gorilla Illusions)
  • Memory transience, which is the way we forget details very quickly, and then "fill them in" based on what we think should have happened
  • Misattribution, where we remember things that are wrong
  • Suggestibility, which is where we create false memories

So some of those things in your head that you "Know" may not be knowledge at all. Some may be opinions which you have reinforced selectively, or memories you have re-adjusted to fit what you would have liked to happen, or suggestions from elsewhere that feel like memories. Some of them may be more like a story you tell yourself, and less like knowledge.

Do these biases really affect tacit knowledge? 

Yes they really do, and they can affect the decisions we make on the basis of that knowledge.  Chapter 10 of the 2015 World development Report, for example, looks at cognitive biases among development professionals, and makes for interesting reading.

While you would expect experts in the World Bank to hold a reliable store of tacit knowledge about investment to alleviate poverty, in fact these experts are as prone to cognitive bias as the rest of us. Particularly telling, for me, was the graph that compared what the experts predicted poor people would think, against the actual views of the poor themselves. 

The report identifies and examines 4 "decision traps" that affect the development professionals and influence the judgements that they make:

  • the use of shortcuts (heuristics) in the face of complexity; 
  • confirmation bias and motivated reasoning; 
  • sunk cost bias; and 
  • the effects of context and the social environment on group decision making.

And if the professionals of the World Bank are subject to such traps and biases, then there is no guarantee that the rest of us are any different.

So what is the implication?

The implication of this study, and many others, is that one person's "tacit knowledge" may be unreliable, or at best a mish-mash of knowledge, opinion, bias and falsehood. As Knowledge Managers, there are a number of things we can do to counter this risk.

  1. We can test Individual Knowledge against the knowledge of the Community of Practice. The World Bank chapter suggests that "group deliberation among people who disagree but who have a common interest in the truth can harness confirmation bias to create “an efficient division of cognitive labor”. In these settings, people are motivated to produce the best argument for their own positions, as well as to critically evaluate the views of others. There is substantial laboratory evidence that groups make more consistent and rational decisions than individuals and are less “likely to be influenced by biases, cognitive limitations, and social considerations”. When asked to solve complex reasoning tasks, groups succeed 80 percent of the time, compared to 10 percent when individuals are asked to solve those tasks on their own. By contrast, efforts to debias people on an individual basis run up against several obstacles (and) when individuals are asked to read studies whose conclusions go against their own views, they find so many flaws and counterarguments that their initial attitudes are sometimes strengthened, not weakened". Therefore community processes such as Knowledge Exchange and Peer Assist can be ideal ways to counter individual biases.
  2. We can routinely test community knowledge against reality. Routine application of reflection processes such as After Action review and Retrospect require an organisation to continually ask the questions "What was expected to happen" vs "What actually happened".  With good enough facilitation, and then careful management of the lessons, reality can be a constant self-correction mechanism against group and individual bias.
  3. We can bring in other viewpoints. Peer Assist, for example, can be an excellent corrective to group-think in project teams, bringing in others with potentially very different views. 
  4. We can combine individual memory to create team memory. Term reflection such as Retrospect is more powerful than individual reflection, as the team notices and remembers more things than any individual can.
  5. We can codify knowledge. Poor as codified knowledge is, it acts as an aide memoire, and counteracts the effects of transience, misattribution and suggestibility. 
But maybe the primary thing we can do is to stop seeing individual tacit knowledge as being safe and reliable, and instead start to concentrate on the shared knowledge held within communities of practice.  

Think of knowledge as Collective rather than Individual, and you will be on teh right track.

2 comments:

Unknown said...

Excellent blog article Nick - thoroughly enjoyed reading this.

Dries Velthuizen said...

Excellent article. Therefore the art of complementary reflection and at the least triangulation of different perspectives

Blog Archive