Confirmation Bias is one of the most pernicious cognitive biases, and is a major challenge to Knowledge Management. See it in action below.
Confirmation bias is a powerful cognitive bias, which means that people- Tend to select evidence that supports what they already believe, and
- Set up tests that confirm their believe, rather than test it.
You can see how this would be a thorn in the side of KM. How do you know whether what you are dealing with is Real Knowledge, or Fake Knowledge - an opinion which has been reinforced through selective evidence and only confirmatory testing?
Below is a short video of a team exercise to expose confirmation bias, which is also an excellent example of confirmation bias in operation. I explore further later down the page.
In the 5 rounds of the game, the facilitator provided a set of names that fit a rule, and the participants suggested other names, and then estimated their confidence that they knew what the rule was.
Round | Names Provided | Names Added (all deemed correct) | Confidence level |
1 | John Adams, Thomas Jefferson, George Washington | Alexander Hamilton, James Madison, Andrew Jackson, John Hancock | 76% |
2 | Abraham Lincoln | Ben Franklin, U Grant, T roosevelt, JFK | 53% |
3 | Martin Luther King | Columbus, Jesus, Nelson Mandela, Rosa Parks | 56% |
4 | Ghandi | Mother Teresa, Julius Caesar, Mohammed (PBUH), Saddam Hussein | 64% |
5 | Philip Seymour Hoffman | Golda Meir, Fidel Castro, Michael Jackson, Amy Winehouse | 76% |
For example, in round 1 the three names provided will be familiar to Americans as "Founding fathers", or signatories to the Declaration of Independence. All the names suggested/added by the participants were also founding fathers, and the group was 76% sure that the rule was "Founding Fathers"
As the facilitator added more names, it became clear that these were not all founding fathers.
- Maybe (round 2) they were "Famous American political figures (male)"
- Maybe (rounds 3 and 4) they were "Famous political/religious figures (male or female)"
- Maybe (round 5) they were "Famous dead people"
However - notice one important thing
All (or almost all) the suggested names were confirmatory. They conformed to the rule that the participants thought was in operation.
In no case did anyone suggest a name that tested the rule, only names that fitted the rule. Each suggested name, each test of the rule, was already inside the set they had already defined. Nobody said "Donald Trump" (to test whether the person had to be dead), or "My granny" (to test whether the person had to be human), or "Homer Simpson" (to test whether the person had to be real), or "Ming Ming the Panda" (to test whether the person had to be human).
The only example of a test I can see in this list, rather than a confirmation, is when someone suggested Rosa Parks, even though all other names to date had been male. This was a true test.
All (or almost all) the suggested names were confirmatory. They conformed to the rule that the participants thought was in operation.
In no case did anyone suggest a name that tested the rule, only names that fitted the rule. Each suggested name, each test of the rule, was already inside the set they had already defined. Nobody said "Donald Trump" (to test whether the person had to be dead), or "My granny" (to test whether the person had to be human), or "Homer Simpson" (to test whether the person had to be real), or "Ming Ming the Panda" (to test whether the person had to be human).
The only example of a test I can see in this list, rather than a confirmation, is when someone suggested Rosa Parks, even though all other names to date had been male. This was a true test.
People prefer to confirm, rather than to test.
Also note how confident the group were with their first guess at the rule, back at the time when the sample set was smallest and when they were most wrong. Then as new names were added, their confidence fell, then rose again. But maybe they are still wrong - maybe if we added Donald Trump, Homer Simpson and Ming Ming the Panda, these would be correct as well. Maybe the rule is "sentient beings, alive or dead."
The lessons for Knowledge Management are these;
The lessons for Knowledge Management are these;
- If everything seems to conform with what you "know" - beware confirmation bias; especially when your sample set is small.
- Just because you are confident of what you know does not mean you are right.
- If you want to test whether your knowledge is correct, don't seek for confirmatory examples, seek for counter-confirmatory examples. Test, don't just confirm,
- The first valid counter-confirmatory example must result in a re-think of what you know.
- All of this is difficult; as humans we are programmed to seek confirmation, not to test theories.
No comments:
Post a Comment