In any collaboration, people make judgments and take decisions, but to what extent are we aware of the potential errors of judgment and choice our minds can put in our way? Recently, I wrote a post on cognitive biases, and I want to continue that theme here.
I’ve been reading Daniel Kahneman’s book Thinking, Fast and Slow - which I recommend to anyone – and the core message is that we are not as rational as we like to think we are – even when we have all the information we need, and the logic is simple, we can still be very wrong. Let me begin by describing a core model in the book.
Kahneman divides cognition into two parts: System 1 and System 2. System 1 is the fast, intuitive, effortless part of the mind often running on simple heuristics (rules of thumb); it runs on automatic pilot – we have no sense of voluntary control. System 2 by contrast is slow, controlled, and more deliberative. On seeing a face, we don’t need to expend a lot of mental energy in determining that it is an angry face (System 1). If we are asked what is 17 x 24, System 2 is called upon; we need to expend mental energy and follow rules and procedures for calculating the answer. While System 1 can be very useful in situations requiring swift judgment and decision making (it is right a good deal of the time and often gets its way despite System 2) it is prone to mental traps, what Kahn calls cognitive illusions. Let me describe a few of these illusions:
Affect heuristic: Making judgments and decisions guided directly by feelings of liking and disliking - expert, professional intuitions do not all arise from true expertise.Anchoring: When people consider a particular value for an unknown quantity – before estimating that quantity – the estimates stay close to the number considered, e.g., when asked was Gandhi more than 114 years old when he died you will end up with a much higher estimate of his age at death than if the anchoring question referred to death at 35.
Availability illusion: People tend to assess the relative importance of issues by the ease with which they retrieved from memory (often put there by the media)
Focusing illusion: Any aspect of life to which attention is directed will loom large in a global evaluation. Framing: Participants asked to imagine that they have been given $50 behave differently depending on whether they told they can ‘keep’ $20 or must ‘lose’ $30 even though the outcome is the same. We dislike losses much more than we like gains of equivalent size. Losses loom larger than gains.
Priming effect: Exposure to a word causes immediate and measurable changes in the ease with which many related words can be evoked. If you have recently heard the word EAT you are more likely to complete the word fragment SO_P as SOUP. If you heard WASH you are more likely to complete it as SOAP.Substitution: When faced with a difficult question, we often answer an easier one, usually without noticing the substitution, i.e., we often revert to using a simplifying heuristic to try and solve a complex problem.
What you see is all there is: Making predictions based on what can be a momentary coincidence of random events; the exaggerated expectation of consistency is a common error. We are prone to think the world is more coherent and predictable than it is, and downplay chance. The bias toward coherence favors overconfidence.
There are many, many more. What are some lessons for us?We are not as rational as we think we are. The mind is a system for jumping to conclusions, and we are prone to be far more confident in our conclusions than we should be. We need cognitive diversity in our collaborations; people who can challenge the coherent stories we make up about how the world works. We need people who can sometimes disrupt our automatic pilot heuristics in System 1, and lead us into the more deliberate, effortful world of System 2. Basically, we need each other to help recognize and manage our illusions – to be skeptical when we feel most confident.
No comments:
Post a Comment