Our minds don’t work as well as we think they do.
Don’t get me wrong–our minds are incredible. Every day we make hundreds of decisions, process tons of sensory information, and perform complex analyses and calculations. It is easy to be so amazed by the complex accomplishments of our grey matter that we overlook many of the shortcuts and mistakes we make along the way.
But one of the ways the mind is able to do as much as it does is by prioritizing speed over accuracy. Our mind looks for easy-to-answer questions, easy-to-recognize patterns, and fills in the gaps where necessary. We jump to conclusions, make hasty decisions without considering all the facts, and are constantly led astray by irrelevant pieces of information.
If you think you’re mind is working just fine, and that what I’m saying does not apply to you, then you might be interested in reading Daniel Kahneman’s new book, “Thinking, Fast and Slow.” Kahneman has spent his career studying the mental errors we all make. Here are just a few examples:
- Duration Neglect: we tend to ignore the duration in our assessment of past experiences.
- Peak-End Rule: our evaluations of past experiences are heavily biased by the peak moments and how the experience ended.
- Anchoring error: we tend to influence answers to questions on whatever “anchor” we happen to have in our heads (even if it has no relation whatsoever to the question at hand.)
- Availability heuristic: we tend to overestimate the probability or importance of things that are easily called to mind. (This is often driven by media reporting of sensational events, such as airplane crashes and shark attacks.)
- Planning fallacy: we tend to overestimate benefits and underestimate costs of future projects.
The short list above is by no means complete. There is also “base-rate neglect”, the “representative heuristic”, “illusion of validity”, “loss aversion” and I could go on and on with all of the mental shortcuts that Kahneman has found, which sometimes get us into trouble.
According to Kahneman, the mind has two systems, and neither of them work as well as we would hope. System 1 is designed to give quick answers. It tends to jump to conclusions, looking for a quick intuitive answer to questions, and ignoring relevant information if it isn’t readily available.
System 2 is more analytical, and will actually take the time to do a more thorough investigation before coming up with an answer. But system 2 is lazy. If system 1 has an appealing answer, system 2 is likely to sit back and let the system 1 answer go, even if it has not been carefully thought out. System 2 will usually work with whatever information it has readily at hand, sometimes failing to do the work required to fill in important gaps. It will search for an easy question to answer, and mistakenly substitute that answer even when it doesn’t really apply to the question that really needs to be solved. And even when system 2 does fully engage, it makes a lot of mistakes, attaching meaning to things it shouldn’t or being pulled in different directions based on a variety of situational factors prone to induce biases.
Don’t be surprised if you are still thinking that your mind is above all of these inconsistencies. One of the biggest mistakes the mind makes is in failing to recognize its own shortcomings. While our thinking is muddled with biases, false conclusions and inaccurate assumptions, we feel like we are carefully thinking things through.
A clear example of this is the “Dunning-Kruger effect,” named for the researchers who discovered that a large majority of people think they are above average across a wide variety of domains. “As a result, such people remain unaware of their incompetence and accordingly fail to take any self-improvement measures that might rid them of their incompetence,” says Social Psychologist Daniel Hawes. This is easily observed in the first weeks of any season of American Idol when some of the worst singers in the world are shocked and amazed to be rejected by the judges.
So the challenge becomes how do we recognize or protect ourselves from making these mistakes if we don’t even realize we are making them? Unfortunately, this is easier said than done. But if you want to know what it feels like when you are wrong, I would suggest listening to this enlightening TED Talk “On Being Wrong” by “wrongologist,” Kathryn Schultz. It turns out that the easiest way to recognize being wrong is not as helpful as you might think: “being wrong feels just like being right.”
References and recommended reading:
Kahneman, D. (2011). Thinking, Fast and Slow. Farar, Straus and Giroux.
by Jeremy McCarthy
Now available: New e-book on The Psychology of Spas and Wellbeing.