Cognitive Biases
Reprinted from: https://yourbias.is
See also: https://en.wikipedia.org/wiki/List_of_cognitive_biases
1. Anchoring
The first thing you judge influences your judgment of all that follows.
Human minds are associative in nature, so the order in which we receive information helps determine the course of our judgments and perceptions. For instance, the first price offered for a used car sets an ‘anchor’ price which will influence how reasonable or unreasonable a counter-offer might seem. Even if we feel like an initial price is far too high, it can make a slightly less-than-reasonable offer seem entirely reasonable in contrast to the anchor price.
Be especially mindful of this bias during financial negotiations such as houses, cars, and salaries. The initial price offered has proven to have a significant effect.
2. The sunk cost fallacy
You irrationally cling to things that have already cost you something.
When we've invested our time, money, or emotion into something, it hurts us to let it go. This aversion to pain can distort our better judgment and cause us to make unwise investments. A sunk cost means that we can't recover it, so it's rational to disregard the cost when evaluating. For instance, if you've spent money on a meal but you only feel like eating half of it, it's irrational to continue to stuff your face just because 'you've already paid for it'; especially considering the fact that you're wasting actual time doing so.
To regain objectivity, ask yourself: had I not already invested something, would I still do so now? What would I counsel a friend to do if they were in the same situation?
3. The availability heuristic
Your judgments are influenced by what springs most easily to mind.
How recent, emotionally powerful, or unusual your memories are can make them seem more relevant. This, in turn, can cause you to apply them too readily. For instance, when we see news reports about homicides, child abductions, and other terrible crimes it can make us believe that these events are much more common and threatening to us than is actually the case.
Try to gain different perspectives and relevant statistical information rather than relying purely on first judgments and emotive influences.
4. The curse of knowledge
Once you understand something you presume it to be obvious to everyone.
Things makes sense once they make sense, so it can be hard to remember why they didn't. We build complex networks of understanding and forget how intricate the path to our available knowledge really is. This bias is closely related to the hindsight bias wherein you will tend to believe that an event was predictable all along once it has occurred. We have difficulty reconstructing our own prior mental states of confusion and ignorance once we have clear knowledge.
When teaching someone something new, go slow and explain like they're ten years old (without being patronising). Repeat key points and facilitate active practice to help embed knowledge.
5. Confirmation bias
You favour things that confirm your existing beliefs.
We are primed to see and agree with ideas that fit our preconceptions, and to ignore and dismiss information that conflicts with them. You could say that this is the mother of all biases, as it affects so much of our thinking through motivated reasoning. To help counteract its influence we ought to presume ourselves wrong until proven right.
Think of your ideas and beliefs as software you're actively trying to find problems with rather than things to be defended. "The first principle is that you must not fool yourself – and you are the easiest person to fool." - Richard Feynman
6. The Dunning-Kruger effect
The more you know, the less confident you're likely to be.
Because experts know just how much they don't know, they tend to underestimate their ability; but it's easy to be over-confident when you have only a simple idea of how things are. Try not to mistake the cautiousness of experts as a lack of understanding, nor to give much credence to lay-people who appear confident but have only superficial knowledge.
“The whole problem with the world is that fools and fanatics are so certain of themselves, yet wiser people so full of doubts.”- Bertrand Russell
7. Belief bias
If a conclusion supports your existing beliefs, you'll rationalise anything that supports it.
It's difficult for us to set aside our existing beliefs to consider the true merits of an argument. In practice this means that our ideas become impervious to criticism, and are perpetually reinforced. Instead of thinking about our beliefs in terms of 'true or false' it's probably better to think of them in terms of probability. For example we might assign a 95%+ chance that thinking in terms of probability will help us think better, and a less than 1% chance that our existing beliefs have no room for any doubt. Thinking probabalistically forces us to evaluate more rationally.
A useful thing to ask is 'when and how did I get this
belief?'
We tend to automatically defend our ideas without ever really questioning them.
8. Self-serving bias
You believe your failures are due to external factors, yet you're responsible for your successes.
Many of us enjoy unearned privileges, luck and advantages that others do not. It's easy to tell ourselves that we deserve these things, whilst blaming circumstance when things don't go our way. Our desire to protect and exalt our own egos is a powerful force in our psychology. Fostering humility can help countermand this tendency, whilst also making us nicer humans.
When judging others, be mindful of how this bias interacts with the just-world hypothesis, fundamental attribution error, and the in-group bias.
9. The backfire effect
When some aspect of your core beliefs is challenged, it can cause you to believe even more strongly.
We can experience being wrong about some ideas as an attack upon our very selves, or our tribal identity. This can lead to motivated reasoning which causes a reinforcement of beliefs, despite disconfirming evidence. Recent research shows that the backfire effect certainly doesn't happen all the time. Most people will accept a correction relating to specific facts, however the backfire effect may reinforce a related or 'parent' belief as people attempt to reconcile a new narrative in their understanding.
“It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so.” - Mark Twain
10. The barnum effect
You see personal specifics in vague statements by filling in the gaps.
Because our minds are given to making connections, it's easy for us to take nebulous statements and find ways to interpret them so that they seem specific and personal. The combination of our egos wanting validation with our strong inclination to see patterns and connections means that when someone is telling us a story about ourselves, we look to find the signal and ignore all the noise.
Psychics, astrologers and others use this bias to make it seem like they're telling you something relevant. Consider how things might be interpreted to apply to anyone, not just you.
11. Groupthink
You let the social dynamics of a group situation override the best outcomes.
Dissent can be uncomfortable and dangerous to one's social standing, and so often the most confident or first voice will determine group decisions. Because of the Dunning-Kruger effect, the most confident voices are also often the most ignorant.
Rather than openly contradicting others, seek to facilitate objective means of evaluation and critical thinking practices as a group activity.
12. Negativity bias
You allow negative things to disproportionately influence your thinking.
The pain of loss and hurt are felt more keenly and persistently than the fleeting gratification of pleasant things. We are primed for survival, and our aversion to pain can distort our judgment for a modern world. In an evolutionary context it makes sense for us to be heavily biased to avoid threats, but because this bias affects our judgments in other ways it means we aren't giving enough weight to the positives.
Pro-and-con lists, as well as thinking in terms of probabilities, can help you evaluate things more objectively than relying on a cognitive impression.
Comments
Post a Comment