Cognitive Biases
Cognitive biases are systematic errors in human judgement — patterns of deviation from rationality that recur predictably across people and contexts. They are not random noise; they are directional and repeatable, which means they can be anticipated, named, and partially corrected.
Primary source: kahneman-2011-thinking-fast-and-slow (Kahneman, 2011).
Origin: The Heuristics-and-Biases Programme
daniel-kahneman and amos-tversky launched this research programme at Hebrew University in the early 1970s. Their founding insight: people use heuristics (efficient cognitive shortcuts) that work well on average but produce systematic biases in specific conditions. Earlier models assumed roughly rational agents; Kahneman and Tversky documented the precise, reproducible ways rationality fails.
Taxonomy of Major Biases
Biases from the Representativeness Heuristic
Judging probability by resemblance to a prototype or narrative:
| Bias | Description |
|---|---|
| Base-rate neglect | Ignoring prior probabilities when a vivid case description is available (the Steve-the-librarian example: ~20× more male farmers than male librarians in the US, yet “meek and tidy” makes people say librarian) |
| Conjunction fallacy | Judging “Linda is a feminist bank teller” as more probable than “Linda is a bank teller” because the conjunction is more representative |
| Stereotyping | Assigning individual probability based on group resemblance, ignoring individual variation |
Biases from the availability-heuristic
Estimating frequency by ease of recall:
| Bias | Description |
|---|---|
| Availability cascade | A story that spreads widely becomes perceived as common/probable, independent of actual base rates |
| Affect heuristic | Emotional salience (fear, disgust) inflates perceived frequency of vivid risks (plane crashes, shark attacks) vs. dull risks (car accidents, heart disease) |
anchoring-bias
Initial numbers influence final estimates even when the anchor is arbitrary (a wheel-of-fortune number affects estimates of African UN member percentages). Adjustment from anchor is always insufficient.
overconfidence Biases
| Bias | Description |
|---|---|
| Illusion of validity | Subjective confidence in predictions exceeds their empirical accuracy (stock pickers, interviewers) |
| Planning fallacy | Projects completed later and over-budget because inside-view planning ignores reference class (base rate of similar projects) |
| Hindsight bias | After the fact, past events feel inevitable; inflates confidence in future predictions |
Biases from prospect-theory
| Bias | Description |
|---|---|
| loss-aversion | Losses weigh ~2× gains psychologically; drives extreme risk-aversion for losses, risk-seeking to avoid locking in a loss |
| framing-effect | ”200 lives saved” vs. “400 lives lost” from the same policy produce opposite preferences |
| Sunk cost fallacy | Prior investments (sunk costs) influence continuation decisions even though only future costs/benefits are relevant |
| Endowment effect | Owning something makes it feel more valuable; people demand more to give something up than they would pay to acquire it |
Overconfidence and Narrative Biases
| Bias | Description |
|---|---|
| Narrative fallacy | People prefer coherent causal stories to accurate probabilistic accounts |
| WYSIATI (What You See Is All There Is) | System 1 builds high-confidence conclusions from incomplete evidence, ignoring what is absent |
| Halo effect | A positive impression in one dimension generalises to all dimensions (attractive people seem more competent) |
| Regression neglect | Attributing causal stories to statistical regression toward the mean |
Why Biases Persist
- Evolutionary origin: most heuristics are adaptive in natural environments; biases occur when applied to statistical or financial contexts the mind did not evolve for.
- Immediate vs. delayed feedback: System 1 learns from rapid feedback (social cues, physical danger); it does not self-correct in domains with slow or absent feedback (financial markets, policy effects).
- Emotional charge: System 1 tags events with emotional intensity, not probability; vivid events feel probable.
Correctives
| Problem | Corrective |
|---|---|
| Base-rate neglect | Force consideration of reference class before making predictions |
| Overconfidence | Pre-mortems; track record analysis; reference-class forecasting |
| Anchoring | Generate your own estimate before seeing any anchor number |
| Loss-aversion | Reframe decisions as policies (repeated choices) rather than one-off gambles |
| Planning fallacy | Use outside view: how long did similar projects actually take? |
Related Concepts
- system-1-system-2 — the cognitive architecture that produces biases
- heuristics — the mental shortcuts that generate biases as side effects
- prospect-theory — formal theory of loss-aversion and framing
- loss-aversion — the most consequential single bias
- availability-heuristic — bias from ease-of-recall
- anchoring-bias — insufficient adjustment from initial numbers
- overconfidence — systematic over-estimation of one’s predictive accuracy