Biases

The invisible architecture of human judgment and decision-making

The first principle is that you must not fool yourself - and you are the easiest person to fool.

What the human being is best at doing is interpreting all new information so that their prior conclusions remain intact.

We see the world not as it is, but as we are - or as we are conditioned to see it.

The eye sees only what the mind is prepared to comprehend.

It is the peculiar and perpetual error of the human understanding to be more moved and excited by affirmatives than negatives.

We don't see things as they are, we see them as we are. Our perception is shaped by our past experiences, fears, and desires.

The confirmation bias is the mother of all misconceptions. It is the tendency to search for, interpret, favor, and recall information that confirms or supports one's prior beliefs or values.

We are prone to overestimate how much we understand about the world and to underestimate the role of chance in events.

The most dangerous biases are the ones we don't know we have. They operate in the background, shaping our decisions without our awareness.

We judge others by their actions, but we judge ourselves by our intentions. This fundamental attribution error distorts nearly all human interaction.

The availability heuristic makes us fear the wrong things. What we can easily recall seems more common and threatening than it actually is.

Anchoring drags our judgment toward arbitrary numbers. Once an anchor is set, other judgments are made by adjusting away from that anchor.

Survivorship bias causes us to overestimate our chances of success because we only see the winners, never the invisible failures.

The sunk cost fallacy makes us throw good money after bad. We continue investing in losing propositions because we've already invested so much.

Hindsight bias makes the past seem inevitable. After an event has occurred, we see it as having been predictable, despite there having been little or no objective basis for predicting it.

Our brains are prediction machines that would rather be wrong than uncertain. We fill gaps in our knowledge with assumptions that feel like facts.

The Dunning-Kruger effect protects the incompetent from awareness of their incompetence. The skills needed to produce right answers are exactly the skills needed to recognize what a right answer is.

We suffer from optimism bias when planning our own projects, but from pessimism bias when evaluating others' plans.

Groupthink is the tendency to prioritize harmony and consensus over accurate analysis and critical evaluation.

The halo effect causes one trait to influence how we perceive other traits. A single positive quality can make us see everything else in a positive light.

We are blind to our blindness. We have very little idea of how little we know. We're not designed to know how little we know.

Recency bias gives recent events more importance than they deserve, while neglecting the long-term patterns that truly matter.

The curse of knowledge makes it impossible to remember what it was like not to know something. This explains why experts are often terrible teachers.