The fourth-quarter comeback to win the game. The tumor that appeared on a second scan. The guy in accounting who was secretly embezzling company funds. The situation may be different each time, but we hear ourselves say it over and over again: “I knew it all along.”
The problem is that too often we actually didn’t know it all along, we only feel as though we did. The phenomenon, which researchers refer to as “hindsight bias,” is one of the most widely studied decision traps and has been documented in various domains, including medical diagnoses, accounting and auditing decisions, athletic competition, and political strategy.
In a new article in the September 2012 issue of Perspectives on Psychological Science, a journal of the Association for Psychological Science, psychological scientists Neal Roese of the Kellogg School of Management at Northwestern University and Kathleen Vohs of the Carlson School of Management at the University of Minnesota review the existing research on hindsight bias, exploring the various factors that make us so susceptible to the phenomenon and identifying a few ways we might be able to combat it. This article is the first overview to draw insights together from across different disciplines.
Roese and Vohs propose that there are three levels of hindsight bias that stack on top of each other, from basic memory processes up to higher-level inference and belief. The first level of hindsight bias, memory distortion, involves misremembering an earlier opinion or judgment (“I said it would happen”). The second level, inevitability, centers on our belief that the event was inevitable (“It had to happen”). And the third level, foreseeability, involves the belief that we personally could have foreseen the event (“I knew it would happen”).
The researchers argue that certain factors fuel our tendency toward hindsight bias. Research shows that we selectively recall information that confirms what we know to be true and we try to create a narrative that makes sense out of the information we have. When this narrative is easy to generate, we interpret that to mean that the outcome must have been foreseeable. Furthermore, research suggests that we have a need for closure that motivates us to see the world as orderly and predictable and to do whatever we can to promote a positive view of ourselves.
Ultimately, hindsight bias matters because it gets in the way of learning from our experiences.
“If you feel like you knew it all along, it means you won’t stop to examine why something really happened,” observes Roese. “It’s often hard to convince seasoned decision makers that they might fall prey to hindsight bias.”
Hindsight bias can also make us overconfident in how certain we are about our own judgments. Research has shown, for example, that overconfident entrepreneurs are more likely to take on risky, ill-informed ventures that fail to produce a significant return on investment.
While our inclination to believe that we “knew it all along” is often harmless, it can have important consequences for the legal system, especially in cases of negligence, product liability, and medical malpractice. Studies have shown, for example, that hindsight bias routinely afflicts judgments about a defendant’s past conduct.
And technology may make matters worse. “Paradoxically, the technology that provides us with simplified ways of understanding complex patterns – from financial modeling of mortgage foreclosures to tracking the flow of communications among terrorist networks – may actually increase hindsight bias,” says Roese.
So what, if anything, can we do about it?
Roese and Vohs suggest that considering the opposite may be an effective way to get around our cognitive fault, at least in some cases. When we are encouraged to consider and explain how outcomes that didn’t happen could have happened, we counteract our usual inclination to throw out information that doesn’t fit with our narrative. As a result, we may be able to reach a more nuanced perspective of the causal chain of events.