Expert Political Judgment, by Philip Tetlock, a psychologist who teaches at the University of California, Berkeley, presents findings from a nearly 20-year project in which he asked 284 “pundits” (people who get paid large sums of money for offering opinions and advice on political and economic trends) to make predictions on a variety of issues. Tetlock also studied the pundits’ decision-making processes, assessed how they handled information that contradicted their views, and measured their reactions when their predictions failed to pan out.
When it comes to forecasting what the future holds, the so-called “experts” are no better than the rest of us! In fact, they often do worse, based on their over-inflated opinions of their expertise.
After reviewing more than 82,000 predictions from the study, Tetlock discovered that:
- The majority of pundits performed worse than random chance, with less than half of their predictions actually coming true.
- Specialists in a particular field are only slightly more accurate than non-specialists in predicting what will happen in their areas of expertise. Having a little bit of knowledge in a specific area might make you a more accurate forecaster. Having a lot can actually make you less accurate.
- The greater the experts’ self-confidence in their abilities, the less reliable their predictions. People who keep up with current events via mainstream media can predict almost as accurately as specialists.
- The better known and the more frequently quoted the expert, the less reliable their predictions.
What I found most interesting was how the pundits responded when proven wrong.
Rather than acknowledge their errors, the experts in Tetlock’s study tended to do one of four things. Claim that their prediction was basically correct but their timing was off. Argue that they got blindsided by an unexpected or improbable event. Insist that they almost got it right. Or grudgingly admit they got it wrong, but for the right reasons.
In other words, they used the same excuses and self-justifications that us less famous and lesser-paid mortals use. And they almost never changed their beliefs about the way the world works just because they happened to make a mistake.
It’s human nature to dismiss new information that doesn’t fit with our core attitudes, beliefs, and assumptions about how we see the world. So it’s not surprising that the pundits would readily accept any evidence that supported their theories, while casting a highly critical eye on any that argued against it. After all, this is the same human tendency that causes conservatives to watch only Fox news while liberals flock to movies by Michael Moore. And this is the same human tendency that causes us to make terrible blunders in our businesses.
The issue for business leaders, then, becomes one of identifying the pundits in our organizations and evaluating the influence they have on our decisions.
Who are the people that — by virtue of their knowledge and expertise in a certain area, their positional power, their status among management or co-workers, or even their longevity with the company — we turn to first when trying to discern what the future holds for the business? And do we automatically accept their word as gospel, even though evidence now suggests that we could have equal success predicting future events by randomly throwing darts at a board? Or do we challenge them to present real evidence to support their suppositions?
I’m not suggesting that we discount the knowledge and experience of specialists and experts in our organizations. But we must accept the fact that they are equally prone, and perhaps more so, to being blinded by their own attitudes and assumptions about how the world works. We need to pay closer attention to how we incorporate their ideas and opinions into our decision-making decisions. And we need to have safeguards in place to ensure that we don’t allow their biases and over-confidence in their abilities to lead us into decisions that may have devastating long-term consequences.
The bottom line from Tetlock’s study is that the experts we look to for opinions, advice and, yes, even a glimpse into the future, are just as vulnerable to human error as the rest of us. They cling just as tightly to what they “know” to be true. And like us, they hate to be wrong. And their thought bubbles tend to carry more weight.
I predict that we’ll get much better results if we keep this in mind when making decisions about the future of our organizations.