(This is the first of four condensed excerpts from Daniel Kahneman’s new book, “Thinking Fast and Slow.”)
Oct. 25 (Bloomberg) -- Most of us view the world as more benign than it really is, our own attributes as more favorable than they truly are, and the goals we adopt as more achievable than they are likely to be. We also tend to exaggerate our ability to forecast the future, which fosters overconfidence.
In terms of its consequences for decisions, the optimistic bias may well be the most significant cognitive bias. Because optimistic bias is both a blessing and a risk, you should be both happy and wary if you are temperamentally optimistic.
Optimism is normal, but some fortunate people are more optimistic than the rest of us. If you are genetically endowed with an optimistic bias, you hardly need to be told that you are a lucky person -- you already feel fortunate.
Optimistic people play a disproportionate role in shaping our lives. Their decisions make a difference; they are inventors, entrepreneurs, political and military leaders -- not average people. They got to where they are by seeking challenges and taking risks. They are talented and they have been lucky, almost certainly luckier than they acknowledge.
A survey of founders of small businesses concluded that entrepreneurs are more sanguine than midlevel managers about life in general. Their experiences of success have confirmed their faith in their judgment and in their ability to control events. Their self-confidence is reinforced by the admiration of others. This reasoning leads to a hypothesis: The people who have the greatest influence on the lives of others are likely to be optimistic and overconfident, and to take more risks than they realize.
The evidence suggests that an optimistic bias plays a role -- sometimes the dominant role -- whenever people or institutions voluntarily take on significant risks. More often than not, risk-takers underestimate the odds they face and, because they misread the risks, optimistic entrepreneurs often believe they are prudent, even when they are not. Their confidence sustains a positive mood that helps them obtain resources from others, raise the morale of their employees and enhance their prospects of prevailing. When action is needed, optimism, even of the mildly delusional variety, may be a good thing.
An optimistic temperament encourages persistence in the face of obstacles. But this persistence can be costly. A series of studies by Thomas Astebro shed light on what happens when optimists get bad news. (His data came from Canada’s Inventor’s Assistance Program -- which provides inventors with objective assessments of the commercial prospects of their ideas. The forecasts of failure in this program are remarkably accurate.)
In Astebro’s studies, discouraging news led about half of the inventors to quit after receiving a grade that unequivocally predicted failure. However, 47 percent of them continued development efforts even after being told that their project was hopeless, and on average these individuals doubled their initial losses before giving up.
Significantly, persistence after discouraging advice was relatively common among inventors who had a high score on a personality measure of optimism. This evidence suggests that optimism is widespread, stubborn and costly.
In the market, of course, belief in one’s superiority has significant consequences. Leaders of large businesses sometimes make huge bets in expensive mergers and acquisitions, acting on the mistaken belief that they can manage the assets of another company better than its current owners do. The stock market commonly responds by downgrading the value of the acquiring firm, because experience has shown that such efforts fail more often than they succeed. Misguided acquisitions have been explained by a “hubris hypothesis”: The executives of the acquiring firm are simply less competent than they think they are.
The economists Ulrike Malmendier and Geoffrey Tate identified optimistic chief executive officers by the amount of company stock that they owned personally and observed that highly optimistic leaders took excessive risks. They assumed debt rather than issue equity and were more likely to “overpay for target companies and undertake value-destroying mergers.” Remarkably, the stock of the acquiring company suffered substantially more in mergers if the CEO was overly optimistic by the authors’ measure. The market is apparently able to identify overconfident CEOs.
This observation exonerates the CEOs from one accusation even as it convicts them of another: The leaders of enterprises who make unsound bets don’t do so because they are betting with other people’s money. On the contrary, they take greater risks when they personally have more at stake. The damage caused by overconfident CEOs is compounded when the business press anoints them as celebrities; the evidence indicates that prestigious awards to the CEO are costly to stockholders.
The authors write, “We find that firms with award-winning CEOs subsequently underperform, in terms both of stock and of operating performance. At the same time, as CEO compensation increases, CEOs spend more time on activities such as writing books and sitting on outside boards, and they are more likely to engage in earnings management.”
I have had several occasions to ask founders and participants in innovative startups this question: To what extent will the outcome of your effort depend on what you do in your company? The answer comes quickly, and in my small sample it has never been less than 80 percent. Even when they are not sure they will succeed, these bold people think their fate is almost entirely in their own hands. They know less about their competitors and find it natural to imagine a future in which the competition plays little part.
Colin Camerer, who coined the concept of competition neglect, illustrated it with a quote from a chairman of Disney Studios. Asked why so many big-budget movies are released on the same holidays, he said, “Hubris. Hubris. If you only think about your own business, you think, ‘I’ve got a good story department, I’ve got a good marketing department’ … and you don’t think that everybody else is thinking the same way.” The competition isn’t part of the decision. In other words, a difficult question has been replaced by an easier one.
This is a kind of dodge we all make, without even noticing. We use fast, intuitive thinking -- System 1 thinking -- whenever possible, and switch over to more deliberate and effortful System 2 thinking only when we truly recognize that the problem at hand isn’t an easy one.
The question that studio executives needed to answer is this: Considering what others will do, how many people will see our film? The question they did consider is simpler and refers to knowledge that is most easily available to them: Do we have a good film and a good organization to market it?
Organizations that take the word of overconfident experts can expect costly consequences. A Duke University study of chief financial officers showed that those who were most confident and optimistic about how the Standard & Poor’s index would perform over the following year were also overconfident and optimistic about the prospects of their own companies, which went on to take more risks than others.
As Nassim Taleb, the author of “The Black Swan,” has argued, inadequate appreciation of the uncertainty of the environment inevitably leads economic agents to take risks they should avoid. However, optimism is highly valued; people and companies reward the providers of misleading information more than they reward truth tellers. An unbiased appreciation of uncertainty is a cornerstone of rationality -- but it isn’t what organizations want. Extreme uncertainty is paralyzing under dangerous circumstances, and the admission that one is merely guessing is especially unacceptable when the stakes are high. Acting on pretended knowledge is often the preferred approach.
Overconfidence also appears to be endemic in medicine. A study of patients who died in the intensive-care unit compared autopsy results with the diagnoses that physicians had provided while the patients were still alive. Physicians also reported their confidence. The result: “Clinicians who were ‘completely certain’ of the diagnosis ante-mortem were wrong 40 percent of the time.” Here again, experts’ overconfidence is encouraged by their clients. As the researchers noted, “Generally, it is considered a weakness and a sign of vulnerability for clinicians to appear unsure.”
According to Martin Seligman, the founder of positive psychology, an “optimistic explanation style” contributes to resilience by defending one’s self-image. In essence, the optimistic style involves taking credit for successes but little blame for failures.
Organizations may be better able to tame optimism than individuals are. The best idea for doing so was contributed by Gary Klein, my “adversarial collaborator” who generally defends intuitive decision-making against claims of bias.
Klein’s proposal, which he calls the “premortem,” is simple: When the organization has almost come to an important decision but hasn’t committed itself, it should gather a group of people knowledgeable about the decision to listen to a brief speech: “Imagine that we are a year into the future. We implemented the plan as it now exists. The outcome has been a disaster. Please take 5 to 10 minutes to write a brief history of that disaster.”
As a team converges on a decision, public doubts about the wisdom of the planned move are gradually suppressed and eventually come to be treated as evidence of flawed loyalty. The suppression of doubt contributes to overconfidence in a group where only supporters of the decision have a voice. The main virtue of the premortem is that it legitimizes doubts.
Furthermore, it encourages even supporters of the decision to search for possible threats not considered earlier. The premortem isn’t a panacea and doesn’t provide complete protection against nasty surprises, but it goes some way toward reducing the damage of plans that are subject to the biases of uncritical optimism.
(Daniel Kahneman, a professor of psychology emeritus at Princeton University and professor of psychology and public affairs emeritus at Princeton’s Woodrow Wilson School of Public and International Affairs, received the Nobel Memorial Prize in Economic Sciences for his work with Amos Tverksy on decision making. This is the first in a four-part series of condensed excerpts from his new book, “Thinking Fast and Slow,” just published by Farrar, Straus and Giroux. The opinions expressed are his own.)
--Editors: Mary Duenwald, David Henry.
Click on “Send Comment” in the sidebar display to send a Letter to the editor.
To contact the writer of this article: Daniel Kahneman at Kahneman@princeton.edu
To contact the editor responsible for this article: Mary Duenwald at email@example.com