Humans make big thinking mistakes in predictable ways. Collectively these errors are called ‘cognitive biases’. Business leaders sometimes make billion dollar decisions, and neither their businesses nor wider society can afford ‘hardware glitches’ on that scale. This three-part series on cognitive biases and leadership starts with how cognitive biases systematically distort our view of the past and how that affects today’s decisions.
In 2001, I was called into British Airways because they were struggling with a failing change project, Project F (an unfortunate choice of letter when things became difficult.) Within the first few hours, I was warned (in hushed tones) to ‘not talk about Project Phoenix’. Project Phoenix, as far as I could tell from these snatched conversations, had been recently cancelled after a two-year implementation had produced no results. Staff were demoralized and executives were humiliated. Phoenix was the elephant in the room, but in my desire not to rock the boat during these early conversations, and what proved to be a significant error on my own part, I acceded to these wishes.
A struggling project has specific causes, say bad planning or budgeting, but it also has general, systemic causes which will dog every project the organization undertakes. For example if a business has low trust, weak leadership, inexperience leading complex change, or a history of poor labor-management relations it will not matter how well conceived an individual project may be. Before diving into the specifics of Project F, I should have gone deeply into what went wrong at Project Phoenix because these systemic factors were surely going to make my work more difficult. I chickened out, or more formally I colluded with BA to suppress a history that was making them uncomfortable.
Researchers call this the ‘Ostrich Effect’, avoiding risky or difficult situations. In bad markets, investors look up the value of their holdings up to 80% less often than in good markets. When things are difficult, we screw our eyes tightly closed which makes things worse.
Hindsight allows me to offer the following counsel for change leaders (and which applies much more widely): ‘that which you do not wish to discuss, is the thing you most need to discuss’.
While the Ostrich Effect makes leaders ignore difficult situations or failed projects at the cost of learning, and makes it certain that their mistakes will be repeated, another family of biases has the opposite effect: those biases cause leaders to overweight the past in deliberations rather than underweight it as in the Ostrich Effect. An important bias here is the ‘sunk cost fallacy’ which leads to a strategic error called ‘escalation of commitment’.
Say you and a friend have tickets to a sporting event a considerable distance away. You paid $75 dollars for yours and she was given hers for free. On the day of the event, the weather forecast is abysmal: cold and rainy. Who is more likely to skip the event? When Nobel Prize winner, Amos Tversky (a pioneer in cognitive biases), consulted a new financial advisor he was asked to provide a list of his stocks and what he had paid for them. He asked ‘why is what I paid for them important’? The financial advisor looked at our Nobel winner as though he were crazy [i].
Perhaps your gut response is similar? Of course what you paid for the ticket or the stocks matters! Yet such thinking is irrational. If stocks are down, the money is gone. What matters is whether the game will be a fulfilling use of your time despite the weather, or whether the stocks will rise or fall from here. If you have bought a lemon of a car, or a money pit of a house, or are in a relationship or job that stinks, it may be profitable to invest more and escalate your commitment, or it might be right to fold. How much you have paid, or how long you have invested are (rationally) irrelevant.
The sunk cost fallacy described above leads to escalation of commitment which costs a lot because when projects or investments turn sour, decision making typically gets worse. Baring Brothers bank was founded in 1762, but one trader, Nick Leeson, escalating his commitments to a losing arbitrage strategy brought it down in 1995 when his losses that started small grew to $1.3 billion.
Escalation of commitment is not only costly, but it is commonplace: escalation experts suggest that 30-40% of all IT projects involve some degree of project escalation. So what causes this bizarre tendency to double-down when things are going poorly? Escalation of commitment happens because people do not like to feel like they have squandered resources, or that past efforts have been in vain. Human beings self-justify, liking to feel that time and money have been well spent, and that past decisions were good (or not that bad). Another bias, confirmation bias, seals the trap. Having invested, leaders seek out confirming evidence that things are going well. In executive teams, leaders do not want to lose face, so projects they have sponsored will continue to be endorsed long past what rationality would suggest. At one failing change program, at the UK’s National Health Service, the program directors, when facing government investigations, repeatedly pointed to all the little successes to avoid bigger questions and issues. The program was eventually cancelled having cost the UK $20 billion with scant benefit. Listening to the transcripts, one could conclude the project was 90% success, not 90% failure!
Given the scale and breadth of the problem, what can a leader do? So it is with projects, they can fail in many ways but the path to success is a narrow one. The escalation problem, with its complex psychological, cultural, social and political causes, defies trivial answers.
In general terms, and using the language of virtue ethics, the challenge for the leader is to balance the desirable virtue of persistence with the undesirable one of stubbornness. There are several strategies which a leader can use, here are two:
Increase vigilance with troubled projects. If there is one time you must second-guess your decisions it is when a project or investment is failing, find an independent observer who can objectively assess the project. Your team will want to believe things are going well and look for confirming evidence. Combat that confirmation bias by actively seeking out evidence that would contradict the decision to commit more money. Most importantly, if more resources are committed, create stopping rules now because the stakes will be higher, the losses will be greater and the escalation force more extreme if (say six months from now), you face the same decision.
Create imaginary scenarios. Intel got into the memory business in 1969, and by 1981 was the market leader. During the early 1980’s Japanese competition, and the scale of the computer industry had turned the memory business into a commodity business which was becoming increasingly unprofitable for Intel. Memory was still over half of the Intel’s $1 billion revenue, but overall profitability had nose-dived from $198 million to just $2 million and the memory business was losing money. Andy Grove felt the writing was on the wall, but Intel’s executives were focused on Intel’s history, its identity as ‘the memory’ company, its legacy investment in technology and manufacturing capacity, and simply could not believe the mounting evidence that they were being outgunned in a market they had created and which was over half of their top-line.
In his book, Only the Paranoid Survive, Grove recounts a conversation with Gordon Moore (of Moore’s Law fame and co-founder of Intel), ‘If we got kicked out and the board brought in a new CEO, what you think he would do?’ Gordon answered without hesitation, ‘He would get us out of memories.’ I stared at him, numb, then said, ‘Why shouldn’t you and I walk out the door, come back, and do it ourselves?’ And so they did. Thirty years later, Intel’s revenue is greater than $50 billion, implying that their painful cut, getting out of a business they had founded, was correct.
These two past-based biases, the Ostrich effect and Escalation of Commitment, cost hundreds of billions of dollars. In the first, leaders refuse to confront the past, and in the second they are too attached to it. When kicking off a major change, leaders must courageously ask whether the past has lessons for them, then detach from it and start, as it were, from a clean slate perhaps saying ‘I appreciate we have put a lot into this, but we need to park that and consider only what we are going to get out of it.’
In the next article in this series, we turn to the present-based biases, and which of those affect problem selection and problem solving by leadership teams.
Contributed article: This article is part of a three-part series by Paul Gibbons based on his upcoming book Why Change Fails: Science and 21st Century Leadership. He writes on science and business leadership at www.paulgibbons.net.
[i] Professor Mark Keil of Georgia State University, who specializes in IT project management and escalation issues, provided helpful resources.
Click here for “Cognitive biases and leading change: Part 2“