{"id":6127,"date":"2013-12-18T15:48:12","date_gmt":"2013-12-18T19:48:12","guid":{"rendered":"http:\/\/calleam.com\/WTPF\/?p=6127"},"modified":"2020-01-15T11:30:32","modified_gmt":"2020-01-15T19:30:32","slug":"cognitive-biases-and-leading-change-part-3","status":"publish","type":"post","link":"https:\/\/calleam.com\/WTPF\/?p=6127","title":{"rendered":"Cognitive biases and leading change &#8211; Part 3"},"content":{"rendered":"<p>In <a href=\"http:\/\/calleam.com\/WTPF\/?p=6099\">parts 1<\/a> and <a href=\"http:\/\/calleam.com\/WTPF\/?p=6112\">2<\/a> of this series we have looked at the affect cognitive bias have on our view of the past and the present. \u00a0In this third and final part we&#8217;ll be looking at how such biases effect our view of the future.<\/p>\n<hr \/>\n<p>Physicist Niels Bohr (a contemporary and collaborator of Einstein) said \u201cprediction is difficult, especially about the future\u201d.\u00a0 If humans are biased in our views of the past and present, our views of the future are even more fraught.\u00a0 We are caught between the Scylla of hopes, fears, emotional attachments, self-justification, identity, psychological projection and the Charybdis of our limited cognitive abilities in areas such as planning, probability, judgment, and causal inference.<\/p>\n<p>Just what do future-oriented biases look like, and what can be done about them by leaders contemplating major change?<\/p>\n<h3>Future-oriented biases<\/h3>\n<p align=\"center\"><em>\u201cLake Wobegon \u2026where all the men are strong, all the women good looking, and all the children above average\u201d (Garrison Keillor)<\/em><\/p>\n<p>For executives making strategic decisions and decisions involving major change, three areas are important: optimism biases, thinking probabilistically, and the planning fallacy.<\/p>\n<h4>Optimism: a double-edged sword<\/h4>\n<p align=\"center\"><em>\u201dVer la vida como es y no como\u00a0deber\u00eda\u00a0ser\u00a0<\/em>[1]<em>.\u201d (Cervantes)<\/em><\/p>\n<p>Scan the internet for quotations and articles on optimism and you will find an overwhelmingly optimistic view of optimism. Yet optimism proves to be at the root of several biases which doom change from the start, and sometimes destroy businesses.\u00a0 Optimism renders us \u201cconfident without competence\u201d in our judgments, to analyze risk and probability badly, or to be Pollyannaish when planning change.<\/p>\n<p>Lest it be thought that optimism is a bad thing, in general terms it is a very good thing.\u00a0 An optimistic outlook means you live longer, are healthier, more emotionally resilient, earn more, attempt more and succeed[2].\u00a0 To be sure, when a new strategy or change program is contemplated, there is enormous value in full-throated commitment and enthusiasm.<\/p>\n<p>The challenge therefore, for leaders, is not to stamp it out. It may be right to stoke it. Leaders must be conscious of when optimism becomes hubris, and when a positive outlook becomes self-delusion.\u00a0 Leaders must know whether to abandon, forge ahead or proceed with caution.<\/p>\n<p>British Petroleum&#8217;s (BP) Engineers and staff faced such a decision on the Deep Water Horizon drill platform. \u00a0In that case they did indeed forge ahead with the simplest and faster approach to the drilling despite very clear warning signs of danger. What were those warnings?<\/p>\n<ol>\n<li>An internal BP report recommended against a \u201clong string casing\u201d, but engineers on the Horizon countered that using one would \u201csave at least 3 days, and $7 &#8211; $10m dollars\u201d.\u00a0 Expedience won.<\/li>\n<li>Halliburton modelling recommended using 21 \u201ccentralizers\u201d. Some BP engineers agreed: \u201cwe need to honor the modelling having chosen the risker casing\u201d.\u00a0\u00a0 Other engineers argued \u201dits a vertical hole so hopefully the pipe stays centralized due to gravity\u201d.\u00a0 Hope (and time and money won) again and BP used only six centralizers.<\/li>\n<li>There were portents that the well had gas leak problems, and a test could have validated or disproven that, but other engineers argued that the test \u201cwas only a model\u201d and could wait until full production began.<\/li>\n<\/ol>\n<p>Here we see optimism\u2019s dark side: denial of expertise, mistrust of models, and hope versus prudence.\u00a0 We see people trusting their gut where hundreds of billions, and many lives are at stake.\u00a0 My Irish grandmother used to say \u201cdon\u2019t get a dog and bark yourself\u201d, yet in the Deepwater Horizon disaster, recommendations, models and formal decision processes were wilfully ignored because decision makers preferred the sound of their own bark to expert recommendations.<\/p>\n<p>There is no easy answer to when optimism becomes denial, or hubris.\u00a0 Had things gone smoothly, the managers involved might have been applauded for their can-do practical attitudes.\u00a0 The world of risk means that very poor decisions can turn out well, or \u201cperfect\u201d decisions can fall prey to <em>force majeure<\/em>.<\/p>\n<h4>Confidence without Competence<\/h4>\n<p align=\"center\"><em>Problem not just that thinking is flawed, but that we are confident in flawed thinking.<\/em><\/p>\n<p>In my Strategic Decision Making program at US Business Schools, I ask executives to answer ten questions, some of which are:<\/p>\n<ol>\n<li>How many books in the Old Testament?<\/li>\n<li>How long is the Nile River?<\/li>\n<li>How many neurons are in the human brain?<\/li>\n<li>How far is it from Earth to the Moon?<\/li>\n<\/ol>\n<p>Nobody but the saddest nerd would get them all right, but I ask them to provide their answer as a range where they would be 90 percent sure of being correct, what statisticians call a 90 percent confidence interval.\u00a0 The most common score is 40 percent, four out of ten answers within their range.\u00a0 My executives get over half wrong despite being 90 percent sure on ten of them!<\/p>\n<p>This phenomenon does just apply to subjects where our knowledge is limited.\u00a0 Berkeley Professor, Philip Tetlock, has spent a career studying expert predictions.\u00a0 The experts that he studied came from every imaginable area of expertise and despite their credentials and reputation, fared little better than pure chance and worse than simplistic methods (such as merely averaging the election results in a precinct over the last few cycles).\u00a0 Fully 15 percent of events that had no chance, happened, and a whopping 25 percent of events they were absolutely sure of failed to happen.<\/p>\n<p>Tetlock categorized experts as hedgehogs or foxes.\u00a0 Hedgehogs have a Big Idea, and are specialized and ideological.\u00a0 Foxes know less but about lots of things, are self-critical, study the evidence, and are cautious and flexible.\u00a0 It sounds as if we want our world run by foxes, but quite the opposite is the case.\u00a0 We vote for hedgehogs, employ them as pundits, and journalists, and writers.\u00a0 And CEOs.\u00a0 Al Dunlap built a very (very) lucrative business career as a master cost-cutter, earning him the moniker \u201cChainsaw Al\u201d.\u00a0 He slashed and burned his way through CEO roles with charisma and competence.\u00a0 It worked well when it was the correct strategy, and abysmally when it was the wrong strategy.<\/p>\n<p>Humans are confident without necessarily being competent, but in boardrooms, it is worse.\u00a0 \u201cStrong leadership\u201d, \u201cknowing one\u2019s stuff\u201d, and \u201cexuding confidence\u201d are good personas to project.\u00a0 Are not leaders are supposed to be sure of themselves, and have the answers?\u00a0 To demur, reflect, or to change ones views is seen as \u201cflip-floppy\u201d (more so in the national political arena).\u00a0 To employ Tetlock\u2019s dichotomy, boardroom culture is heavily hedgehog.<\/p>\n<p>Hedgehogs respond badly to challenge, deal badly with complexity, like straightforward solutions, and generally do not play well with foxes.\u00a0 Those are exactly the attributes that often get promotions, and are very poor at managing major change, or complex risky endeavors (such as ultra-deep offshore oil wells). Which is your team culture?<\/p>\n<h4>Thinking risk and probability<\/h4>\n<p align=\"center\"><em>The illusion of control deludes leaders into banking on a single outcome in situations that are highly risk dependent. (Kahnemann)<\/em><\/p>\n<p>Imagine an ambitious Vice President who presents a 30 slide presentation of a business case.\u00a0 He concludes with:<\/p>\n<ol>\n<li>\u201cThis project will take 18 months, and cost $65 million dollars, however, there is a 40 percent chance that it will take twice as long and cost 75 percent more.\u201d<\/li>\n<li>\u201cThis $10 million investment clears our hurdle rate, but 50 percent of the time we will only break even and 15 percent of the time we will lose money.\u201d<\/li>\n<li>\u201cBased on my analysis the probability of this product succeeding is 35 percent.\u201d<\/li>\n<\/ol>\n<p>Our VP will not last long: corporate culture does not applaud probabilistic analyses. President Truman (a hedgehog) once moaned, \u201ccan someone please find me a one-handed economist\u201d, because foxes bore the pants of hedgehogs with their \u201con the other hands\u201d and \u201chowevers\u201d.<\/p>\n<p>While probabilistic reasoning is tedious and leads to more ambiguous conclusions, reality operates probabilistically.\u00a0 The more complex the system, the more variable (risky) the outcomes.\u00a0\u00a0 The profound implications of this essential feature of reality still elude us in all the practical disciplines.\u00a0 A complex project, or an oil rig, or an economy, can be blown of course (or blown up) by a myriad of factors.\u00a0 Each part of a project, each small task, each person who must be persuaded, each milestone to be met has its own distribution of possible outcomes. The expected delivery date of a project that has 10,000 tasks is the sum[3] of those probabilities.<\/p>\n<p>When leaders consider a single plan and budget, they try to land on a dime how long 10,000 tasks will take and how much they will cost.\u00a0 They arrive at a best-case presentation, perhaps a most probable one.\u00a0 In the hundreds of consulting proposals I have written or reviewed, not a single one had anything other than a single end-date and cost[4].<\/p>\n<p>In a world where reality looks like bell-shaped curves, one falls back on proxies for reality: means, medians and modes.\u00a0 However, if you cannot swim, walking across a river that is an average of four feet deep, or four feet deep in most places will get you into big trouble.\u00a0 The river\u2019s depth is a distribution and using just one slice of the distribution is, well, absurd.<\/p>\n<p>Venture capitalists and traders think like this, knowing they will only get some right, and their job is to get the big ones right, and the small ones wrong[5].\u00a0 Every endeavor will have a distribution of returns and a business has to be resilient enough to handle deviations from the mean.<\/p>\n<h4>Planning fallacy<\/h4>\n<p align=\"center\"><em>In preparing for battle, plans are useless, planning is indispensable. (Eisenhower)<\/em><\/p>\n<p>Professor Bent Flyvbjerg, of Oxford\u2019s Said Business School, specializes in analyzing major projects.\u00a0 He studied 1471 such projects and found an average overrun of 27 percent.\u00a0 That number may strike you as tolerable, perhaps good?\u00a0 But, bearing in mind our caution about crossing a stream an average of four feet deep, that figure should raise eyebrows rather than give comfort. Flyvbjerg also found that one in six had a cost overrun of 200 percent, and a schedule overrun of almost 70 percent.<\/p>\n<p>To recap some of the more grotesque overruns from chapter 1:<\/p>\n<ol>\n<li>US Air Force, 100 percent ($600 million overrun)<\/li>\n<li>NHS Connecting for Health:\u00a0 700 percent (a $15 billion overrun)<\/li>\n<li>Boston\u2019s Big Dig, 190 percent ($12 billion overrun)<\/li>\n<li>Denver International Airport, 100 percent ($3 billion overrun)<\/li>\n<\/ol>\n<p>Lest you be seduced by thinking this phenomenon only applies to projects of enormous complexity, recall that Flyvbjerg studied 1471 projects.\u00a0 It equally applies to projects with tiny scales.\u00a0 One study asked students to estimate how long it would take to complete a term paper a) if everything went as well as it could, or b) everything went as poorly as it could. They provided a range of 27 to 49 days.\u00a0 Term papers were handed in after an average of 55.5.<\/p>\n<p>Kahneman summarizes as follows, \u201cExecutives make decisions based on delusional optimism rather than on a rational weighting of gains, losses, and probabilities. They overestimate benefits and underestimate costs. They spin scenarios of success while overlooking the potential for mistakes and miscalculations. As a result, they pursue initiatives that are unlikely to come in on budget or on time or to deliver the expected returns \u2014 or even to be completed.\u201d<\/p>\n<h3>What does this suggest for change strategy?<\/h3>\n<p>The first thing is management of capital.\u00a0 Banks \u201cstress test\u201d their portfolios (not very well as 2007 proved) against numerous scenarios, and corporations must do likewise estimating their ability to shoulder the one in six 200 percent overrun and must also consider the investment in the view of middle of the distribution events such as only realizing 50 percent of the benefits.<\/p>\n<p>The second is to diagnose and name \u201cfat-tail\u201d events and to rationally reassess prospects.\u00a0 Fat tails are part of the fabric of the universe, and the best decisions will encounter them.\u00a0 Executives must use the language of this chapter, \u201cfat tails\u201d and \u201cescalation of commitment\u201d to rigorously re-assess projects bearing in mind the sunk-cost bias, and impression management effects (that is, not wishing to look bad to either the business, to Wall Street, or to customers).<\/p>\n<p>The third structure is proposed by Kahneman, who did the pioneering work on the planning fallacy in 1979.\u00a0 He suggests the following procedure for \u201cgrounding\u201d planning in reality:<\/p>\n<ol start=\"1\">\n<li>Identify an appropriate reference class (e.g., school building project, IT project, family room addition, etc.)<\/li>\n<li>Obtain the statistics of the referenced class (e.g., percentage by which expenditures exceeded budget, project delays, cost per square foot, etc.). Use this objective research to generate a baseline prediction.<\/li>\n<li>If, despite your disciplined efforts, you believe optimism bias is still at play, adjust the baseline prediction as necessary.<\/li>\n<\/ol>\n<p>Most writing on leadership contains a great deal on creating a compelling and optimistic view of the future.\u00a0 Little is written about its darker sides, hubris and denial.\u00a0 Less still is written about how to balance those forces.\u00a0 Given the uncertain and probabilistic nature of change, leaders must be conscious of how often 100 percent confidence, means 75 percent correct (as Tetlock showed), they must ensure that big decisions are evaluated probabilistically, and that the multiple biases (illusion of control, planning fallacy) that affect project budgets and timescales are factored in.<\/p>\n<h3>Conclusion<\/h3>\n<p>One could conclude after this tour of human fallibility that it is miraculous how successful we are at building cathedrals, computers and corporations given how wrong our judgments can be.\u00a0 There are interesting hypotheses about the roots of bias, and about how the evolution of our brains and our social technologies (decision making) have not kept up with our technological prowess.<\/p>\n<p><em>Contributed article: This article is part three of a three-part series by Paul Gibbons who is writing <strong>The Science of Organizational Change: How to Leverage 21<sup>st<\/sup> Century Intelligence on Human Behavior<\/strong>.\u00a0 He blogs at <\/em><a href=\"https:\/\/paulgibbons.net\">www.paulgibbons.net<\/a><em> and his most recent book <strong>Reboot Your Life: A 12-day Program for Ending Stress, Realizing Your Goals, and Being More Productive<\/strong> is available on Amazon.com and from <\/em><a href=\"http:\/\/www.paulgibbons.net\/rebootyourlife\">www.paulgibbons.net\/rebootyourlife<\/a><\/p>\n<div>\n<p>&nbsp;<\/p>\n<hr align=\"left\" size=\"1\" width=\"33%\" \/>\n<div>\n<ul>\n<li>[1] To see life as it is and not as it should be.<\/li>\n<li>[2] As in most social science research, the arrow of causality could run either way, or more probably both.\u00a0 Healthier, wealthier people are (understandably) more optimistic.<\/li>\n<li>[3] It is not really a sum, with the same sort of mathematical complexity as weather systems.<\/li>\n<li>[4] Assessing variability and risk of projects is in fact an important and relatively recent addition to the core capabilities of big consulting firms.<\/li>\n<li>[5] Strategies vary and are more complex than this, but both VCs and traders expect to lose some of the time.<\/li>\n<\/ul>\n<\/div>\n<\/div>\n","protected":false},"excerpt":{"rendered":"<p>In parts 1 and 2 of this series we have looked at the affect cognitive bias have on our view of the past and the present. \u00a0In this third and final part we&#8217;ll be looking at how such biases effect our view of the future. Physicist Niels Bohr (a contemporary and collaborator of Einstein) said [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[27,40,37,31],"tags":[],"class_list":["post-6127","post","type-post","status-publish","format-standard","hentry","category-blog","category-leadership-blog","category-management","category-people"],"_links":{"self":[{"href":"https:\/\/calleam.com\/WTPF\/index.php?rest_route=\/wp\/v2\/posts\/6127","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/calleam.com\/WTPF\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/calleam.com\/WTPF\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/calleam.com\/WTPF\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/calleam.com\/WTPF\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=6127"}],"version-history":[{"count":13,"href":"https:\/\/calleam.com\/WTPF\/index.php?rest_route=\/wp\/v2\/posts\/6127\/revisions"}],"predecessor-version":[{"id":8994,"href":"https:\/\/calleam.com\/WTPF\/index.php?rest_route=\/wp\/v2\/posts\/6127\/revisions\/8994"}],"wp:attachment":[{"href":"https:\/\/calleam.com\/WTPF\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=6127"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/calleam.com\/WTPF\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=6127"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/calleam.com\/WTPF\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=6127"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}