"The fact that people's intuitive decisions are often strongly and systematically biased has been firmly established over the past 60 years by literally hundreds of empirical studies."
Human error and bias is the first of my six reasons that organizations choose the wrong projects. Here's an example:
In 2005, a billion-dollar investment was initiated for Astypalea, a Greek island described in tourist guides as, "One of those secret islands that receive few foreign visitors" . With funding support from the Greek government and European Union, a plan was devised to transform the island into a whole-island vacation destination. The plan included building 70 km of new roads, 3 luxury hotels, 5000 vacation homes, a golf course, and a "mega-yacht" marina. Developers believed the plan would not just be good for investors, it would provide high-paying jobs for locals and enable the island to avoid the environmental harm from uncontrolled development experienced at neighboring islands . Panagiotis Chatzipanos was installed as the project portfolio manager, responsible for 19 projects organized into 5 programs.
To be successful, Chatzipanos needed to successfully complete the projects in his portfolio. However, the "AstyProject" immediately encountered difficulties. According to a paper by Chatzipanos and coauthor T. Giotis, cognitive biases—perceptual distortions, inaccurate judgments, and illogical interpretations by project participants and stakeholders—"dramatically increased the complexity of what was already a difficult-to-manage project portfolio" .
First, it took more than a year to persuade the local community of the mutual benefit of the project. The authors place part of the blame on community members who seemed to have an "unreasonable resistance to change" (status quo bias) and were inclined to interpret situations "through the narrow lens of their own past experiences" (e.g., "beware of foreigners bearing gifts," framing bias). Changing opinions was tough due to people's "tendency to favor information that confirmed their existing beliefs" (supporting evidence bias). Conformity to group opinion trumped open mindedness (groupthink).
Chatzipanos also blames AstyProject personel—program managers, project managers, individual project team members, and subject matter experts. They were all subject to a "uniqueness bias," a sense that the project was the first of its kind for the Eastern Mediterranean, a bias that "impeded us from diligently obtaining lessons learned from other similar projects around the world." Team members were overly optimistic, often overestimating favorable outcomes and underestimating unfavorable outcomes (overoptimism bias). Time and cost estimates for completing tasks were underestimated (planning fallacy), and there appeared to be instances of strategic misrepresentation and misstatement of facts in response to incentives in the budget process (motivational bias). There was excessive confidence in individuals' beliefs, knowledge, and abilities (overconfidence bias) and a reluctance to openly discuss risky or difficult facts and situations (Ostrich Effect). There was also a tendency to believe that more should be invested in failing approaches because so much had been spent already (sunk cost bias) .
In short, the AstyProject portfolio manager found that his project portfolio was failing. Projects were overbudget and not completed according to schedule. Desired project outcomes were not being obtained. The project portfolio failed to anticipate and address the biases that were encountered. Would choosing a different set of projects and employing methods for countering biases (see below and subsequent pages) enable Chatzipanos' portfolio to be successful? In the end it didn't matter. The AstyProject experienced a black swan event; that is, an unforeseen event with massive negative implications: The project was halted when government funding support was withdrawn in response to the Greek debt crisis that began in late 2009 and continues to this day.
Overcoming the first reason that organizations choose the wrong projects requires an understanding of human errors and biases.
Heuristics and Judgmental Biases
Why do we need decision aids? Can't people make good choices on their own? Like many decision analysts, I was first attracted to the science of decision making as a result of reading about the errors and biases that affect people's judgments.
Remarkably, it appears that our brains have been hard-wired to make certain, predictable kinds of errors, errors that psychologists have termed biases. Hundreds of different biases have been identified and categorized, including biases that distort our judgments, introduce errors into the estimates and forecasts that we produce, cause us to overlook critical considerations, and compel us to make poor choices. Clearly, if managers responsible for selecting projects are subject to these errors and biases (they are), then organizations may not be choosing the best projects for their project portfolios.
If you're not already familiar with the major results from this fascinating field of research, this introduction should help you to appreciate the value of methods for avoiding biases. Without the assistance of such methods, the decisions made within organizations, including choices of which projects to conduct, are likely to be in error.
The fact that people's intuitive decisions are often strongly and systematically biased has been firmly established over the past 60 years by literally hundreds of empirical studies. Behavioral psychologist Daniel Kahneman received the 2002 Nobel Prize in Economics for his work in this area. The conclusion reached by Kahneman and his colleagues is that people use unconscious shortcuts, termed heuristics, to help them make decisions. Heuristics are mental shortcuts that usually involve focusing on one aspect of a complex problem and ignoring others [4}. "In general, these heuristics are useful, but sometimes they lead to severe and systematic errors" .
Understanding heuristics and the errors they cause is important because it can help us find ways to counteract them. For example, when judging distance people use a heuristic that equates clarity with proximity. The clearer an object, the closer we perceive it to be (Figure 1). Although this heuristic is usually correct, it allows haze to trick us into thinking that objects are more distant than they are. The effect can be dangerous. Studies show people often drive faster in fog because reduced clarity and contrast make going fast appear slower. Airline pilots are similarly tricked, so pilots are trained to rely more on instruments than on what they think they see out the cockpit window.
Figure 1: Haze tricks us into thinking objects are further away.
Some of the dozens of well-documented heuristics and related errors and biases include:
The following summary of judgmental biases, including ideas for countering them, is derived from some of the many excellent papers on the subject, including the many groundbreaking papers by Daniel Kahneman and Amos Tversky and the popular 1998 Harvard Business Review article by Hammond, Keeney, and Raiffa .
Status Quo Bias
Status quo bias refers to the tendency people have to prefer alternatives that perpetuate the status quo . Psychologists call this a comfort zone bias based on research suggesting that breaking from the status quo is, for most people, emotionally uncomfortable. It requires increased responsibility and opening oneself up to criticism. For example, if you are a company executive considering introducing a lower-cost/lower-quality version of an existing product to your product line, you face the risk of adversely affecting the perceptions of customers who choose your high-quality products. If your company's reputation for quality declines, you could be accused of making a bad choice. Just considering the change forces you to confront the trade-off between increased profits and the risk of damaging your brand image. Sticking to the status quo is easier because it is familiar; it creates less internal tension.
Admittedly, there are often good reasons for leaving things unchanged. But, studies show that people overvalue the status quo. A famous experiment involved randomly giving students a gift consisting of either a coffee mug or a candy bar . When offered the chance to trade, few wanted to exchange for the alternative gift. It is unlikely, of course, that students who naturally preferred the coffee cup to the candy bar received their preferred gift by chance. Apparently, "owning" what they had been given made it appear more valuable.
The power of this bias was quantified in a related experiment . Students were randomly chosen to receive mugs. Those with mugs were asked to name the minimum price at which they would sell their mug. Those without were asked to name the maximum price at which they would buy. The median selling price was more than twice the median offer price. Again, ownership increased perceived value. Sometimes referred to as the endowment effect, this bias may help explain why investors are often slow to sell stocks that have lost value. Likewise, it might be a factor for explaining why executives have trouble terminating failing projects.
Social norms tend to reinforce preference for the status quo. For example, courts (and many organizations) view a sin of commission (doing something wrong) as more serious than a sin of omission (failing to prevent a wrong). As another example, government decision makers are often reluctant to adopt efficiency-enhancing reforms if there are "losers" as well as "gainers." Any change is seen as unfair. The burden of proof is on the side of changing the status quo.
Lack of information, uncertainty, and too many alternatives promote holding to the status quo. In the absence of an unequivocal case for changing course, why face the unpleasant prospect of change? Thus, many organizations continue to support failing projects due to lack of solid evidence that they've failed. Killing a project may be a good business decision, but changing the status quo is typically uncomfortable for the people involved.
What causes status quo bias? According to psychologists, when people face the opportunity of changing their status quo, the loss aspects of the change loom larger than the gain aspects . Losses represent the certain elimination of visible, existing benefits. Gains, in contrast, are prospective and speculative. We know what we have, who knows what we will get? We fear regret, and this fear is amplified by our desire to maintain the respect and approval of others. In business, the key to success is often bold action, but for many CEO's, the only thing worse than making a strategic mistake is being the only one in the industry to make it. Sticking with the status quo is safer.
The best advice for countering the bias toward the status quo is to consider carefully whether status quo is the best choice or only the most comfortable one:
Sunk Cost Bias
We know rationally that sunk costs—past investments that are now irrecoverable—are irrelevant to current decisions. Sunk costs are the same regardless of the course of action that we choose next. If we evaluate alternatives based solely on their merits, we should ignore sunk costs. Only incremental costs and benefits should influence our choices.
Yet, the more we invest in something (financially, emotionally, or otherwise), the harder it is to give up that investment. For example, when making a telephone call, being on hold and hearing the recording, "Your call is important to us...Please stay on the line," often means that you've got a lot longer to wait. Still, having already invested the effort to make the call, it's hard to hang up and call another time .
There is a great deal of research demonstrating the influence of sunk costs. In one study, students were shown to be more likely to eat identical TV dinners if they paid more for them . Another study arranged to have similar tickets for a theater performance sold at different prices—people with the more expensive tickets were less likely to miss the performance . A third study found that the higher an NBA basketball player is picked in the draft, the more playing time he gets, even after adjusting for differences in performance .
The Concorde supersonic airplane is often cited as an example of sunk cost bias . It soon became obvious that the Concorde was very costly to produce and, with limited seating, was unlikely to generate adequate revenue. Few orders for planes were coming in. Still, even though it was clear that the plane would not make money, France and England continued to invest.
Sunk cost reasoning shows up frequently in business. For example, you might be reluctant to fire a poor performer you hired in part because you may feel to do so would be an admission of earlier poor judgment. You might be inclined to give more weight to information you paid for than to information that was free. You might find it harder to terminate a project if you've already spent a lot on it.
Why is it so difficult to free oneself from sunk cost reasoning? Many of us appear to be born with strong feelings about wasting resources. We feel obligated to keep investing because, otherwise, the sunk cost will have been "wasted." We would then need to admit (at least to ourselves if not to others) that we made a mistake. It has even been suggested that sunk cost reasoning may be a kind of self-punishment. We may unconsciously force ourselves to follow through on commitments that no longer seem desirable in order to instruct ourselves to be more careful next time .
Techniques for countering sunk cost bias include:
Supporting Evidence Bias
Also called confirmation bias, supporting evidence bias is our tendency to want to confirm what we already suspect and look for facts to support it. This bias not only affects where we go to collect information, but also how we interpret the evidence that we receive. We avoid asking tough questions and discount new information that might challenge our preconceptions .
Suppose, for example, you are considering an investment to automate some business function. Your inclination is to call an acquaintance who has been boasting about the good results his organization obtained from doing the same. Isn't it obvious that he will confirm your view that, "It's the right choice"? What may be behind your desire to make the call is the likelihood of receiving emotional comfort, not the likelihood of obtaining useful information.
Supporting evidence bias influences the way we listen to others. It causes us to pay too much attention to supporting evidence and too little to conflicting evidence. Psychologists believe the bias derives from two fundamental tendencies. The first is our nature to subconsciously decide what we want to do before figuring out why we want to do it. The second is our inclination to be more attracted to experiences that make us feel good than experiences that make us feel uncomfortable .
Despite our inclination to look for supporting evidence, it is usually much more informative to seek contradictory evidence. Confirming evidence often fails to discriminate among possibilities. To illustrate, in one study students were given the sequence of numbers 2, 4, 6 and told to determine the rule that generated the numbers . To check hypotheses, they could choose a possible next number and ask whether that number was consistent with the rule. Most students asked whether a next number "8" would be consistent with the rule. When told it was, they expressed confidence that the rule was, "The numbers increase by 2." Actually, the rule was, "Any increasing sequence." A better test would have been to check whether a next number incompatible with the hypothesis (e.g., "7") was consistent with the unknown rule.
Supporting evidence bias can cause us to perpetuate our pet beliefs. For example, if a manager believes people are basically untrustworthy, that manager will closely monitor their behavior. Every questionable act will increase suspicions. Meanwhile, employees will notice that their actions are being scrutinized. Closely watching employees will make it impossible to develop trust. Studies show that when people are placed in situations where authority figures expect them to cheat, more of them do, in fact, cheat . The behavior pattern reinforces itself to everyone's detriment.
Changing what we believe takes effort. When first encountered, data that conflicts with our preconceptions are often interpreted as being the result of error, or to some other externally attributed factor. It is only after repeatedly being exposed to the conflicting information that we are willing to make the effort to change our beliefs.
Some advice for avoiding supporting evidence bias:
The first step in making a choice is to frame the decision, but it is also where you can first go wrong . The way a problem is framed strongly influences the subsequent choices we make. People tend to accept the frame they are given; they seldom stop to reframe it in their own words. A frame that biases our reasoning causes us to make poor decisions.
Edward Russo and Paul Shoemaker  provide a story to illustrate the power of framing. A Jesuit and a Franciscan were seeking permission from their superiors to be allowed to smoke while they prayed. The Jesuit asked whether it was acceptable for him to smoke while he prayed. His request was denied. The Franciscan asked the question a different way: "In moments of human weakness when I smoke, may I also pray?" Of course, the story describes this frame as eliciting the opposite response.
Whether outcomes are described as gains or losses influences people's choices. In one experiment, participants were asked to express their preferences among alternative programs impacting community jobs . They were told that due to a factory closing 600 jobs were about to be lost. However, if program A is adopted, 200 jobs will be saved. On the other hand, if program B is adopted, there is a 1/3 probability that 600 jobs will be saved and a 2/3 probability that none of the 600 jobs will be saved. Most people preferred program A. Another group was given a rephrasing of the choice. If program C is adopted, they were told, 400 people will lose their jobs. If program D is adopted, there is a 1/3 probability that nobody will lose their job and a 2/3 probability that 600 will lose their jobs. This group mainly favored program D.
Similar effects occur in everyday decision making. For example, the typical New York taxi driver chooses how long to work each day based on a personal target for daily earnings. Failing to achieve the target is perceived a loss. Thus, on slow days the driver works more hours in order to achieve the target and avoid the loss. On busy days the driver hits the target more quickly and quits early. However, it would be more efficient to work longer hours on fast days and quit early on slow days. The driver would end up with more income and fewer hours worked .
Framing can also distort resource allocation decisions through interactions with other biases. Partition bias describes the tendency for decision makers to prefer "fair" allocations wherein participants are perceived to be treated more or less equally . In one study, MBA students were asked to play the role of the capital allocation manager for an international product-oriented company . The company, they were told, has three main divisions: health care, beauty care, and home care. Each division has one or more geographic regions: health care in the USA only; beauty care in the USA and Europe; and home care in the USA, Europe and Latin America. All students were given the same data on the differing performance of each division and its geographical subunits. However, half the students were told that the company practiced centralized decision making, meaning that the capital allocation manager should distribute funds to each of the six subunits, while the other half was told the company practiced decentralized decision making, meaning that the allocation manager should only chose the allocation across the 3 divisions.
Students that allocated resources across the 3 divisions directly gave health care an average allocation of 30% (roughly 100% divided by the number of divisions). The students that allocated resources across the six subunits gave health care an average allocation of 20% (roughly 100% divided by the number of subunits). It appears that the students paid little attention to the data regarding the relative performance of units and thought mainly about how to equally (fairly) distribute resources. Partition bias can cause organizations to provide too much funding for business units that are performing poorly and too little to those performing well. In addition, the result of the bias can depend on the way choices are framed.
Framing bias impacts funding choices in well-recognized ways as well. Project proponents intuitively understand the advantage of focusing attention on upside potential rather than down-side risk. It sounds more positive to say that a new product launch has a "1-in-3 chance of succeeding" compared to the mathematically equivalent statement that it has a "67% chance of failing." If people are rational, they should make the same choice in every situation in which the important factors, outcomes, and outcome probabilities are identical. It shouldn't matter whether those outcomes are described as "gains" or "losses" or as "successes" or "failures." But, the words establish different frames, and decisions may differ because of it .
Another example, described by Hammond, Keeney and Raiffa , involves automobile insurance laws voted on in New Jersey and Pennsylvania. Each state gave voters a new option: By accepting a limited right to sue they could lower their insurance premiums. New Jersey framed the initiative by automatically giving drivers the limited right to sue unless they specified otherwise. Pennsylvania framed it by giving drivers the full right to sue unless they specified otherwise. Both measures passed, and in both cases large majorities of drivers defaulted to the status quo. But, because of the way Pennsylvania framed the choice, drivers in that state failed to gain about $200 million in expected insurance savings.