Lee Merkhofer Consulting Priority Systems
Implementing project portfolio management

Bounded Awareness and Decision-Making Heuristics

In addition to the biases that degrade judgments, research shows that problems arise from the processes that people use to make decisions. Problems include bounded awareness and the use of flawed decision-making heuristics.

Bounded Awareness

Bounded awareness, described by Max Bazerman and Dolly Chugh [6], refers to the well-documented observation that people routinely overlook important information during the decision-making process. One cause is our tendency to become overly focused. Focus limits awareness, and important information outside the range of focus can be missed. Thus, an airplane pilot attending to status monitors and controls can overlook the presence of another airplane on the runway. Cell phones can distract drivers and contribute to car accidents.

Cognitive scientist Daniel Gilbert cites research suggesting that the human mind is inherently incapable of seeing certain things [7]. We can detect patterns in what we see, but because we are limited by the extent of our own imagination, we have a tough time imagining what is not there. Due to blind spots, we miss things, then the mind replaces what we didn't see with what it expects to experience.

Iridium phone

Unnoticed developments

George Day and Paul Schoemaker [8] observe that companies are often slow to recognize developments at the periphery of the business that ultimately turn out to be strategically important. A classic example is Motorola's Iridium project, initiated in the 1980's. The plan was to provide around-the-world, mobile phone service for the business community through a network of 66 low-orbiting, communication satellites. After 15 years and spending $5 billion, service was launched in 1998. However, by then the build-out of cellular phone service was well underway. Motorola's $3,000 phones, about the size of a brick, couldn't compete. Had company executives been paying attention to market developments, they would likely have abandoned Iridium. Similar examples include the music industry's failure to foresee the threat of Napster-type services, Polaroid's bankruptcy resulting from the rapid rise of digital photography, and Wal-Mart's surprise that social concerns would cause communities to resist the opening of new stores.

Studies show that unexpected information is particularly easy to miss. In one experiment, participants were shown a video of an exciting basketball game. There is a 5-second interval during which a person in a gorilla costume walks through the game thumping his chest. Few people recall seeing the gorilla.

Changes that occur slowly are often not recognized until it is too late. This may explain, in part, business scandals. At Enron, for example, accounting irregularities were adopted slowly. Like the boiling frog, executives may have been lulled into a sense of comfort, unaware that they were in hot water [9].

The Challenger space shuttle disaster provides an example of a situation wherein important information existed, but was not sought out. On the day of the launch, decision makers argued whether the low temperature would be a problem for the shuttle's O-rings. Seven prior launches with some sort of O-ring failure were examined, but no pattern between failures and temperature emerged. After the fact, data were analyzed for all 24 previous launches. The expanded data indicated that the Challenger had more than a 99% chance of malfunctioning [10].

Vioxx

Was risk ignored?

Sometimes information is known, but not acted upon. For example, the drug company Merck did not withdraw its pain relief drug Vioxx from the market until 2004. it's been estimated that by that time the drug may have been associated with as many as 25,000 heart attacks and stokes [11]. Evidence of the drug's risks was reported in a medical journal as early as 2000. The delay in taking action cost Merck dearly—Over $200 million has been awarded in claims. Why didn't Merck withdraw the drug sooner? The warning was available, but it was ignored.

Group dynamics hinder information sharing. Teams may discuss information, but the discussion usually conveys information that is widely known, not information that is uniquely known to a single team member. Psychologists say that this is because sharing something that others already know produces positive reinforcement in the form of the agreement of others. However, introducing into the discussion something that only you know typically doesn't generate positive feedback—"People just sit there, and it is unclear what happens next."

Organizational structures can institutionalize bounded awareness. Organizational silos and multiple layers of management hinder information flow. Information doesn't get transmitted well across silos, and management layers can filter out messages management doesn't want to hear.

Advice:

  1. Stay alert to peripheral threats and opportunities. Look for unknown unknowns, For example, watch for regulatory, technological, and market-oriented changes and trends and be prepared to modify your strategies.
  2. Instead of using the information that happens to be in the room at the time, identify what information is needed to make the decision and then get it.
  3. In meetings, assume everyone has unique information and ask for it. Meeting agendas should request individual reporting.
  4. Get outsiders' perspectives. Outsiders may help you to see critical information that you could easily overlook when immersed in day-to-day activities.
  5. Break down organizational silos and other barriers to information flow.

Decision-Making Heuristics

Just as people use heuristics to make judgments, executives (and the rest of us) often rely on simplified reasoning to make difficult decisions. Kahneman [12] argues that the mind can be considered to have two processing components, System 1 and System 2. System 1 makes rapid intuitive decisions based on associative memory, images, and emotional reactions. System 2 monitors the output of System 1 and overrides it when the result conflicts with logic, probability, or some other decision-making rule. The trouble is that System 2 is lazy — we must make a special effort to pay attention, and such focus consumes time and energy. Therefore, we don't always apply as much reasoning effort as we should.

According to decision theorist Herbert Simon, decision complexity coupled with limited time, laziness, and inadequate mental computational power reduce decision makers to a state of "bounded rationality." Decision-making shortcuts can save time and ease the psychological burden of decision making. However, like other judgmental biases, there are circumstances where the use of such heuristics results in bad choices.

Selective Focus

If a decision involves many considerations, a natural response is to simplify by focusing on a subset of what matters. However, the simplification can mislead. As illustration, an experiment was conducted on the gambling floor of a Las Vegas casino. Subjects were given chips worth 25 cents each and shown two alternative gambles:

Bet A: 11/12 chance to win 12 chips, 1/12 chance to win 24 chips (a bet with a high chance of payoff)

Bet B: 2/12 chance to win 79 chips, 10/12 chance to lose 5 chips (a bet with a high potential payoff)

Subjects were asked to indicate which gamble they would rather play. They were then asked to assume they owned a ticket to play each bet, and asked the lowest price for which they would sell each ticket. Subjects often chose Bet A, yet stated (87% of the time) a higher selling price for Bet B. Researchers explained the inconsistency by concluding that when asked to choose, subjects focus on the odds of winning, but when asked to set a selling price, they focus on the winning payoff [13].

Zero Defects

Over simplification

Over simplifying likewise creates problems for business decisions. The "zero defects" program popular with industrial firms in the 1960's provides an example. The program was based on the idea that management should use all possible means to get a job done right the first time. However, once the program was implemented, many firms discovered that they could not live with consequences of making quality the primary goal. Quality rose, but productivity declined, production deadlines were missed, and amounts of spoiled and scrapped goods increased. A high percentage of firms dropped the program.

Intuition

The quickest way to make decisions is by intuition. Let the subconscious mind decide and make the choice that feels right. Intuitive choices are based on emotions without deliberate reasoning. Unfortunately, our emotions can persuade us to make choices that aren't in our best interests. Failing to get positive results from a project, for example, might naturally cause some project managers to become angry. Is the manager's anger a factor in the decision to keep plugging away rather than admit failure?

Poor safety choices

Poor safety choices

An area where researchers think emotions routinely get in the way of sound choices is personal health and safety. In an experiment conducted by Andres Gershoff and Jonathan Koehler [14], people were asked to choose between cars with two different automobile air bag systems. In car one, the airbag reduced the chance of death in a crash to 2%. The second car did event better; the risk of death was 1%. However, the second car utilized a more powerful airbag, and there was a very small risk, 0.01%, that the driver would be killed by the force of the air bag deploying. Although the risk with car one was very nearly twice as high as car two, most people preferred car one. Why did the researchers think emotion was the key factor? They found that people were more likely to make the lower risk choice when they were making safety decisions for others rather than for themselves. Also, they found that participants who scored high on a personality test measuring intuitive thinking were the most likely to avoid safety products with a small potential risk of an adverse outcome associated with a malfunction .

Reasoning by Analogy

Dell's business model

Copying Dell

Reasoning by analogy means choosing a course of action based on an approach found effective for an analogous situation. For example, the business model for Toys R Us was supposedly developed based on an analogy with supermarkets; that is, let the customer see toys arranged like food down supermarket isles. Similarly, Intel reports that it moved into low-end chips to avoid U.S. Steel's well-studied mistake in not pursuing the market in low-end concrete reinforcing bars.

Dangers arise when analogies are built on surface similarities only. For example, in the late 1990's, numerous companies, including Ford, tried copying Dell's strategy of "virtual integration" with suppliers. Dell produced build-to-order desktop computers. Ongoing declines in microprocessor prices is what made supplier integration so advantageous—Dell saved money because it could delay purchasing a processor until just before shipping the computer. The analogy fails for industries (like the autos) that aren't experiencing rapidly declining supply costs. Dell's strategy became much less effective as PC configuration standardized and fewer customers sought customized hardware.

Reasoning by Metaphor

Reasoning by metaphor is similar to reasoning by analogy, except that the comparison is to some easily visualized physical image. For example, when Stephen Jobs and Steve Wozniak were designing the Macintosh computer, they adopted the metaphor of making the interface perform like a desktop, where users could see folders and move them and their contents around. The familiar image is responsible for the enduring success of this ubiquitous user interface.

Domino Theory

Domino Theory

One of the most powerful metaphors to shape U.S. national defense policy during the Cold War was the domino theory. The metaphor portrays countries facing communism as a line of dominoes—allow one country to fall under the influence of communism and the others will also fall, one after the other. Thus, every domino was critical, and none could be allowed to topple no matter what the costs. The domino theory was used from the 1950s to 1980s by successive U.S. Presidents as an argument to prop up governments claiming to be anti-communist. Yet, many have questioned the validity of the theory. The administration of John F. Kennedy cited the domino theory as a reason for supporting the costly Vietnam war, yet when the U.S. finally left Vietnam, after some 58,000 Americans had died, communism did not then take hold in Thailand, Indonesia, and other large Southeast Asian countries.

Rules of Thumb

Popular business rules of thumb include, "the processing power of computer chips doubles every 18 months," "a new business needs to have 4 consecutive quarters of profits before launching a public offering," and "a public health risk is acceptable if the probability of fatality is less than one-chance-in-a-million."

The problem with rules-of-thumb is that they ignore specifics and can become outdated with changing circumstances. For example, if you used a rule of thumb that a fast-food restaurant is worth 2 to 3 times its annual revenues, you might end up paying too much for a restaurant with declining sales, old equipment, and health code violations. Mortgage lenders who stuck with the old rule of thumb that borrowers can spend no more than 25% of income on mortgage payments lost out in areas of rapid price appreciation to more aggressive lenders.

Seeking Means Rather Than Ends

If it is difficult to determine the choice that best achieves the end goal, choosing based on some presumed means to the end can seem like a reasonable approach. Means-based decision making is especially likely if the end goal is difficult to measure. For example, organizations often fund learning initiatives based on the goal of improving the productivity of their workers. Because it is difficult to estimate the impact of a training course on people's productivity, many corporate learning departments simply report the number of hours employees spend in the classroom. A program that manages to get 80% of employees to spend 40 hours in class isn't necessarily providing the best service to the company.

Seeking means rather than ends leads to decision errors when the selected means is only one characteristic or influencer of the desired end, not the cause of it. With regard to choosing projects, the end goal is to produce the greatest possible value for the organization, subject to the constraint on available resources. Because value is difficult to measure (see Part 3), a popular heuristic is to prioritize projects based on strategic alignment—projects are ranked based on how well they fit elements of stated corporate strategy. Translating strategy to action is important, however, being well-linked to an element of strategy doesn't ensure that a project will be successful or that it will generate the most value for the organization.

Incrementalism

Incrementalism is an approach to decision making wherein any large change is made slowly, though a series of small adjustments. The idea is to be cautious, to experiment and learn. For example, an organization might avoid a high-risk, big investment decision by making a series of small investments. The first step might be to conduct a low-cost, low-risk pilot project. If the pilot appears successful, a bolder move might then be taken. Or, an expanded pilot might come next. Home Depot, for example, reportedly uses this approach. The company's policy is to test proposed changes to retail strategy in individual stores before making any large-scale changes.

A major advantage of incrementalism is that it preserves flexibility while providing useful information. However, there are two potential problems. One is that experimenting takes time, and the window of opportunity may close before the organization is prepared to fully act. Another problem relates to sunk cost bias. It may be hard to abandon a plan as costs accumulate because decision makers don't want to admit that earlier investments were "wasted."

Over-Reliance on Experts

Because decisions often involve complex technical considerations, experts frequently play a necessary role in decision making. Experts have credibility and command respect from others, both inside and outside the organization.

There can be problems, however, with relying on experts. Experts can easily dominate discussion and discourage constructive dissent. Over-reliance on experts can create bad choices. Experts may be biased towards solutions that rely more heavily on technology, and they may not fully understand important non-technical considerations relevant to the decision. The Bush Administration, for example, has been criticized for relying too much on experts in the Central Intelligence Agency in its decision to take down the government of Iraq.

Due to the risks associated with using decision-making heuristics, you should:

  1. Be careful using solutions based on analogies. Look for differences in the respective situations and ask how those differences might cause the analogy to fail.
  2. Don't make a decision based on a rule-of-thumb without thinking carefully about whether the rule applies in the current situation. Have things evolved such that rule of thumb no longer makes sense?
  3. Think about whether your decision logic is confusing means with ends. Will what you intend to do guarantee that you get what you want? Or, are you merely choosing something that is only associated with (does not cause) the desired result?
  4. Watch out for the natural attractiveness of incremental solutions and the subsequent power of sunk cost bias. Is the incremental approach popular solely due to reluctance to upset the status quo? Is what will be learned that valuable, compared to the incremental costs including the costs of delay?
  5. Be sure to get the judgments of experts, but don't allow them to dominate the decision-making process. Institute controls that ensure the objectivity of expert opinions and allow others play a role and share responsibility for choices.
  6. If the stakes are high and the probability of success must be maximized, use formal decision analysis rather than intuition or decision-making heuristics.

Decision Errors

Figure 3 shows the 3 essential steps to benefiting from decisions: (1) recognize the opportunity to choose, (2) make a good choice, and (3) implement.


Essential steps for benefitting from decisions

Figure 3:   Essential steps for benefiting from decisions.



Missing the opportunity to make a decision means that the "do nothing" or status quo option is chosen by default. Warren Buffet has reportedly said that he considers his worst investment failures to be missed opportunities to take action, "The things I could have done but didn't do have cost us billions of dollars" [15]. Taking advantage of an opportunity to decide doesn't, of course, ensure success. There are many reasons why a conscious business choice may be poor, including misinterpreting the problem to be solved (poor framing), not being clear about objectives, failing to identify good alternatives, and using heuristics or unaided intuition rather than reasoned logic. Finally, there is nothing to be gained from making a good choice if it is not implemented. A study by an Ohio State University researcher found that about one-third of nearly 400 business decisions studied were never implemented by the organizations involved [16].

It is often assumed that a bad decision will be evident to us after the fact. However, this is not necessarily the case. In one experiment, researchers used a card trick. They showed participants pairs of pictures of female faces and asked them to choose the face they thought more attractive. Later, they switched some of the cards and asked the participants to explain their choices. In the majority of cases where the cards were switched, participants failed to recognize the switch—they simply justified the choices they thought they had made [17].

To benefit from the power to decide, be sure to:

  1. Be alert to the opportunity or need to make decisions.
  2. Consciously frame the decision to be made.
  3. Be clear about decision objectives.
  4. Identify distinct and creative alternatives.
  5. Use analysis and logic to make a good choice.
  6. Follow through and implement what you decide.