Lee Merkhofer Consulting Priority Systems
Implementing project portfolio management

Estimating and Forecasting Biases

People are notoriously poor at estimating and forecasting. They interpret statistical correlation as implying cause-and-effect. They tend to naively extrapolate trends that they perceive in charts. They ignore or don't correctly use probabilities when making choices. They draw inferences from samples that are too small or unrepresentative. They routinely overestimate their abilities and underestimate the time and effort required to complete difficult tasks. Estimating and forecasting biases are a special class of biases important to project-selection decision making.

Misestimating Likelihoods

Uncertain situations are particularly troublesome. Studies show that people make systematic errors when estimating how likely uncertain events are. As shown in Figure 2, likely outcomes (above 40%) are typically estimated to be less probable than they really are. And, outcomes that are quite unlikely are typically estimated to be more probable than they are. Furthermore, people often behave as if extremely unlikely, but still possible outcomes have no chance whatsoever of occurring.


Probability estimates versus actuals

Figure 2:   People systematically over- or under-estimate probabilities.


In addition to systematically misestimating low and high probabilities, studies show that people consistently misestimate the likelihoods of events with certain characteristics. For example, availability bias refers to the tendency to overestimate the frequency of events that are easy to recall (available in memory), such as spectacular or sensational events. Recency bias relates to the tendency to attribute more salience to recent stimuli or observations, which can lead to the overestimation of the likelihood that a rare event that occurred recently will soon reoccur. Illusion of control bias refers to the tendency of people to believe they can control the probabilities of events when in fact they cannot. For example, in the game of craps, a player may throw the dice softly for low numbers and hard for high numbers. Clustering or correlation bias refers to the tendency to see patterns or correlations that don't really exist, for example, believing that the geographic location of a supplier is related to the quality of its products.

Overconfidence

Overconfidence has been called, "perhaps the most robust finding in the psychology of judgment" [4]. We believe we are more accurate at making estimates than we are. To illustrate, I've often repeated a well-known demonstration of what I call the "2/50 rule." Participants are asked to provide confidence intervals within which they are "98% sure" that various uncertain quantities lie. The quantities for the questions are selected from an Almanac, for example, "What's the elevation of the highest mountain in Texas?" "Give me low and high values within which you are 98% sure that the actual value falls." When the true value is checked, up to 50% of the time it falls outside of the specified confidence intervals. If people were not overconfident, values outside their 98% confidence intervals would occur just 2% of the time.

Number sequence

Titanic -- The ship that couldn't sink

Overconfidence is also demonstrated by the many examples of people expressing confidence about things that are subsequently proven wrong. For example, British mathematician Lord Kelvin said, "Heavier-than-air flying machines are impossible." Thomas Watson, founding Chairman of IBM, reportedly said, "I think there is a world market for about five computers." The Titanic was the ship that couldn't sink. Likewise, surveys show that most drivers report that they are better than average, and most companies believe their brands to be of "above-average" value.

Overoptimism

Overoptimism describes the human tendency to believe that things will more likely turn out well than poorly. Although generally regarded as a positive trait, optimism has been blamed for a variety of problems relevant to corporate and personal decision making. This includes over-estimating the likelihood of positive events and under-estimating the likelihood of negative events. Overoptimism is cited as a reason that project managers so frequently experience project overruns, performance shortfalls, and completion delays. Economists believe the bias contributes to the creation of economic bubbles; during periods of rising prices investors are overoptimistic about their investments. It has been suggested that in many cases of corporate disclosure fraud, the offending officers and directors were not consciously lying but instead were expressing honestly-held but irrationally optimistic views of their firms' condition and prospects. A related bias is wishful thinking, a tendency to focus on outcomes that are pleasing to imagine rather than what is objectively most likely.

Halo Effect

The halo effect refers to a bias wherein a perception of one, typically overarching trait (e.g., a person is likeable) bleeds over into expectations regarding specific traits (e.g., the person is intelligent). The effect was first demonstrated by a study of how military officers rate their soldiers in various categories. The officers usually rated their men as being either good or bad across the board. In other words, there was very high correlation across all positive and negative estimations. Few people were evaluated to perform well in some areas and badly in others.

The perceptions of Hollywood stars demonstrate this bias. Because they are typically attractive and likeable, we naturally assume they are also intelligent, friendly, and display good judgment, though there is often ample evidence to the contrary. It's as if we cannot easily separate our evaluations with respect to different criteria. The explanation may have something to do with our desire to avoid cognitive dissonance. If we valuate something as good in one category and bad in another, that makes expressing an overall evaluation more difficult.

Apple

Apple's halo effect

The Halo effect has considerable influence in business, and not just for interviewing and hiring. In the automotive industry, the term "halo vehicle" refers to a limited production model with special features that is intended to help the manufacturer sell other models within the category. Similarly, Apple's success has been attributed in part to a halo effect that Steve Jobs cast over Apple's products. Job's brilliant design sense made Apple's products, and everything else about the company, including stock price, seem attractive. Conversely, investors who see Apple as being highly profitable tend to believe its products are high-quality and its advertising honest.

Anchoring

A bias related to overconfidence is anchoring. Initial impressions become reference points that anchor subsequent judgments. For example, if a salesperson attempts to forecast next year sales by looking at sales in the previous year, the old numbers become anchors, which the salesperson then adjusts based on other factors. The adjustment is usually insufficient. Likewise, if when estimating an uncertainty, you begin by thinking of a middle or most likely value and then consider the potential for deviations from that value, you will underestimate your uncertainty. Anchoring and adjustment leads to confidence ranges that are too narrow.

Credit debt

Minimum payment anchor

Anchors can be set through any mechanism that creates a reference point. For example, in one study, groups of consumers were shown credit card bills that either did or did not contain minimum payment requirements and asked how they would pay the debt off given their real-life finances. The payments for those who indicated they would pay over time were 70% lower for the group who saw information on minimum payments compared to those who did not. Apparently, the minimum payment works as an anchor, causing the card holder to pay a smaller amount than would have been paid in the absence of the anchor.

Recent events are easy to recall and often become anchors (recency bias, mentioned above). Thus, investors tend to believe that the direction a stock is currently moving is the direction that it will continue to move (so, anchoring contributes to stock price volatility since it prolongs up- and downswings). Knowing that recent job performance has a more pronounced affect on impressions, workers naturally work harder to demonstrate good performance in the 3 months just prior to reviews than in the previous nine months.

Jaws, the movie

Jaws anchor

Vivid events can become strong anchors. When the movie "Jaws" opened at theaters across the U.S., the number of swimmers visiting California beaches dropped significantly. Sharks do inhabit the California coast, but the risk of a swimmer actually being attacked by a shark is, for most people, much less than the risk of dying in a car accident while driving to the beach. Studies show that people overestimate the likelihood of dying from dramatic, well-publicized risk events, such as botulism, tornadoes, auto accidents, homicides, and cancer, but underestimate the risks of unremarkable or less dramatic events, such as asthma, diabetes, and emphysema.

Project proponents can use anchoring to help win approval for their projects. Assuming that organizational decision makers use the common approach of considering and making decisions for each proposal in turn, proponents should seek to have their preferred projects placed on the agenda immediately following a request to approve a much larger project. The more expensive project proposal will create an anchor against which the proponent's project will seem more reasonable.

Motivational Biases

Motivational biases can affect estimates and forecasts whenever estimators believe that the quantities expressed may affect them personally. For example, managers may have an incentive to overstate productivity forecasts to reduce the risk that the capital dollars allocated to their business units will be reduced. More subtle biases also affect estimates provided from managers, and the effect can depend on the individual. For example, project managers who are anxious to be perceived as successful may pad cost and schedule estimates to reduce the likelihood that they fail to achieve expectations. On the other hand, project managers who want (consciously or unconsciously) to be regarded as high-performers may underestimate the required work and set unrealistic goals. Most managers are overly optimistic. When companies collect data on the financial returns from projects, they almost always find that actual returns are well-below forecasted returns.

The so-called principal-agent problem, which arises whenever someone engages another to act on his or her behalf, invariably generates the potential for motivational biases due to the differing incentives and preferences. 0f the parties involved. As an example, the most senior executives of a firm are often subject to motivational biases when dealing with very large projects. The CEO and other "C-level" executives, as agents for shareholders, have a duty to truthfully disclose to the board of directors the costs, benefits and risks of very large projects. However, most C-level executives earn their full compensation when projects succeed but share responsibility for losses or underperformance. Thus, there is a built-in incentive for senior executives to understate projecxt risks and costs while overstating benefits. The time delay between the decision to commit to a large project and the point when the project outcome becomes known also contributes to the bias. Executives know that they may be recruited by other companies after a landmark project is approved, but before it is complete . Likewise, contractors have an incentive to provide information that pleases top executives. In order to win contracts, bidders may deliberately underestimate costs. They know that recontracting based on changing scope is often possible, and, unless the contract is at a fixed price, delays will typically be tolerated. Again, the tendency of natural incentives is to emphasize the positive and playing down or hiding the negatives.

Motivational biases can also cause people to minimize the uncertainty associated with the estimates that they provide. I have found, for example, that managers sometimes become defensive when asked to estimate the potential risks associated with a proposed project, even in environments where it is well-known that projects can fail. Perhaps they feel that admitting to downside potential would suggest deficient risk management practices or the fallibility of their project management skills. Experts likewise face disincentives to fully acknowledge uncertainty. They may think that someone in their position is expected to know, with high certainty, what is likely to happen within the their domains of expertise. We do, in fact, appear to value the opinions of highly confident individuals more highly. Studies show that consultants and others who sell advice are able to charge more when they express great confidence in their opinions, even when their forecasts are more often proven wrong [5].

Poorly structured incentives, obviously, can distort decisions as well as estimates. For example, any company that rewards good outcomes rather than good decisions motivates a project manager to escalate commitments to failing project, since the slim chance of turning the project around is better from the manager's perspective than the certainty of project failure.

Base-Rate and Small Sample Bias

Base-rate bias refers to the tendency people have to ignore relevant statistical data when estimating likelihoods. Small sample bias is the tendency to draw conclusions from a small sample of observations despite the fact that random variations mean that such samples have little real predictive power. For example, suppose you are planning to buy a car. You read a Consumer Report review that ranks a model highly based on uniformly positive assessments from a survey of 1,000 owners. You mention the article to a friend who recently bought this model. She swears that it is the worst car she has ever owned and that she will never buy another as long as she lives. Your friend's experience has increased the statistical sample from 1,000 to 1,001, but you're faith in the Consumer Report study has been destroyed. People are moved more by one powerful, vivid example than by a mass of statistics.

The tendency to underestimate the effort needed to complete a complex task has been attributed to base-rate bias. Instead of basing estimates mostly on the amount of time it has taken to do previous similar projects, managers typically take an "internal view" of the current project, thinking only about the tasks and scenarios leading to successful completion. This almost always leads to overly optimistic forecasts. Known as the planning fallacy, managers ignore past experience and envision success scenarios and overlook the potential for mistakes and delays. To overcome this, one One manager I know says he always multiplies the time his programmers say will be required to complete new software by a factor of two, because "that's what usually happens."

Variations of base-rate biases are often important in business environments, including the tendency people have to be insufficiently conservative (or "regressive") when making predictions based on events that are partially random. Investors, for example, often expect a company that has just experienced record profits to earn as much or more the next year, even if there have been no changes in products or other elements of the business that would explain the recent, better-than-anticipated performance.

Conjunctive Bias

Airline trip insurance

Trip insurance illustrates conjunctive bias

Conjunctive events bias refers to the tendency for events that occur in conjunction with one another to make a result appear more likely. A conjunction (a combination of two or more events occurring together) cannot be more probable than any one of its components. Yet, the stringing together of component events can create a more compelling vision that appears more likely. For example, the possibility that you may die during a vacation (due to any cause) must be more likely than the possibility that you will die on vacation as a result of a terrorist attack. Yet, one study showed that people are willing to pay more for an insurance policy that awards benefits in the event of death due to terrorism than one that awards benefits based on death due to any cause.

Conjunctive event bias plays an important role in project planning. Any complex project has the character of a conjunctive event, since each component part needs to go as planned for the whole to succeed. Even if the probability of each individual part succeeding is very likely, the overall probability of success can be low. Thus, conjunctive event bias can contribute to time and cost overruns in real projects.

Lack of Feedback

Forecasting errors are often attributed to the fact that most people don't get good feedback about the accuracy of their forecasts. We are all fairly good at estimating physical characteristics like volume, distance, and weight because we frequently make such estimates and get feedback about our accuracy. We are less experienced (and get less verification) when making forecasts for things that are more uncertain. Weather forecasters and bookmakers have opportunities and incentives to maintain records of their judgments and see when they have been inaccurate. Studies show that they do well in estimating the accuracy of their predictions.

Advice for improving forecasts and estimates includes:

  1. Think about the problem on your own before consulting others and getting anchored to their biases.
  2. Be open-minded and receptive. Seek opinions from multiple and diverse sources. Tell them as little as possible about your own ideas beforehand.
  3. Tell people you want "realistic" estimates. Ask about implicit assumptions.
  4. Expect and acknowledge that contractors and project champions will be overly optimistic. Apply adjustments to what you are told based on historical performance and consideration of the motivational biases of those involved.
  5. Consciously evaluate options using multiple criteria that reflect non-overlapping objectives and perspectives, and try not to let your overall intuitive impression influence your component assessments.
  6. Look for anchors that might be biasing estimates. Are the numbers unsubstantiated, extrapolated from history without sufficient adjustment?
  7. Encourage the estimation of a range of possibilities instead of just a point estimate. Ask for low and high values first (rather than for a middle or best-guess value) so as to create extreme-valued anchors that counteract the tendency toward overconfidence around a middle value.
  8. Look for overconfidence and overoptimism. Have the team build a case based on an outside view. Consider war gaming.
  9. Require project proponents to identify reasons why what they propose might fail. Conduct a "premortem" — Imagine that in some future point in time it is apparent that the project turned out horribly, then write a history of how it went wrong and why.
  10. Give people who provide you with estimates knowledge of results as quickly as possible.
  11. Use network diagrams, scenario building, and similar techniques to identify and define the sequencing of component activities. A major value of such techniques is that they help reduce the likelihood that some necessary, but less visible, activities, such as procurement and training, aren't overlooked.
  12. Routinely use logic to check estimates. As a simple example, if you have 2 months to complete a project estimated to require 2000 hours, verify that you have a sufficient number of FTE's available.
  13. Consider establishing incentives to counteract natural motivational bises, for example, financial or non-financial rewards for planners whose estimates prove accurate, penalties for seriously misleading forecasts.
  14. Don't encourage the winning contractor to be the one that most underestimates the true cost by forcing biders to bear financial penalties for delays and cost overruns.