Lee Merkhofer Consulting Priority Systems
Project priority system case study

A Priority System for Allocating an O&M Budget

Elsie Myers Martin

Lee Merkhofer

Myers Martin Consulting, LLC

Lee Merkhofer Consulting

The primary author of this paper, Ms. Elsie Martin, worked at the client organization, Northern States Power (NSP), during the period when the described priority system was developed and used. She adds an insider's perspective and provides “lessons learned” based on interviews with company executives and staff.

The data contained herein has been approved for public release by NSP. The information was originally prepared in support of NSP's submittal of the described priority system to the Institute for Operations Research and Management Sciences (INFORMS) “2000 Competition for Best Application of Decision Analysis.” The priority system was selected as a finalist and presented for judging at the November 2000 INFORMS meeting in San Antonio, Texas, where it was awarded “runner up” status.

This paper was originally published under a different title in the abstracts of the First Annual Power Delivery Asset Management Workshop, New York, June 3-5, 2003. [1]


Beginning in the late 1990's, Northern States Power (Electric) began exploring formal tools for improving the allocation of its annual operating budget of over $250 million. The goal was to establish a fair, efficient, and effective process for allocating funds across territories and departmental functions. After trying several approaches, NSP created a resource allocation system based on multi-attribute utility analysis. The system was used until NSP became part of Xcel Energy in 2000. Characteristics of the system and lessons learned are presented.


Northern States Power (now called Xcel Energy) is an electric utility serving over one million customers in Minnesota, North and South Dakota, Wisconsin and Michigan. In 1997, the electric business unit, NSP Electric, created a centralized Asset Management department with responsibility for the annual operating budget of more than $250 million. A major challenge was establishing a fair, efficient, and effective process for allocating operating funds across territories and departmental functions. Despite previous attempts to use priority setting tools, key managers would invariably make decisions after lengthy discussions and based largely on intuition. The process was painful and departments with the most eloquent speakers often came out ahead. As issues became more complex (e.g., invest in information technology vs. system hardware) and as stakes continued to grow (corporate pressure to lower costs regardless of external cost drivers such as storm damage and rapid customer growth), the task became even more daunting. A better process was needed.

NSP used the resulting priority system over 3 years (the budgeting cycles of 1998, 1999 & 2000) until the company was merged into Xcel Energy. The purpose of the system was to identify allocations of the operating budget that would increase system reliability and customer service while at the same time eliminating budget gaps. More specifically, the system was used to evaluate and recommend allocations of the operating budget across 10 broadly defined delivery products (called “portfolios”).

Methods Used and How They Were Applied

Starting in 1997, the Asset Management department undertook an effort to create a formal resource allocation system. The development effort was assisted by my coauthor, Dr. Merkhofer (to help ensure a logical, defensible design), and stakeholders from the affected departments (to improve buy-in and to ensure that all concerns and issues would be addressed). The selected approach was based on multi-attribute utility analysis [2]. Multi-attribute utility analysis, also called multi-objective or multi-criteria decision analysis, is a formal theory for making complex decisions that impact multiple objectives. Graduate programs at many universities teach this method. It has been used to support numerous high-level government and business decisions and is recommended by a Department of Energy Standard on prioritization[3].

The system operated as follows. To provide alternative allocation scenarios, each portfolio team proposed multiple five-year funding cases. Three cases were required. Case 1 assumed a “bare bones” funding level sufficient only to protect health and safety. Case 5 held funding to 1999 levels for five years. Case 9 was a “maximum” funding level encompassing every activity that could be undertaken effectively. In addition, portfolio teams created intermediate funding cases so that increments between cases were no more than $2 million (Figure 1).

Funding cases

To provide a basis for evaluating the funding cases, a team of senior managers established 7 fundamental objectives. The objectives included both financial goals (e.g., revenue and cost impacts) and non-financial goals, such as improving customer service (e.g., reliability, meeting commitments, and billing) and creating a platform for future success (e.g., enhancing corporate reputation and increasing employee effectiveness) (Figure 2).


Note that safety was not explicitly represented as an objective. Safety was viewed as a constraint for decision making-regulations and NSP policy (and the definition of the minimum funding case) ensured that an adequate level of public and worker protection was provided under all funding cases.

Management and staff developed the measures and scales for evaluating funding cases against each objective using a formal process involving influence diagrams. An influence diagram [4] is a graphic device for identifying key influencing factors (Figure 3).


The senior management team then assigned relative priority, or weight, to the objectives using a “poker chip” exercise based on methods recommended in multi-attribute utility analysis literature [5]. This exercise helped establish a consensus priority and a range of weightings for sensitivity analysis. Technical experts “scored” the alternative funding cases using the measures and scales. Using a mathematical algorithm, system software in a spreadsheet translated “scores” assigned by technical experts and “weights” established by senior executives into estimates of total value, which were expressed both in equivalent dollars and in terms of a unitless “figure-of-merit” called “utility.”

System Outputs

System outputs included a benefit vs. cost curve for each portfolio (Figure 4). The graph illustrates how benefit improves as funding for the portfolio increases. The vertical axis shows the value produced (based on the degree to which the funding case achieves objectives), and the horizontal axis shows the cost (in this case, five year cost, although similar plots were created to compare total benefit with budget-year cost). Each point on the graph represents a funding case.

Portfolio benefit vs cost

Optimal allocations for any total budget level were obtained from a table, read from top to bottom (Figure 5). To obtain the table, the increments in funding cases from portfolios were ranked by declining overall benefit-to-cost. Funding levels 21 through 31 are shaded to illustrate funding requirements above the budget guideline.

Optimal allocations

One of the most interesting outputs of the system was the total benefit vs. total cost curve (Figure 6). Each point on the curve corresponds to a row in optimal allocation table. The curve shows the benefits obtained when available funds are optimally allocated across portfolios.

Total benefit vs cost

Indicated on the curve is the point corresponding to the allocation that produces the largest total benefit obtained from any allocation that does not exceed the budget-year guideline (this is funding level 20 in the table). For comparison, the triangle in Figure 6 shows the "benefit" of keeping all portfolios at 1999 funding levels. As indicated by the fact that the budget curve falls well above the triangle, the system was able to identify funding shifts that produce much more benefit (roughly 40% more) than would be derived from a “status quo” funding allocation that left all portfolios at their 1999 funding levels. Similar outputs were generated under alternative weights and other assumptions for sensitivity analyses.

Another output of interest was a bar graph showing the components of total value under a specified allocation (Figure 7). By computing the graph under different total budget assumptions, it was possible to see what objectives would be improved or degraded if the total O&M funds were increased or decreased. Because the system was tied to a database of activities by portfolio, it was also possible to indicate the specific work items that would be added or cut that caused the changes in total benefit.

Sources of O&M value

Scoring Process

A notable characteristic of NSP's system was the detail and rigor of the scoring process. For example, there were five different measures for estimating impacts of funding on reliability. Data on previous year performance was provided to improve forecasts and reduce gaming. Scorers were trained and supplied with a detailed manual of scoring instructions. A team was available to answer questions and provide assistance. Scores were formally audited in group reviews and quality assurance sessions.

Continuous Learning and Improvement

Over the 3 years that the system was in use, it was refined and expanded with each application as employees became more comfortable with the methodology and saw ways to improve it. For example, the application for the 2001 budget added the multi-year perspective to track 5-year funding needs and to account for the longer-term impacts of some funding decisions.

What Was Learned and How the Organization Used the Results

Interviews with stakeholders indicated that there were unanticipated benefits from using the system. Although everyone expected that the system would improve resource allocations, participants saw additional benefits. The system:

  • Improved mutual understanding by efficiently communicating needs and the resources required to address those needs.
  • Promoted accountability and responsibility by forcing managers to go on record regarding what they expected to accomplish with their resources.
  • Clarified objectives and values. When defining funding alternatives, managers frequently redefined and shifted dollars among activities to get the highest possible performance. And, it pointed out where operational measures needed improvement.
  • Helped change company culture regarding customer commitments. The relatively high weights on objectives related to customer service and the need to forecast performance created an increased awareness of the importance of interactions with customers.

What participants seemed to like most was that the system provided a framework for evaluating funding requests that was fair, consistent, and promoted a complete and thorough consideration of relevant issues.

Identified Areas for Improvement

Interviews conducted after the 2000 application indicated that there were still changes needed to improve the resource allocation process. The key remaining deficiencies were:

  • Lack of Data. The transmission area lacked measures to adequately represent reliability drivers for bulk transmission. Very little analysis tied maintenance practices to equipment failure. No analyses were available for confirmed scoring judgments of prior years.
  • Lack of Incentives to Prevent Gaming. Some funding cases were too large and generic, allowing the distribution maintenance areas to group activities rather than consider their benefits individually. Tension continued between “engineering” areas and “marketing” areas.
  • Budgeting Disconnects. Frustrations with the budgeting processes for capital and operating as well as the lack of direct tie between departmental budgets and portfolios continued to be an issue.

These issues were to be addressed in the 2001 application and tested in a pilot project to allocate resources for tree trimming across Xcel Energy's 12 state service territory. This pilot, however, was not approved and the system was canceled.

Lessons Learned Three Years Later

Although this system provided many benefits to NSP Electric, the merger brought significant organizational changes that resulted in the NSP executives most familiar with the system leaving the organization. Incoming Xcel Energy senior management wanted their own management systems. Focused on merger activities, they lacked understanding and ownership of the multi-attribute utility analysis methodology. They wanted a system that relied primarily on analysis of historical data and statistical risk and less on the professional judgment of those closest to the problem.

In order for this or any portfolio optimization tool to be successful, senior management must understand and sponsor it. In addition, department managers - those responsible for day-to-day spending decisions - must support the process and be willing to give up some autonomy for the good of the whole, trusting that the method is fair. No tool solves all organizational issues and inefficiencies. Being clear about scope - what's in and what's out - as well as having firm sponsorship is important to success.

When applied properly, Operations Research and Management Science tools, such as multi-attribute utility analysis, are mathematically sound and provide accurate recommendations. They can combine “hard” data - data obtained directly from customers, through benchmarking surveys or from operating statistics - with “soft” data - judgments and critical insights that are only available from field and other knowledgeable personnel. These tools include methods that counter and reduce biases associated with subjective judgments. The structure helps participants be disciplined and rigorous, giving format and direction to a complicated process. Computer software provides a convenient means for collecting data, documenting assumptions, performing calculations, conducting sensitivity analysis and reporting results.

In summary, key benefits of the priority system included:

  • It forced senior management to articulate clearly what is important and why.
  • It captured all benefits from proposed work, not just financial benefits.
  • It provided a defensible process for determining the dollar value of project benefits.
  • It captured technical judgments of front-line supervisors regarding needs and the effectiveness of proposed projects.

By doing these things, it helped participants spend more time evaluating alternatives and less time arguing that their funding should not be decreased.

Although not a tool for all problems, a priority system based on multi-attribute utility analysis can be an excellent tool for improving resource allocation and optimally managing a portfolio of business investments.


  1. E. Martin and M. W. Merkhofer, “Lessons Learned - Resource Allocation based on Multi-Objective Decision Analysis”, Proceedings of the First Annual Power Delivery Asset Management Workshop, New York, June 3-5, 2003.
  2. See R. Keeney and H. Raiffa, Decisions with Multiple Objectives, Wiley, New York, 1976. A detailed reference emphasizing applications to government decision making appears on the United Kingdom ODPM website. See Multi-Criteria Analysis Manual.
  3. See DOE Limited Standard: Guidelines for Risk-Based Prioritization of DOE Activities.
  4. M. W. Merkhofer, “Using Influence Diagrams in Multi-attribute Utility Analysis - Improving Effectiveness through Improving Communication,” Chapter 13 in R. M. Oliver and J. Q. Smith (eds.) Influence Diagrams, Belief Nets and Decision Analysis, Wiley, New York, 1988.
  5. See, for example, W. Edwards and J.R. Newman, “Multiattribute Evaluation,” in H.R. Arkes and K.R. Hammond (eds.), Judgment and Decision Making, Cambridge University Press, Cambridge, England, 1986, 13-37 and C. Kirkwood, Strategic Decision Making: Multiobjective Decision Analysis with Spreadsheets, Duxbury Press, 1997, 53-61.
/css" href="stylemasternotie.css" />