
Term

Explanation

E




e (Euler's number)

The letter e is a symbol representing a constant number, the first few digits of which are
2.71828182, that appears frequently in mathematics, especially in
formulas dealing with continuous growth. For example, if you invest $1 at a rate of interest of 100% a year with
interest compounded continually, you will have the amount e = $2.71828... at the end of the year:
The number e is the base of the
natural logarithm.
It is sometimes called Euler's number after the Swiss mathematician Leonhard Euler who is credited to be the
first to use the symbol in a manuscript written in 1727.


earned value management (EVM)

An acronymrich method for measuring progress on projects and indicating any variances in
planned accomplishments, schedule, and cost expenditures. EVM, also called
earned value analysis (EVA, not to be confused with economic value added) is primarily used as a way
of reporting project progress to stakeholders, and government regulations
often require that contractors providing services to government agencies
comply with standards for using EVM. In the context of project portfolio management (PPM), EVM may be
incorporated as a means for reporting progress on individual projects and
for demonstrating compliance with government requirements for EVM.
The basic concept with EVM is that project work be
planned, budgeted, and scheduled in timephased, "planned value"
increments. Typically, these work increments are defined in a hierarchical
fashion as a work breakdown structure, but
for a smaller project the work elements might simply be individual project
tasks. The work elements define a schedule and cost/value baseline for the
project. As project work is conducted, project value is "earned." Various
indices are computed to summarize project status based on comparing earned
value with planned and actual costs.
With EVM, the value that is assigned to each work element
is termed its planned value (PV). The PV is meant to be a quantity or
weighting factor that indicates the portion of the project value that,
according to the plan, will be contributed by that work element at a
specified time. Usually, the PV for a work element is set equal to its
cost, however, the PV might alternatively be defined as the number of labor
hours required or as a subjectively assigned number of "points."
The value of the work element is earned as the work is
completed. For example, the earning rule might be that 25% of the value is
earned when the task is started, and the remaining 75% is earned upon
completion.
Progress against the plan is reported on a regular basis
(e.g., weekly or monthly) by accumulating earned value (EV) based on the
earning rules. By subtracting the value of the work planned (PV) from the
value of the work performed (EV), a schedule variance (SV) may be computed
at any point in time during the project:
SV = EV  PV.
(Some EVM documents alternatively define SV = PV  EV.
With the definition given above, negative numbers are "unfavorable," and
positive numbers are "favorable.")
Similarly, a schedule performance index (SPI) may be
computed by dividing the EV by the PV:
SPI = EV/PV.
If the SV is greater than zero (SPI is greater than 1),
the work is ahead of schedule. If the SV is less than zero (SPI is less
than 1), the work is behind schedule. Schedule variances can be rolled up
to any level in the work breakdown schedule to provide higherlevel
indicators of schedule compliance.
Since a work element's PV is traditionally chosen to be
the scheduled cost of the work, the traditional term for a work element's
planned value is the budgeted cost for work scheduled (BCWS). The
traditional term for earned value is the budgeted cost for work performed
(BCWP). The actual cost of conducting each work element is termed the
actual cost of work performed (ACWP). In this context, where value and cost
are both measured in dollars, a cost variance (CV) can be computed by
subtracting the actual cost of work performed (ACWP) from the budgeted cost
of work performed (BCWP):
CV = BCWP  ACWP = EV  AC.
EVM defines many additional indicators of technical,
schedule, and cost performance that can also be calculated, and guidance is
available for interpreting and addressing the various discrepancies that
the indicators may reveal. As you can no doubt appreciate, EVM can be
confusing because of the many acronyms that are used.
Although EVM is a wellestablished and effective means
for managing the completion of complex projects, it's major limitation from
the standpoint of PPM is that it does not provide indicators for tracking
the anticipated ability of the project to deliver benefits to the
organization. Because EVM is unconcerned with project changes that might
impact the ultimate value derived from a project, it provides no signals
that might suggest that the project plan should be reconsidered. EVM might,
for example, indicate that a project is under budget, ahead of schedule,
and within scope, but that project could nevertheless be in trouble with
regard to achieving the benefits that
motivated the decision to fund it.


economic value added (EVA®)

A financial project valuation metric and related management framework
developed by consulting company Stern Steward founders Joel Stern and G.
Bennett Steward III (EVA® is a registered trademark of Stern Steward).
The EVA® of a project is
calculated by taking net operating profit and subtracting a charge for the
capital or assets deployed. The deducted amount is the "cost of
capital"—what shareholders and lenders could obtain by investing in
securities of comparable risk.
EVA®, also sometimes termed earned value
added, provides a useful input for prioritizing projects because it
quantifies the direct financial component of project value. However, other
techniques are needed to account for the indirect or nonfinancial
components of project value. Also, depending on the characteristics of
projects, it may be more convenient to account for the cost of capital
using the more traditional calculation of net
present value (NPV).
While there are other financial metrics that likewise
account for the cost of capital, the appeal of EVA® is that it does so
in a conceptually simple and intuitive way that is easy for nonfinancial
managers to understand. Since EVA® starts with familiar operating
profits and then deducts a charge for the capital employed, it can be
interpreted simply as "net profit minus the rent."
EVA® has become popular because it highlights the
importance of the cost of capital when financially evaluating projects.
EVA® may show, for example, that despite increasing earnings, a
project is destroying shareholder
value because the cost of capital associated with the required
investment is too high. By assessing a charge for using capital, EVA®
forces managers to think about managing assets as well as income.
As indicated above, a major weakness of EVA® is that
it fails to account for nonfinancial project impacts (such as improved
employee knowledge) that are difficult to express in terms of incremental
cash flows. Also, accounting for opportunity costs by subtracting a capital
charge is conceptually simple only if project start times, durations, and
spending rates aren't very important (if they are, then the NPV approach of
discounting cash flows using hurdle rates is computationally and
conceptually simpler). Like classic NPV, EVA® does not explicitly
address cash flow uncertainties, and it can be very difficult to determine
the appropriate charge for the capital used by a project.


efficient frontier

In the context of modern
portfolio theory, the efficient frontier is the bounding curve obtained
when portfolios of possible investments are plotted based on risk and
expected return. The efficient frontier shows the investment combinations
that produce the highest return for the lowest possible risk. A portfolio
that is not on the efficient frontier is said to be "inefficient" because
another portfolio exists that has lower risk for the same return.
Efficient frontier as defined by modern portfolio
theory
In the context of project
portfolio management, the efficient frontier typically refers to the
bounding curve that is obtained when portfolios of projects (or sometimes individual projects)
are plotted based on cost and some quantity that is intended to represent
portfolio (or project) attractiveness (ideally, the yaxis should represent
the value or worth of the portfolio to the organization). In this context,
a portfolio that is not on the efficient frontier is inefficient because
another portfolio exists with greater value for the same cost. For more
explanation see the paper chapter on the finding
the efficient frontier.
Efficient frontier as defined by project portfolio
management


eigenvalue, eigenvector

Concepts from linear algebra relevant to prioritization because of their use in AHP,
a multicriteria decision making approach. A scalar (number) λ is called an eigenvalue
of the n × n matrix A if there is a nonzero vector x such that Ax = λx. In this case, x is
called an eigenvector corresponding to the eigenvalue λ.
An eigenvector for a matrix is scaled when multiplied by the matrix
In the language of matrix algebra, a vector can be multiplied by
a matrix to produce another vector. An eigenvector of a matrix is a vector that when multiplied by that
matrix simply scales the entries in the vector; in other words, the vector doesn't change direction, it merely changes size.
The eigenvalue associated with the eigenvector is the amount by which the vector entries are scaled.


ELECTRE

A decision aid that involves comparing pairs of potential
actions based on multiple criteria. ELECTRE, like PROMETHEE, is a socalled outranking
method, representative of what has been referred to as the "European
school" of multicriteria methods. Unlike multicriteria analysis methods such as multiattribute utility analysis (MUA) and the
analytic hierarchy process (AHP) (the
socalled "American school"), outranking methods do not involve developing
or assessing from decision makers a utility function (see decision theory) for quantifying
decisionmaker preferences. Instead, with ELECTRE and other outranking
methods, preferences are determined indirectly by having decision makers
express relative preferences between pairs of options.
The results of the comparisons are organized into a
matrix of values that show the "concordance" and/or the "discordance"
between the candidate actions. The matrix is analyzed to produce various
results and to choose, rank, or sort the alternatives.
The ELECTRE method was developed in France in the late
1960s and the term is an acronym for ELEmination et Choix Traduisent la
Realite—elimination and choice reflecting reality. Like
PROMETHEE, ELECTRE comes in various "versions" that indicate refinements
and whether the version is meant for selecting or classifying options.
A strength of ELECTRE is its ability to account for
uncertainty and vagueness. One reported weakness is that, due to the way
preferences are incorporated, the lowest performances under certain
criteria are not displayed. The outranking method results do not show the
strengths and weaknesses of the alternatives, nor results and impacts to be
verified. ELECTRE has been used in energy, economics, environmental, water
management, and transportation problems.
Outranking methods, like ELECTRE and PROMETHEE, which
were prevalent early on in the development of MCA methods, have been
largely overtaken by the use of value measurement approaches such as MUA
and AHP. Still, there are a few Europebased project portfolio management tools that
advertise their use of ELECTRE.


elimination by aspects

A multicriteria method wherein
alternatives are evaluating with respect each criterion, one criterion at a time,
and all those alternatives that fail to reach a minimum level of performance
with respect to the criterion are eliminated. The approach
is noncompensatory in that it does not require specifying weights
or willingness to make tradeoffs. It typically does not result
in either a "best" alternative or a prioritization of alternatives.
However, the approach can be an efficient method for screening projects
provided that minimum levels of performance on each criterion can
be specified. After screening, a compensatory approach is
required to prioritize the remaining projects.


enterprise management

A collection of management principles and techniques
focused on helping the organization achieve its highestlevel objectives, such as increasing shareholder value. Enterprise
management typically includes strategic planning, longterm investment
strategy, organizing and resourcing, performance assessment, and leading
and directing the organization.


enterprise project
management (EPM)

A broad term that refers to processes for improving the
conduct and coordination of projects
across an enterprise. The term predates project portfolio management (PPM), and, like
project management and program management, EPM is mainly
focused on "doing projects right," not on doing the "right projects." Thus,
though tools to support EPM often include dashboards that can show work progress at
various level of detail, project prioritization and portfolio optimization capabilities are generally not
included.


enterprise project portfolio management
(EPPM)

Project portfolio
management applied at the enterprise level; that is, to all projects and programs conducted by the organization. EPPM
may be implemented by establishing a hierarchy of project portfolios that
are managed both individually and collectively to maximize the value
derived by the enterprise.


enterprise resource planning (ERP)

Refers to a process and/or comprehensive software system
aimed at centrally managing and coordinating the broad set of activities
needed to successfully run a business enterprise, including product
planning, material purchasing, inventory control, distribution, accounting,
marketing, finance, and HR. Most ERP software products are composed of
modules, with each module focused on one business process — some ERP
systems include a module for project portfolio
management. Customers can purchase as many modules as they require. A
single, central repository contains the information needed to support
planning and decision making across the various modules and associated
business functions. ERP systems typically use or are integrated with a
relational database.
An oft cited example illustrating the ERP concept is a
system wherein when the sales department records a new customer order
information is routed to the inventory and warehousing department to
retrieve and package the order, to the finance department to prepare an
invoice for billing, and to the manufacturing department for purchased
product replacement.


event tree

A graphical representation, similar to a decision tree, showing the sequence of
events that might occur following some initial event. The difference
between a decision tree and an event tree is that an event tree does not
contain nodes representing decisions. Like decision trees, event trees are
typically used to understand risk and to identify actions for improving
performance.
Historically, event trees have been used most often for
accident analysis. In such applications, each event is represented by two
branches corresponding to the possibility that an event either does or does
not occur (e.g., a safety system works or fails). The figure below provides
an example.
Event tree for assessing the consequences of loss of
coolant to a nuclear reactor
If they are quantified, event trees, like decision trees,
can be used for quantitative risk analysis. Suppose, for example, that a
model has been constructed to estimate some quantity or quantities of
interest. An event tree can then be constructed with each event in the tree
representing one of the inputs to the model that is uncertain. If an
uncertain input could take on any value within some range, it is discretized, for example, by specifying
high, medium and low possibilities. Some decision tree tools can create an
event tree automatically from an influence diagram and link it to a
model constructed with Excel.
Each path through the event tree defines a set of inputs
for the model, so model outputs can be computed for each path. To quantify
uncertainty over the model outputs, probabilities must be assigned to each
branch in the tree. By convention, as in the case of decision trees,
probabilities of the event outcomes are placed under the corresponding
branches and the model outcomes are placed at the ends of the tree
(illustrated here). The
probability of each path can be calculated by multiplying the branch
probabilities along the path. The tree thus provides a probability
distribution describing the uncertain outcomes at the ends of the tree.
Some project portfolio management
tools, including some for prioritizing R&D projects, use event
trees in this way to quantify project risks.


evidential reasoning (ER)

A multi criteria analysis
(MCA) decisionmaking approach based on a theory for reasoning with
evidence. ER incorporates concepts from decision theory, artificial intelligence,
statistics, expert systems and
fuzzy logic. The approach differs from
conventional MCA both in the way that uncertainty is represented and in the
method used to combine assessments of alternatives against individual
criteria in order to draw overall
conclusions regarding the relative desirability of those alternatives.
ER is based on the concept of belief structures as
defined by DempsterShafer (DS)
theory. A belief function is a generalization of a probability function
that allows distinguishing different types of uncertainty. For example,
uncertainty associated with the random behavior of a wellunderstood system
may be viewed as being different than uncertainty due to insufficient
knowledge. Such distinctions are retained through the analysis, and the
different types of information are interpreted as bits of evidence relevant
to revising beliefs. The use of belief functions gives ER the ability to
quantify ignorance and the impact of incomplete, potentially conflicting,
and unreliable information in a way that is more complete and explicit than
what can be accomplished with classical probability theory.
One way to contrast ER with more conventional MCA methods
has to do with the way the multicriteria analysis is commonly represented.
With conventional MCA, the analysis can be viewed as a decision matrix where numbers are
assigned to each alternative to measure the alternative's performance with
regard to each of some number of criteria. If, for example, there are
M alternatives and N criteria, the decision matrix will have
M rows and N columns and a total of M x N
cells. Each cell contains a number indicating the evaluation of the
corresponding alternative relative to the indicated criterion. Depending on
the MCA method, the numbers in the cells might be scores, values, or utilities. The individual evaluations are
combined via an aggregation
equation, for example, by weighting and adding the individual
assessments. The numbers produced by a conventional MCA approach can be
regarded as summarizing an average performance metric that, according to
the supporters of ER, fails to communicate the diversity of performance and
quality of information on which conclusions are based.
With ER, the decision problem is represented by a
belief decision matrix. A belief decision matrix is similar to a
conventional decision matrix except for the fact that the assessment of
each alternative against each criterion is represented by a twodimensional
variable defined as a belief structure. The first element indicates the
value or "grade" assigned by the assessment. For example, the grade
assigned to an alternative's performance with respect to a criterion might
be "excellent," "good," "average," or "poor." The second element indicates
the degree of belief underlying the grade assessment. The degree of belief
is a number, which, like a probability, lies between zero and one. Thus,
for example, an alternative might be assigned a grade of "good" with a
degree of belief of 0.4. With ER, each criterion can have its own set of
evaluation grades and the criteria can be arranged into a hierarchy. To
integrate the belief structures established for the different criteria, ER
uses rules for evidence pooling and belief updating. Instead of simply
aggregating average scores, the ER approach employs an evidential reasoning
algorithm to aggregate belief degrees according to the evidence combination
rules of DS theory.
The evidence combination rules of DS theory employ
"fusion operators" that apply specific rules for integrating different
sorts of evidence. For example, one kind of evidence would imply a
constraint for the set of possibilities containing the belief. If there are
multiple such constraints, the fusion operator would be the intersection of
the constrained sets dictated by independent belief sources. The
application of the fusion operators can be interpreted as being analogous
to, but more general than, Bayes theorem
for updating probability distributions based on new data.
When the degree of belief is such that it is not possible
to characterize uncertainty with a precise measure such as a probability
number assigned to a point in the space of possibilities, ER views
probability as being assigned or spread across an interval or subset of
the set of all possibilities. As with fuzzy logic, the theory of belief
functions is addressed through a setmembership approach. The idea is that
partial knowledge about some variable X is described by a set of possible
values E. The set defines a constraint over the variable. As a result,
utility is spread across some interval or set. The ER approach is,
therefore, characterized as a distributed modeling framework capable of
representing both precise data and ignorance, with the use of interval
utility for characterizing incomplete assessments and incomplete knowledge.
The ER approach is in this way a hierarchical evaluation model with rules
for synthesizing evidence specified by the DS theory of evidence.
The major advantage of ER is its ability to handle
uncertainties associated with quantitative and qualitative data. It
provides a straightforward way of quantifying ignorance and is therefore a
useful framework for handling incomplete, uncertain information with
varying levels of support or credibility. Computer programs have been
developed to facilitate the application of ER to realworld problems.
Applications include cargo ship design selection, marine system safety
analysis, organizational selfassessment, supplier prequalification
assessment, and environmental assessment. The applications demonstrate that
ER is able to handle both deterministic and random systems with incomplete,
missing information, or vague (fuzzy) information, large numbers (hundreds)
of criteria arranged in a hierarchy, and many alternatives.
ER's main disadvantage is its complexity and the
corresponding large computational requirements needed to apply the
combination rules of DS theory. Because the belief structure requires
twodimensional values, calculations required for the aggregation processes
are naturally more involved than with traditional methods using an additive utility function. ER
has also be criticized for a tendency to produce counterintuitive results
in cases where there is conflicting evidence, and researchers continue to
develop modifications to the method to address such problems. Also, even
though ER has been applied to a wide variety of problems, the specific
types of problems that warrant ER's more intense level of computation
remains unclear.


expected commercial value (ECV)

A method often used to assign a value to a project that is intended to create a new
product. Also called, estimated commercial value, ECV represents an
application of expected net present value
(ENPV). Scenarios are defined to
represent possible project outcomes. Each scenario is assigned a
probability to indicate its likelihood, and a project value is estimated
for each scenario. The expected commercial value is obtained by multiplying
each scenario's value by the scenario probability and adding the results.
ECV the prioritization metric most often used by project portfolio management tools aimed at new
product development projects.
Typically, the scenarios defined for computing ECV are
highly simplified. In particular, it is common to represent the product
development project as having two stages, which may be represented in a
simple decision tree.
Decision tree for computing ECV
The first stage is the product development stage.
Recognizing uncertainty, the probability of the project being technically
successful is P_{ts}. The second stage is the product launch, the
success of which is likewise uncertain. The probability of commercial
success (assuming the project is technically successful) is P_{cs}.
If D is the development cost, C is the cost of commercially launching the
project, and PV is the present value of future earnings for a commercially
successful project, then ECV may be computed using the formula:
ECV = [(PV*P_{cs}C)*P_{ts}]D
In reality, of course, technical and commercial success
are not yes/no outcomes. There may be varying degrees of technical success
and, assuming the product is launched, commercial sales could be anywhere
within a range of possibilities. Thus, the simplifications typically used
for the calculation of ECV may lead to inaccurate project valuations. Also,
because ECV is a simplified version of ENPV, it has the limitations of the
more general approach (including the potential for omitting nonfinancial
sources of project value and inadequate accounting of risk and
organizational risk tolerance).
On the other hand, depending on the application, the simple ECV formula may
provide a reasonably adequate method for ranking product development
projects.


expected internal rate of return (EIRR)

A modification of the internal rate of return (IRR) sometimes used to
prioritize projects (such as new
product development projects) whose costs and future cash flows are highly
uncertain. In the formula for computing IRR, project costs are replaced by
the expected value of initialyear project
costs, and project cash flows are replaced by the yearbyyear expected
value of project cash flows. Thus, The EIRR is the solution to the
equation:
In other words, to use the EIRR, alternative project cost
and future cashflow scenarios are
defined. For example, the various stages and associated cash flows for the
project (such as development, testing, and commercialization) may be
represented in a decision tree.
Probabilities are assigned to each scenario. The expected value of project
costs and expected value of each year's net cash flow are computed by
multiplying probabilities by cash flows and adding. The EIRR is then
computed as the discount rate that equates the discounted value of expected
future cash flows with the expected project cost.
When applied to multistage, highrisk projects, the EIRR
behaves in an intuitive way. For early stage projects with a low
probabilities of ultimate success, expected cash flows tend to be low so
the EIRR tends to be low. However, if such a project is funded, its EIRR
tends to grow (assuming initial project outcomes are successful) as project
costs are sunk and earlystage failure scenarios are avoided. A late stage
project (one that has successfully avoided early and middle stage risks)
tends to have a very high EIRR. Because of the strong influence of project
stage on the EIRR, typical advice is that projectbyproject comparisons
using EIRR be conducted only for projects at the same stage of development
and that separate budgets be established for funding projects within the
different stages.
As a project prioritization metric, the EIRR has the
advantages and disadvantages described for the IRR, plus the advantages and
difficulties associated with assigning probabilities to alternative
scenarios.


expected net present value (ENPV)

An enhancement of the net
present value approach that explicitly addresses uncertainty. Depending
on how it is applied, ENPV can produce estimates of uncertainty in the
value of the overall project portfolio and adjust project value to account
for risk. It can also be coupled with
methods for quantifying the nonfinancial or indirect components of project
value. It is, therefore, a useful tool for computing project and project
portfolio value. However, the computations necessary to compute ENPV can be
difficult, and the method is often best reserved for very large and risky
projects.
With ENPV, rather than calculate a single timestream of
project cash flows and other project impacts, alternative scenarios are defined representing the range
of possibilities. Simulation
techniques are often used to generate the alternative scenarios, which may
be represented in as a decision
tree (a graphic structure wherein alternative sequences of choices and
outcomes are displayed as branches in the tree and the various paths
through the tree represent the alternative scenarios) or event tree
(similar to a decision tree, but without nodes and branches representing
alternative choices). Probabilities are associated to each scenario in the
tree. A project NPV is computed for each
scenario, and the ENPV is the probabilityweighted sum of the values.
As described under net present value, selecting discount rates is often problematic. If
risk is important, riskadjusted discount rates
are often used, with different riskadjusted rates being appropriate for
different scenarios. Alternatively, techniques based on risk tolerance can be used to account
for risk (these techniques generally involve using a riskfree discount rate for computing
EPNV).
In addition to the difficulties mentioned above related
to selecting the discount rate, another limitation of ENPV is that
historical data is generally unavailable for estimating probabilities.
Thus, probabilities must typically be assigned subjectively.


expected value

Term used to represent the result of a mathematical
computation performed using probabilities. Suppose there is an uncertain
(random) variable X that may produce various "payoffs" (values).
Suppose the possible payoffs are denoted x_{1},
x_{2},..., x_{N}, and suppose that these
alternative payoffs occur with probabilities p_{1},
p_{2},...p_{N}, respectively. The expected
value of the variable is sum of each possible payoff multiplied by its
probability:
If instead of there being a finite number of payoffs, the
uncertain variable can take on a continuum of possible values (e.g., any
value between 0 and 1), then its expected value is computed by weighting
the possible values using the variable's probability density function
and using integral calculus.
The expected value may be interpreted as the average
return one would expect over many "trials" or opportunities for the
uncertainty to occur. See expected commercial
value and expected net present value
for examples of measures based on expected value.


expert

A person recognized by others to have superior knowledge or skill in a specific area. As
used on this website, the term expert is used to refer to a person who, due to great experience and understanding of an
organization's projects, or a subset
of its projects, is tasked with providing estimates needed as inputs by a
project selection decision model. Because the process of
providing projectspecific inputs to a project selection model is frequently called scoring, an
expert who provides scores may be referred to as a scorer.


expert system

A computer system programmed to behave like a human with
expertise in a particular field or problem area—it uses human
knowledge and reasoning techniques to provide advice for solving problems.
Expert systems represent an application or subfield of artificial
intelligence. Although various methods can be used to simulate the
performance of an expert, most expert systems consist of two components:
(1) a knowledge base that contains subject matter expertise and (2) an
inference engine that applies heuristics or reasoning rules similar to
those used by experts in the given field. Expert systems are typically used
as an aid to human workers or to supplement some information system. Some
project portfolio management tools are
advertised as including components that operate as expert systems.


exponential function

A mathematical function of the form: f(x) =
a^{x} (a raised to the power x). where x is a
variable, and a is a constant. The most commonly encountered version
of the exponential function is where the constant a is Euler's
number denoted e, which is equal to approximately 2.78128. In that
case the function is often expressed as f(x) = exp(x).


exponential utility function

Also called the relative risk aversion function and the negative exponential utility function, an
often used and very practial (ordinal utility function)
for valuing uncertain projects. A decision maker's utility function will be exponential if the
delta property holds and may be written:
where U is utility, V is value, and e denotes the
Euler's number (~2.78128).
The parameter R, called "rho", is the decision maker's
risk tolerance..
With the above form of the exponential utility function, the utility
numbers are all negative, but this should be of no concern since the relative utilities are what matters. If you would like
to scale the utility nunbers to go from zero to one is, use this version:
In this form of the equation, Low is the lowest value of V that can be obtained and High is the highest value.
As with the previous equation, this equation applies assuming that the the utility of V is
monotonically increasing (more
value is better) and R is positive (the decision maker is
risk averse) and not equal to infinity.
The figure below shows a plot of the exponential utility function with utility and value both scaled to go from
zero to one. As shown, the lower R (the more risk averse the decision maker is) the more the utility function curves:
As suggeted by the figure, situations where the uncertainty in possible project values
approaches the decision maker's risk tolerance
are the situations where
accounting for risk tolerance matters.
The reason that the exponential
utility function is so often used is that it leads to a simple
equation for computing the risk adjusted value of projects. If the decision maker has an
exponential utility function, the certain equivalent
of a project having an uncertain value V is given by the equation:
where E{U(V)] is the expected utility of the project,
R is the decision maker's risk tolerance, and ln is the natural logarithm. Thus, the exponential
utility function provides an easy way to compute the risk adjusted value of a project. Simply
compute the probability distribution for the uncertain project values (e.g., by using a
decision tree, with
project values being the possible discounted net present values of the project). Convert the
possible values to utilities using an exponential utility function, and then use the above formula to compute the
project's risk adjusted value.


externality

An undesired (or desired) impact associated with the production or consumption of
of a good that affects the welfare of a third party without any compensating payment (or compensation) being made.
In project analysis, an externality is an effect of a project not reflected in its financial
accounts and consequently not included
in the valuation of the project.

