CHAPTER 1-4
ALLOWING FOR UNCERTAINTY
Uncertainty, in the presence of vivid hopes and fears, is
painful, but must be endured if we wish to live without the
support of comforting fairy tales.
Bertrand Russell, A History of Western Philosophy
(New York: Simon and Schuster, 1945, p. xiv
Will Chip Lohmiller's kick from the 45 yard line go through
the uprights? How much oil can you expect from the next well you
drill, and what value should you assign to that prospect? Will
you be the first one to discover a workable system for converting
speech into computer-typed output?
Today's actions often continue to affect events many years
later. This perseveration of consequences constitutes a
difficulty in decision-making. Chapter 1-2 showed how the
mechanism of time-discounting and present-value calculation deals
nicely with that difficulty.
Inter-relatedness of activities is another difficulty in
making decisions. However, the mechanism of tabular analysis and
the consideration of each combination of activities handles the
difficulty of inter-relatedness with ease, as we saw in Chapter
1-1.
Now we come to uncertainty, a third major difficulty in
decision-making. When reading the business examples in previous
chapters, you certainly realized that you usually cannot know
with reasonable certainty just how many sales you will make at
each possible price. And often the expenditures you must make at
each possible level of sales are quite uncertain, too.
This chapter presents to you the intellectual machinery to
deal with uncertainty in a systematic fashion when valuing and
comparing alternatives. The estimation of probabilities is
discussed in Chapter 00, and the combination of probabilities in
complex situations is discussed in Chapter 4-2.
The central concept for dealing with uncertainty is
probability. Philosophers have wrestled long and hard with the
nature of probability and its proper interpretation. For
decision-making, however, the following uncontroversial
interpretation suffices. A probability statement is always about
the future. To say that an event has a high or low probability
is simply to make a forecast. But one does not know what the
likelihoods really are for future events, except in the case of a
gambler playing black on an honest roulette wheel, or an
insurance company issuing a policy on an event with which it has
had a lot of experience, such as a life insurance policy.
Therefore, we must make guesses about the likelihoods, using
various commonsense gimmicks.
All the gimmicks used to estimate probabilities should be
thought of as "proxies" for the actual probability. For example,
if NASA Mission Control simulates what will probably happen if a
valve is turned aboard an Apollo spacecraft, the result on the
ground is not the real probability that it will happen in space,
but rather a proxy for the real probability. If a manager looks
at the last two Decembers' sales of radios, and on that basis
guesses the likelihood that he will run out of stock if he orders
200 radios, then the last two years' experience is serving as a
proxy for future experience. If a sales manager just "intuits"
that the odds are 3 to 1 (a probability of .75) that the main
competitor will not meet a price cut, then all his past
experience summed into his intuition is a proxy for the
probability that it will really happen. Whether any proxy is a
good or bad one depends on the wisdom of the person choosing the
proxy and making the probability estimates.
A probability is stated as an arbitrary weight between 0 and
1. Zero means you estimate that there is no chance of the event
happening, and 1 means you are sure it will happen. A
probability estimate of .2 means that you think the chances are 1
in 5 (odds of 1 to 4) that the event will happen. A probability
estimate of .2 indicates that you think there is twice as great a
chance of the events happening as if you had estimated a
probability of .1.
There is no logical difference between the sort of
probability that the life insurance company estimates on the
basis of its "frequency series" of past death rates, and the
salesman's seat-of-the-pants estimate of what the competitor will
do. No frequency series speaks for itself in a perfectly
objective manner. Many judgments necessarily enter into
compiling every frequency series -- in deciding which frequency
series to use for an estimate, and choosing which part of the
frequency series to use. For example, should the insurance
company use only its records from last year, which will be too
few to give as many data as would be liked, or should it also use
death records from years further back, when conditions were
somewhat different?
In view of the inevitably subjective nature of probability
estimates, you may prefer to talk about "degrees of belief"
instead of probabilities. That's fine, just as long as it is
understood that we operate with degrees of belief in exactly the
same way as we operate with probabilities. The two terms are
working synonyms.
A probability estimate for an event that occurs many times -
- such as the likelihood of death of a man in the U. S. during
his fiftieth year -- is easy to interpret. But the probability
of a one-time or first-time event, such as the likelihood of
success of the first mission to Mars, is harder to interpret. I
view the latter as a representative of that category of events
that have some similarity to the event which was to forecast,
with the extent of similarity judged on the basis of analogy and
theoretical reasonings.
ALLOWING FOR UNCERTAINTY WHEN COMPARING ALTERNATIVES
The Concept of Expected Value
Consider these two alternatives: a) a thousand-dollar bill
in hand, or b) a 1/2 chance of two thousand-dollar bills and a
1/2 chance of nothing. It is intuitively clear that if you were
to be given a choice between these two alternatives on (say) 5000
occasions, you would be equally well off whichever you
consistently choose. The concept of expected value enables us to
evaluate and compare the two alternatives formally, leaving aside
(for now) any feeling of pleasantness or unpleasantness about the
certain and the uncertain choices.
The expected value is the combination of the value of each
outcome weighted by the probability that the outcome will take
place. That is, the expected value is the weighted average
obtained by first multiplying the value of each outcome by its
conditional probability, and then summing. It is the same as the
present value in the single-period context where no discounting
need be done. An example: Conditional on the fact that someone
offers to gamble double-or-nothing for a dozen apples, using a
fair coin, the expected value is:
(1) (2) (1)x(2)
Outcome Probability Expected Payoff
____________________________________________
No apples .50 0 apples
12 apples .50 6 apples
________
Expected value = 6 apples
____________________________________________
An expected value can be calculated meaningfully for payoffs
measured in apples, dollars, happiness points, or whatever. But
please notice that expected value is not synonymous with worth.
Twice as much money does not necessarily mean twice as much
pleasure or utility to you. For example, a 50-50 chance of
$l,000 may be worth less to you than a sure $500. We'll deal
later with that complication.
If you wish to know the present value of an expected value
of a set of outcomes at some future time, you may discount the
expected values just as if they were sums of money, just like any
other present-value calculation discussed earlier. But of course
this does not take into account the fact that there is
uncertainty and risk in the expected value of a set of possible
outcomes, as compared to a sum for sure; this matter will be
handled later.
The value of a business opportunity is the sum of all the
possible outcomes of an alternative choice, each weighted by the
probability that it will occur. For example, the expected value
of a one-in-ten chance that you'll get $100 if you sing in a
contest, plus a nine-in-ten chance of getting nothing, equals
(.1 X $100 + .9 X 0 =) $10.
This does not mean that the value to you of this contest
opportunity is $10; If you need money badly, and this will be
your last day on earth, having a one-in-ten chance of $100 may
not be worth to you ten times a sure prospect of having $10. Or
the chance of $1000 may be worth more, if you desperately need
$100 for a ticket out of hell. But over the long run of a good
many alternatives in the operation of a business or a live, the
expected value is a reasonable way to compute the values of
opportunities. Later, we will see how to modify the expected
value to take into account the special circumstances of the
disadvantages of risk for an individual (and for firms, too).
We operate on the basis of expected value literally all the
time. When you decide whether to take an umbrella in case of
rain, you are implicitly taking into account the probability of
rain, together with the costs of carrying the umbrella and of
getting wet if it rains and you do not take the umbrella.
Without an explicit calculation, your implicit intuitive solution
will often be in error. For example, consider that the chance of
rain is 1 in 50, your valuation of getting wet is $-10, and your
valuation of carrying the umbrella is $-1. The expected value of
carrying the umbrella is $-1. and the expected value of not
carrying the umbrella is .02 x -$10 + .98 x $0 = -.$20. So the
expected value of carrying it is much more negative (less
positive) than not carrying it. If you do this calculation for
yourself, you may well find yourself not carrying an umbrella in
many situations where you would otherwise have carried it for
lack of thinking clearly about the matter. (Indeed, an analysis
is only useful if it often leads to conclusions other that you
would have arrived at in the absence of the analysis.)
Another way to do the same calculation: State your values
of carrying the umbrella and of getting wet if it rains if you
have no umbrella, then figure backwards to the probability of
rain that would make it worthwhile to take the umbrella -- that
is, the probability that would balance the expected values of the
two alternatives. I will leave it to you to check that the
probability must be 10 percent (.1) or greater for carrying it to
be worthwhile, given the valuations in the paragraph above.
Figure of Umbrella Calculation Here
Expected value is at the heart of all insurance. The
insurer estimates the probability of the insured-against event --
say, the probability of death of a man during the year he is aged
55 in the U. S. at present -- and then multiplies that
probability by the value of the insurance to obtain the expected
value of the loss. That expected value plus its operating
expenses forms the basis for the insurance company calculating
the price it will charge for that insurance policy. The
expected-value concept is also at the heart of all prices of
wagers with bookmakers (in the states and countries where that is
legal, of course!).
Another example: The concept of expected value underlies
the decision to accept an offer of a settlement in a law suit
about a patent of yours. Assume that the company you are suing
offers you a $200,000 settlement. You figure that you have a .6
chance of winning $1 million in court. (Leave aside for now the
complication that you do not know for sure how much you would be
awarded if you do win.) The expected value of continuing the suit
to a trial is $600,000, and you should therefore turn down the
offer unless you are willing to pay a lot to avoid the risk of
losing (a matter that will be discussed in the next chapter).
In such a case, unfortunately, the calculation of expected
value to her of a lawyer working on a contingency basis will
differ from the client's expected value, because the lawyer will
take into account the costs of her time if the settlement is not
accepted; in contrast, the client does not pay those costs, and
the appropriate calculation for you therefore does not include
them. Hence the lawyer sometimes has a stake in your accepting a
settlement even when it is not to the client's interests. (This
discrepancy between interests of people on the same side of the
table occurs in many circumstances. For example, it is usually
to the interest of a publisher to set a higher price for a book
than is best for the author.)
The choice of a price to bid in a closed auction is another
important application of the concept of expected value. The
decision hinges on the probabilities of winning the auction at
various prices you might bid; the higher your bid, the more you
would gain if you win the auction, but the lower the chance of
winning because some competitor is more likely to underbid you.
You should evaluate alternative bids according to their expected
values, which you calculate as the probability of winning
multiplied by the gain if the bid is won at that price.
Consider for example that you are in the painting business,
and your town calls for bids on painting the town hall. You
figure that the work will cost you $16,000 if you get the job.
The bid prices you are considering, and the probabilities you
estimate for winning the auction at each of those prices, are
shown in columns 1 and 2 in Table 1-4-1. The expected value for
each price is calculated by multiplying the probability of
winning by the difference between revenue and expenditures
(column 4) if you do get the bid. (For completeness, we also
show the probability multiplied by the expected value of $0 if
you do not win the auction.) The bid price with the highest
expected value in column X is the best alternative.
When risk is ignored in a present-value calculation, an
expected value in a future period may be treated just like a
certain income or outgo. In that fashion the complications of
both futurity and uncertainty may be dealt with at once, as long
as no decisions need be made in the future. (If they will be, we
must resort to the more complex machinery of the decision tree,
which we tackle below.)
The Decision Tree
The situation is more complex when there will be a sequence
of choices rather than only one choice. Consider calculating the
expected value of this gamble: You flip a nickel. If it falls
on its head, I'll give you $240, and you will also get a chance
to flip a dime. If the nickel does fall heads and you do get a
chance to flip the dime -- a chance you may reject, of course --
I'll give you $250 if the dime falls on its head, but you must
give me $300 if it falls on its tail. If the original nickel
flip falls on its tail, you get $150 from me plus a chance to
flip a quarter. If you get the chance and choose to flip the
quarter and it falls on its head, I'll give you $150, but if it
falls on its tail, you must give me $100.
Would you pay $200 for this gamble which is diagrammed in
Figure 1-4-1? It looks easy to evaluate at first, but you soon
see how puzzling it is. The heart of the difficulty is that you
cannot evaluate the choice of taking the deal now unless you know
what you will choose to do after you see whether the coin falls
on its head or tail.
Figure 1-4-1
Curiously, even though all the necessary elements of
knowledge to solve this problem were available 300 years ago, it
was only in the past half-century that the solution was
discovered, a powerful mathematical technique known as "decision-
tree analysis" or "backward induction" -- or more frighteningly,
"dynamic programming".
The way out of the impasse is to start at the farthest-away
point in time and figure the expected values of the farthest-away
sets of outcomes. Then you decide which alternatives you would
choose in the next-to-last period if you reach those points, and
so on, all the way backward to the present. When this process is
complete, and only then, you can choose a first-period
alternative.
The steps in a decision-tree analysis require only simple
arithmetic, and can be easily learned when you need to do so. In
perhaps 9 of 10 cases, the greatest value of the decision-tree
analysis is not the formal calculation, but rather the exercise
of forcing yourself to clarify your thinking on paper.1
Consider, for example, the picture of the decision about choosing
a college major (Figure 1-4-2). You will find it very difficult
to decide on the probabilities, costs, and benefits to put into
the picture. You can avoid making these quantities explicit if
you avoid putting the analysis on paper. But a sound decision
requires that you do make these quantities explicit. And in most
cases, the process of making your best guesses about these
quantities reveals the best decision without formal analysis.2
Figure 1-4-2
(The value of paper, pencil, and picture-making is brought
out by this famous puzzle: A man points to the image of a person
and says, "Brothers or sisters have I none. That man's father is
my father's son." The puzzle is hard for most of us to solve in
our heads. But drawing simple pictures usually reveals the
answer immediately. Try it.)
Before you can assess its value to you, you must know the
chance that an event will occur. But the relevant probability is
not obvious in many circumstances. First there is the problem of
assessing even the "simplest" likelihood -- for example, the
likelihood that it will rain on Sunday. (Estimation of
probabilities is discussed in Chapter 00.) Then there is the
complication that several probabilities may interact, such as the
probability of rain on Sunday and your favorite football team
winning the game, a complex probability. Complex probabilities
are dealt with in Chapter 4-2.
Page # thinking uncrt14% 3-3-4d