2. There is really no such thing as permanent security in anything on earth. Not taking risks is really not more secure than taking them, for your present state can always be changed without action on your part. If you don't take the risk of dying by driving to the store, your house could collapse on you and kill you anyway.
3. You are supposed to be afraid when you risk. Admit your fears--of loss, of rejection, of failure.
4. Risking normally involves a degree of separation anxiety--the anxiety you feel whenever you are removed from something that makes you feel secure. Many children feel this when they first leave their parents for school. Some college students feel this when they go off to college. Travelers sometimes feel it when they get homesick. The way to overcome separation anxiety is to build a bridge between the familiar and secure and the new. Find out what the new place--school or country--is like and how its elements compare to familiar and secure things at home. Take familiar things with you--books, teddy bear, popcorn popper, whatever.
The same is true of all risks. Make the opportunity as familiar as possible and learn as much about it as you can before you release the security of the old. Find out about the new job, its location, the lifestyle of those who live there, and so on.
Let's take a typical state lottery, for example. The investment for a ticket is a dollar. The usual prize is about $6,500,000 and the chance of winning is about one in 14,800,000. By discounting the possible outcome by the chance of winning (dividing $6.5 million by 14.8 million), we discover that the expected value of the lottery ticket is about 43.9 cents. Since a ticket costs $1.00 (more than twice as much as its expected value), we would conclude that this is a poor risk. Only when the expected value meets or exceeds the required expense is the risk considered worth taking, according to this theory.
Sweepstakes. Ticket: $2.50 Prize: $100,000,000
Chance: 1 in
2. State Lottery. Ticket: $1.00 Prize: $42,300,000 Chance: 1 in 14,800,000 (Note: Calculate the expected value for just a single winner and for the number of winners you'd expect based on 80,000,000 entries.)
3. Reader's Digest Sweepstakes. Ticket: 32 cent stamp to return the entry Prize: $6,000,000 Chance: 1 in 256,000,000
4. Publisher's Clearinghouse Sweepstakes. Ticket: 32 cent stamp to return the entry Prize: $10,000,000 Chance: 1 in 140,000,000
5. Charity Raffle. Ticket: $5.00 Prize: $12,400 (new car) Chance: 1 in 3,000
6. Vegas Roulette #1. Ticket: $20 bet Prize: $380 Chance: 1 in 35
7. Reno Roulette #2 Ticket: $25 bet Prize: $975 Chance: 1 in 35
8. Pearl in Oyster Ticket: $10 Prize: $50 Chance: 1 in 8
9. Extended Warranty Ticket (Price of Extended Warranty): $45 Prize (Cost of average covered warranty repair): $180 Chance: 1 in 12
10. In Your Dreams Ticket: $1.00 Prize: $500,000 Chance: 1 in 250
Many risks have multiple possible outcomes, each outcome with its own probability of occurrence and its own value. The expected value of a given decision in such cases is the sum of all the values of each outcome, each diminished by its individual probability. The formula is
EV = sumn (pn x rn)
where p is the probability and r is the reward or value of the risk. Note in the following examples that the value of an outcome is represented numerically, but it does not need to represent dollars, or even physical units. A 10 could be units of happiness, pleasure, pain, embarrassment, and so forth, as well as dollars.
Example: Should I go scuba diving this weekend? If I do, there is a ninety percent probability that I will have a lot of fun. I quantify this great fun as 10 fun points. There is also a ten percent probability that I will get hurt, which I quantify as minus 20 fun points. If I make the other decision, to stay home, there is a ninety-nine percent probability that I will be bored, represented by a minus 2 fun points, and a one percent probability that something exciting will happen, which I represent as five fun points (half as exciting as going scuba diving). Our expected value worksheet looks like this:
_____ .9 x +10 = +9
|_____ .1 x -20 = -2
_Scuba?_| Total = 7
|_____ .99 x -2 = -1.98
|_____ .01 x +5 = + .05
Total = -1.93
Here we see that the expected value of going diving is 7, which is much higher than the expected value of staying home, which is a negative 1.93.
As another example, suppose I'm trying to decide whether or not to attempt a repair on my computer or whether to have a dealer fix it. If I attempt the repair, there are three possible outcomes. One is that I'll succeed, which I value both financially, experientially, and egotistically, so I give that a +10. Second is that I will increase the cost of repairing the computer by damaging something. This I rate at -8. The third possibility is that I will ruin the computer and be totally humiliated. This I rate at -20. The probability I see for each of these possibilities is, in order, 50%, 30%, and 20%. Do note that for any given decision, the probabilities of all possible outcomes must add up to 100%.
On the other hand, if I have a dealer repair the computer, there are two possibilities. One is that it will cost a modest amount of money, which I rate at a -2, since I will be out a few bucks and will have to haul the computer into the shop and back. The other possibility is that the repair will cost major money, which I rate at a -9. The probabilities for each of these I predict at 20% for a cheap repair and 80% for an expensive one. Our EV worksheet would then look like this:
_____ .5 x +10 = +5
|_____ .3 x -8 = -2.4
|_____ .2 x -20 = -4
Me Fix?_| Total -1.4
|_____ .8 x -9 = -7.2
|_____ .2 x -2 = -0.4
Here we see that both expected values are negative, meaning that this decision will probably result in discomfort either way. However, the expected value for doing the repair is "higher" (less negative) than that for having the dealer do it, so that is the way our calculations tell us to go. Note, of course, that if we decide our probabilities are different, or if we decide that our rewards are different, the expected values will change.
1. You want to decide whether or not to take the freeway home from an event you've attended. From experience, you calculate that if you take the freeway, you will either speed home, which you rate at a +8 on the happiness scale, or you will get into a traffic jam, which you rate at a -6 on the happiness scale. If you take the side streets, you will either get home okay, which is a +4 on the happiness scale (since it's only half as fast as the freeway) or you will hit another traffic jam, which you rate as a -7, slightly worse than a jam on the freeway. The probability of a freeway jam is 60% (you'll have to figure out the probability for speeding home). The probability for getting home okay on the side streets is 30%.
2. Your crop of
cotton is infested with insects and you want
which pesticide to use. Some of the insects are probably resistant to
different available pesticides, so you sit down and figure out the
probabilities. If you use ToxiBug, there is a 22% probability that it
kill 95% of the bugs on your crops. There is a 49% probability that it
will kill only 71% of the bugs, and a 29% probability that it will kill
only 43% of the bugs. (Use the bug percentages as reward numbers, so
95% is a reward of 95.)
If you use Bug-O-Kill, there is a 71% probability that it will kill 90% of the bugs, a 24% probability that it will kill 11% of the bugs, and a 5% probability that it will kill 19% of the bugs.
If you use MegaDeath Bug Viability Terminator, there is a 90% chance that 60% of the bugs will be killed, and a 10% chance that 5% will be killed.
3. You have been
retained by Amalgamated Pencil Sharpeners,
help determine whether the company should export its new sharpener
XT-S to Brazil. If APS does export, there are three foreseen
First, there is a 25% probability that the product will sell well,
the company (after startup costs) $280,000. Second, there is a 40%
that the sharpeners will have only modest performance, earning the
a net of only $15,000. Lastly, the product might be rejected, causing
company to lose its startup costs of $175,000. The probability for this
If the company decides not to export the sharpeners, it could invest the startup money with a 90% probability of making a net of $18,000 and a 10% probability of losing a net of $27,000.
What should the company do?
4. As a diving buff, you have been asked to help salvage a sunken treasure ship off the coast of Florida. Your only problem is an abundance of riches: There are three ships to choose from. And, to make life interesting, there is a little uncertainty about whether each has already been salvaged. (If the ship has already been salvaged, there will be no treasure at all left, and the attempt will result in a net loss equal to the cost of mounting the expedition.) Judging by the records of each ship's inventory and the probability of previous salvaging, you have this information:
If you salvage the Jacques D'Ambois, there is a 60% probability of finding the $20,000,000 in gold and silver bars on board. Cost of salvaging this wreck is $5,000,000.
If you salvage the Acana, there is a 75% probability of finding $11,000,000 in doubloons and jewels. Cost of salvaging this wreck is $3,000,000.
If you salvage the Princess Avanti, there is a 20% probability of finding $30,000,000 in gold and a 25% probability of finding only $15,000,000. Cost of salvaging this wreck is $4,000,000.
the cost of salvaging from the hoped
for return in
each case. Subtract the probability of success from 100% to find the
of failure. Failure results in the expense of salvaging (a net loss).
Which ship should be salvaged?
5. Penelope has a
serious illness for which doctors have
surgery. If she has the operation, there is a 60% chance she will
and live another 50 years. There is a 20% chance she will live only 20
more years. And there is a 20% percent chance that she will die on the
operating table or shortly thereafter. If she does not have the
there is a 60% chance that she will live only five years. There is a
percent chance that she will live 15 years. And there is a 25% chance
she will spontaneously recover and live 50 years. For each case, let
number of years to live equal the possible reward. For the possibility
of dying on the operating table, make that equal to a negative of the
value of not having the operation at all (so calculate the not having
operation EV first).
Should she have the operation?
6. You have $250,000 to invest for a year. If you put it in stocks, there is a 50 percent chance that you will net a return of $40,000. There is, however a 20 percent chance that you'll lose $2,000 and a 30 percent chance that the market will really decline and you'll lose $50,000.
If you put the
money in the bank, there is a 95 percent chance
you'll earn $17,500 in interest. There is, however, just a small
percent--that the bank will go broke, and since the FDIC insurance
only $100,000, you would lose $150,000.
Which investment has the highest expected value?
7. You are trying to decide between three used cars, all of which are priced the same. If you buy used car number one, there is a 70 percent probability that you'll have to spend $400 to get the engine back in shape. However, there is a 30 percent probability that the engine will have to be replaced, which will cost you $2,000.
If you choose car number 2, there is a 50 percent probability that you won't have to spend any money at all, a 30 percent probability that emission repairs will cost only $200, but there is a 20 percent chance that the car will require a California smog conversion (since it may be a European import that hasn't been built for California). This will cost you $5000.
If you choose car
number three, you will face a 60 percent
of an $800 transmission repair, a 35 percent probability of a small
adjustment, and a 5 percent possibility that you'll need to spend $1600
to fix the engine and the transmission.
Which car should you buy? Consider the costs as negative values and choose the one with the lowest negative total.
8. Your true love comes up to you and says, "Darling, I can't decide whether we should go to the beach or to a movie, because while the beach would be twice as much fun if it doesn't rain, there is a 30 percent chance of rain today. And if it rains, the beach would be no fun at all." You smile knowingly and reply, "Well, sweetheart, I just happen to know how to calculate expected values. I'll solve the problem for us." If the fun you would have at the beach is a 10, what should you decision be?
1. Dismiss extremely remote or unrealistic possibilities. For example, in the decision, Shall I go to the store? there are risks like dying on the freeway, being shot by robbers, buying poisoned food, and so forth, but these should not normally enter into the risk evaluation because they are highly if not extremely improbable. Remember that all life is accompanied by risk. Ten thousand television sets catch fire each year, a hundred thousand people walk through plate glass each year, 125,000 do-it-yourselfers injure themselves with power tools each year, 70,000 children are injured by toys each year, ten thousand people are poisoned by aspirin each year. But what are we willing to give up? Some of these are not really remote, but we are willing to take the risk. E.g. automobile deaths. 1 chance in 4000 each year of dying.
And of course whenever you trust someone, you risk betrayal; when you open yourself, you risk exploitation or ridicule; whenever you hand over a dollar, you risk being defrauded.
2. Insofar as possible, avoid catastrophes. If there is a small but significant chance for catastrophe, then the regular expected value calculations may not apply.
A major principle of risk management is to avoid any real risk of catastrophe at any reasonable cost. The difficulty of applying this principle comes from the uncertainty of what is a real risk and what is a reasonable cost.
3. Recognize the tradeoffs. Remember that every action of life has some risk to it. Even when we don't take the risk upon ourselves, risk is often put upon us by the nature of life and society. Eating you risk food poisoning or choking, but you have to eat or you'll die. Socializing you risk disease, driving or flying you risk crashing, but in some sense you have to socialize and travel. Lying in the sun you risk skin cancer; smoking you risk lung cancer; eating French fries you risk heart disease.
Don't deny the risks involved in living and don't worry excessively about the consequences of modern life.
4. Maximize Expected Values. Normally, the expected value of each alternative shows its relative preferability. That is, you are opting for the greatest probability of the greatest good. Remember, though, that these calculations are guides, and are based on what may be very subjective probabilities and rewards. You are not "required by law" to choose any particular alternative. If you believe that the alternative with the highest EV is a poor choice, you should reconsider the probabilities and rewards you have assigned to all the alternatives.
With these ideas in mind, you'll better understand why some people pursue dangerous sports like skiing, sky diving, race car driving and so forth. The risk/benefit ratio is acceptable to them. It may be useful to note here, too, that most people are not rational risk takers. They take some risks all out of proportion to any expected return and avoid other risks that have a large expected value compared to the risk.
- Critical Thinking Course Homepage
- Introduction to Creative Thinking
- Creative Thinking Techniques
- Criteria for Evaluating a Creative Solution
- Introduction to Problem Solving
- Human-Factor Phenomena in Problem Solving
- Problem Solving Techniques
- Biases Affecting Information Processing
- Decision Making Techniques
- Decision Simplification Techniques
- Difficulties Created by the Videographic Presentation of Information
- Why Are We So Busy?
- Truths of the Information Age
Copyright 1998, 2009, 2012 by Robert Harris | How to cite this page
w w w . v i r t u a l s a l t . c o m