September 15, 2004
Knightian Risk and Uncertainty: Two-Card Bets
This post is notes mainly for myself (or for others who have heard about this subject) on risk. I've started reading Roger Lowenstein's When Genius Failed, about the collapse of Long-Term Capital Management in 1998. Besides the lessons for finance, thinking, and life generally, Professors Merton and Scholes were involved. On page 62 Lowenstein talks about risk, and that started me thinking.
(1) Consider various bets.
Bet 1-A. I have two cards, the 3 of spades and the 4 of spades. I win $1 if a 3 turns up. Value: $.50 (if you're risk neutral). This is the situation some people (Knight) call "risk", though I hate that terminology since it contradicts the usual meaning of the word in economics. "Definite Priors" is better, or "Certainty about Uncertainty".
Bet 1-B. I have two cards. Both are either 3's, or 4's. I win $1 if a 3 turns up. Value: $.50 (if you're risk neutral). This is the situation some people (Knight) call "uncertainty", though I hate that terminology since it contradicts the usual meaning of the word in economics. "Indefinite Priors" is better, or "Uncertainty about Uncertainty".
Some people think it is impossible to value Bet 1-B, because the investor needs to estimate probabilities of probabilities. I've never understood the problem, though, and am left with the impression that those people are just confused in their thinking. Situations with Indefinite Priors make one's head hurt more, because it's harder to think out of the box, one *does* have to think up some priors, and not thinking hard enough will lead to bigger mistakes, but all that is different from saying that a person can't or won't come up with an estimate of probabilities. I side with Savage against Knight.
The Ellsberg Paradox is essentially that people prefer 1-A to 1-B. The explanation for it that I like is that we don't trust the person on the other side of the bet, so the more wiggle room they have, the less we like it. Maybe they know that, for whatever reason, more people like to bet on 3 than on 4, so they always choose a deck with two 4's instead of one with two 3's. Even more simply, as in the way I specified Bet 1-A, they pick a deck of two 4's and only let me bet on 3. Either way, I'm safer with a deck I know has exactly one 3 and one 4.
(2) Now suppose now that I draw a card 100 times , reshuffling each time, but keeping the same two-card deck each time. The expected value of each bet will be $50.
Bet 100-A will very likely yield me close to $50, though I might get as little as $0 or as much as $100.
Bet 100-B will yield me either $100 or $0, with zero probability of anything else happening.
Bet 100-A will be preferred to Bet 100-B by any risk-averse investor.
(An investor will be indifferent between 1-A and 1-B unless he doesn't trust that the bet is not rigged.)
(If the investor is allowed to change what he bets on-- shifting from 3 to 4 and back-- then Bet 100-B is clearly the best, yielding either $100 or $99 to an investor playing the rational strategy. When learning is possible, starting with bad info is, other things equal, good
Although Bet 100-B is riskier than 100-A, it is much simpler. If I wanted to make precise plans for what to do in the various possible outcomes of Bet 100-B, I would only have to make two plans, for $0 and $100, whereas for Bet 100-A I would need 101 plans.
(3) If I were betting on the outcome of the bets, it would be much easier to bet on Bet 100-B. I could bet that the outcome would be either $0 or $100, and I'd win my bet for sure. In this sense, it is quite possible for a situation with Indefinite Priors to be simpler, safer, and more certain.
(4) Suppose I had to pay $.40 for Bet 1-A or $40 for Bet 100-A. Which would I prefer? Most risk-averse investors would prefer 100-A, because it very likely yields around $50, and is very unlikely to yield less than $40. An old result (due to Samuelson) in finance, though, is that to think Bet 100-A is better for *any* risk averse investor is The Fallacy of Large Numbers, because even independent repetitions of a gamble do not reduce risk, if by risk we mean the standard definition in terms of mean-preserving spreads. Bet 100-A, at cost $40, incurs the risk of a net payoff of $-40. The worst that can happen with Bet 1-A at a cost of $0.40 is a net payoff of $-0.40. Thus, someone who is heavily averse to big down-side risks would prefer Bet 1-A.
The Fallacy of Large Numbers comes up most often in the context of insurance. Insurance companies work not by collecting risks so they average out, but by dividing risks among many small investors. That is why insurance against big earthquakes and hurricanes-- which do *not* average out-- makes sense (or would if we didn't all know that the government will bail out even the uninsured,so it is pointless to pay for insurance).
If I get round to reading more of the book, I may discover how these examples fit into LTCM's collapse. I am hoping this will fit into my research program on consumer and bargaining uncertainty over values, as in "Explaining Incomplete Contracts as the Result of Contract-Reading Costs," "Getting Carried Away in Auctions as Imperfect Value Discovery," and "Strategic Implications of Uncertainty Over One's Own Private Value in Auctions."
Posted by erasmuse at September 15, 2004 01:22 PM
TrackBack URL for this entry: http://www.rasmusen.org/mt-new/mt-tb.cgi/201
It should read "...then Bet 100-B is clearly the best, yielding either $100 or $99 to an investor playing the rational strategy".
Please keep on posting your notes!
Posted by: Michael Stastny at October 3, 2004 02:48 PM
Thanks!I've fixed that mistake, and an odd glitch in the bet 100-A description, October 3, 2004.
Posted by: Eric Rasmusen at October 3, 2004 07:24 PM