03.29b Misestimates of Risk As a Reason for Regulation. I have two ideas I want to jot down on this. (1) If some people overestimate and some underestimate, then government info provision is a better solution than requiring safety precautions (or banning precautions). (2) Ambiguity aversion in experiments might be based on fear of trickery or of showing low ability; in either case, it gives a good reason for government ignoring such aversion in the case of natural risks. For an example to use for both, suppose that scientists all agree that the risk of mad cow disease from eating organic cow brains is level 100 per million meals.

(1) Heterogeneity.
Suppose half of consumers think the risk is 0 and half think it is 150. On average, consumers estimate the risk at 75, an underestimate of the risk. But this does not imply that the government should make it most costly for consumers to eat cow brains-- say, by a tax. That is the correct policy for half of consumers, but will skew the behavior of the other half even further away from the optimum. A better policy would be to provide information on the true risk, without any attempt to shade it one way or the other.

Will people respond to such information? One problem is the Two-Armed Bandit/Pseudo- Kripke Paradox I discussed on March 1. The people whose estimates are 0 might rationally put the problem out of their minds and ignore all new information.

(2) Ambiguity Aversion. The idea of the Ellsberg Paradox is that people prefer a single probability to a compound probability. Here, suppose scientists and consumers alike believe that there is actually a 99% probability that the risk is 50.5 per million and a 1% probability that it is 5,000 per million, for an overall probability of .99 (50.5) + .01 (5,000) = 100 (approximately). Consumers are more reluctant to eat the cow brains than if the risk were a sure 100 per million. Ought the government to be?

I was going to say that the government should not, but I changed my mind. Here is what I was reasoning. In experiments, consumers dislike ambiguity. In the Ellsberg Paradox, as I just discussed in G406, if Urn 1 contains 50 red balls and 50 white balls, Urn 2 contains an unknown mixture of red and white balls, and you win $10 if a white ball is drawn, you probably would pick Urn 1. The explanation I like best is that you are suspicious of experimenters and don't take sucker bets. Even if conditions are set up so that the experimenter can prove he has not rigged Urn 2, our rule of thumb is to choose the simple over the complicated. (Another application of the idea is in my "Explaining Incomplete Contracts as the Result of Contract-Reading Costs".) Another explanation is that we don't like to make choices that might turn out wrong and make people think we are stupid, an idea akin to "regret theory". Since the mad cow disease example is a gamble against Nature rather than an experimenter, I thought that our fears of ambiguity are irrational there.

But I neglected something. Can we really trust the estimates of the scientists? No, we can't. When someone tells me there is a 99% chance of a good outcome and a 1% chance of a bad outcome, I rightly worry that they are hedging their bets and maybe are trying to fool me into thinking the bad outcome is unlikely while covering themselves in case it really does occur. Thus, there is a big difference between 1% and 0%.

Taking this idea a bit further, connect it with the idea of "boiling-in-oil" contracts. If someone tells me the proability of a bad outcome is zero, due to his high effort or just because he has exerted effort in figuring that out, I can make him put his money where his mouth is by asking him to forfeit his entire wealth if the bad outcome then occurs. If he tells me the probability is 1%, I can't ask him to take that bet, because he is probably too risk averse to accept it given that there is indeed some probability of his losing all his wealth. This makes getting accurate information from an agent much harder.

I thought of these subjects because I am teaching risk in G406, because Rick Harbaugh talked about his research on ambiguity aversion in our game theory lunch, and because I've been reading Kip Viscusi's chapter on "Regulation of Health, Safety, and Environmental Risks" for the Handbook of Law and Economics that will appear in a year or two.

[in full at 04.03.29b.htm ]

To return to Eric Rasmusen's weblog, click http://php.indiana.edu/~erasmuse/w/0.rasmusen.htm.