Sunday, August 10, 2003

BANS ON LIE DETECTOR TESTS are a great example of not just foolish but wasteful and dangerous government regulation. But to see how useful lie detector tests are, one must not ask scientists or lawyers; one must think about the data like an economist or businessman. (I don't mean to insult scientists and lawyers--but just because all the smart scientists and lawyers agree on something is not a reliable guide to whether it is good policy.) Law professor Instapundit's August 5 posting and the recent National Academy of Sciences report attacking lie detectors are prime examples of this. Using just these two sources and the anti-lie-detector August 3, 2003 Boston Globe story that Instapundit cites (now gone from the free site), we see that lie detectors actually are useful and effective, even without going to pro-lie-detector sources.

Big ideas that are useful here are:

  1. If lots of competitive businesses use a practice for a long time, it probably works.
  2. A practice must be judged in comparison to other practices; it must be better, not good, and certainly not perfect.
  3. A practice should be judged in the way it is ordinarily used, not in the ways it could be misused. In particular, if it is always used in combination with other practices, don't test it in isolation.
  4. A practice ought to pass a cost-benefit analysis. If it is cheap enough, a crude practice is better than a more effective technique.
Now, to lie detectors. The Boston Globe says
Congress outlawed its use on employees of private businesses with the Employee Polygraph Protection Act of 1988, a law based on doubts about the machine's reliability and spurred by revelations that more than 400,000 people a year were being regularly polygraphed in the private sector. But the act specifically exempted various government agencies from its provisions, allowing the use of the polygraph in the public sector to explode in the past 15 years. For example, the polygraph is now used to screen applicants for 62 percent of the nation's police departments, compared with 19 percent 40 years ago. The federal government alone runs 20 polygraph programs and employs more than 500 examiners.
Thus, we learn that the private sector used lie detectors extensively, no doubt at great cost. And if they didn't work, those companies were also making bad decisions based on them. Note, too, that the government uses them extensively despite banning them for other people. All this points to lie detectors being unpopular with certain employee groups, perhaps because they are effective. Note further:

In 2002, when the FBI was looking for who might have leaked information regarding what the government might have known before the attacks of September 11, 2001, it asked members of the Senate Select Committee on Intelligence and their staffs to submit to polygraphs. Out of the question, the legislators replied. "They're not even admissible in court," scoffed Senator Richard C. Shelby, Republican of Alabama, whose support for widespread polygraph testing within the federal government was otherwise unbounded.
Why did the Senators object, and Shelby in particular? Because if there did turn out to be a spy on his staff, he'd rather not know-- it could cost him re-election.

That was where matters stood until 1999, when Congress began looking closely at security at the Los Alamos and Sandia labs in the midst of the Wen Ho Lee fiasco. The uproar was immediate and loud. After a series of stormy public meetings in New Mexico, Congress mandated the testing of the 20,000 employees at both labs. But New Mexico senator Jeff Bingaman, for whom this was a constituent matter, forced into the bill the funding for the National Academy of Sciences report on the reliability of the polygraph when used for security screening. When it was released late last year, the study proved the most significant critique of the polygraph since the Frye decision.
Here we have another example of the same thing. Those laboratories are national disgraces as far as security is concerned, as the Wen Ho Lee case showed. If I remember rightly, Berkeley, the long-time administrator, was just fired because of that. Scientists ate security, of course. And nobody at the labs wants any more spies to be caught.

Then, the anti-polygraph article makes an amazing revelation:

It is in this area that the faith placed in the polygraph seems most fragile and in which the machine seems to run most directly on faith. Both Aldrich Ames and Robert Hanssen, Soviet megaspies in the [NSA] and the FBI, respectively, passed screening polygraphs, Ames by persuading the examiner that he'd misread the chart. The case of Wen Ho Lee is similarly instructive. Accused of leaking nuclear weapons data to the Chinese government, Lee passed a polygraph. Seeking to elicit a confession, however, the examiner told Lee that he'd failed. The best use of the polygraph was conflated with the worst, and Lee wound up in jail until he was cleared of all the major charges against him. Ames, from his cell in a federal prison, is leading a campaign against the use of polygraph screening.
So we learn that in one out of three cases, the polygraph would have caught a major spy. Aldrich Ames's treason, recall, led to the deaths of a number of U.S. agents in the Soviet Union. Taken seriously, the polygraph would have saved their lives. Hanssen passed his test--too bad. Lee passed too. The article implies that Lee was not a spy, which I think is wrong, but on the article's assumption, the test worked there too. Whether it did or not, catching Ames would have been worth testing 50,000 people, I'm sure (suppose he caused $500 million worth of damage-- that would be $10,000 per person in benefit, which easily passes any cost-benefit test).

Then the Globe tells us:

In 1997, an article in a psychological journal put the machine's reliability at only 61 percent. (Between 1994 and 1997, more than 3,000 FBI applicants were judged to have been "withholding information" during their pre-employment polygraph examinations.)
Only 61 percent? That sounds stupendously high (though I don't know what "reliability" means here). Compare that with any other means of detecting lies that you like--and, in particular, interviews.

And finally we come to the National Academy of Sciences report From the Globe article and the report's executive summary:

The study determined that not only was the polygraph useless for security screening but that its use might actually be detrimental to the work of keeping the labs secure. It argued that the test was so vague that, to catch one spy, nearly 100 other employees might have to have their security clearances lifted. "Polygraph testing," the report concluded, "yields an unacceptable choice . . . between too many loyal employees falsely judged deceptive and too many . . . threats left undetected." [Globe]

Table S-1 illustrates the unpleasant tradeoffs facing policy makers who use a screening technique in a hypothetical population of 10,000 government employees that includes 10 spies, even when an accuracy is assumed that is greater than can be expected of polygraph testing on the basis of available research. If the test were set sensitively enough to detect about 80 percent or more of deceivers, about 1,606 employees or more would be expected "fail" the test; further investigation would be needed to separate the 8 spies from the 1,598 loyal employees caught in the screen. [NAS report]

This seems to contradict the 1997 study the Globe cited with its 61%. But suppose the NAS is correct. Then the polygraph, by itself, will screen out 84% of the tested employees. You tell those people to go back to work. What happens next? Do you fire the 16% who fail? No, of course not. You look and see whether they have taken lots of trips to China, whether they've bought new cars, whether they work on sensitive subjects, and so forth. And I bet none of these other criteria--which sound quite reasonable--- have anything like the same degree of accuracy. We use them because we use every criterion that has at least slight value and is cheap.

Now go back to my four big ideas and you'll see how they apply.

[ http://php.indiana.edu/~erasmuse/w/03.08.10a.htm ]

 

To return to Eric Rasmusen's weblog, click http://php.indiana.edu/~erasmuse/w/0.rasmusen.htm.