Monday, June 30, 2008

Confirmation bias and the death penalty

So we're continuing on our death penalty theme. Don't worry, soon we'll move onto happier subjects!

Justin Wolfers posted this piece today explaining how the Supreme Court misinterpreted his research on the deterrent effect of capital punishment. In the post (which is a follow-up to an op-ed in the Washington Post), he links to some background information he put together on the empirical evidence in the debate.

The relevance of the deterrent debate is that many people have taken the position that the death penalty is only justifiable as a means of preventing murder. Thus, the question about deterrence cuts to the heart of the death penalty's reason for existence.

Wolfers critiques both Justice John Paul Stevens, writing for the dissent, and Justice Antonin Scalia, writing for the majority. Stevens wrote that, "In the absence of such evidence, deterrence cannot serve as a sufficient penological justification for this uniquely severe and irrevocable punishment". To this, Wolfers appropriately responds, "the absence of evidence should not be confused with the evidence of absence." Just because we cannot find a deterrent effect doesn't necessarily mean it doesn't exist.

The more consequential critique is of Scalia's decision, in which the Justice writes, "a significant body of recent evidence" shows "that capital punishment may well have a deterrent effect, possibly a quite powerful one." Wolfers responds that Justice Scalia may want to update his reading list to include the large body of research contradicting his claim.

Since in my last post on this topic I discussed the statistical reasons why an answer to the deterrence question remains elusive, I thought it was important to follow up with the other factor hindering the debate: ideology. Justices Stevens and Scalia differ on the issue of the death penalty, for reasons that go beyond the issue of deterrence, and these differences affect how they filter new evidence. Maybe Stevens and Scalia we legitimately swayed by virtues of different econometric techniques. But my guess is that this is the result of "confirmation bias". According to the idea of confirmation bias, people have a tendency to seek out information that supports their preexisting world view. In statistical analysis, researchers may dismiss contradictory findings on methodological grounds or only report the versions of their results that worked and not the ones that didn't.

Supreme Court judges aren't the only ones who fall pray to this bias. It is often hard to find an academic who's willing to change his/her position because of contradictory empirical evidence. In his 2006 testimony to congress, economist (and deterrence proponent) Paul Rubin states that:

“Another recent paper by Lawrence Katz, Steven D. Levitt, and Ellen Shustorovich uses state-level panel data covering the period 1950 to 1990 to measure the relationship between prison conditions, capital punishment, and crime rates….In several estimations, both the prison death rate and the execution rate are found to have significant, negative relationships with murder rates…”


However one of the paper's authors, Steven Levitt, disagrees with this assessment of his work. According to Levitt, he and his co-authors ran 20 different models (containing different mixes of variables), and found "significant, negative relationships" in only 3. Generally if the results of a statistical model are so sensitive to the model's specification, the relationship is not that strong or may not exist at all.

Unfortunately there's not much we can do about confirmation bias, except to be open about it. It takes a lot of effort to approach issues with a clear and open mind. But it's important to try; sometimes it's an issue of life and death.

No comments: