"The mathematical mistakes that could be undermining justice"
They failed, though, to convince the jury of the value of the Bayesian approach, and Adams was convicted. He appealed twice unsuccessfully, with an appeal judge eventually ruling that the jury's job was "to evaluate evidence not by means of a formula... but by the joint application of their individual common sense."
But what if common sense runs counter to justice? For David Lucy, a mathematician at Lancaster University in the UK, the Adams judgment indicates a cultural tradition that needs changing. "In some cases, statistical analysis is the only way to evaluate evidence, because intuition can lead to outcomes based upon fallacies," he says.
Norman Fenton, a computer scientist at Queen Mary, University of London, who has worked for defence teams in criminal trials, has just come up with a possible solution. With his colleague Martin Neil, he has developed a system of step-by-step pictures and decision trees to help jurors grasp Bayesian reasoning (bit.ly/1c3tgj). Once a jury has been convinced that the method works, the duo argue, experts should be allowed to apply Bayes's theorem to the facts of the case as a kind of "black box" that calculates how the probability of innocence or guilt changes as each piece of evidence is presented. "You wouldn't question the steps of an electronic calculator, so why here?" Fenton asks.
It is a controversial suggestion. Taken to its logical conclusion, it might see the outcome of a trial balance on a single calculation. Working out Bayesian probabilities with DNA and blood matches is all very well, but quantifying incriminating factors such as appearance and behaviour is more difficult. "Different jurors will interpret different bits of evidence differently. It's not the job of a mathematician to do it for them," says Donnelly.
The linked paper is "Avoiding Probabilistic Reasoning Fallacies in Legal Practice using Bayesian Networks" by Norman Fenton and Martin Neil. The interesting parts, IMO, begin on page 9 where they argue for using the likelihood ratio as the key piece of information for evidence, and not simply raw probabilities; page 17, where a DNA example is worked out; and page 21-25 on the key piece of evidence in the Bellfield trial, no one claiming a lost possession (nearly worthless evidence)
Related reading: Inherited Improbabilities: Transferring the Burden of Proof, on Amanda Knox.
Apply Bayesian methods to trial evidence in criminal trials would make explicit a conflict that currently goes unstated. Unlike the standard "a preponderance of the evidence," for which there is both folk and professional consensus that a probability of 0.51 is required, "beyond a reasonable doubt" does not, to my knowledge, have an associated mathematical probability. At a folk level people in the US claim to believe "It is better to let ten guilty men go free than to send one innocent man to prison" but there is ample circumstantial evidence that this not always a true preference, and I highly doubt there would be anything remotely approaching consensus for a standard of 0.9.
Trying establish a numerical standard would devolve into mindkilling politics fairly quickly, I suspect. It might break along party lines, or it might break along lines of "people more likely to know someone who was the victim of a previously-acquitted criminal" vs. "people more likely to know someone who was wrongfully prosecuted", but either way, it would just be something new to argue about.