I think Bayesian justice would result in a larger percentage of defendants being found guilty at trial, because instead of "guilty beyond a reasonable doubt", the prosecution would only have to prove "expected value of conviction > expected value of no conviction".
EDIT: On the other hand, if someone committed an awful crime, but can convince you that they won't do it again; or if they might, but they pay a lot of taxes; let them go.
If the standard used is value to society, then if the defendant is judged to have no value to society, and executions are cheap, then convict and execute if p(defendent will commit more crime) > 0. If the defendant has a net cost to society, execute regardless.
If government functions via redistribution of taxation, then most people have a negative value to society, since most of the government's income comes from the top 10% or so. Therefore, execute the bottom 90%. Tax, and redistribute among the survivors. Again, the bottom 90% has negative value. Execute. Repeat. You eventually converge on a single citizen, whose expected contribution to society (minus his cost to society) is zero by some measures. At that point, flip a coin.
The precedential value must be taken into account as these EV's. Someone who can convince you that they won't do it again who gets off will encourage others who are convincing to use their one-murder-free card. That's bad policy.
In 2004, The United States government executed Cameron Todd Willingham via lethal injection for the crime of murdering his young children by setting fire to his house.
In 2009, David Grann wrote an extended examination of the evidence in the Willingham case for The New Yorker, which has called into question Willingham's guilt. One of the prosecutors in the Willingham case, John Jackson, wrote a response summarizing the evidence from his current perspective. I am not summarizing the evidence here so as to not give the impression of selectively choosing the evidence.
A prior probability estimate for Willingham's guilt (certainly not a close to optimal prior probability) is the probability that a fire resulting in the fatalities of children was intentionally set. The US Fire Administration puts this probability at 13%. The prior probability could be made more accurate by breaking down that 13% of intentionally set fires into different demographic sets, or looking at correlations with other things such as life insurance data.
My question for Less Wrong: Just how innocent is Cameron Todd Willingham? Intuitively, it seems to me that the evidence for Willingham's innocence is of higher magnitude than the evidence for Amanda Knox's innocence. But the prior probability of Willingham being guilty given his children died in a fire in his home is higher than the probability that Amanda Knox committed murder given that a murder occurred in Knox's house.
Challenge question: What does an idealized form of Bayesian Justice look like? I suspect as a start that it would result in a smaller percentage of defendants being found guilty at trial. This article has some examples of the failures to apply Bayesian statistics in existing justice systems.