It's illegal for the prosecution or defense to do so in court. Apologies for the lack of context.
The 1971 paper that cites the .70-.74 numbers causes me to believe the people who participated were unbelievably bad at quantation, or that the flaws pointed out in 2006 paper of the 1971 paper are sufficient to destroy the value of that finding, or that this is one of many studies with fatal flaws. I expect there are very few jurors indeed who would convict with a belief that the defendant was 25% to be innocent.
I wonder if quantation interferes with analysis for some large group of people? Perhaps just the mention of math interferes with efficient analysis. I don't know; I can say that in math- or physics-intensive cases, both sides try to simplify for the jury.
In fact, we have some types of cases with fact patterns that give us fairly narrow confidence ranges; if there's a case where I'm 75% certain the guy did it, and no likely evidence or investigation will improve that number, that's either not issued, or if that state has been reached post-issuance, the case is dismissed.
In 2004, The United States government executed Cameron Todd Willingham via lethal injection for the crime of murdering his young children by setting fire to his house.
In 2009, David Grann wrote an extended examination of the evidence in the Willingham case for The New Yorker, which has called into question Willingham's guilt. One of the prosecutors in the Willingham case, John Jackson, wrote a response summarizing the evidence from his current perspective. I am not summarizing the evidence here so as to not give the impression of selectively choosing the evidence.
A prior probability estimate for Willingham's guilt (certainly not a close to optimal prior probability) is the probability that a fire resulting in the fatalities of children was intentionally set. The US Fire Administration puts this probability at 13%. The prior probability could be made more accurate by breaking down that 13% of intentionally set fires into different demographic sets, or looking at correlations with other things such as life insurance data.
My question for Less Wrong: Just how innocent is Cameron Todd Willingham? Intuitively, it seems to me that the evidence for Willingham's innocence is of higher magnitude than the evidence for Amanda Knox's innocence. But the prior probability of Willingham being guilty given his children died in a fire in his home is higher than the probability that Amanda Knox committed murder given that a murder occurred in Knox's house.
Challenge question: What does an idealized form of Bayesian Justice look like? I suspect as a start that it would result in a smaller percentage of defendants being found guilty at trial. This article has some examples of the failures to apply Bayesian statistics in existing justice systems.