Interesting Talking Machines episode quote about Bayesian stats being used at Bletchley and GCHQ (its successor). Seems like they held on to a possibly significant advantage (crypto ppl would be better to comment on this) for years, owing largely to Turing. (The rest of the episode is about AI safety and also interesting.)

Source:

http://www.thetalkingmachines.com/blog/2016/2/26/ai-safety-and-the-legacy-of-bletchley-park

GCHQ in the ’70s, we thought of ourselves as completely Bayesian statisticians. All our data analysis was completely Bayesian, and that was a direct inheritance from Alan Turing. I’m not sure this has ever really been published, but Turing, almost as a sideline during his cryptoanalytic work, reinvented Bayesian statistics for himself. The work against Enigma and other German ciphers was fully Bayesian. …

Bayesian statistics was an extreme minority discipline in the ’70s. In academia, I only really know of two people who were working majorly in the field, Jimmy Savage … in the States and Dennis Lindley in Britain. And they were regarded as fringe figures in the statistics community. It’s extremely different now. The reason is that Bayesian statistics works. So eventually truth win out. There are many, many problems where Bayesian methods are obviously the right thing to do. But in the ’70s we understood that already in Britain in the classified environment.

 

Transcription Source:

https://www.johndcook.com/blog/2017/07/25/bayesian-methods-at-bletchley-park/

 

 

New to LessWrong?

New Comment
2 comments, sorted by Click to highlight new comments since: Today at 8:42 PM

So eventually truth win out.

Eh, no, apparently. Although the article is comforting.

Eh, no, apparently. Although the article is comforting.

Things tend to converge to the truth when market forces make it advantageous. This is probably correct for Bayes.