alex_zag_al comments on Beautiful Probability - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (109)
And in a way, you do, even doing Bayesian statistics. The messiness is just in the actual numerical calculations, not in the definitions of the rules.
Suppose you're trying to find a good model for some part of the real world, and you've got your set of models you're considering. When you see data, and you use Bayes' Theorem to find the posterior probabilities, the expression is going to have in it P(data|m) for each model m, and P(data). And if your models are messy - necessary to represent a messy real world - then these are going to be complicated expressions that make for a really awful calculation.
So Bayesian statisticians have a toolbox for approximating posterior probabilities - Laplace approximation, Gibbs sampling, etc - to deal with that messiness.
(these are my impressions from a bit of reading of the Bayesian statistics literature - I've never actually done these things, I don't know what the expressions really look like)