cousin_it comments on Bayes' rule =/= Bayesian inference - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (70)
I like this post, there's still a lot of confusion around Bayesian methods.
Two things that would have helped me while I was learning Bayesianism were that:
and
I might write these into a post sometime.
*This is what's going wrong in the heads of people who say things like "The probability is either 1 or 0, but I don't know which."
I don't understand how you can hold a position like that and still enjoy the post. How do you parse the phrase "my prior for the probability of heads" in the second example?
In the second example the person was speaking informally, but there is nothing wrong with specifying a probability distribution for an unknown parameter (and that parameter could be a probability for heads)
I hadn't seen that, but you're right that that sentence is wrong. "Probability" should have been replaced with "frequency" or something. A prior on a probability would be a set of probabilities of probabilities, and would soon lead to infinite regress.
only if you keep specifying hyper-priors, which there is no reason to do
Exactly. There's no point in the first meta-prior either.