cousin_it comments on Bayes' rule =/= Bayesian inference - Less Wrong

37 Post author: neq1 16 September 2010 06:34AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (70)

You are viewing a single comment's thread. Show more comments above.

Comment author: Oscar_Cunningham 16 September 2010 08:37:31AM 4 points [-]

I like this post, there's still a lot of confusion around Bayesian methods.

Two things that would have helped me while I was learning Bayesianism were that:

The frequency with which a coin comes up heads isn't a probability, no matter how much it looks like one.*

and

Bayes theorem doesn't do parameter estimation.

I might write these into a post sometime.

*This is what's going wrong in the heads of people who say things like "The probability is either 1 or 0, but I don't know which."

Comment author: cousin_it 16 September 2010 09:44:20AM *  1 point [-]

I don't understand how you can hold a position like that and still enjoy the post. How do you parse the phrase "my prior for the probability of heads" in the second example?

Comment author: neq1 16 September 2010 01:31:36PM 2 points [-]

In the second example the person was speaking informally, but there is nothing wrong with specifying a probability distribution for an unknown parameter (and that parameter could be a probability for heads)

Comment author: Oscar_Cunningham 16 September 2010 11:28:11AM *  1 point [-]

I hadn't seen that, but you're right that that sentence is wrong. "Probability" should have been replaced with "frequency" or something. A prior on a probability would be a set of probabilities of probabilities, and would soon lead to infinite regress.

Comment author: neq1 16 September 2010 01:33:14PM 1 point [-]

only if you keep specifying hyper-priors, which there is no reason to do

Comment author: Oscar_Cunningham 16 September 2010 02:50:26PM 0 points [-]

Exactly. There's no point in the first meta-prior either.