You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

alexflint comments on Bayesianism in the face of unknowns - Less Wrong Discussion

1 Post author: rstarkov 12 March 2011 08:54PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (35)

You are viewing a single comment's thread. Show more comments above.

Comment author: alexflint 13 March 2011 12:25:41PM 0 points [-]

I see the point you're making about observation selection effects but surely in this case it doesn't flatten the posterior very much. Of all the times you see a coin come up heads 100 times in a row, most of them will be for coins with p(heads) close to 1, even if you are discarding all other runs. That's assuming you select coins independently for each run.

Comment author: alexflint 13 March 2011 12:28:00PM 0 points [-]

Hmm perhaps I mis-read the post. I was assuming he was picking a single coin and flipping it 100 times.

Comment author: othercriteria 13 March 2011 02:38:09PM 0 points [-]

The description of the coin flips having a Binomial(n=?,p) distribution, instead of a Bernoulli(p) distribution, might be a cause of the mis-reading.

Comment author: rstarkov 13 March 2011 08:07:10PM *  0 points [-]

Perhaps - obviously each coin is flipped just once, i.e. Binomial(n=1,p), which is the same thing as Bernoulli(p). I was trying to point out that for any other n it would work the same as a normal coin, if someone were to keep flipping it.