Tyrrell_McAllister comments on The Joys of Conjugate Priors - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (24)
This is an excellent article. However, I did have the same philosophical problem that Cyan gave in this bullet point:
You seem to suggest that conjugate prior distributions are "smart" because they update in a computationally tractable way. Certainly, as a concession to practical necessity, we have to take computational tractability into account. But it is controversial to think of doing this as part of the ideal epistemology that we are trying to approximate.
Also, I found myself confused at a few points near the beginning. You write
At first, I misread you as saying, in effect, "Given that x occurs, what should be your updated probability that x occurs?" But, of course, your updated probability, conditioned on x's occurring, that x occurs, should be 1.
I also misunderstood you to be proposing to consider the probability of the probability of a given event being such-and-such. That is, I thought that you were proposing to consider a probability of the form P(P(x | y) = p | z), where x, y, and z are events, and p is a number in [0,1]. But, as I understand it, this is not a well-formed notion in Bayesian epistemology.
I think that my confusion arose from your calling \beta an "internal parameter". But, from the subsequent discussion, it seems better to think of \beta as an unknown parameter fed into whatever physical process generated x. For example, \beta could be an unknown parameter fed into a pseudo-random number generator that was observed to output the number x.