DanielLC comments on Help with a (potentially Bayesian) statistics / set theory problem? - Less Wrong

2 Post author: joshkaufman 10 November 2011 10:30PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (27)

You are viewing a single comment's thread. Show more comments above.

Comment author: DanielLC 13 November 2011 09:30:54PM 0 points [-]

As I stated at the beginning, I don't know the standard meaning of maximum entropy prior.

This time when I looked it up I found a simpler definition with finite cases. I'm not sure why I missed that before. I think I can figure out where the confusion is. I was thinking of every possible combination of opinions being separate possibilities. If this is the case, having them all be independent of each other is the maximum entropy. If, on the other hand, you only look at correlation, and consider H(80) = 50 being one case, then maximum entropy would seem to be that H(n) is uniformly distributed.

I don't think that's quite right either. I suspect that has something to do with H(n) being continuous instead of discrete. I know the Jeffreys prior for that is beta(1/2,1/2), as opposed to beta(1,1), which is the uniform distribution.