khafra comments on Rationality Quotes January 2013 - Less Wrong

6 Post author: katydee 02 January 2013 05:23PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (604)

You are viewing a single comment's thread. Show more comments above.

Comment author: khafra 17 January 2013 07:09:28PM 3 points [-]

Fat tailed distributions make the rockin' world go round.

Comment author: gwern 17 January 2013 07:21:13PM 2 points [-]

They don't even have to be fat-tailed; in very simple examples you can know that on the next observation, your posterior will either be greater or lesser but not the same.

Here's an example: flipping a biased coin in a beta distribution with a uniform prior, and trying to infer the bias/frequency. Obviously, when I flip the coin, I will either get a heads or a tails, so I know after my first flip, my posterior will either favor heads or tails, but not remain unchanged! There is no landing-on-its-edge intermediate 0.5 coin. Indeed, I know in advance I will be able to rule out 1 of 2 hypotheses: 100% heads and 100% tails.

But this isn't just true of the first observation. Suppose I flip twice, and get heads then tails; so the single most likely frequency is 1/2 since that's what I have to date. But now we're back to the same situation as in the beginning: we've managed to accumulative evidence against the most extreme biases like 99% heads, so we have learned something from the 2 flips, but we're back in the same situation where we expect the posterior to differ from the prior in 2 specific directions but cannot update the prior: the next flip I will either get 2/3 or 1/3 heads. Hence, I can tell you - even before flipping - that 1/2 must be dethroned in favor of 1/3 or 2/3!

Comment author: [deleted] 17 January 2013 08:30:45PM *  2 points [-]

And yet if you add those two posterior distributions, weighted by your current probability of ending up with each, you get your prior back. Magic!

(Witch burners don't get their prior back when they do this because they expect to update in the direction of "she's a witch" in either case, so when they sum over probable posteriors, they get back their real prior which says "I already know that she's a witch", the implication being "the trial has low value of information, let's just burn her now".)

Comment author: gwern 17 January 2013 08:34:53PM 1 point [-]

Yup, sure does. Which is a step toward the right idea Kindly was gesturing at.

Comment author: shminux 17 January 2013 08:56:19PM *  -1 points [-]

For coin bias estimate, as for most other things, the self-consistent updating procedure follows maximum likelihood.

Comment author: [deleted] 17 January 2013 09:10:07PM 2 points [-]

Max liklihood tells you which is most likely, which is mostly meaningless without further assumptions. For example, if you wanted to bet on what the next flip would be, a max liklihood method won't give you the right probability.

Comment author: [deleted] 18 January 2013 04:13:38PM *  1 point [-]

Yes.

OTOH, the expected value of the beta distribution with parameters a and b happens to equal the mode of the beta distribution with parameters a - 1 and b - 1, so maximum likelihood does give the right answer (i.e. the expected value of the posterior) if you start from the improper prior B(0, 0).

(IIRC, the same thing happens with other types of distributions, if you pick the ‘right’ improper prior (i.e. the one Jaynes argues for in conditions of total ignorance for totally unrelated reasons) for each. I wonder if this has some particular relevance.)