Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Academian comments on What is Bayesianism? - Less Wrong

81 Post author: Kaj_Sotala 26 February 2010 07:43AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (211)

You are viewing a single comment's thread. Show more comments above.

Comment author: Academian 17 March 2010 01:46:03PM *  4 points [-]

Log odds of independent events do not add up, just as the odds of independent events do not multiply. The odds of flipping heads is 1:1, the odds of flipping heads twice is not 1:1 (you have to multiply odds by likelihood ratios, not odds by odds, and likewise you don't add log odds and log odds, but log odds and log likelihood-ratios). So calling log odds themselves "evidence" doesn't fit the way people use the word "evidence" as something that "adds up".

This terminology may have originated here:

http://causalityrelay.wordpress.com/2008/06/23/odds-and-intuitive-bayes/

I'm voting your comment up, because I think it's a great example of how terminology should be chosen and used carefully. If you decide to edit it, I think it would be most helpful if you left your original words as a warning to others :)

Comment author: JGWeissman 17 March 2010 04:53:32PM 0 points [-]

By "evidence", I refer to events that change an agent's strength of belief in a theory, and the measure of evidence is the measure of this change in belief, that is, the likelihood-ratio and log likelihood-ratio you refer to.

I never meant for "evidence" to refer to the posterior strength of belief. "Log odds" was only meant to specify a particular measurement of strength in belief.

Comment author: ciphergoth 17 March 2010 02:44:00PM *  0 points [-]

Can you be clearer? Log likelihood ratios do add up, so long as the independence criterion is satisfied (ie so long as P(E_2|H_x) = P(E_2|E_1,H_x) for each H_x).

Comment author: Academian 17 March 2010 02:56:52PM 2 points [-]

Sure, just edited in the clarification: "you have to multiply odds by likelihood ratios, not odds by odds, and likewise you don't add log odds and log odds, but log odds and log likelihood-ratios".

Comment author: Morendil 17 March 2010 02:55:09PM 1 point [-]

As long as there are only two H_x, mind you. They no longer add up when you have three hypotheses or more.

Comment author: ciphergoth 17 March 2010 02:59:42PM 0 points [-]

Indeed - though I find it very hard to hang on to my intuitive grasp of this!

Comment author: Academian 20 March 2010 12:51:28AM 1 point [-]

Here is the post on information theory I said I would write:

http://lesswrong.com/lw/1y9/information_theory_and_the_symmetry_of_updating/

It explains "mutual information", i.e. "informational evidence", which can be added up over as many independent events as you like. Hopefully this will have restorative effects for your intuition!

Comment author: Academian 17 March 2010 03:08:38PM 0 points [-]

Don't worry, I have an information theory post coming up that will fix all of this :)