roystgnr comments on What Bayesianism taught me - LessWrong

62 Post author: Tyrrell_McAllister 12 August 2013 06:59AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (201)

You are viewing a single comment's thread.

Comment author: roystgnr 11 August 2013 04:21:55AM 0 points [-]

Occam's razor should be on your list. Not in the "Solomonoff had the right definition of complexity" sense, but in the sense that any proper probability distribution has to integrate to 1, and so for any definition of complexity satisfying a few common sense axioms the limit of your prior probability has to go to zero as the complexity goes to infinity.

I think you've oversimplified the phrasing of 6 (not your fault, though; more the fault of the English language). Although your expected value for your future estimate of P(H) should be the same as your current estimate of P(H), that doesn't imply symmetry of expected future evidence. For example, I have a very high expectation that future evidence will very slightly increase my already very strong belief that aliens are not visiting Earth; this is mostly balanced out by a very tiny expectation that future evidence will strongly decrease that belief.

Comment author: Tyrrell_McAllister 11 August 2013 04:47:13AM 2 points [-]

for any definition of complexity satisfying a few common sense axioms the limit of your prior probability has to go to zero as the complexity goes to infinity.

What are these axioms?

Although your expected value for your future estimate of P(H) should be the same as your current estimate of P(H), that doesn't imply symmetry of expected future evidence.

Right. In general, the distribution for your posterior probability is by no means symmetric about your prior probability.

Comment author: ciphergoth 11 August 2013 09:19:36PM *  0 points [-]

Assuming you think only in terms of discrete options, I think the only axiom you need is that for any level of complexity k there is at least one option that complex.

EDIT: I'm wrong, you don't even need this.

Comment author: Tyrrell_McAllister 13 August 2013 10:30:22PM 1 point [-]

Does this give one any reason to believe that, if two hypotheses are under consideration, the simpler one is a priori more likely? If not, it seems to me to be missing something too crucial to be called a formalization of Occam's razor.

Comment author: ciphergoth 14 August 2013 07:01:08AM *  0 points [-]

Right, you'd need more than that one axiom before you could really say you had a formulation of Occam's Razor. I'm just making a more specific point, that whatever formulation of complexity you come up with, so long as it satisfies the axiom above, will have the property that any probability distribution over discrete outcomes must assign diminishing probability to increasingly complex hypotheses in the limit.

EDIT: actually even without that axiom, so long as you consider only discrete hypotheses and your definition of complexity maps hypotheses to a real positive number representing complexity, you will have that the mass of probability given to hypotheses more complex than x falls to zero as x goes to infinity.

Comment author: Eugine_Nier 15 August 2013 02:33:48AM -1 points [-]

Not in the "Solomonoff had the right definition of complexity" sense, but in the sense that any proper probability distribution has to integrate to 1, and so for any definition of complexity satisfying a few common sense axioms the limit of your prior probability has to go to zero as the complexity goes to infinity.

Assuming you restrict to discrete probability distributions.