Warrigal comments on Forcing Anthropics: Boltzmann Brains - Less Wrong

17 Post author: Eliezer_Yudkowsky 07 September 2009 07:02PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (59)

You are viewing a single comment's thread. Show more comments above.

Comment author: pengvado 10 November 2009 02:46:59AM *  2 points [-]

"My observations are the output at [...] by a high-volume random experience generator"

"My observations are [...], which were output by a high-volume random experience generator". Since the task is to explain my observations, not to predict where I am. This way also makes it more clear that that suffix is strictly superfluous from a Kolmogorov perspective.

In order for one's priors to be well-defined, then for large N, all hypotheses of length N+1 together must be more likely than all hypotheses of length N together.

You mean less likely. i.e. there is no nonnegative monotonic-increasing infinite series whose sum is finite. Also, it need not happen for all large N, just some of them. So I would clarify it as: ∀L ∃N>L ∀M>N (((sum of probabilities of hypotheses of length M) < (sum of probabilities of hypotheses of length N)) or (both are zero)).

But you shouldn't take that into account for your example. The theorem applies to infinite sequences of hypotheses, but not to any one finite hypothesis such as the disjunction of a billion green rooms. To get conclusions about a particular hypothesis, you need more than "any prior is Occam's razor with respect to a sufficiently perverse complexity metric".

Comment author: [deleted] 10 November 2009 03:47:41AM 1 point [-]

"My observations are [...], which were output by a high-volume random experience generator". Since the task is to explain my observations, not to predict where I am. This way also makes it more clear that that suffix is strictly superfluous from a Kolmogorov perspective.

You are correct, though I believe your statement is equivalent to mine.

You mean less likely. i.e. there is no nonnegative monotonic-increasing infinite series whose sum is finite. Also, it need not happen for all large N, just some of them. So I would clarify it as: ∀L ∃N>L ∀M>N (((sum of probabilities of hypotheses of length M) < (sum of probabilities of hypotheses of length N)) or (both are zero)).

Right again; I'll fix my post.