Warrigal comments on Forcing Anthropics: Boltzmann Brains - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (59)
"My observations are [...], which were output by a high-volume random experience generator". Since the task is to explain my observations, not to predict where I am. This way also makes it more clear that that suffix is strictly superfluous from a Kolmogorov perspective.
You mean less likely. i.e. there is no nonnegative monotonic-increasing infinite series whose sum is finite. Also, it need not happen for all large N, just some of them. So I would clarify it as: ∀L ∃N>L ∀M>N (((sum of probabilities of hypotheses of length M) < (sum of probabilities of hypotheses of length N)) or (both are zero)).
But you shouldn't take that into account for your example. The theorem applies to infinite sequences of hypotheses, but not to any one finite hypothesis such as the disjunction of a billion green rooms. To get conclusions about a particular hypothesis, you need more than "any prior is Occam's razor with respect to a sufficiently perverse complexity metric".
You are correct, though I believe your statement is equivalent to mine.
Right again; I'll fix my post.