jimrandomh comments on Frequentist Magic vs. Bayesian Magic - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (79)
I don't understand why your universal prior passes the tape through the Turing machine. If you have a source of genuinely random tape inputs in the first place, why not just use the probability that a tape has the first four bits '1100'? And in any case, what's the advantage of invoking Turing machinery in place of the good old fair coins?
The trick, and the reason for using Turing machines, is that '0000' and '1111' ought to be considered more likely than '0101' and '1010', because they're simpler; and the way this model represent simplicity, a sequence is simple if it's the output of short programs (where 'program' in this case means 'prefix to a Turing machine tape'). If you used the tape directly, then '0000000000' and '0100011011' would be equally likely, which, generally speaking, they're not.