Daniel_Burfoot comments on Does Solomonoff always win? - Less Wrong

11 Post author: cousin_it 23 February 2011 08:42PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (55)

You are viewing a single comment's thread.

Comment author: Daniel_Burfoot 24 February 2011 12:26:42AM 3 points [-]

First let's set the input sequence to be very malevolent toward Omega: it will always say the reverse of what Omega is projected to say based on the previous bits.

I'm not sure I buy this, but let me make sure I understand correctly. There is some super-super-intelligence generating bits, which is smarter than Omega and also antagonistic to it. That intelligence basically simulates Omega, figures out what Omega will guess, and sends the opposite value.

First, can't Omega just figure out that it's being simulated, and then find a source of pure randomness, and just guess according to that?

Second, can't Omega at least make sure it remains level with the pitiful human, by figuring out what the human will guess and copying that?

Third, how do you know there even exists a stream of bits that will make Omega lose constantly? It's not at all obvious to me that such strings exist. It seems like you're asserting the existence of N-bit strings that have Kolmogorov complexity substantially larger than N.

Comment author: cousin_it 24 February 2011 06:13:08AM *  4 points [-]

My Omega uses Solomonoff induction. Its beliefs about the next bit are defined mathematically and depend only on the previous bits.

Comment author: mkehrt 25 February 2011 05:08:05AM *  0 points [-]

I'm suspicious of this. My understanding is that true Solomonoff induction is incomputable (because it requires ordering by Kolomogorov complexity.). Thus, you can't just create an algorithm to predict its next move.

edit: That is, just because it is "defined mathematically" doesn't mean we can predict its next move.

Comment author: cousin_it 25 February 2011 05:54:27AM *  0 points [-]

The input sequence doesn't have to be computable, it only has to exist.

Comment author: Vladimir_M 24 February 2011 05:21:18AM *  0 points [-]

From what I understand, Omega is defined simply as a function F : L -> {0,1}, where L is the set of all strings over {0,1}, and F((b_1,...,b_n)) is interpreted as its prediction of what the bit b_{n+1} will be given the initial bits b_1,...,b_n. Then you can always define the "malevolent" bit sequence simply as b_{n+1} = ~F((b_1,...,b_n)). F can be wildly uncomputable, of course, so it makes no sense to ask about the Kolmogorov complexity of these bit sequences.

The generalization where Omega also assigns probabilities rather than just making binary guesses seems obvious.

Comment author: ciphergoth 24 February 2011 10:32:57AM 3 points [-]

it makes no sense to ask about the Kolmogorov complexity of these bit sequences.

The bit sequences are finite length at each step, so they do have a Kolmogorov complexity.