A very simple measure on the binary strings is the uniform measure and so Solomonoff Induction will converge on it with high probability.
Can you explain why? What's the result saying the Solomonoff distribution "as a whole" often converges on uniform?
It's not clear to me from this description whether the SI predictor is also conditioned. Anyway, if the universal prior is not conditioned, then the convergence is easy as the uniform distribution has very low complexity. If it is conditioned, then you will have no doubt observed many processes well modelled by a uniform distribution over your life -- flipping a coin is a good example. So the estimated probability of encountering a uniform distribution in a new situation won't be all that low.
Indeed, with so much data SI will have built a model of langu...
You're about to flip a quantum coin a million times (these days you can even do it on the internet). What's your estimate of the K-complexity of the resulting string, conditional on everything else you've observed in your life so far? The Born rule, combined with the usual counting argument, implies you should say "about 1 million". The universal prior implies you should say "substantially less than 1 million". Which will it be?
EDIT: Wei Dai's comment explains why this post is wrong.