John_Faben comments on The Weighted Majority Algorithm - Less Wrong

18 Post author: Eliezer_Yudkowsky 12 November 2008 11:19PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (94)

Sort By: Old

You are viewing a single comment's thread.

Comment author: John_Faben 14 November 2008 09:37:11PM 1 point [-]

Don - the "sleight of hand", as you put it is that (as I think has been said before) we're examining worst-case. We explicitly *are* allowing the string of 1s and 0s to be picked by an adversarial superintelligence who knows our strategy. Scott's algorithm only needs to sample 4 bits on average to answer the question *even when the universe it out to get him* - the deterministic version will need to sample exactly n/4+1.

Basically, Eliezer seems to be claiming that BPP = P (or possibly something stronger), which most people think is probably true, but, as has been said, no-one can prove. I for one accept his intuitive arguments as, I think, do most people who've thought about it, but proving this intuition rigorously is a major outstanding problem in computational complexity.

My supervior's intuitive argument for why you can never get anything from randomness is that in any particular case where randomness appears to help you can just pick a pseudo-random source which is "random enough" (presumably a formalisation of this intuition is what Scott's talking about when he says that BPP = P if "realy good pseudorandom generators exist").