You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Viliam_Bur comments on Open Thread, March 16-31, 2012 - Less Wrong Discussion

2 Post author: OpenThreadGuy 16 March 2012 04:53AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (114)

You are viewing a single comment's thread. Show more comments above.

Comment author: Viliam_Bur 16 March 2012 04:48:59PM 1 point [-]

It seems humans, even groups of humans, are not capable of fast recursive self-improvement. What is it that is missing that doesn't allow one of them to prevail?

I would guess that the reason is people don't work with exact numbers, only with approximations. If you make a very long equation, the noise kills the signal. In mathematics, if you know "A = B" and "B = C" and "C = D", you can conclude that "A = D". In real life your knowledge is more like "so far it seems to me that under usual conditions A is very similar to B". A hypothetical perfect Bayesian could perhaps assign some probability and work with it, but even our estimates of probabilities are noisy. Also, the world is complex, things do not add to each other linearly.

I suspect that when one tries to generalize, one gets a lot of general rules with maybe 90% probabilities. Try to chain dozen of them together, and the result is pathetic. It is like saying "give me a static point and a lever and I will move the world" only to realize that your lever is too floppy and you can't move anything that is too far and heavy.