Erik2 comments on The Wonder of Evolution - Less Wrong

34 Post author: Eliezer_Yudkowsky 02 November 2007 08:49PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (80)

Sort By: Old

You are viewing a single comment's thread.

Comment author: Erik2 04 November 2007 09:59:20AM 3 points [-]

"Only in AI would people design algorithms that are literally stupider than a bag of bricks, boost the results back towards maximum entropy, and then argue for the healing power of noise."

I do not have the time to go through it now (which probably means I never will remember to do it) but I can offer a small observation.

When training neural networks, there is a very good reason why adding a random element improves the performance: it avoids getting stuck in suboptimal local minima. Training a network can be seen as minimizing errors on a surface in weight-space. This surface usually is littered with local minima of various sizes, so a deterministic training rule gets stuck while a stochastic one can get kicked out of them. Of course, one has to be careful not to add too much of a random element; this is usually done by using small steps in the training.

I do not know if this adds anything as once the training is complete, the net constitutes an algorithm that is deterministic. The point however is that optimization methods that (necessarily) rely on local information usually performs better with an element of noise.