Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Houshalter comments on Worse Than Random - Less Wrong

25 Post author: Eliezer_Yudkowsky 11 November 2008 07:01PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (99)

Sort By: Old

You are viewing a single comment's thread.

Comment author: Houshalter 03 June 2015 03:18:48AM 0 points [-]

What if the lock has multiple combinations that are close to each other. In this case the brute force does much worse than a random search. Because the correct combinations could be 98,99,100. This is true for real combination locks btw, they usually aren't too sensitive to being off by a single number, and so multiple combinations do work.

Or an even simpler case, you need to create a single algorithm to break a large number of locks. And all the locks have the same combination. And you can't save memory or modify the algorithm after it it's finished, so you have to pick the best one.

If you pick a deterministic algorithm, there is a significant chance that the combination could be something like 99999. Then you have the worst case time. Whereas the time for the random algorithm will be a Gaussian distribution around the average case.

The expected completion time for each is the same, but the deterministic algorithm has a significantly higher chance of hitting a worst case scenario.

Now a random search is bad because it can forget what it's already tried, but you can just add a memory or some other way of creating a random ordering.

And in general, there exists a pattern that will disrupt any deterministic algorithm and give it much worse performance. And they aren't rare pathological cases either.