endoself comments on Useful Concepts Repository - Less Wrong

32 Post author: Qiaochu_Yuan 10 June 2013 06:12AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (105)

You are viewing a single comment's thread. Show more comments above.

Comment author: endoself 10 June 2013 08:47:51PM 0 points [-]

Your second paragraph could benefit from the concept of simulated annealing.

Comment author: Vaniver 11 June 2013 12:51:21AM 2 points [-]

There are lots of metaheuristic optimization methods; simulated annealing is the easiest one to explain and implement, but consequently it's also the dumbest of them.

Comment author: Houshalter 16 July 2013 07:06:02AM 0 points [-]

What are the best ones?

Comment author: Vaniver 16 July 2013 03:15:39PM 0 points [-]

I'm personally a fan of tabu search, which prevents cycling and getting stuck in local optima by remembering features of previously seen solutions, and not visiting any solution with those features for some set length of time. "Best" depends on the particular problem, though; there are situations when the easy implementation of simulated annealing makes it a superior solution to a cleverer but longer implementation of something else.

Comment author: Qiaochu_Yuan 10 June 2013 08:51:09PM 2 points [-]

I thought about mentioning simulated annealing, but then it seemed to me like simulated annealing is more involved than the basic concept I wanted to get across (e.g. the cooling phase is an extra complication).

Comment author: endoself 10 June 2013 09:20:44PM 1 point [-]

But it's actually important to the example. If someone intends to allocate their time searching for small and large improvements to their life, then simulated annealing suggests that they should make more of the big ones first. (The person you describe has may not have done this, since they've settled into a local optimum but now decide to find a completely different point on the fitness landscape, though without more details it's entirely possible they've decided correctly here.)

Comment author: malcolmocean 11 June 2013 06:57:38PM *  1 point [-]

I propose making an analogy to Split & Merge Expectation Maximization (SMEM) instead.

It's a very efficient algorithm for modelling that operates as follows:
1) perform EM to find the local optimum
2) examine the clusters to determine which two are most similar, and combine them
3) examine the clusters to determine which one is least representative of the data it's supposed to describe, and break it into two
4) perform EM on the three adjusted clusters
5) Repeat 1-4 until the change in likelihood between iterations drops below some epsilon.

I think this is actually quite isomorphic to Goal Factoring (from Geoff Anders / CFAR) in that you're trying to combine things that are similar and then break up things that are inefficient. At least, I spent an entire summer working on an SMEM clustering program (though some of that was UI) and they feel similar to me.

Comment author: AspiringRationalist 11 June 2013 04:59:23AM 0 points [-]

Simulated annealing is one of many techniques for optimizing in search spaces with a lot of local maxima. Is there a reason why you're emphasizing that one in particular?

Comment author: endoself 11 June 2013 09:40:00AM 0 points [-]

It provides a usefully concept, which can be carried over into other domains. I suppose there are other techniques that use a temperature, but I'm much less familiar with them and they are more complicated. Is understanding other metaheuristics more useful to people who aren't actually writing a program preforms some optimization than just understanding simulated annealing?