TheOtherDave comments on [META] 'Rational' vs 'Optimized' - Less Wrong

30 Post author: TheOtherDave 04 January 2012 06:58PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (67)

You are viewing a single comment's thread. Show more comments above.

Comment author: TheOtherDave 04 January 2012 09:27:52PM 1 point [-]

The closest equivalent to "rationalist" I can think of is "optimizer," which sounds pretty good to me in most of the contexts I find myself using the word on this site.

"Aspiring rationalist" is not a phrase I can readily imagine myself using, but I guess if I wanted to keep the modesty signalling I'd probably say something like "I'm a mediocre optimizer" or "I'm getting better at optimizing" or some such thing.

Comment author: Vladimir_Nesov 04 January 2012 09:55:40PM *  7 points [-]

Becoming a better optimizer is not at all clearly the best marginal improvement, and clearly not an exclusive terminal goal. You are a human being. You want yourself optimized in some ways, but not necessarily with a focus on making yourself a better optimizer. So far, rationality is not that great for most purposes: you get a much clearer "big picture" understanding of the world, you correct some grievous mistakes, you see more freedom for finding ways of making life more fun. Perhaps you get a chance of producing a useful idea for the project of FAI. But this is not something best characterized as "being a better optimizer".

Comment author: TheOtherDave 04 January 2012 10:10:31PM 0 points [-]

If I'm understanding you correctly, I would agree with you that "rationality" as you're using it in this comment doesn't map particularly well to any form of optimization, and I endorse using "rationality" to refer to what I think you're talking about..

I would also say that "rationality" as it is frequently used on this site doesn't map particularly well to "rationality" as you're using it in this comment.

Comment author: TheOtherDave 04 January 2012 10:42:32PM 0 points [-]

Huh. If the folks downvoting this can explain to me how to map the uses of "rationality" in "rationality is not that great for most purposes" and "rationalists should win" (for example) to one another, I'd appreciate it... they sure do seem like different things to me. They are both valuable, but they are different, and seem mostly incommensurable.

Comment author: Zetetic 05 January 2012 03:45:19AM *  -1 points [-]

I didn't down-vote, but my two cents:

I think you're misunderstanding Nesov's comment. Becoming a better optimizer loses a useful distinction- to see this you need to take an outside view of optimization in general. Rationalists want to optimize their behaviors relative to their values/goals - which includes only a very narrow slice of things to optimize for (generally not the number of paperclips in the universe any other process that might be encoded in a utility function) and a specific set of heuristics to master. Hence the claim that rationality isn't that great for many purposes- there are only relatively few purposes we wish to actually pursue currently.

Even though becoming a sufficiently stronger optimizer-in-general will help you achieve your narrow range of goals, unless you specifically work towards optimizing for your value set, it's not optimal to do so relative to your actual utility function. An optimizer-in-general, strictly speaking, will on average be just as good at optimizing for the number of paperclips in the universe as you will at managing your relationships. The useful distinction is lost here.

Comment author: wedrifid 05 January 2012 03:52:05AM 0 points [-]

I think you're misunderstanding Nessov's comment. Becoming a better optimizer looses a useful distinction

One s. One o.

Comment author: Zetetic 05 January 2012 10:13:21AM 2 points [-]

One s. One o.

Opss. Fixed.

Comment author: TheOtherDave 05 January 2012 03:32:11PM 0 points [-]

I agree that a sufficiently general optimizer can optimize its environment for a wide range of values, the vast majority of which aren't mine, and a significant number of which are opposed to mine. As you say, an optimizer-in-general is as good at paperclips as it is at anything else (though of course a human optimizer is not, because humans are the result of a lot of evolutionary fine-tuning for specific functions).

I would say that a sufficiently general rationalist can do exactly the same thing. That is, a rationalist-in-general (at least, as the term is frequently used here) is as good at paperclips as it is at anything else (though of course a human rationalist is not, as above).

I would also say that the symmetry is not a coincidence.

I agree that if this is what Nesov meant, then I completely misunderstood his comment. I'm somewhat skeptical that this is what Nesov meant.

Comment author: Zetetic 05 January 2012 10:50:26PM 0 points [-]

I was thinking about whether telling someone I'm an aspiring optimizer is going to result in less confusion than telling them that I'm an aspiring rationalist. I think that the term 'optimizer' needs a little more specification to work; how about Decision Optimization? If I tell someone I'm working on decision optimization, I pretty effectively convey what I'm doing - learning and practicing heuristics in order to make better decisions.

Comment author: TheOtherDave 05 January 2012 11:27:14PM 0 points [-]

I probably agree that "I'm working on decision optimization" conveys more information in that case than "I'm working on rationality" but I suspect that neither is really what I'd want to say in a similar situation... I'd probably say instead that "I'm working on making more consistent probability estimates," or "I'm working on updating my beliefs based on contradictory evidence rather than rejecting it," or whatever it was. (Conversely, if I didn't know what I was working on more specifically, I would question how I knew I was working on it.)