Will_Newsome comments on SotW: Check Consequentialism - Less Wrong

38 Post author: Eliezer_Yudkowsky 29 March 2012 01:35AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (311)

You are viewing a single comment's thread. Show more comments above.

Comment author: Eliezer_Yudkowsky 24 March 2012 12:02:45AM 13 points [-]

Cleverness-related failure mode (that actually came up in the trial unit):

One shouldn't try too hard to rescue non-consequentialist reasons. This probably has to be emphasized especially with new audiences who associate "rationality" to Spock and university professors, or audiences who've studied pre-behavioral economics, and who think they score extra points if they come up with amazingly clever ways to rescue bad ideas.

Any decision-making algorithm, no matter how stupid, can be made to look like expected utility maximization through the transform "Assign infinite negative utility to departing from decision algorithm X". This in essence is what somebody is doing when they say, "Aha! But if I stop my PhD program now, I'll have the negative consequence of having abandoned a sunk cost!" (Sometimes I feel like hitting people with a wooden stick when they do this, but that act just expresses an emotion rather than having any discernible positive consequences.) This is Cleverly Failing to Get the Point if "not wanting to abandon a sunk cost", i.e., the counterintuitive feel of departing from the brain's previous decision algorithm, is treated as an overriding consideration, i.e., an infinite negative utility.

It's a legitimate future consequence only if the person says, "The sense of having abandoned a sunk cost will make me feel sick to my stomach for around three days, after which I would start to adjust and adapt a la the hedonic treadmill". In this case they have weighed the intensity and the duration of the future hedonic consequence, rather than treating it as an instantaneous infinite negative penalty, and are now ready to trade that off against other and probably larger considerations like the total amount of work required to get a PhD.

Comment author: Will_Newsome 24 March 2012 11:52:57AM *  3 points [-]

Rationalization is an important skill and should be rewarded, not punished. If you never try to rationalize others' decisions then you won't notice when they actually do have a good justification, and if you never practice rationalization then you'll never get good enough at it to find their justifications when they exist. The result is gross overconfidence in the stupidity of the opposing side and thus gross overconfidence in one's own rationality. That leads to tragedies and atrocities, both personal and societal.

Comment author: Alicorn 24 March 2012 05:27:39PM 6 points [-]

Perspective-taking is a separate "skill" from rationalizing one's own behavior.

Comment author: Will_Newsome 24 March 2012 06:28:56PM 4 points [-]

Hm, is perspective-taking the same skill that I was talking about? I can't tell. Also I thought that Eliezer's examples were phrased in the hypothetical, and thus it'd be rationalizing others' beliefs/behavior, not one's own. I'm not sure to what extent rationalizing a conclusion and rationalizing one's own behavior are related. Introspectively, the defensiveness and self-justifying-ness inherent to the latter makes it a rather different animal.

Comment author: handoflixue 29 March 2012 09:01:42PM 3 points [-]

"Coming up with explanations" is a good skill.

"Coming up with a single, stupid explanation, failing to realize it is stupid, and then using it as an excuse to cease all further thought" is a very, very bad skill.

Thinking "well, but abandoning a sunk cost actually IS a negative future event" is smart IFF you then go "I'd be miserable for three days. How does that weigh against years spent in the program?"

It's very, very bad, however, if you stop there and continue to spend 2 years on a PhD just because you don't want to even THINK about those three days of misery.

I think understanding this dichotomy is critical. If you stop even thinking "well, but abandoning a sunk cost IS a negative future event" because you're afraid of falling in to the trap of then avoiding all sunk costs, then you're ignoring real negative consequences to your decisions.