lessdazed comments on Some potential dangers of rationality training - Less Wrong

18 Post author: lukeprog 21 January 2012 04:50AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (48)

You are viewing a single comment's thread.

Comment author: lessdazed 23 January 2012 04:47:33AM 3 points [-]

Perhaps the sunk cost fallacy is useful because without it...

This sounds like a fake justification. For every justification of a thing by pointing to the positive consequences of it, one can ask how much better that thing is than other things would be.

I expect evolution to produce beings in local optima according its criteria, which often results in a solution similar to what would be the best solution according to human criteria. But it's often significantly different, and rarely the same.

For every systematic tendency to deviate from the truth I have, I can ask myself the leading question "How does this help me?" and I should expect to find a good evolutionary answer. More than that, I would expect to actually be prone to justifying the status quo according to my personal criteria, rather than evolution's. Each time I discover that what I already do habitually is best, according to my criteria, in the modern (not evolutionary) environment, I count it as an amazing coincidence, as the algorithms that produced my behavior are not optimized for that.