Lumifer comments on Dark Arts of Rationality - LessWrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (185)
No problem. I'm also aiming for a non-confrontational tone, that's sometimes difficult in text.
I don't know. I haven't pinpointed the higher order differences that you're trying to articulate.
I do stand by my point that regardless of your definition of "terminal goal", I can construct a game in which the optimal move is to change them. I readily admit that under certain definitions of "terminal goal" such games are uncommon.
If it's the branding that's annoying you, see this comment -- it seems my idea of what qualifies as "dark arts" may differ from the consensus.
I'm not entirely sure what you mean by getting the same effects without the "darkness". I am quite confident that there are mental states you can only access via first-order self deception, and that it is instrumentally rational to do so. Michael Bloom provides another crisp example of this. I am skeptical that there are ways to attain these gains without self-deception.
Without involving Omega-like agents? In a realistic setting?