DSimon comments on Not for the Sake of Happiness (Alone) - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (94)
The subject in detail is too complicated to bother with in this comment thread because it is discussed in much greater detail elsewhere, so I'll just bring up two things.
1) In the last month I've been thinking pretty darned carefully and am now really really unsure whether I'd accept the Superhappies' deal and am frankly glad I'll never have to make that choice.
2) Some of my own desires are bad, and if I were to take a pill that completely eliminated those desires, I would. The idea that what humanity wants right now is what it really wants is definitely not certain, as most certainly uncertain as uncertainties get. So the real question is, why does our utility function act the way it does? There was no purpose for it and if we can agree on a way to change it, we should change it, even if that means
go extinct.
Strongly agreed! But that's why the gloss for CEV talks about stuff like what we would ideally want if we were smarter and knew more.