DSimon comments on Not for the Sake of Happiness (Alone) - Less Wrong

48 Post author: Eliezer_Yudkowsky 22 November 2007 03:19AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (94)

Sort By: Old

You are viewing a single comment's thread. Show more comments above.

Comment author: Grognor 25 October 2011 04:46:05AM *  1 point [-]

The subject in detail is too complicated to bother with in this comment thread because it is discussed in much greater detail elsewhere, so I'll just bring up two things.

1) In the last month I've been thinking pretty darned carefully and am now really really unsure whether I'd accept the Superhappies' deal and am frankly glad I'll never have to make that choice.

2) Some of my own desires are bad, and if I were to take a pill that completely eliminated those desires, I would. The idea that what humanity wants right now is what it really wants is definitely not certain, as most certainly uncertain as uncertainties get. So the real question is, why does our utility function act the way it does? There was no purpose for it and if we can agree on a way to change it, we should change it, even if that means

other types of utilon

go extinct.

Comment author: DSimon 25 October 2011 01:59:03PM 0 points [-]

The idea that what humanity wants right now is what it really wants is definitely not certain

Strongly agreed! But that's why the gloss for CEV talks about stuff like what we would ideally want if we were smarter and knew more.