Unknown3 comments on What Would You Do Without Morality? - Less Wrong

26 Post author: Eliezer_Yudkowsky 29 June 2008 05:07AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (171)

Sort By: Old

You are viewing a single comment's thread.

Comment author: Unknown3 30 June 2008 03:21:00AM 1 point [-]

I wonder if Eliezer is planning to say that morality is just an extrapolation of our own desires? If so, then my morality would be an extrapolation of my desires, and your morality would be an extrapolation of yours. This is disturbing, because if our extrapolated desires don't turn out to be EXACTLY the same, something might be immoral for me to do which is moral for you to do, or moral for me and immoral for you.

If this is so, then if I programmed an AI, I would be morally obligated to program it to extrapolate my personal desires-- i.e. my personal desires, not the desires of the human race. So Eliezer would be deceiving us about FAI: his intention is to extrapolate his personal desires, since he is morally obligated to do so. Maybe someone should stop him before it's too late?