Stuart_Armstrong comments on Welcome to Heaven - Less Wrong

23 Post author: denisbider 25 January 2010 11:22PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (242)

You are viewing a single comment's thread. Show more comments above.

Comment author: Wei_Dai 26 January 2010 01:02:21AM 13 points [-]

denis, most utilitarians here are preference utilitarians, who believe in satisfying people's preferences, rather than maximizing happiness or pleasure.

To those who say they don't want to be wireheaded, how do you really know that, when you haven't tried wireheading? An FAI might reason the same way, and try to extrapolate what your preferences would be if you knew what it felt like to be wireheaded, in which case it might conclude that your true preferences are in favor of being wireheaded.

Comment author: Stuart_Armstrong 26 January 2010 12:55:50PM 4 points [-]

To those who say they don't want to be wireheaded, how do you really know that, when you haven't tried wireheading?

Same reason I don't try heroin. Wireheading (as generally conceived) imposes a predictable change on the user's utility function; huge and irreversible. Gathering this information is not without cost.

Comment author: Wei_Dai 26 January 2010 01:20:46PM 4 points [-]

I'm not suggesting that you try wireheading now, I'm saying that an FAI can obtain this information without a high cost, and when it does, it may turn out that you actually do prefer to be wireheaded.

Comment author: Stuart_Armstrong 26 January 2010 02:05:44PM 3 points [-]

That's possible (especially the non-addictive type of wire heading).

Though this does touch upon issues of autonomy - I'd like the AI to run it by me, even though it will have correctly predicted that I'd accept.