Stuart_Armstrong comments on Welcome to Heaven - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (242)
denis, most utilitarians here are preference utilitarians, who believe in satisfying people's preferences, rather than maximizing happiness or pleasure.
To those who say they don't want to be wireheaded, how do you really know that, when you haven't tried wireheading? An FAI might reason the same way, and try to extrapolate what your preferences would be if you knew what it felt like to be wireheaded, in which case it might conclude that your true preferences are in favor of being wireheaded.
Same reason I don't try heroin. Wireheading (as generally conceived) imposes a predictable change on the user's utility function; huge and irreversible. Gathering this information is not without cost.
I'm not suggesting that you try wireheading now, I'm saying that an FAI can obtain this information without a high cost, and when it does, it may turn out that you actually do prefer to be wireheaded.
That's possible (especially the non-addictive type of wire heading).
Though this does touch upon issues of autonomy - I'd like the AI to run it by me, even though it will have correctly predicted that I'd accept.