Randaly comments on Thought experiment: The transhuman pedophile - Less Wrong

6 Post author: PhilGoetz 17 September 2013 10:38PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (74)

You are viewing a single comment's thread.

Comment author: Randaly 17 September 2013 11:32:54PM 1 point [-]

The answer is 1). In fact, terminal values can change themselves. Consider an impressive but non-superhuman program that is powerless to directly affect its environment, and whose only goal is to maintain a paperclip in its current position. If you told the program the paperclip would be moved unless it changed itself to desire that the paperclip be moved, you would move the paperclip, then (assuming sufficient intelligence) the program will change its terminal value to the opposite of what it previously desired.

(In general, rational agents would only modify their terminal values if they that doing so would be required to maximize their original terminal values. Assuming that we too want their original terminal values maximized, this is not a problem.)