torekp comments on A definition of wireheading - Less Wrong

35 Post author: Anja 27 November 2012 07:31PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (80)

You are viewing a single comment's thread. Show more comments above.

Comment author: torekp 01 December 2012 05:13:48PM *  2 points [-]

The authors argue that [... in addition to some other agents] the goal-seeking agent that gets one utiliton every time it satisfies a pre-specified goal and no utility otherwise [...], will all decide to build and use a delusion box.

They're using the term "goal seeking agent" in a perverse way. As EY explains in his third and fourth paragraphs, seeking a result defined in sensory-data terms is not the only, or even usual, sense of "goal" that people would attach to the phrase "goal seeking agent". Nor is that a typical goal that a programmer would want an AI to seek.