Qiaochu_Yuan comments on A definition of wireheading - Less Wrong

35 Post author: Anja 27 November 2012 07:31PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (80)

You are viewing a single comment's thread. Show more comments above.

Comment author: Qiaochu_Yuan 29 November 2012 01:17:19AM 1 point [-]

I'm not sure I understand the illustration. In particular, I don't understand what "want" means if it doesn't mean having a world-model over world-states and counting gliders.

Comment author: Vladimir_Nesov 29 November 2012 11:19:11AM 2 points [-]

I guess "want" in "AI that would want to actually maximize the number of gliders" refers to having a tendency to produce a lot of gliders. If you have an opaque AI with obfuscated and somewhat faulty "jumble of wires" design, you might be unable to locate its world model in any obvious way, but you might be able to characterize its behavior. The point of the example is to challenge the reader to imagine a design of an AI that achieves the tendency of producing gliders in many environments, but isn't specified in terms of some kind of world model module with glider counting over that world model.