You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

cousin_it comments on Example decision theory problem: "Agent simulates predictor" - Less Wrong Discussion

23 Post author: cousin_it 19 May 2011 03:16PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (76)

You are viewing a single comment's thread. Show more comments above.

Comment author: cousin_it 19 May 2011 07:10:52PM *  3 points [-]

Yeah, you can't control arbitrary world programs that way, but remember that our world really is like that: a human mind runs within the same universe that it's trying to control. One idea would be to define a class of "fair" world programs (perhaps those that can be factorized in appropriate ways) and care only about solving those. But I guess I'd better wait for your writeup, because I don't understand what kind of formalism you prefer.