You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

tailcalled comments on Versions of AIXI can be arbitrarily stupid - Less Wrong Discussion

15 Post author: Stuart_Armstrong 10 August 2015 01:23PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (59)

You are viewing a single comment's thread.

Comment author: tailcalled 11 August 2015 06:00:45PM 2 points [-]

Do you have a better proposal for how to act if you know that you are in either heaven or hell, but don't know which?

Comment author: Stuart_Armstrong 11 August 2015 09:20:48PM 0 points [-]

Follow your priors. The problem here is that the prior for Hell has been constructed, "artificially", to have unnaturally high probability.

Comment author: tailcalled 12 August 2015 09:35:34PM 1 point [-]

I think my claim was that your example was kinda bad, since it's not obvious that the AI is doing anything wrong, but on reflection I realized that it doesn't really matter, since I can easily generate a better example.