tailcalled comments on Versions of AIXI can be arbitrarily stupid - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (59)
Do you have a better proposal for how to act if you know that you are in either heaven or hell, but don't know which?
Follow your priors. The problem here is that the prior for Hell has been constructed, "artificially", to have unnaturally high probability.
I think my claim was that your example was kinda bad, since it's not obvious that the AI is doing anything wrong, but on reflection I realized that it doesn't really matter, since I can easily generate a better example.