eli_sennesh comments on Versions of AIXI can be arbitrarily stupid - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (59)
Uncomputable AIXI can be approximated almost arbitrarily well by computable versions. And the general problem is that "Hell" is possible in any world - take a computable version of AIXI in our world, and give it a prior that causes it to never do anything...
This means that "pick a complexity prior" does not solve the problem of priors for active agents (though it does for passive agents) because which complexity prior we pick matters.
Provided you have access to unbounded computing power and don't give half a damn about non-asymptotic tractability, yes.