What do you think? There might be a theoritical limitation to how much data an AI could collect without influencing the data itself and making its prediction redundant. Would this negate the idea of a 'God' AIand cause it to make suboptimal choices even with near limitless processing power?
Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.