Amanojack comments on The AI in a box boxes you - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (378)
Indeed, a similar point seems to apply to the whole anti-boxing argument. Are we really prepared to say that super-intelligence implies being able to extrapolate anything from a tiny number of data points?
It sounds a bit too much like the claim that a sufficiently intelligent being could "make A = ~A" or other such meaninglessness.
Hyperintelligence != magic
Yes, but the AI could take over the world, and given a Singularity, it should be possible to recreate perfect simulations.
So really this example makes more sense if the AI is making a future threat.