Viliam_Bur comments on Holden Karnofsky's Singularity Institute Objection 3 - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (8)
If it tries to self-improve, and as a side effect turns the universe to computronium.
If it gains a general intelligence, and as a part of trying to provide better search results, it realizes that self-modification could bring much faster search results.
This whole idea of a harmless general intelligence is just imagining a general intelligence which is not general enough to be dangerous; which will be able to think generally, and yet somehow this ability will always reliably stop before thinking something that might end bad.
Thanks, I completely missed that. Explains a lot.