Similar to the monthly Rationality Quotes threads, this is a thread for memorable quotes about Artificial General Intelligence.
- Please post all quotes separately, so that they can be voted up/down separately. (If they are strongly related, reply to your own comments. If strongly ordered, then go ahead and post them together.)
- Do not quote yourself.
- Do not quote comments/posts on LW/OB.
... I wonder how "alone" I am in the notion that AGI causing human extinction may not be a net negative, in that so long as it is a sentient product of human endeavors it is essentially a "continuation" of humanity.
Two problems: An obnoxious optimizing process isn't necessarily sentient. And how much would you really want such a continuation if it say tried to put everything in its future lightcone into little smiley faces?
If it helps ask yourself how you feel about a human empire that expands through its lightcone preemptively destroying every single alien species before they can do anything with a motto of "In the Prisoners' Dilemma, Humanity Defects!" That sounds pretty bad doesn't it? Now note that the AGI expansion is probably worse than that.