Similar to the monthly Rationality Quotes threads, this is a thread for memorable quotes about Artificial General Intelligence.
- Please post all quotes separately, so that they can be voted up/down separately. (If they are strongly related, reply to your own comments. If strongly ordered, then go ahead and post them together.)
- Do not quote yourself.
- Do not quote comments/posts on LW/OB.
Looking more closely, this much-duplicated "quote" seems to be a paraphrase of something he wrote in a letter to Heinrich Zaggler in the context of the first world war: "Our entire much-praised technological progress, and civilization generally, could be compared to an axe in the hand of a pathological criminal."
I do think about the AGI problem in much this way, though. E.g. in Just Babies, Paul Bloom wrote:
I think our current civilization i like a two-year old. The reason we haven't destroyed ourselves yet, but rather just bit some fingers and ruined some carpets, is because we didn't have any civilization-lethal weapons. We've had nuclear weapons for a few decades now and not blown ourselves up yet but there were some close calls. In the latter half of the 21st century we'll acquire some additional means of destroying our civilization. Will we have grown up by then? I doubt it. Civilizational maturity progresses more slowly than technological power.