Similar to the monthly Rationality Quotes threads, this is a thread for memorable quotes about Artificial General Intelligence.
- Please post all quotes separately, so that they can be voted up/down separately. (If they are strongly related, reply to your own comments. If strongly ordered, then go ahead and post them together.)
- Do not quote yourself.
- Do not quote comments/posts on LW/OB.
Human values change in part because we aren't optimizers in any substantial sense. We're giant mechas for moving around DNA (after the RNA's replication process got hijacked) that have been built blindly by evolution for an environment where the primary dangers were large predators and other humans. Only then something went wrong and the mechas got too smart from runaway sexual selection. This narrative may be slightly wrong, but something close to it is correct. More to the point, for much of human history, having values that were that different from peers was a good way to not have reproductive success. Humans were selected for having incoherent, inconsistent, fluid value systems.
There's no reason to think that an AGI will fall into that category. Moreover, note that even powerful humans prefer to impose their values on others rather than alter their own values. A sufficiently powerful AGI would likely do likewise.
Regarding the empire, I may need to apologize; I think I have more negative connotations to the word "empire" than were stated explicitly in my remark and that they are not shared. Here's possibly a slightly different analogy that may help: If you have to choose between a future with the United Federation of Planets from Startrek or the Imperium from Warhammer 40K, which would you choose?
I was assuming the latter. As to the former, again: hence my caveat. I don't much care what the possibility of AGI mindspace is, I've already arbitrarily limited the kinds I'm talking about to a very narrow window.
So objecting to my valuative statement regarding that narrow window with the statement, "But there's ... (read more)