ygert comments on Harry Potter and the Methods of Rationality discussion thread, part 20, chapter 90 - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (609)
Meaning to post this for a while not because it is a novel idea but just so it is recorded somewhere.
I think that there is a good chance that the story finishes with A SuperIntelligence of sorts. Furthermore, I think that if a SI is actually brought in the story, there is at least 50% chance that it will be a SI built/cast with good intentions which nonetheless destroys (in a way) humanity and/or the universe.
Eliezer specifically and publicly said that this will not happen. There will be no superintelligent AI in HPMOR. I see no reason to doubt Eliezer's word on the matter.
fair enough
I'm pretty sure that what he said was that nothing was intended as an allegory -- or maybe a metaphor or something of the sort -- to an artificial super-intelligence.
Somebody has the link, I expect.