ygert comments on Harry Potter and the Methods of Rationality discussion thread, part 20, chapter 90 - Less Wrong

9 Post author: palladias 02 July 2013 02:13AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (609)

You are viewing a single comment's thread. Show more comments above.

Comment author: Tenoke 02 July 2013 02:49:11PM *  1 point [-]

Meaning to post this for a while not because it is a novel idea but just so it is recorded somewhere.

I think that there is a good chance that the story finishes with A SuperIntelligence of sorts. Furthermore, I think that if a SI is actually brought in the story, there is at least 50% chance that it will be a SI built/cast with good intentions which nonetheless destroys (in a way) humanity and/or the universe.

Comment author: ygert 03 July 2013 10:17:10AM 2 points [-]

Eliezer specifically and publicly said that this will not happen. There will be no superintelligent AI in HPMOR. I see no reason to doubt Eliezer's word on the matter.

Comment author: Tenoke 03 July 2013 05:58:06PM 2 points [-]

fair enough

Comment author: loserthree 03 July 2013 12:42:59PM 3 points [-]

I'm pretty sure that what he said was that nothing was intended as an allegory -- or maybe a metaphor or something of the sort -- to an artificial super-intelligence.

Somebody has the link, I expect.