This is a new thread to discuss Eliezer Yudkowsky’s Harry Potter and the Methods of Rationality and anything related to it. This thread is intended for discussing chapters 91 & 92 . The previous thread has passed 500 comments.
There is now a site dedicated to the story at hpmor.com, which is now the place to go to find the authors notes and all sorts of other goodies. AdeleneDawner has kept an archive of Author’s Notes. (This goes up to the notes for chapter 76, and is now not updating. The authors notes from chapter 77 onwards are on hpmor.com.)
The first 5 discussion threads are on the main page under the harry_potter tag. Threads 6 and on (including this one) are in the discussion section using its separate tag system. Also: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17,18,19,20.
Spoiler Warning: this thread is full of spoilers. With few exceptions, spoilers for MOR and canon are fair game to post, without warning or rot13. More specifically:
You do not need to rot13 anything about HP:MoR or the original Harry Potter series unless you are posting insider information from Eliezer Yudkowsky which is not supposed to be publicly available (which includes public statements by Eliezer that have been retracted).
If there is evidence for X in MOR and/or canon then it’s fine to post about X without rot13, even if you also have heard privately from Eliezer that X is true. But you should not post that “Eliezer said X is true” unless you use rot13.
I'm expecting a positive ending for a few reasons, one of which is that since this is rationality propaganda I doubt Eliezer wants to portray Harry's super-rationality as having ultimately bad results.
It's rationality propaganda that isn't supposed to encourage people to accelerate the way to unfriendly AI but rationality propaganda that is supposed to prevent unfriendly AI from happening.
As far as the story goes, Harry is very rational but Quirrell is on a level where he outplays Harry. In the end Harry is a bit a projection of how Eliezer sees his own childhold self.
On the one hand Eliezer was very smart and rational. On the other hand he was delusional because he didn't take unfriendly AI seriously as something that can actually happen in reality. ... (read more)