Well this isn't a story for the sake of being a story. It's meant to educate, not just entertain. Sometimes the moral is: invent the spells Becomus Godus and Fixus Everthingus and use them. Keeping in mind why Eliezer is spending his presumably valuable time writing Harry Potter fanfiction, it seems inevitable to me that some sort of overpowered ending is required. Eliezer's quest is to reshape the world in a positive way through friendly AGI. Harry's goal is the same (substituting magic = AGI).
If the above theory is correct, we're in for a story arc about outcome pumps, wish granting machines, extrapolated morality, and hard takeoffs. Because the point of this is to make readers want (a) to be more rational, and then (b) to help MIRI on its FAI quest, no?
New chapter!
This is a new thread to discuss Eliezer Yudkowsky’s Harry Potter and the Methods of Rationality and anything related to it. This thread is intended for discussing chapter 102.
There is a site dedicated to the story at hpmor.com, which is now the place to go to find the authors notes and all sorts of other goodies. AdeleneDawner has kept an archive of Author’s Notes. (This goes up to the notes for chapter 76, and is now not updating. The authors notes from chapter 77 onwards are on hpmor.com.)
Spoiler Warning: this thread is full of spoilers. With few exceptions, spoilers for MOR and canon are fair game to post, without warning or rot13. More specifically: