- This thread has run its course. You will find newer threads in the discussion section.
Another discussion thread - the fourth - has reached the (arbitrary?) 500 comments threshold, so it's time for a new thread for Eliezer Yudkowsky's widely-praised Harry Potter fanfic.
Most of the paratext and fan-made resources are listed on Mr. LessWrong's author page. There is also AdeleneDawner's collection of most of the previously-published Author's Notes.
Older threads: one, two, three, four. By tag.
Newer threads are in the Discussion section, starting from Part 6.
Spoiler policy as suggested by Unnamed and approved by Eliezer, me, and at least three other upmodders:
You do not need to rot13 anything about HP:MoR or the original Harry Potter series unless you are posting insider information from Eliezer Yudkowsky which is not supposed to be publicly available (which includes public statements by Eliezer that have been retracted).
If there is evidence for X in MOR and/or canon then it's fine to post about X without rot13, even if you also have heard privately from Eliezer that X is true. But you should not post that "Eliezer said X is true" unless you use rot13.
It would also be quite sensible and welcome to continue the practice of declaring at the top of your post which chapters you are about to discuss, especially for newly-published ones, so that people who haven't yet seen them can stop reading in time.
I believe Harry considers some punishments completely out of bounds, too severe for anyone. Certainly I do. The following may have no connection to the real reasons for this; but even without Many-Worlds you have a non-zero probability of personally suffering any possible punishment. Legally allowing a given punishment for anyone seems to produce a non-zero increase in this probability (even in a world without Polyjuice). Some possible punishments may have such negative utility for you that a course of action which avoids such increases, but which almost certainly leads to your death, would still have positive utility. Azkaban seems like a good candidate for such a punishment.
Furthermore even if one is a pure consequentialist, there may be a case for acting like a deontologist in some cases. While a perfectly rational entity can properly weight costs and benefits, people can't. Chances are if a person's moral code says "it's a good idea to subject some people to mind rape for decades" that person has made a mistake, and one should account for that.