The new thread, discussion 13, is here.
This is a new thread to discuss Eliezer Yudkowsky's Harry Potter and the Methods of Rationality and anything related to it. With three chapters recently the previous thread has very quickly reached 1000 comments. The latest chapter as of 25th March 2012 is Ch 80.
There is now a site dedicated to the story at hpmor.com, which is now the place to go to find the authors notes and all sorts of other goodies. AdeleneDawner has kept an archive of Author's Notes. (This goes up to the notes for chapter 76, and is now not updating. The authors notes from chapter 77 onwards are on hpmor.com.)
The first 5 discussion threads are on the main page under the harry_potter tag. Threads 6 and on (including this one) are in the discussion section using its separate tag system. Also: one, two, three, four, five, six, seven, eight, nine, ten, eleven.
As a reminder, it's often useful to start your comment by indicating which chapter you are commenting on.
Spoiler Warning: this thread is full of spoilers. With few exceptions, spoilers for MOR and canon are fair game to post, without warning or rot13. More specifically:
You do not need to rot13 anything about HP:MoR or the original Harry Potter series unless you are posting insider information from Eliezer Yudkowsky which is not supposed to be publicly available (which includes public statements by Eliezer that have been retracted).
If there is evidence for X in MOR and/or canon then it's fine to post about X without rot13, even if you also have heard privately from Eliezer that X is true. But you should not post that "Eliezer said X is true" unless you use rot13.
Thanks fixed.
Of course, after you make the precommitment you are no longer a strict consequentialist.
Fair enough. Rather than talking about precommittments to X, I ought to have talked about assertions that I will X in the future, made in such a way that the benefits of actually Xing in the future that derive from the fact of my having made that assertion (in terms of my reputation and associated credibility boosts and so forth) and the costs of failing to X (ibid) are sufficiently high that I will X even in situations where Xing incurs significant costs. Correction duly noted.
Boy would I like a convenient way of referring to that second thing, though.