Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Harry Potter and the Methods of Rationality discussion thread, March 2015, chapter 114 + chapter 115

2 Gondolinian 03 March 2015 06:02PM

This is a new thread to discuss Eliezer Yudkowsky’s Harry Potter and the Methods of Rationality and anything related to it. This thread is intended for discussing chapter 114, and also, as a special case due to the exceptionally close posting times, chapter 115.

There is a site dedicated to the story at hpmor.com, which is now the place to go to find the authors notes and all sorts of other goodies. AdeleneDawner has kept an archive of Author’s Notes. (This goes up to the notes for chapter 76, and is now not updating. The authors notes from chapter 77 onwards are on hpmor.com.)

Spoiler Warning: this thread is full of spoilers. With few exceptions, spoilers for MOR and canon are fair game to post, without warning or rot13. More specifically:

You do not need to rot13 anything about HP:MoR or the original Harry Potter series unless you are posting insider information from Eliezer Yudkowsky which is not supposed to be publicly available (which includes public statements by Eliezer that have been retracted).

If there is evidence for X in MOR and/or canon then it’s fine to post about X without rot13, even if you also have heard privately from Eliezer that X is true. But you should not post that “Eliezer said X is true” unless you use rot13.

Harry Potter and the Methods of Rationality discussion thread, February 2015, chapter 113

8 Gondolinian 28 February 2015 08:23PM

This is a new thread to discuss Eliezer Yudkowsky’s Harry Potter and the Methods of Rationality and anything related to it. This thread is intended for discussing chapter 113.

There is a site dedicated to the story at hpmor.com, which is now the place to go to find the authors notes and all sorts of other goodies. AdeleneDawner has kept an archive of Author’s Notes. (This goes up to the notes for chapter 76, and is now not updating. The authors notes from chapter 77 onwards are on hpmor.com.)

Spoiler Warning: this thread is full of spoilers. With few exceptions, spoilers for MOR and canon are fair game to post, without warning or rot13. More specifically:

You do not need to rot13 anything about HP:MoR or the original Harry Potter series unless you are posting insider information from Eliezer Yudkowsky which is not supposed to be publicly available (which includes public statements by Eliezer that have been retracted).

If there is evidence for X in MOR and/or canon then it’s fine to post about X without rot13, even if you also have heard privately from Eliezer that X is true. But you should not post that “Eliezer said X is true” unless you use rot13.


IMPORTANT -- From the end of chapter 113:

This is your final exam.

You have 60 hours.

Your solution must at least allow Harry to evade immediate death,
despite being naked, holding only his wand, facing 36 Death Eaters
plus the fully resurrected Lord Voldemort.

If a viable solution is posted before
*12:01AM Pacific Time* (8:01AM UTC) on Tuesday, March 3rd, 2015,
the story will continue to Ch. 121.

Otherwise you will get a shorter and sadder ending.

Keep in mind the following:

1. Harry must succeed via his own efforts. The cavalry is not coming.
Everyone who might want to help Harry thinks he is at a Quidditch game.

2. Harry may only use capabilities the story has already shown him to have;
he cannot develop wordless wandless Legilimency in the next 60 seconds.

3. Voldemort is evil and cannot be persuaded to be good;
the Dark Lord's utility function cannot be changed by talking to him.

4. If Harry raises his wand or speaks in anything except Parseltongue,
the Death Eaters will fire on him immediately.

5. If the simplest timeline is otherwise one where Harry dies -
if Harry cannot reach his Time-Turner without Time-Turned help -
then the Time-Turner will not come into play.

6. It is impossible to tell lies in Parseltongue.

Within these constraints,
Harry is allowed to attain his full potential as a rationalist,
now in this moment or never,
regardless of his previous flaws.

Of course 'the rational solution',
if you are using the word 'rational' correctly,
is just a needlessly fancy way of saying 'the best solution'
or 'the solution I like' or 'the solution I think we should use',
and you should usually say one of the latter instead.
(We only need the word 'rational' to talk about ways of thinking,
considered apart from any particular solutions.)

And by Vinge's Principle,
if you know exactly what a smart mind would do,
you must be at least that smart yourself.
Asking someone "What would an optimal player think is the best move?"
should produce answers no better than "What do you think is best?"

So what I mean in practice,
when I say Harry is allowed to attain his full potential as a rationalist,
is that Harry is allowed to solve this problem
the way YOU would solve it.
If you can tell me exactly how to do something,
Harry is allowed to think of it.

But it does not serve as a solution to say, for example,
"Harry should persuade Voldemort to let him out of the box"
if you can't yourself figure out how.

The rules on Fanfiction dot Net allow at most one review per chapter.
Please submit *ONLY ONE* review of Ch. 113,
to submit one suggested solution.

For the best experience, if you have not already been following
Internet conversations about recent chapters, I suggest not doing so,
trying to complete this exam on your own,
not looking at other reviews,
and waiting for Ch. 114 to see how you did.

I wish you all the best of luck, or rather the best of skill.

Ch. 114 will post at 10AM Pacific (6PM UTC) on Tuesday, March 3rd, 2015.


ADDED:

If you have pending exams,
then even though the bystander effect is a thing,
I expect that the collective effect of
'everyone with more urgent life
issues stays out of the effort'
shifts the probabilities very little

(because diminishing marginal returns on more eyes
and an already-huge population that is participating).

So if you can't take the time, then please don't.
Like any author, I enjoy the delicious taste of my readers' suffering,
finer than any chocolate; but I don't want to *hurt* you.

Likewise, if you hate hate hate this sort of thing, then don't participate!
Other people ARE enjoying it. Just come back in a few days.
I shouldn't even need to point this out.

I remind you again that you have hours to think.
Use the Hold Off On Proposing Solutions, Luke.

And really truly, I do mean it,
Harry cannot develop any new magical powers
or transcend previously stated constraints on them
in the next sixty seconds.

Harry Potter and the Methods of Rationality discussion thread, February 2015, chapter 112

4 Gondolinian 25 February 2015 09:00PM

This is a new thread to discuss Eliezer Yudkowsky’s Harry Potter and the Methods of Rationality and anything related to it. This thread is intended for discussing chapter 112.

There is a site dedicated to the story at hpmor.com, which is now the place to go to find the authors notes and all sorts of other goodies. AdeleneDawner has kept an archive of Author’s Notes. (This goes up to the notes for chapter 76, and is now not updating. The authors notes from chapter 77 onwards are on hpmor.com.)

Spoiler Warning: this thread is full of spoilers. With few exceptions, spoilers for MOR and canon are fair game to post, without warning or rot13. More specifically:

You do not need to rot13 anything about HP:MoR or the original Harry Potter series unless you are posting insider information from Eliezer Yudkowsky which is not supposed to be publicly available (which includes public statements by Eliezer that have been retracted).

If there is evidence for X in MOR and/or canon then it’s fine to post about X without rot13, even if you also have heard privately from Eliezer that X is true. But you should not post that “Eliezer said X is true” unless you use rot13.

Harry Potter and the Methods of Rationality discussion thread, February 2015, chapter 111

3 b_sen 25 February 2015 06:52PM

This is a new thread to discuss Eliezer Yudkowsky’s Harry Potter and the Methods of Rationality and anything related to it. This thread is intended for discussing chapter 111.

There is a site dedicated to the story at hpmor.com, which is now the place to go to find the authors notes and all sorts of other goodies. AdeleneDawner has kept an archive of Author’s Notes. (This goes up to the notes for chapter 76, and is now not updating. The authors notes from chapter 77 onwards are on hpmor.com.)

Spoiler Warning: this thread is full of spoilers. With few exceptions, spoilers for MOR and canon are fair game to post, without warning or rot13. More specifically:

You do not need to rot13 anything about HP:MoR or the original Harry Potter series unless you are posting insider information from Eliezer Yudkowsky which is not supposed to be publicly available (which includes public statements by Eliezer that have been retracted).

If there is evidence for X in MOR and/or canon then it’s fine to post about X without rot13, even if you also have heard privately from Eliezer that X is true. But you should not post that “Eliezer said X is true” unless you use rot13.

Harry Potter and the Methods of Rationality discussion thread, February 2015, chapter 110

3 Gondolinian 24 February 2015 08:01PM

This is a new thread to discuss Eliezer Yudkowsky’s Harry Potter and the Methods of Rationality and anything related to it. This thread is intended for discussing chapter 110.

There is a site dedicated to the story at hpmor.com, which is now the place to go to find the authors notes and all sorts of other goodies. AdeleneDawner has kept an archive of Author’s Notes. (This goes up to the notes for chapter 76, and is now not updating. The authors notes from chapter 77 onwards are on hpmor.com.)

Spoiler Warning: this thread is full of spoilers. With few exceptions, spoilers for MOR and canon are fair game to post, without warning or rot13. More specifically:

You do not need to rot13 anything about HP:MoR or the original Harry Potter series unless you are posting insider information from Eliezer Yudkowsky which is not supposed to be publicly available (which includes public statements by Eliezer that have been retracted).

If there is evidence for X in MOR and/or canon then it’s fine to post about X without rot13, even if you also have heard privately from Eliezer that X is true. But you should not post that “Eliezer said X is true” unless you use rot13.

Harry Potter and the Methods of Rationality discussion thread, February 2015, chapter 109

5 Gondolinian 23 February 2015 08:05PM

This is a new thread to discuss Eliezer Yudkowsky’s Harry Potter and the Methods of Rationality and anything related to it. This thread is intended for discussing chapter 109.

There is a site dedicated to the story at hpmor.com, which is now the place to go to find the authors notes and all sorts of other goodies. AdeleneDawner has kept an archive of Author’s Notes. (This goes up to the notes for chapter 76, and is now not updating. The authors notes from chapter 77 onwards are on hpmor.com.)

Spoiler Warning: this thread is full of spoilers. With few exceptions, spoilers for MOR and canon are fair game to post, without warning or rot13. More specifically:

You do not need to rot13 anything about HP:MoR or the original Harry Potter series unless you are posting insider information from Eliezer Yudkowsky which is not supposed to be publicly available (which includes public statements by Eliezer that have been retracted).

If there is evidence for X in MOR and/or canon then it’s fine to post about X without rot13, even if you also have heard privately from Eliezer that X is true. But you should not post that “Eliezer said X is true” unless you use rot13.

Harry Potter and the Methods of Rationality discussion thread, February 2015, chapter 108

5 b_sen 20 February 2015 09:53PM

New long chapter! Since I expect its discussion to generate more than 160 comments (which would push the previous thread over the 500 comment limit) before the next chapter is posted, here is a new thread.

This is a new thread to discuss Eliezer Yudkowsky’s Harry Potter and the Methods of Rationality and anything related to it. This thread is intended for discussing chapter 108 (and chapter 109, once it comes out on Monday).

EDIT: There have now been two separate calls for having one thread per chapter, along with a poll in this thread. If the poll in this thread indicates a majority preference for one thread per chapter by Monday, I will edit this post to make it for chapter 108 only. In that case a new thread for chapter 109 should be posted by whoever gets a chance and wants to after the chapter is released.

EDIT 2: The poll indicates a large majority (currently 78%) in favor of one thread per chapter. This post has been edited accordingly.

There is a site dedicated to the story at hpmor.com, which is now the place to go to find the authors notes and all sorts of other goodies. AdeleneDawner has kept an archive of Author’s Notes. (This goes up to the notes for chapter 76, and is now not updating. The authors notes from chapter 77 onwards are on hpmor.com.)

Spoiler Warning: this thread is full of spoilers. With few exceptions, spoilers for MOR and canon are fair game to post, without warning or rot13. More specifically:

You do not need to rot13 anything about HP:MoR or the original Harry Potter series unless you are posting insider information from Eliezer Yudkowsky which is not supposed to be publicly available (which includes public statements by Eliezer that have been retracted).

If there is evidence for X in MOR and/or canon then it’s fine to post about X without rot13, even if you also have heard privately from Eliezer that X is true. But you should not post that “Eliezer said X is true” unless you use rot13.

Harry Potter and the Methods of Rationality discussion thread, February 2015, chapters 105-107

6 b_sen 17 February 2015 01:17AM

Two new short chapters! Since the next one is coming tomorrow and we know it'll be short, let's use one thread for both.

This is a new thread to discuss Eliezer Yudkowsky’s Harry Potter and the Methods of Rationality and anything related to it. This thread is intended for discussing chapter 105 (and chapter 106, once it comes out tomorrow).  EDIT: based on Alsadius' comment about thread creation for MOR chapters, let's also use this thread for chapter 107 (and future chapters until this nears 500 comments) unless someone objects to doing so.  Given that this is the final arc we're talking about, thread titles should be updated to indicate chapters covered.

There is a site dedicated to the story at hpmor.com, which is now the place to go to find the authors notes and all sorts of other goodies. AdeleneDawner has kept an archive of Author’s Notes. (This goes up to the notes for chapter 76, and is now not updating. The authors notes from chapter 77 onwards are on hpmor.com.)

Spoiler Warning: this thread is full of spoilers. With few exceptions, spoilers for MOR and canon are fair game to post, without warning or rot13. More specifically:

You do not need to rot13 anything about HP:MoR or the original Harry Potter series unless you are posting insider information from Eliezer Yudkowsky which is not supposed to be publicly available (which includes public statements by Eliezer that have been retracted).

If there is evidence for X in MOR and/or canon then it’s fine to post about X without rot13, even if you also have heard privately from Eliezer that X is true. But you should not post that “Eliezer said X is true” unless you use rot13.

Harry Potter and the Methods of Rationality discussion thread, February 2015, chapter 104

8 b_sen 16 February 2015 01:24AM

New chapter!

This is a new thread to discuss Eliezer Yudkowsky’s Harry Potter and the Methods of Rationality and anything related to it. This thread is intended for discussing chapter 104.

There is a site dedicated to the story at hpmor.com, which is now the place to go to find the authors notes and all sorts of other goodies. AdeleneDawner has kept an archive of Author’s Notes. (This goes up to the notes for chapter 76, and is now not updating. The authors notes from chapter 77 onwards are on hpmor.com.)

Spoiler Warning: this thread is full of spoilers. With few exceptions, spoilers for MOR and canon are fair game to post, without warning or rot13. More specifically:

You do not need to rot13 anything about HP:MoR or the original Harry Potter series unless you are posting insider information from Eliezer Yudkowsky which is not supposed to be publicly available (which includes public statements by Eliezer that have been retracted).

If there is evidence for X in MOR and/or canon then it’s fine to post about X without rot13, even if you also have heard privately from Eliezer that X is true. But you should not post that “Eliezer said X is true” unless you use rot13.

Harry Potter and the Methods of Rationality discussion thread, January 2015, chapter 103

7 b_sen 29 January 2015 01:44AM

New chapter, and the end is now in sight!

This is a new thread to discuss Eliezer Yudkowsky’s Harry Potter and the Methods of Rationality and anything related to it. This thread is intended for discussing chapter 103.

There is a site dedicated to the story at hpmor.com, which is now the place to go to find the authors notes and all sorts of other goodies. AdeleneDawner has kept an archive of Author’s Notes. (This goes up to the notes for chapter 76, and is now not updating. The authors notes from chapter 77 onwards are on hpmor.com.)

Spoiler Warning: this thread is full of spoilers. With few exceptions, spoilers for MOR and canon are fair game to post, without warning or rot13. More specifically:

You do not need to rot13 anything about HP:MoR or the original Harry Potter series unless you are posting insider information from Eliezer Yudkowsky which is not supposed to be publicly available (which includes public statements by Eliezer that have been retracted).

If there is evidence for X in MOR and/or canon then it’s fine to post about X without rot13, even if you also have heard privately from Eliezer that X is true. But you should not post that “Eliezer said X is true” unless you use rot13.

Harry Potter and the Methods of Rationality discussion thread, July 2014, chapter 102

7 David_Gerard 26 July 2014 11:26AM

New chapter!

This is a new thread to discuss Eliezer Yudkowsky’s Harry Potter and the Methods of Rationality and anything related to it. This thread is intended for discussing chapter 102.

There is a site dedicated to the story at hpmor.com, which is now the place to go to find the authors notes and all sorts of other goodies. AdeleneDawner has kept an archive of Author’s Notes. (This goes up to the notes for chapter 76, and is now not updating. The authors notes from chapter 77 onwards are on hpmor.com.) 

Spoiler Warning: this thread is full of spoilers. With few exceptions, spoilers for MOR and canon are fair game to post, without warning or rot13. More specifically:

You do not need to rot13 anything about HP:MoR or the original Harry Potter series unless you are posting insider information from Eliezer Yudkowsky which is not supposed to be publicly available (which includes public statements by Eliezer that have been retracted).

If there is evidence for X in MOR and/or canon then it’s fine to post about X without rot13, even if you also have heard privately from Eliezer that X is true. But you should not post that “Eliezer said X is true” unless you use rot13.

Fifty Shades of Self-Fulfilling Prophecy

18 PhilGoetz 24 July 2014 12:17AM

The official story: "Fifty Shades of Grey" was a Twilight fan-fiction that had over two million downloads online. The publishing giant Vintage Press saw that number and realized there was a huge, previously-unrealized demand for stories like this. They filed off the Twilight serial numbers, put it in print, marketed it like hell, and now it's sold 60 million copies.

The reality is quite different.

continue reading »

How Tim O'Brien gets around the logical fallacy of generalization from fictional evidence

9 mszegedy 24 April 2014 09:41PM

It took me until I read The Things They Carried for the third time until I realized that it contained something very valuable to rationalists. In "The Logical Fallacy of Generalization from Fictional Evidence," EY explains how using fiction as evidence is bad not only because it's deliberately wrong in particular ways to make it more interesting, but more importantly because it does not provide a probabilistic model of what happened, and gives at best a bit or two of evidence that looks like a hundred or more bits of evidence.

Some background: The Things They Carried is a book by Tim O'Brien that reads as an autobiography where he recollects various stories from being a story in the Vietnam War. However, O'Brien often repeats himself, writing the same story over again, but with details or entire events that change. It is actually a fictional autobiography; O'Brien was in the Vietnam War, but all the stories are fictional.

In The Things They Carried, Tim O'Brien not only explains how generalization from fictional evidence is bad, but also has his own solution to the problem that actually works, i.e. gives the reader a useful probabilistic model of what happened in such a way that actually interests the reader. He does this by telling his stories many times, changing significant things about them. Literally; he contradicts himself, writing out the same story but with things changed. The best illustration of the principle in the book is the chapter "How to Tell a True War Story," found here (PDF warning, and bad typesetting warning).

A reader is not inclined to read a list of probabilities, but they are inclined to read a bunch of short stories. He talks about this practice a lot in the book itself, writing, "All you can do is tell it one more time, patiently, adding and subtracting, making up a few things to get at the real truth. … You can tell a true war story if you just keep on telling it." He always says war story, but the principle generalizes. At one point, he has a character represent the forces that act on conventional writing, telling a storyteller that he cannot say that he doesn't know what happened, and that he cannot insert any analysis.

O'Brien also writes about a lot of other things I don't want to mention more than briefly here, such as the specific ways in which the model that conventional war stories give of war is wrong, and specific ways in which the audience misinterprets stories. I recommend the book very much, especially if you think writing "tell multiple short stories" fiction is a great idea and want to do it.

I apologize if this post has been made before.

EDIT: Tried to clarify the idea better. I added an example with an excerpt.

EDIT 2: Added a better excerpt.

EDIT 3: Added a paragraph about background.

Rationalist fiction: a Slice of Life IN HELL

7 Ritalin 25 March 2014 05:02PM

"If you're sent to Hell for that, you wouldn't have liked it in Heaven anyway." 

This phrase inspired in me the idea of a Slice of Life IN HELL story. Basically, the strictest interpretation of the Abrahamic God turns out to be true, and, after Judgment Day, all the sinners (again, by the strictest standards), the pagans, the atheists, the gays, the heretics and so on end up in Hell, which is to say, most of humanity. Rather than a Fire and Brimstone torture chamber, this Hell is very much like earthly life, except it runs on Murphy's Law turned Up To Eleven ("everything that can go wrong, will go wrong"), and you can't die permanently, and it goes on forever. It's basically Life as a videogame, set to Maximum Difficulty, and real pain and suffering.

Our stories would focus actually decent, sympathetic people, who are there for things like following the wrong religion, or having sex outside missionary-man-on-woman, lack of observance of the daily little rituals, or even just being lazy. They manage to live more-or-less decently because they're extremely cautious, rational, and methodical. Given that reality is out to get them, this is a constant uphill battle, and even the slightest negligence can have a terrible cost. Thankfully, they have all the time in eternity to learn from their mistakes.

This could be an interesting way to showcase rationalist principles, especially those regarding safety and planning, in a perpetual Worst Case Scenario environment. There's ample potential for constant conflict, and sympathetic characters whom the audience can feel they really didn't deserve their fate. The central concept also seems classically strong to me: defying Status Quo and cruel authorities by striving to be as excellent as one can be, even in the face of certain doom.

What do you guys think? There's lots of little details to specify, and there are many things that I believe should be marked as "must NOT be specified". Any help, ideas, thoughts are very welcome.

Fiction: Written on the Body as love versus reason

-11 PhilGoetz 08 September 2013 06:13AM

In 1992, Jeanette Winterson, one of the hottest young authors of the early 1990s, published Written on the Body. Critics loved it, but none of them seem to have picked up on what I thought the book was about: The question of whether reason in love is good for you.

continue reading »

Harry Potter and the Methods of Rationality discussion thread, part 20, chapter 90

9 palladias 02 July 2013 02:13AM

This is a new thread to discuss Eliezer Yudkowsky’s Harry Potter and the Methods of Rationality and anything related to it. This thread is intended for discussing chapter 90The previous thread has passed 750 comments. 

There is now a site dedicated to the story at hpmor.com, which is now the place to go to find the authors notes and all sorts of other goodies. AdeleneDawner has kept an archive of Author’s Notes. (This goes up to the notes for chapter 76, and is now not updating. The authors notes from chapter 77 onwards are on hpmor.com.) 

The first 5 discussion threads are on the main page under the harry_potter tag.  Threads 6 and on (including this one) are in the discussion section using its separate tag system.  Also: 1234567891011121314151617,18,19.

Spoiler Warning: this thread is full of spoilers. With few exceptions, spoilers for MOR and canon are fair game to post, without warning or rot13. More specifically:

You do not need to rot13 anything about HP:MoR or the original Harry Potter series unless you are posting insider information from Eliezer Yudkowsky which is not supposed to be publicly available (which includes public statements by Eliezer that have been retracted).

If there is evidence for X in MOR and/or canon then it’s fine to post about X without rot13, even if you also have heard privately from Eliezer that X is true. But you should not post that “Eliezer said X is true” unless you use rot13.

Harry Potter and the Methods of Rationality discussion thread, part 19, chapter 88-89

12 Vaniver 30 June 2013 01:22AM

This is a new thread to discuss Eliezer Yudkowsky’s Harry Potter and the Methods of Rationality and anything related to it. This thread is intended for discussing chapter 88-89The previous thread has passed 500 comments. 

There is now a site dedicated to the story at hpmor.com, which is now the place to go to find the authors notes and all sorts of other goodies. AdeleneDawner has kept an archive of Author’s Notes. (This goes up to the notes for chapter 76, and is now not updating. The authors notes from chapter 77 onwards are on hpmor.com.) 

The first 5 discussion threads are on the main page under the harry_potter tag.  Threads 6 and on (including this one) are in the discussion section using its separate tag system.  Also: 12345678910111213141516, 17, 18.

Spoiler Warning: this thread is full of spoilers. With few exceptions, spoilers for MOR and canon are fair game to post, without warning or rot13. More specifically:

You do not need to rot13 anything about HP:MoR or the original Harry Potter series unless you are posting insider information from Eliezer Yudkowsky which is not supposed to be publicly available (which includes public statements by Eliezer that have been retracted).

If there is evidence for X in MOR and/or canon then it’s fine to post about X without rot13, even if you also have heard privately from Eliezer that X is true. But you should not post that “Eliezer said X is true” unless you use rot13.

The Classic Literature Workshop

2 Ritalin 16 June 2013 09:54AM

From EY's Facebook page, there were two posts that got me thinking about fiction and how to work it better and make it stronger:

It would have been trivial to fix _Revenge of the Sith_'s inadequate motivation of Anakin's dark turn; have Padme already in the hospital slowly dying as her children come to term, not just some nebulous "visions". (Bonus points if you have Yoda lecture Anakin about the inevitability of death, but I'd understand if they didn't go there.) At the end, Anakin doesn't try to choke Padme; he watches the ship with her fly out of his reach, away from his ability to use his unnatural Sith powers to save her. Now Anakin's motives are 320% more sympathetic and the movie makes 170% more sense. If I'd put some serious work in, I'm pretty sure I could've had the movie audience in tears.

I still feel a sense of genuine puzzlement on how such disastrous writing happens in movies and TV shows. Are the viewers who care about this such a tiny percentage that it's not worth trying to sell to them? Are there really so few writers who could read over the script and see in 30 seconds how to fix something like this? (If option 2 is really the problem and people know it's the problem, I'd happily do it for $10,000 a shot.) Is it Graham's Design Paradox - can Hollywood moguls just not tell the difference between competent writers making such an offer, and fakers who'll take the money and run? Are the producers' egos so grotesque that they can't ask a writer for help? Is there some twisted sense of superiority bound up with believing that the audience is too dumb to care about this kind of thing, even though it looks to me like they do? I don't understand how a >$100M movie ends up with flaws that I could fix at the script stage with 30 seconds of advice.

A helpful key to understanding the art and technique of character in storytelling, is to consider the folk-psychological notion from Internal Family Systems of people being composed of different 'parts' embodying different drives or goals. A shallow character is then a character with only one 'part'.

A good rule of thumb is that to create a 3D character, that person must contain at least two different 2D characters who come into conflict. Contrary to the first thought that crosses your mind, three-dimensional good people are constructed by combining at least two different good people with two different ideals, not by combining a good person and a bad person. Deep sympathetic characters have two sympathetic parts in conflict, not a sympathetic part in conflict with an unsympathetic part. Deep smart characters are created by combining at least two different people who are geniuses.

E.g. HPMOR!Hermione contains both a sensible young girl who tries to keep herself and her friends out of trouble, and a starry-eyed heroine, neither of whom are stupid. (Actually, since HPMOR!Hermione is also the one character who I created as close to her canon self as I could manage - she didn't *need* upgrading - I should credit this one to J. K. Rowling.) (Admittedly, I didn't actually follow that rule deliberately to construct Methods, I figured it out afterward when everyone was praising the characterization and I was like, "Wait, people are calling me a character author now? What the hell did I just do right?")

If instead you try to construct a genius character by having an emotionally impoverished 'genius' part in conflict with a warm nongenius part... ugh. Cliche. Don't write the first thing that pops into your head from watching Star Trek. This is not how real geniuses work. HPMOR!Harry, the primary protagonist, contains so many different people he has to give them names, and none of them are stupid, nor does any one of them contain his emotions set aside in a neat jar; they contain different mixtures of emotions and ideals. Combining two cliche characters won't be enough to build a deep character. Combining two different realistic people in that character's situation works much better. Two is not a limit, it's a minimum, but everyone involved still has to be recognizably the same person when combined.

Closely related is Orson Scott Card's observation that a conflict between Good and Evil can be interesting, but it's often not half as interesting as a conflict between Good and Good. All standard rules about cliches still apply, and a conflict between good and good which you've previously read about and to which the reader can already guess your correct approved answer, cannot carry the story. A good rule of thumb is that if you have a conflict between good and good which you feel unsure about yourself, or which you can remember feeling unsure about, or you're not sure where exactly to draw the line, you can build a story around it. I consider the most successful moral conflict in HPMOR to be the argument between Harry and Dumbledore in Ch. 77 because it almost perfectly divided the readers on who was in the right *and* about whose side the author was taking. (*This* was done by deliberately following Orson Scott Card's rule, not by accident. Likewise _Three Worlds Collide_, though it was only afterward that I realized how much of the praise for that story, which I hadn't dreamed would be considered literarily meritful by serious SF writers, stemmed from the sheer rarity of stories built around genuinely open moral arguments. Orson Scott Card: "Propaganda only works when the reader feels like you've been absolutely fair to other side", and writing about a moral dilemma where *you're* still trying to figure out the answer is an excellent way to achieve this.)

Character shallowness can be a symptom of moral shallowness if it reflects a conflict between Good and Evil drawn along lines too clear to bring two good parts of a good character into conflict. This is why it would've been hard for Lord of the Rings to contain conflicted characters without becoming an entirely different story, though as Robin Hanson has just remarked, LotR is a Mileu story, not a Character story. Conflicts between evil and evil are even shallower than conflicts between good and evil, which is why what passes for 'maturity' in some literature is so uninteresting. There's nothing to choose there, no decision to await with bated breath, just an author showing off their disillusionment as a claim of sophistication.

 

I was wondering if we could apply this process to older fiction, Great Literature that is historically praised, and excellent by its own time's standards, but which, if published by a modern author, would seem substandard or inappropriate in one way or another.

Given our community's propensity for challenging sacred cows, and the unique tool-set available to us, I am sure we could take some great works of the past and turn them into awesome works of the present.


Of course, it doesn't have to be a laboratory where we rewrite the whole damn things. Just proprely-grounded suggestions on how to improve this or that work would be great.

 

P.S. This post is itself a work in progress, and will update and improve as comments come. It's been a long time since I've last posted on LW, so advice is quite welcome. Our work is never over.

 

EDIT: Well, I like that this thread has turned out so lively, but I've got finals to prepare for and I can't afford to keep participating in the discussion to my satisfaction. I'll be back in July, and apologize in advance for being such a poor OP. That said, cheers!

Orwell and fictional evidence for dictatorship stability

16 Stuart_Armstrong 24 May 2013 12:19PM

"If you want a picture of the future, imagine a boot stamping on a human face—forever."
George Orwell (Eric Arthur Blair), Nineteen Eighty-Four

Orwell's Nineteen Eighty-Four is brilliant, terrifying and useful. It's been at its best fighting against governmental intrusions, and is often quoted by journalists and even judges. It's cultural impact has been immense. And, hey, it's well written.

But that doesn't mean it's accurate as a source of predictions or counterfactuals. Orwell's belief that "British democracy as it existed before 1939 would not survive the war" was wrong. Nineteen Eighty-Four did not predict the future course of communism. There is no evidence that anything like the world he envisaged could (or will) happen. Which isn't the same as saying that it couldn't, but we do require some evidence before accepting Orwell's world as realistic.

Yet from this book, a lot of implicit assumptions have seeped into our consciousness. The most important one (shared with many other dystopian novels) is that dictatorships are stable forms of government. Note the "forever" in the quote above - the society Orwell warned about would never change, never improve, never transform. In several conversations (about future governments, for instance), I've heard - and made - the argument that a dictatorship was inevitable, because it's an absorbing state. Democracies can come become dictatorships, but dictatorships (barring revolutions) will endure for good. And so the idea is that if revolutions become impossible (because of ubiquitous surveillance, for instance), then we're stuck with Big Brother for life, and for our children's children'c children's lives.

But thinking about this in the context of history, this doesn't seem credible. The most stable forms of government are democracies and monarchies; nothing else endures that long. And laying revolutions aside, there have been plenty of examples of even quite nasty governments improving themselves. Robespierre was deposed from within his own government - and so the Terror, for all its bloodshed, didn't even last a full year. The worse excesses of Stalinism ended with Stalin. Gorbachev voluntarily opened up his regime (to a certain extent). Mao would excoriate the China of today. Britain's leaders in the 19th and 20th century gradually opened up the franchise, without ever coming close to being deposed by force of arms. The dictatorships of Latin America have mostly fallen to democracies (though revolutions played a larger role there). Looking over the course of recent history, I see very little evidence the dictatorships have much lasting power at all - or that they are incapable of drastic internal change and even improvements.

Now, caveats abound. The future won't be like the past - maybe an Orwellian dictatorship will become possible with advanced surveillance technologies. Maybe a world government won't see any neighbouring government doing a better job, and feel compelled to match it by improving lot of its citizens. Maybe the threat of revolution remains necessary, even if revolts don't actually happen.

Still, we should refrain from assuming that dictatorships, whether party or individual, are somehow the default state, and conduct a much more evidence-based analysis of the matter.

Pascal's wager

-11 duckduckMOO 22 April 2013 04:41AM


I started this as a comment on "Being half wrong about pascal's wager is even worse" but its really long, so I'm posting it in discussion instead.

 

Also I illustrate here using negative examples (hell and equivalents) for the sake of followability and am a little worried about inciting some paranoia so am reminding you here that every negative example has an equal and opposite positive partner. For example pascal's wager has the opposite where accepting sends you to hell, it also has the opposite where refusing sends you to heaven. I haven't mentioned any positive equivalents or opposites below. Also all of these possibilities are literally effectively 0 so don't be worrying.

 

"For so long as I can remember, I have rejected Pascal's Wager in all its forms on sheerly practical grounds: anyone who tries to plan out their life by chasing a 1 in 10,000 chance of a huge pay-off is almost certainly doomed in practice.  This kind of clever reasoning never pays off in real life..."

 

Pascal's wager shouldn't be in in the reference class of real life. It is a unique situation that would never crop up in real life as you're using it. In the world in which pascal's wager is correct you would still see people who plan out their lives on a 1 in 10000 chance of a huge pay-off fail 9999 times out of 10000. Also, this doesn't work for actually excluding pascal's wager. If pascal's wager starts off excluded from the category real life you've already made up your mind so this cannot quite be the actual order of events.

 

In this case 9999 times you waste your Christianity and 1/10000 you don't go to hell for eternity, which is, at a vast understatement, much worse than 10000 times as bad as worshipping god even at the expense of the sanity it costs to force a change in belief, the damage it does to your psyche to live as a victim of self inflicted Stockholm syndrome, and any other non obvious cost: With these premises choosing to believe in God produces infinitely better consequences on average.

 

Luckily the premises are wrong. 1/10000 is about 1/10000 too high for the relevant probability. Which is:

the probability that the wager or equivalent, (anything whose acceptance would prevent you going to hell is equivalent) is true

MINUS

the probability that its opposite or equivalent, (anything which would send you to hell for accepting is equivalent), is true 

 

1/10000 is also way too high even if you're not accounting for opposite possibilities.

 

 

Equivalence here refers to what behaviours it punishes or rewards. I used hell because it is in the most popular wager but it applies to all wagers. To illustrate: If its true that there is one god: ANTIPASCAL GOD, and he sends you to hell for accepting any pascal's wager, then that's equivalent to any pascal's wager you hear having an opposite (no more "or equivalent"s will be typed but they still apply) which is true because if you accept any pascal's wager you go to hell. Conversely, If PASCAL GOD is the only god and he sends you to hell unless you accept any pascal's wager, that's equivalent to any pascal's wager you hear being true.

 

The real trick of pascals wager is the idea that they're generally no more likely than their opposite. For example, there are lots of good, fun, reasons to assign the Christian pascal's wager a lower probability than its opposite even engaging on a Christian level:

 

Hell is a medieval invention/translation error: the eternal torture thing isn't even in the modern bibles.

The belief or hell rule is hella evil and gains credibility from the same source (Christians, not the bible) who also claim that god is good as a more fundamental belief, which directly contradicts the hell or belief rule.

The bible claims that God hates people eating shellfish, taking his name in vain, and jealousy. Apparently taking his name in vain is the only unforgivable sin. So if they're right about the evil stuff, you're probably going to hell anyway.

It makes no sense that god would care enough about your belief and worship to consign people to eternal torture but not enough to show up once in a while.

it makes no sense to reward people for dishonesty.

The evilness really can't be overstated. eternal torture as a response to a mistake which is at its worst due to stupidity (but actually not even that: just a stacked deck scenario), outdoes pretty much everyone in terms of evilness. worse than pretty much every fucked up thing every other god is reputed to have done put together. The psychopath in the bible doesn't come close to coming close.

 

The problem with the general case of religious pascal's wagers is that people make stuff up (usually unintentionally) and what made up stuff gains traction has nothing to do with what is true. When both Christianity and Hinduism are taken seriously by millions (as were the Roman/Greek gods, and Viking gods, and Aztec gods, and Greek gods, and all sorts of other gods at different times, by large percentages of people) mass religious belief is 0 evidence. At most one religion set (e.g. Greek/Roman, Christian/Muslim/Jewish, etc) is even close to right so at least the rest are popular independently of truth.

 

The existence of a religion does not elevate the possibility that the god they describe exists above the possibility that the opposite exists because there is no evidence that religion has any accuracy in determining the features of a god, should one exist.

 

You might intuitively lean towards religions having better than 0 accuracy if a god exists but remember there's a lot of fictional evidence out there to generalise from. It is a matter of judgement here. there's no logical proof for 0 or worse accuracy (other than it being default and the lack of evidence) but negative accuracy is a possibility and you've probably played priest classes in video games or just seen how respected religions are and been primed to overestimate religion's accuracy in that hypothetical. Also if there is a god it has not shown itself publicly in a very long time, or ever. So it seems to have a preference for not being revealed.  Also humans tend to be somewhat evil and read into others what they see in themselves. and I assume any high tier god (one that had the power to create and maintain a hell, detect disbelief, preserve immortal souls and put people in hell) would not be evil. Being evil or totally unscrupled has benefits among humans which a god would not get. I think without bad peers or parents there's no reason to be evil. I think people are mostly evil in relation to other people.  So I religions a slight positive accuracy in the scenario where there is a god but it does not exceed priors against pascal's wager (another one is that they're pettily human) or perhaps even the god's desire to stay hidden. 

 

Even if God itself whispered pascal's wager in your ear there is no incentive for it to actually carry out the threat: 

 

There is only one iteration.

AND

These threats aren't being made in person by the deity. They are either second hand or independently discovered so:

The deity has no use for making the threat true, to claim it more believably, as it might if it was an imperfect liar (at a level detectable by humans) that made the threats in person.

The deity has total plausible deniability.

Which adds up to all of the benefits of the threat having already being extracted by the time the punishment is due and no possibility of a rep hit (which wouldn't matter anyway.)

 

So, All else being equal. i.e. unless the god is the god of threats or pascal's wagers (whose opposites are equally likely):

 

If God is good (+ev on human happiness -ev on human sadness that sort of thing), actually carrying out the threats has negative value.

If god is scarily-doesn't-give-a-shit-neutral to humans, it still has no incentive to actually carry out the threat and a non zero energy cost.

if god gives the tiniest most infinitesimal shit about humans its incentive to actually carry out the threat is negative.

 

If God is evil you're fucked anyway:

The threat gains no power by being true, so the only incentive a God can have for following through is that it values human suffering. If it does, why would it not send you to hell if you believed in it? (remember that the god of commitments is as likely as the god of breaking commitments)

 

Despite the increased complexity of a human mind I think the most (not saying its at all likely just that all others are obviously wrong) likely motivational system for a god which would make it honour the wager is that that God thinks like a human and therefore would keep its commitment out of spite or gratitude or some other human reason. So here's why I think that one is wrong. It's generalizing from fictional evidence: humans aren't that homogeneous (and one without peers would be less so), and if a god gains likelihood to keep a commitment from humanness it also gains not -designed-to-be-evil-ness that would make it less likely to make evil wagers.  It also has no source for spite or gratitude, having no peers. Finally could you ever feel spite towards a bug? Or gratitude? We are not just ants compared to a god, we're ant-ant-ant-etc-ants.

 

Also there's the reasons that refusing can actually get you in trouble:  bullies don't get nicer when their demands are met. It's often not the suffering they're after but the dominance, at which point the suffering becomes an enjoyable illustration of that dominance.  As we are ant-ant-etc-ants this probability is lower but The fact that we aren't all already in hell suggests that if god is evil it is not raw suffering that it values. Hostages are often executed even when the ransom is paid. Even if it is evil, it could be any kind of evil: its preferences cannot have been homogenised by memes and consensus.

 

There's also the rather cool possibility that if human-god is sending people to hell, maybe its for lack of understanding. If it wants belief it can take it more effectively than this. If it wants to hurt you it will hurt you anyway. Perhaps peerless, it was never prompted to think through the consequences of making others suffer. Maybe god, in the absence of peers just needs someone to explain that its not nice to let people burn in hell for eternity. I for one remember suddenly realising that those other fleshbags hosted people. I figured it out for myself but if I grew up alone as the master of the universe maybe I would have needed someone to explain it to me.

 

[LINK] The power of fiction for moral instruction

11 David_Gerard 24 March 2013 09:19PM

From Medical Daily: Psychologists Discover How People Subconsciously Become Their Favorite Fictional Characters

Psychologists have discovered that while reading a book or story, people are prone to subconsciously adopt their behavior, thoughts, beliefs and internal responses to that of fictional characters as if they were their own.

Experts have dubbed this subconscious phenomenon ‘experience-taking,’ where people actually change their own behaviors and thoughts to match those of a fictional character that they can identify with.

Researcher from the Ohio State University conducted a series of six different experiments on about 500 participants, reporting in the Journal of Personality and Social Psychology, found that in the right situations, ‘experience-taking,’ may lead to temporary real world changes in the lives of readers. 

They found that stories written in the first-person can temporarily transform the way readers view the world, themselves and other social groups. 

I always wondered at how Christopher Hitchens (who, when he wasn't being a columnist, was a professor of English literature) went on and on about the power of fiction for revealing moral truths. This gives me a better idea of how people could imprint on well-written fiction. More so than, say, logically-reasoned philosophical tracts.

This article is, of course, a popularisation. Anyone have links to the original paper?

Edit: Gwern delivers (PDF): Kaufman, G. F., & Libby, L. K. (2012, March 26). "Changing Beliefs and Behavior Through Experience-Taking." Journal of Personality and Social Psychology. Advance online publication. doi: 10.1037/a0027525

Harry Potter and the Methods of Rationality discussion thread, part 18, chapter 87

4 Alsadius 22 December 2012 07:55AM

This is a new thread to discuss Eliezer Yudkowsky’s Harry Potter and the Methods of Rationality and anything related to it. This thread is intended for discussing chapter 87The previous thread has passed 500 comments. 

There is now a site dedicated to the story at hpmor.com, which is now the place to go to find the authors notes and all sorts of other goodies. AdeleneDawner has kept an archive of Author’s Notes. (This goes up to the notes for chapter 76, and is now not updating. The authors notes from chapter 77 onwards are on hpmor.com.) 

The first 5 discussion threads are on the main page under the harry_potter tag.  Threads 6 and on (including this one) are in the discussion section using its separate tag system.  Also: 12345678910111213141516, 17.

Spoiler Warning: this thread is full of spoilers. With few exceptions, spoilers for MOR and canon are fair game to post, without warning or rot13. More specifically:

You do not need to rot13 anything about HP:MoR or the original Harry Potter series unless you are posting insider information from Eliezer Yudkowsky which is not supposed to be publicly available (which includes public statements by Eliezer that have been retracted).

If there is evidence for X in MOR and/or canon then it’s fine to post about X without rot13, even if you also have heard privately from Eliezer that X is true. But you should not post that “Eliezer said X is true” unless you use rot13.

Harry Potter and the Methods of Rationality discussion thread, part 17, chapter 86

9 Alsadius 17 December 2012 07:19AM

Edit: New thread posted here

This is a new thread to discuss Eliezer Yudkowsky’s Harry Potter and the Methods of Rationality and anything related to it. This thread is intended for discussing chapter 86The previous thread  has long passed 500 comments. 

There is now a site dedicated to the story at hpmor.com, which is now the place to go to find the authors notes and all sorts of other goodies. AdeleneDawner has kept an archive of Author’s Notes. (This goes up to the notes for chapter 76, and is now not updating. The authors notes from chapter 77 onwards are on hpmor.com.) 

The first 5 discussion threads are on the main page under the harry_potter tag.  Threads 6 and on (including this one) are in the discussion section using its separate tag system.  Also: 123456789101112131415, 16.

As a reminder, it’s often useful to start your comment by indicating which chapter you are commenting on.

Spoiler Warning: this thread is full of spoilers. With few exceptions, spoilers for MOR and canon are fair game to post, without warning or rot13. More specifically:

You do not need to rot13 anything about HP:MoR or the original Harry Potter series unless you are posting insider information from Eliezer Yudkowsky which is not supposed to be publicly available (which includes public statements by Eliezer that have been retracted).

If there is evidence for X in MOR and/or canon then it’s fine to post about X without rot13, even if you also have heard privately from Eliezer that X is true. But you should not post that “Eliezer said X is true” unless you use rot13.

 

NKCDT: The Big Bang Theory

-12 [deleted] 10 November 2012 01:15PM

Hi, Welcome to the first Non-Karmic-Casual-Discussion-Thread.

This is a place for [purpose of thread goes here].

In order to create a causal non karmic environment for every one we ask that you

-Do not upvote or downvote any zero karma posts

-If you see a vote with positive karma, downvote it towards zero, even if it’s a good post

-If you see a vote with negative karma, upvote it towards zero, even if it’s a weak post

-Please be polite and respectful to other users

-Have fun!”

 

 

This is my first attempt at starting a casual conversation on LW where people don't have to worry about winning or losing points, and can just relax and have social fun together.

 

So, Big Bang Theory. That series got me wondering. It seems to be about "geeks", and not the basement-dwelling variety either; they're highly successful and accomplished professionals, each in their own field. One of them has been an astronaut, even. And yet, everything they ever accomplish amounts to absolutely nothing in terms of social recognition or even in terms of personal happiness. And the thing is, it doesn't even get better for their "normal" counterparts, who are just as miserable and petty.

 

Consider, then; how would being rationalists would affect the characters on this show? The writing of the show relies a lot on laughing at people rather than with them; would rationalist characters subvert that? And how would that rationalist outlook express itself given their personalities? (After all, notice how amazingly different from each other Yudkowsky, Hanson, and Alicorn are, just to name a few; they emphasize rather different things, and take different approaches to both truth-testing and problem-solving).

Note: this discussion does not need to be about rationalism. It can be a casual, normal discussion about the series. Relax and enjoy yourselves.

 

But the reason I brought up that series is that its characters are excellent examples of high intelligence hampered by immense irrationality. The apex of this is represented by Dr. Sheldon Cooper, who is, essentially, a complete fundamentalist over every single thing in his life; he applies this attitude to everything, right down to people's favorite flavor of pudding: Raj is "axiomatically wrong" to prefer tapioca, because the best pudding is chocolate. Period. This attitude makes him a far, far worse scientist than he thinks, as he refuses to even consider any criticism of his methods or results. 

 

A My Little Pony fanfic allegedly but not mainly about immortality

9 PhilGoetz 10 September 2012 01:02AM

My Little Pony (generation 4) has 2 immortal characters, who get a lot of sympathy from the bronies.  "How sad!  Poor Celestia and Luna must see everyone they know grow old and die.  How much better to die yourself!"

I tried to write a fanfic saying that death was bad.  But I had to make it a story, and it ended up having other themes.  I don't know whether I like it or not, but it was very popular (now approaching 7000 views in 3 days on fimfiction).

I was pretty sure the message "death is bad" was still in there, because Celestia says things like "Death is bad" and "I'm afraid of dying."  So imagine my surprise when comment after comment said, "Yes, immortality is such a curse!"

continue reading »

fimfiction.net LessWrong group

5 PhilGoetz 09 September 2012 04:10PM

There is now a LessWrong group on fimfiction, to let LWers on fimfiction find each other and collect stories that might be of interest to them.  (That?  Which?  Grammar Nazis, help!)

Dragon Ball's Hyperbolic Time Chamber

35 gwern 02 September 2012 11:49PM

A time dilation tool from an anime is discussed for its practical use on Earth; there seem surprisingly few uses and none that will change the world, due to the severe penalties humans would incur while using it, and basic constraints like Amdahl's law limit the scientific uses. A comparison with the position of an Artificial Intelligence such as an emulated human brain seems fair, except most of the time dilation disadvantages do not apply or can be ameliorated and hence any speedups could be quite effectively exploited. I suggest that skeptics of the idea that speedups give advantages are implicitly working off the crippled time dilation tool and not making allowance for the disanalogies.

Master version on gwern.net

[Link] Short story by Yvain

31 CronoDAS 31 August 2012 04:33AM

Yvain isn't a big enough self-promoter to link to this, but I liked it a lot and I think you will too.

"The Last Temptation of Christ"

The Fiction Genome Project

12 [deleted] 29 June 2012 11:19AM

The Music Genome Project is what powers Pandora. According to Wikipedia:

 

The Music Genome Project was first conceived by Will Glaser and Tim Westergren in late 1999. In January 2000, they joined forces with Jon Kraft to found Pandora Media to bring their idea to market.[1] The Music Genome Project was an effort to "capture the essence of music at the fundamental level" using almost 400 attributes to describe songs and a complex mathematical algorithm to organize them. Under the direction of Nolan Gasser, the musical structure and implementation of the Music Genome Project, made up of 5 Genomes (Pop/Rock, Hip-Hop/Electronica, Jazz, World Music, and Classical), was advanced and codified.

 

A given song is represented by a vector (a list of attributes) containing approximately 400 "genes" (analogous to trait-determining genes for organisms in the field of genetics). Each gene corresponds to a characteristic of the music, for example, gender of lead vocalist, level of distortion on the electric guitar, type of background vocals, etc. Rock and pop songs have 150 genes, rap songs have 350, and jazz songs have approximately 400. Other genres of music, such as world and classical music, have 300–500 genes. The system depends on a sufficient number of genes to render useful results. Each gene is assigned a number between 1 and 5, in half-integer increments.[2]

 

Given the vector of one or more songs, a list of other similar songs is constructed using a distance function. Each song is analyzed by a musician in a process that takes 20 to 30 minutes per song.[3] Ten percent of songs are analyzed by more than one technician to ensure conformity with the in-house standards and statistical reliability. The technology is currently used by Pandora to play music for Internet users based on their preferences. Because of licensing restrictions, Pandora is available only to users whose location is reported to be in the USA by Pandora's geolocation software.[4]

 

 

Eminent lesswronger, strategist, and blogger, Sebastian Marshall,  wonders:

 

Personally, I was thinking of doing a sort of “DNA analysis” of successful writing. Have you heard of the Music Genome Project? It powers Pandora.com.

 

So I was thinking, you could probably do something like that for writing, and then try to craft a written work with elements known to appeal to people. For instance, if you wished to write a best selling detective novel, you might do an analysis of when the antagonist(s) appear in the plot for the first time. You might find that 15% of bestsellers open with the primary antagonist committing their crime, 10% have the antagonist mixed in quickly into the plot, and 75% keep the primary antagonist a vague and shadowy figure until shortly before the climax.

 

I don’t know if the pattern fits that – I don’t read many detective novels – but it would be a bit of a surprise if it did. You might think, well, hey, I better either introduce the antagonist right away having them commit their crime, or keep him shadowy for a while.

 

 

Or, to use an easier example – perhaps you could wholesale adopt the use of engineering checklists into your chosen discipline? It seems to me like lots of fields don’t use checklists that could benefit tremendously from them. I run this through my mind again and again – what kind of checklist could be built here? I first came across the concept of checklists being adopted in surgery from engineering, and then having surgical accidents and mistakes go way down.

 

Some people at TV Tropes came across that article, and thought that their wiki's database might be a good starting point to make this project a reality. I came here to look for the savvy, intelligence, and level of technical expertise in all things AI and NIT that I've come to expect of this site's user-base, hoping that some of you might be interested in having a look at the discussion, and, perhaps, would feel like joining in, or at least sharing some good advice.

Thank you. (Also, should I make this post "Discussion" or "Top Level"?)

"Where Am I?", by Daniel Dennett

8 [deleted] 04 June 2012 09:45AM

”Where Am I?” is a short story by Daniel Dennett from his book Brainstorms: Philosophical Essays on Mind and Psychology. Some of you might already be familiar with it.

The story is a humorous semi-science fiction one, where Dennett gets a job offer form Pentagon that entails moving his brain into a vat, without actually moving his point of view. Later on it brings up questions about uploading and what it would mean in terms of diverging perspectives and so on. Aside from being a joy to read, it offers solutions to a few hurdles about the nature of consciousnesses and personal identity. 

Suppose, I argued to myself, I were now to fly to California, rob a bank, and be apprehended.  In which state would I be tried:  in California, where the robbery took place, or in Texas, where the brains of the outfit were located?  Would I be a California felon with an out-of-state brain, or a Texas felon remotely controlling an accomplice of sorts in California? It seemed possible that I might beat such a rap just on the undecidability of that jurisdictional question, though perhaps it would be deemed an interstate, and hence Federal, offense.

 

[Book Suggestions] Summer Reading for Younglings.

8 Karmakaiser 12 May 2012 04:57PM

I bought my niece a Kindle that just arrived and I'm about to load it up with books to give it to her tomorrow for her birthday. I've decided to be a sneaky uncle and include good books that can teach better abilities to think or at least to consider science cool and interesting. She is currently in the 4th Grade with 5th coming after the Summer.

She reads basically at her own grade level so while I'm open to stuffing the Kindle with books to be read when she's ready, I'd like to focus on giving her books she can read now. Ender's Game will be on there most likely. Game of Thrones will not.

What books would you give a youngling? Her interests currently trend toward the young mystery section, Hardy Boys and the like, but in my experience she is very open to trying new books with particular interest in YA fantasy but not much interest in Sci Fi (if I'm doing any other optimizing this year, I'll try to change her opinion on Sci Fi).

Harry Potter and the Methods of Rationality discussion thread, part 16, chapter 85

9 FAWS 18 April 2012 02:30AM

The next discussion thread is here.

 

This is a new thread to discuss Eliezer Yudkowsky’s Harry Potter and the Methods of Rationality and anything related to it. This thread is intended for discussing chapter 85The previous thread  has long passed 500 comments. Comment in the 15th thread until you read chapter 85. 

There is now a site dedicated to the story at hpmor.com, which is now the place to go to find the authors notes and all sorts of other goodies. AdeleneDawner has kept an archive of Author’s Notes. (This goes up to the notes for chapter 76, and is now not updating. The authors notes from chapter 77 onwards are on hpmor.com.) 

The first 5 discussion threads are on the main page under the harry_potter tag.  Threads 6 and on (including this one) are in the discussion section using its separate tag system.  Also: 12345678910111213, 14, 15.

As a reminder, it’s often useful to start your comment by indicating which chapter you are commenting on.

Spoiler Warning: this thread is full of spoilers. With few exceptions, spoilers for MOR and canon are fair game to post, without warning or rot13. More specifically:

You do not need to rot13 anything about HP:MoR or the original Harry Potter series unless you are posting insider information from Eliezer Yudkowsky which is not supposed to be publicly available (which includes public statements by Eliezer that have been retracted).

If there is evidence for X in MOR and/or canon then it’s fine to post about X without rot13, even if you also have heard privately from Eliezer that X is true. But you should not post that “Eliezer said X is true” unless you use rot13.

Mental Clarity; or How to Read Reality Accurately

-10 Hicquodiam 12 April 2012 06:18AM

 

Hey all - I typed this out to help me understand, well... how to understand things:

 

Mental clarity is the ability to read reality accurately. 

 

I don't mean being able to look at the complete objective picture of an event, as you don't have any direct access to that. I'm talking about the ability to read the data presented by your subjective experience: thoughs, sights, sounds, etc. Once you get a clear picture of what that data is, you can then go on and use it to build or falsify your ideas about the world.


This post will focus on the "getting a clear picture" part.


I use the word "read" because it's no different than reading from a book, or from these words. When you read a book, you are actually curious as to what the words are saying. You wouldn't read anything into it that's not there, which would be counterproductive to your understanding.

 

You just look at the words plainly, and through this your mind automatically recognizes and presents the patterns: the meaning of the sentences, their relation to the topic, the visual imagery associated with them, all of that. If you want to know a truth about reality, just look at it and read what's there.


Want to know what the weather's like? Look outside - read what's going on.


Want to know if the Earth revolves around the Sun, or vice versa? Look at the movement of the planets, read what they're doing, see which theory fits better.


Want to check if your beliefs about the world are correct? Take one, read the reality that the belief tries to correspond to, and see how well they compare.


This is the root of all science and all epiphanies.


But if it's so simple and obvious, why am I talking about it?


It's not something that we as a species often do. For trivial matters, sure, for science too, but not for our strongly-held opinions. Not for the beliefs and positions that shape our self-image, make us feel good/comfortable, or get us approval. Not for our political opinions, religious ideas, moral judgements, and little white lies.


If you were utterly convinced that your wife was faithful, moreso, if you liked to think of her in that way, and your friend came along and said she was cheating on you, you'd be reluctant to read reality and check if that's true. Doing this would challenge your comfort and throw you into an unknown world with some potentially massive changes. It would be much more comforting to rationalize why she still might be faithful, than to take one easy look at the true information. It would also more damaging.


Delusion is reading into reality things which aren't there. Telling yourself that everything's fine when it obviously isn't, for example. It's the equivalent of looking at a book about vampires and jumping to the conclusion that it's about wizards.


Sounds insane. You do it all the time. You'll catch yourself if you're willing to read the book of your own thoughts: flowing through your head, in plain view, is a whole mess of opinions and ideas of people, places, and positions you've never even encountered. Crikey!


That mess is incredibly dangerous to have. Being a host to unchecked or false beliefs about the world is like having a faulty map of a terrain: you're bound to get lost or fall off a cliff. Reading the terrain and re-drawing the map accordingly is the only way to accurately know where you're going. Having an accurate map is the only way to achieve your goals.



So you want to develop mental clarity? Be less confused, or more successful? Have a better understanding of the world, the structure of reality, or the accuracy of your ideas? 


Just practice the accurate reading of what's going on. Surrender the content of your beliefs to the data gathered by your reading of reality. It's that simple.

 

It can also be scary, especially when it comes to challenging your "personal" beliefs. It's well worth the fear, however, as a life built on truth won't crumble like one built on fiction.

 

Truth doesn't crumble.

 

Stay true.



Further reading:


Stepvhen from Burning true on truth vs. fantasy.


Kevin from Truth Strike on why this skill is important to develop.

 

Harry Potter and the Methods of Rationality discussion thread, part 15, chapter 84

3 FAWS 11 April 2012 03:39AM

The next discussion thread is here.

 

This is a new thread to discuss Eliezer Yudkowsky’s Harry Potter and the Methods of Rationality and anything related to it. This thread is intended for discussing chapter 84The previous thread  has passed 500 comments. Comment in the 14th thread until you read chapter 84. 

There is now a site dedicated to the story at hpmor.com, which is now the place to go to find the authors notes and all sorts of other goodies. AdeleneDawner has kept an archive of Author’s Notes. (This goes up to the notes for chapter 76, and is now not updating. The authors notes from chapter 77 onwards are on hpmor.com.) 

The first 5 discussion threads are on the main page under the harry_potter tag.  Threads 6 and on (including this one) are in the discussion section using its separate tag system.  Also: 12345678910111213, 14.

As a reminder, it’s often useful to start your comment by indicating which chapter you are commenting on.

Spoiler Warning: this thread is full of spoilers. With few exceptions, spoilers for MOR and canon are fair game to post, without warning or rot13. More specifically:

You do not need to rot13 anything about HP:MoR or the original Harry Potter series unless you are posting insider information from Eliezer Yudkowsky which is not supposed to be publicly available (which includes public statements by Eliezer that have been retracted).

If there is evidence for X in MOR and/or canon then it’s fine to post about X without rot13, even if you also have heard privately from Eliezer that X is true. But you should not post that “Eliezer said X is true” unless you use rot13.

Harry Potter and the Methods of Rationality predictions

6 gwern 09 April 2012 09:49PM

The recent spate of updates has reminded me that while each chapter is enjoyable, the approaching end of MoR, as awesome as it no doubt will be, also means the end of our ability to learn from predicting the truth of the MoR-verse and its future.

With that in mind, I have compiled a page of predictions on sundry topics, much like my other page on predictions for Neon Genesis Evangelion; I encourage people to suggest plausible predictions that I've omitted, register their probabilities on PredictionBook.com, and come up with their own predictions. Then we can all look back when MoR finishes and reflect on what we (or Eliezer) did poorly or well.  

The page is currently up to >182 predictions.

Harry Potter and the Methods of Rationality discussion thread, part 14, chapter 82

7 FAWS 04 April 2012 02:53AM

The new discussion thread (part 15) is here


This is a new thread to discuss Eliezer Yudkowsky’s Harry Potter and the Methods of Rationality and anything related to it. This thread is intended for discussing chapter 82The previous thread passed 1000 comments as of the time of this writing, and so has long passed 500. Comment in the 13th thread until you read chapter 82. 

There is now a site dedicated to the story at hpmor.com, which is now the place to go to find the authors notes and all sorts of other goodies. AdeleneDawner has kept an archive of Author’s Notes. (This goes up to the notes for chapter 76, and is now not updating. The authors notes from chapter 77 onwards are on hpmor.com.) 

The first 5 discussion threads are on the main page under the harry_potter tag.  Threads 6 and on (including this one) are in the discussion section using its separate tag system.  Also: 12345678910111213.

As a reminder, it’s often useful to start your comment by indicating which chapter you are commenting on.

Spoiler Warning: this thread is full of spoilers. With few exceptions, spoilers for MOR and canon are fair game to post, without warning or rot13. More specifically:

You do not need to rot13 anything about HP:MoR or the original Harry Potter series unless you are posting insider information from Eliezer Yudkowsky which is not supposed to be publicly available (which includes public statements by Eliezer that have been retracted).

If there is evidence for X in MOR and/or canon then it’s fine to post about X without rot13, even if you also have heard privately from Eliezer that X is true. But you should not post that “Eliezer said X is true” unless you use rot13.

Harry Potter and the Methods of Rationality discussion thread, part 12

5 Xachariah 25 March 2012 11:01AM

The new thread, discussion 13, is here.

 

This is a new thread to discuss Eliezer Yudkowsky's Harry Potter and the Methods of Rationality and anything related to it. With three chapters recently the previous thread has very quickly reached 1000 comments. The latest chapter as of 25th March 2012 is Ch 80.

There is now a site dedicated to the story at hpmor.com, which is now the place to go to find the authors notes and all sorts of other goodies. AdeleneDawner has kept an archive of Author's Notes. (This goes up to the notes for chapter 76, and is now not updating. The authors notes from chapter 77 onwards are on hpmor.com.)


The first 5 discussion threads are on the main page under the harry_potter tag.  Threads 6 and on (including this one) are in the discussion section using its separate tag system.  Also: one, two, three, four, five, six, seven, eight, nine, ten, eleven.

As a reminder, it's often useful to start your comment by indicating which chapter you are commenting on.

Spoiler Warning:  this thread is full of spoilers.  With few exceptions, spoilers for MOR and canon are fair game to post, without warning or rot13.  More specifically:

You do not need to rot13 anything about HP:MoR or the original Harry Potter series unless you are posting insider information from Eliezer Yudkowsky which is not supposed to be publicly available (which includes public statements by Eliezer that have been retracted).

If there is evidence for X in MOR and/or canon then it's fine to post about X without rot13, even if you also have heard privately from Eliezer that X is true. But you should not post that "Eliezer said X is true" unless you use rot13.

 

Harry Potter and the Methods of Rationality discussion thread, part 11

6 Oscar_Cunningham 17 March 2012 09:41AM

EDIT: New discussion thread here.

 

This is a new thread to discuss Eliezer Yudkowsky's Harry Potter and the Methods of Rationality and anything related to it. With two chapters recently the previous thread has very quickly reached 500 comments. The latest chapter as of 17th March 2012 is Ch. 79.

There is now a site dedicated to the story at hpmor.com, which is now the place to go to find the authors notes and all sorts of other goodies. AdeleneDawner has kept an archive of Author's Notes. (This goes up to the notes for chapter 76, and is now not updating. The authors notes from chapter 77 onwards are on hpmor.com.)


The first 5 discussion threads are on the main page under the harry_potter tag.  Threads 6 and on (including this one) are in the discussion section using its separate tag system.  Also: one, two, three, four, five, six, seven, eight, nine, ten.

As a reminder, it's often useful to start your comment by indicating which chapter you are commenting on.

Spoiler Warning:  this thread is full of spoilers.  With few exceptions, spoilers for MOR and canon are fair game to post, without warning or rot13.  More specifically:

You do not need to rot13 anything about HP:MoR or the original Harry Potter series unless you are posting insider information from Eliezer Yudkowsky which is not supposed to be publicly available (which includes public statements by Eliezer that have been retracted).

If there is evidence for X in MOR and/or canon then it's fine to post about X without rot13, even if you also have heard privately from Eliezer that X is true. But you should not post that "Eliezer said X is true" unless you use rot13.

Harry Potter and the Methods of Rationality discussion thread, part 10

11 Oscar_Cunningham 07 March 2012 04:46PM

(The HPMOR discussion thread after this one is here.)

This is a new thread to discuss Eliezer Yudkowsky's Harry Potter and the Methods of Rationality and anything related to it. There haven't been any chapters recently, but it looks like there are a bunch in the pipeline and the old thread is nearing 700 comments. The latest chapter as of 7th March 2012 is Ch. 77.

There is now a site dedicated to the story at hpmor.com, which is now the place to go to find the authors notes and all sorts of other goodies. AdeleneDawner has kept an archive of Author's Notes.


The first 5 discussion threads are on the main page under the harry_potter tag.  Threads 6 and on (including this one) are in the discussion section using its separate tag system.  Also: one, two, three, four, five, six, seven, eight, nine.

As a reminder, it's often useful to start your comment by indicating which chapter you are commenting on.

Spoiler Warning:  this thread is full of spoilers.  With few exceptions, spoilers for MOR and canon are fair game to post, without warning or rot13.  More specifically:

You do not need to rot13 anything about HP:MoR or the original Harry Potter series unless you are posting insider information from Eliezer Yudkowsky which is not supposed to be publicly available (which includes public statements by Eliezer that have been retracted).

If there is evidence for X in MOR and/or canon then it's fine to post about X without rot13, even if you also have heard privately from Eliezer that X is true. But you should not post that "Eliezer said X is true" unless you use rot13.

Writing about Singularity: needing help with references and bibliography

4 [deleted] 05 March 2012 01:27AM

 

It was Yudkowsky's Fun Theory sequence that inspired me to undertake the work of writing a novel on a singularitarian society... however, there are gaps I need to fill, and I need all the help I can get. It's mostly book recommendations that I'm asking for.

 

One of the things I'd like to tackle in it would be the interactions between the modern, geeky Singularitarianisms, and Marxism, which I hold to be somewhat prototypical in that sense, as well as other utopisms. And contrasting them with more down-to-earth ideologies and attitudes, by examining the seriously dangerous bumps of the technological point of transition between "baseline" and "singularity". But I need to do a lot of research before I'm able to write anything good: if I'm not going to have any original ideas, at least I'd like to serve my readers with a collection of well-researched. solid ones.

 

So I'd like to have everything that is worth reading about the Singularity, specifically the Revolution it entails (in one way or another) and the social aftermath. I'm particularly interested in the consequences of the lag of the spread of the technology from the wealthy to the baselines, and the potential for baselines oppression and other forms of continuation of current forms of social imbalances, as well as suboptimal distribution of wealth. After all, according to many authors, we've had the means to end war, poverty and famine, and most infectious diseases, since the sixties, and it's just our irrational methods of wealth distribution That is, supposing the commonly alleged ideal of total lifespan and material welfare maximization for all humanity is what actually drives the way things are done. But even with other, different premises and axioms, there's much that can be improved and isn't, thanks to basic human irrationality, which is what we combat here.

 

Also, yes, this post makes my political leanings fairly clear, but I'm open to alternative viewpoints and actively seek them. I also don't intend to write any propaganda, as such. Just to examine ideas, and scenarios, for the sake of writing a compelling story, with wide audience appeal. The idea is to raise awareness of the Singularity as something rather imminent ("Summer's Coming"), and cause (or at least help prepare) normal people to question the wonders and dangers thereof, rationally.

 

It's a frighteningly ambitious, long-term challenge, I am terribly aware of that. And the first thing I'll need to read is a style-book, to correct my horrendous grasp of standard acceptable writing (and not seem arrogant by doing anything else), so please feel free to recommend as many books and blog articles and other material as you like. I'll take my time going though it all.

 

View more: Next