You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

RichardKennaway comments on Fiction: Written on the Body as love versus reason - Less Wrong Discussion

-11 Post author: PhilGoetz 08 September 2013 06:13AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (18)

You are viewing a single comment's thread.

Comment author: RichardKennaway 09 September 2013 08:28:30AM 4 points [-]

If there's a point to that plot-thread-summary, I guess it's in your final sentence:

The lesson is then that reason is best, and instinct will do, but something in-between, instinct plus a crippled reason that takes itself as seriously as if it were the real thing, leads to madness.

If this is just a proposed way of reading Winterson's text, it has no relevance to LessWrong and would be better directed to a literary magazine. If you think that it is a true statement about the proper relation between reason and instinct, then you need to actually say so, and say why, because at present you are neither asserting it, nor providing any reason to believe it.

If you were to do so, that would be relevant, although it amounts to Spock-style rationality, which I doubt will play any better.

Comment author: PhilGoetz 21 September 2013 08:47:26PM -1 points [-]

LessWrong needs to deal with emotions as part of rationality. Strangely, people are eager to upvote Julia Galef's post on the importants of emotions in rationality, yet eager to downvote my attempt to deal with emotion and rationality on LessWrong.

Comment author: Wes_W 21 September 2013 10:10:44PM *  7 points [-]

LessWrong needs to deal with emotions as part of rationality.

Certainly.

Strangely, people are eager to upvote Julia Galef's post on the importants of emotions in rationality, yet eager to downvote my attempt to deal with emotion and rationality on LessWrong.

Don't spend >90% of your word count summarizing a novel next time.

The last paragraph was interesting, and at least some of the setup was required for this specific point, but it felt like a very low signal-to-"why am I reading all these excerpts from a seemingly-arbitrarily-selected 1992 novel" ratio.

Basically, I finished the article feeling like I had a pretty good idea what happened in the novel, but very little new insight into the combination of love and reason, or even what PhilGoetz thinks about it.

Comment author: PhilGoetz 24 September 2013 01:32:43PM *  1 point [-]

I appreciate your explanation, but I don't think you understand how novels work. They are not logical arguments that can be summarized in a one-paragraph conclusion. If you want to take emotions seriously, you need to speak their language. You can't do it all analytically.

Comment author: Desrtopa 24 September 2013 02:07:39PM 2 points [-]

In that case, summarizing a novel may not be among the better ways to discuss emotions as part of rationality.

I think that if you want to raise the message of a book as a point in a discussion, it's better to determine whether you have reasons for taking its contentions seriously beyond their use in an engaging story, and then explain those.

Comment author: PhilGoetz 25 September 2013 03:02:24AM 0 points [-]

Eliezer's article is about people taking scenarios from science fiction about artificial intelligence as evidence of what artificial intelligence is like. In a story like this one, the summary itself is the evidence, and I can't analyze it and explain it to you in anything shorter than a plot summary. If I could, it would be a bad novel. The purpose of this type of novel, as opposed to a Terminator action-adventure flick, is to explore things that are too complex for us to analyze. Any novel that could be analyzed in the way you're suggesting would be a bad novel.

Comment author: Desrtopa 25 September 2013 04:04:28AM 0 points [-]

Just because that's the specific focus in the article doesn't mean that the point is so narrow. Just as it's incorrect to suppose that a sci fi story gives us a useful picture of how society would be transformed by certain technologies, it's also a mistake to conclude, for instance, that a story about a bunch of young boys stranded on an island who devolve into barbarism is a useful case study in human nature. The contents of the book never happened, it's just something someone imagined, and to the extent that the author's belief that such a thing might happen constitutes evidence, we can do better by looking at what reasons a person would have to believe it in the first place.

Any novel whose experience could be replicated via the process I described would be a bad novel, but what you'd be leaving out would not actually be evidence for the truth of the points the novel is contending.

Comment author: Wes_W 24 September 2013 03:45:48PM *  0 points [-]

Apologies, I should clarify.

I don't think a longish summary was inappropriate. I'm not even sure the specific amount of summary you used was inappropriate - if I were an editor, I'd have an eye out for parts which you could get away with trimming, but that's just editing in general.

I DO think there was too little unpacking and exploration of your thesis. The 1800 words of summary aren't the problem, it's that 200 words of analysis is pretty sparse.

Comment author: RichardKennaway 21 September 2013 10:47:16PM 5 points [-]

Strangely, people are eager to upvote Julia Galef's post on the importants of emotions in rationality, yet eager to downvote my attempt to deal with emotion and rationality on LessWrong.

Not strange at all. People presumably think her attempt is better than yours.