ChrisHallquist comments on According to Dale Carnegie, You Can't Win an Argument—and He Has a Point - LessWrong

61 Post author: ChrisHallquist 30 November 2013 06:23AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (77)

You are viewing a single comment's thread. Show more comments above.

Comment author: ChrisHallquist 30 November 2013 07:00:34AM 6 points [-]

Thanks. You may be interested to know that I originally considered titling this post "Being Told You're Wrong Is the Mind-Killer."

Comment author: somervta 30 November 2013 12:11:57PM 11 points [-]

Personally, I'm glad you decided not to.

Comment author: hyporational 30 November 2013 12:31:04PM *  10 points [-]

I agree, mind-killer is too much of an applause light is an applause light these days.

Comment author: ChrisHallquist 30 November 2013 09:51:54PM 7 points [-]

Actually, the reason for that title was because of a point I decided to leave out, but may as well spell out here: "Deciding to talk about politics, even though this may cause you to lose some of your audience" and "Deciding to tell people they're wrong, even though this may cause you to lose some of your audience" are both tradeoffs, and it's odd that LessWrong community norms go so far in one direction on one tradeoff and so far in the other direction on the other tradeoff (at least with regards to certain subjects).

I suspect the reason for this mostly has to do with Eliezer thinking politics are not very important, but also thinking that, say, telling certain people their AI projects are dangerously stupid is very important. But not everyone agrees, and the anti-politics norm is itself a barrier to talking about how important politics are. (Personally, I suspect government action will be important for the future of AI in large part because I expect large organizations in general to be important for the future of AI.)

Comment author: katydee 01 December 2013 04:33:56AM *  3 points [-]

"Deciding to talk about politics, even though this may cause you to lose some of your audience" and "Deciding to tell people they're wrong, even though this may cause you to lose some of your audience" are both tradeoffs, and it's odd that LessWrong community norms go so far in one direction on one tradeoff and so far in the other direction on the other tradeoff (at least with regards to certain subjects).

Yeah, I saw the parallel there. I more or less think that both talking about politics and explicitly telling people that they're wrong are usually undesirable and that LessWrong should do neither.

I also agree with you that government action could be important for the future of AI.

Comment author: hyporational 01 December 2013 05:08:08AM *  2 points [-]

it's odd that LessWrong community norms go so far in one direction on one tradeoff and so far in the other direction on the other tradeoff (at least with regards to certain subjects).

Telling people they are wrong is almost explicitly about rationality, but we should definitely think about how to do that. If I'm wrong, I want to know that and there's a clear benefit in people telling me that.

I don't see any clear benefit in discussing politics here, so I'm not even sure what the tradeoff is. It's not that politics are not important, but that there's not much we can do about them.

I'd be very interested in a post explaining why discussing politics is more important than other things, not why politics is important, for this rather small rationalist community.

but also thinking that, say, telling certain people their AI projects are dangerously stupid is very important.

I'm not sure he has bluntly told that to anyone's face. I think he's saying these things to educate his audience, not to change his opponents' minds.

Personally, I suspect government action will be important for the future of AI

This I might agree with but it doesn't justify talking about other political topics. This particular topic also wouldn't be a mind killer because it's not controversial here and any policies regarding it are still distant hypotheticals.

Comment author: Vaniver 01 December 2013 05:42:15AM *  2 points [-]

I'm not sure he has bluntly told that to anyone's face.

Well...

Comment author: hyporational 01 December 2013 05:48:48AM *  0 points [-]

I see. I'd rather suspect that person wasn't all that important, nor was the audience at that dinner party, but maybe that's just wishful thinking. I also suspect he's learned some social skills over the years.

Comment author: Vaniver 01 December 2013 06:18:56AM 0 points [-]

I'd rather suspect that person wasn't all that important, nor was the audience at that dinner party, but maybe that's just wishful thinking.

In the comments, he makes clear he held the "losing an argument is a good thing, it's on you if you fail to take advantage of it" position. He may no longer feel that way.

Comment author: christopherj 02 December 2013 01:52:26AM 1 point [-]

Actually, the reason for that title was because of a point I decided to leave out, but may as well spell out here: "Deciding to talk about politics, even though this may cause you to lose some of your audience" and "Deciding to tell people they're wrong, even though this may cause you to lose some of your audience" are both tradeoffs, and it's odd that LessWrong community norms go so far in one direction on one tradeoff and so far in the other direction on the other tradeoff (at least with regards to certain subjects).

I have had experience as a moderator at a science forum, and I can tell you that almost all of our moderating involved either A) the politics subforum, or B) indirect religious arguments, especially concerning evolution (the religion subforum was banned before my time due to impossibly high need for moderation). The rest was mostly the better trolls and people getting frustrated when someone wouldn't change their mind on an obvious thing.

However, I must say I don't see how people can discuss rationality and how people fail at it without someone telling someone else that they're wrong. After all, the major aspect of rationality is distinguishing correct from incorrect.

Incidentally, I've been really impressed at the quality of comments and users on this site. Consider what this user has observed about LW before you complain about how politics is not allowed.

Comment author: AlexSchell 01 December 2013 12:09:19AM 1 point [-]

I think you accidentally went up one meta level.

Comment author: Vaniver 01 December 2013 12:17:55AM 3 points [-]

If you read the "I agree" as sarcastic, then it looks like the right meta level. (I'm not sure it's a good thing I thought that was more plausible than the accident hypothesis when I first parsed that sentence.)

Comment author: hyporational 01 December 2013 04:40:54AM *  0 points [-]

Not sarcasm, although now that you mentioned it I can definitely see it, just well intentioned humour. See the other comment :)

Comment author: hyporational 01 December 2013 04:39:34AM 1 point [-]

As I was writing the comment, I realized applause light is an applause light too, so I decided to make fun of that.