MichaelVassar comments on How to Not Lose an Argument - Less Wrong

109 Post author: Yvain 19 March 2009 01:07AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (409)

You are viewing a single comment's thread.

Comment author: MichaelVassar 19 March 2009 02:45:08AM 1 point [-]

The inventors of the original form of rationalist virtue AND rhetoric sure didn't think that the latter was a dark art. Rationalists should WIN!

Comment author: AndySimpson 19 March 2009 06:23:05AM 12 points [-]

Rationalists should shouldn't deny themselves the utility of rhetoric. Any rational rationalist can see that rhetoric is the path to winning, a kind of social theatre that lubricates decision-making with irrational or intermittently rational groups. If a group needs to be convinced of a position within a finite amount of time, bare reasoning isn't always the best option.

Maybe that is too Machiavellian to be "really" rational, but it is the winning path.

Comment author: Yvain 19 March 2009 03:42:44PM *  10 points [-]

I think I am using "rhetoric" in a different way than Aristotle. For Aristotle, it was the art of speaking clearly and eloquently to communicate a position. I am using it more in the way people use when they say "empty rhetoric" or "political rhetoric". "Unless you give up your rights, the terrorists have already won" is my idea of an archetypal rhetorical technique. That may not be fair to the field of rhetoric, but I need some word to describe it and I can't think of a better one, so "rhetoric" it is.

Rhetoric is a technique that may be useful to rationalists, but it's not a rationalist technique. Compare the use of force. I may, as a rationalist, decide the best way towards my goal is murdering all who oppose me, in which case I'll want to know techniques like how to use an assault weapon. But there's still something fundamentally shady about the technique of killing people; it may just barely be justified on utilitarian grounds for a sufficiently important goal, but it's one of those things that you use only as a last resort and even then only after agonizing soul-searching. I feel confident saying that the technique of murdering people effectively as a Dark Art.

I feel the same way about rhetoric (by my pessimistic definition). Tricking people into believing things they have no legitimate evidence for can certainly be helpful, but the more people do it the worse the world gets. Not only do people end up with less than maximally accurate beliefs, but every rhetorician needs to promote Dark Side Epistemology in order to keep zir job. And if I use rhetoric, you need to start using rhetoric just to keep up, and sooner or later everyone's beliefs are completely skewed and inaccurate. It's not quite as Dark an Art as force is, and it's much easier to justify, but it's in the same category.

Be careful about using the "rationalists should win" slogan too literally. Martial artists should win too, but that doesn't mean they should take an AK-47 to their next sparring match and blowing their opponent's face off. Martial artists place high value on winning honorably. I see no reason why we shouldn't emulate them.

Comment author: Nebu 19 March 2009 07:55:43PM 4 points [-]

Martial artists place high value on winning honorably. I see no reason why we shouldn't emulate them.

Except, of course, for all those aspects of martial arts which we shouldn't emulate.

Comment author: pjeby 19 March 2009 10:39:56PM 3 points [-]

Um, isn't it kind of rhetorical to compare rhetoric to force and murder?

Also, all your articles here that I recall -- likewise those of Eliezer on Overcoming Bias -- are masterful applications of rhetoric. So I'm kind of confused here. Is this one of those "do as I say, not as I do" things?

Comment author: Yvain 20 March 2009 06:59:20PM 5 points [-]

If you mean the articles here are clear or well argued, thank you. I have no objection to clarity or good argument; see the first paragraph of the comment above. If you mean that I'm using dirty tricks like the "terrorists win" example, then I'd like to know exactly what you mean so I can avoid doing it in the future.

When I compare rhetoric (meaning "empty rhetoric", as mentioned) to force and murder, I'm not saying they're equally bad, or doing one leads to the other or anything like that. Just that they're bad for the same reason. Both are potentially "useful" techniques. But both prevent rational argument and if used too frequently lead to a world in which rational argument is impossible.

Comment author: SoullessAutomaton 21 March 2009 03:54:09PM 4 points [-]

If you mean the articles here are clear or well argued, thank you. I have no objection to clarity or good argument; see the first paragraph of the comment above. If you mean that I'm using dirty tricks like the "terrorists win" example, then I'd like to know exactly what you mean so I can avoid doing it in the future.

I think the point is that you do a little of both; loosely speaking you are guilty of being fairly eloquent--presenting your ideas persuasively and engagingly, in a style that is inherently likely to increase acceptance.

It is an unavoidable facet of human communication that the same idea can be more or less persuasive depending on how it is presented. Over on OB, Robin uses a far more neutral (or at times even anti-persuasive) style, and if memory serves me he and Eliezer have argued a bit about such use of style.

Comment author: pjeby 21 March 2009 04:46:03PM 5 points [-]

But that is precisely the sort of "dirty trick" you claim to be against. By using murder as an example, you're setting off a "boo light" (opposite of applause light) and linking it to the thing you want people to dislike. That's rhetoric, and emotional manipulation.

And it's neither a good thing nor a bad thing, in itself. Used to strengthen a valid argument, it's fine. Arguing that it's bad in and of itself is a misunderstanding... and another "boo light" (e.g. "empty rhetoric", "dirty tricks").

Emotional manipulation is unavoidable, by the way. Boring presenters and neutral presentations are just manipulating people's emotions either towards boredom and not caring, or to "respect", "status", and "seriousness", depending on the audience. It's best to deliberately choose what emotions you want to create, in whom, rather than leaving the matter to chance.

Comment author: Emile 20 March 2009 04:44:55PM 9 points [-]

Be careful about using the "rationalists should win" slogan too literally. Martial artists should win too, but that doesn't mean they should take an AK-47 to their next sparring match and blowing their opponent's face off. Martial artists place high value on winning honorably. I see no reason why we shouldn't emulate them.

I disagree. The problem with using dishonest rethoric to win in a debate isn't that it's winning dishonorably; it's that it's winning at the wrong game - on a game that you wouldn't consider the most important if you looked at it closely.

To continue with the martial arts analogy, imagine say a Chinese kung fu master in World War 2 Nanjing that knows that Japanese soldiers are coming over to kill off all of his family. Should he try to win the fight honorably? Or just try to win using every dirty trick in the book (including running away)? If he focuses on winning honorably, he's lost sight of his main goal (save his family) in favor of a secondary one (win honorably).

Similarly, if you foxus on "winning the debate", and as a result push people into a corner that will make them dislike you and become more attached to their identity as a believer in whatever - you focused on the wrong subgoal, and lost at the one which was important to you.

Comment author: Yvain 20 March 2009 07:08:23PM 6 points [-]

I'm a precedent utilitarian. I try to maximize utility, except when doing so would set a bad precedent that would lower utility later.

Precedent utilitarians are usually good about restraining from force. Yes, killing a rich miser and distributing her money to the poor might increase utility. But it sets the precedent that anyone can kill someone if they think of a good enough reason, and most people won't be smart enough to limit themselves to genuinely good reasons. Therefore, precedent utilitarians generally respect the rule of not killing others. But in certain cases this rule breaks down. In the WWII example you mention, it doesn't seem particularly dangerous to set the precedent that you can use force against invaders coming to kill your family.

I try to use the same thought process when evaluating when to use rhetoric. If anyone can use rhetoric any time it furthers a goal that they consider genuinely good, then there's little incentive to use rational argument except on the rare hard-core rationalists who are mostly resistant to rhetorical tricks. I want to be able to condemn a demagogue who uses rhetoric without being a hypocrite. If I needed to use rhetoric in a situation where I couldn't blame anyone else for using rhetoric, like trying to save my family, I'd do it.

(the problem with precedent utilitarianism is that the calculations are impossible to do with real math, and mostly just involve handwaving. But I hope it at least gives a sketch of my thought processes)

Comment author: AllanCrossman 20 March 2009 07:17:34PM *  7 points [-]

Yvain: "I'm a precedent utilitarian. I try to maximize utility, except when doing so would set a bad precedent that would lower utility later."

I think this is an odd thing to say. Any utilitarian ought to be declining short-term gains that result in long-term losses. So why the need for this specific disclaimer?

Comment author: topynate 21 March 2009 05:36:37PM 5 points [-]

Yvain seems to be using the term to mean a utilitarian (in the pure sense) who scrupulously considers the force of his example. The implication is that many don't - we're not talking about perfectly rational beings here, just people who agree with the principle of utility maximization.

Comment author: Nominull 19 March 2009 03:28:30PM 9 points [-]

We're running up against the equivocation at the core of this community, between rationalists as people who make optimal plays versus rationalists as people who love truth and hate lies.

Comment author: Annoyance 21 March 2009 03:29:39PM 2 points [-]

rationalists as people who make optimal plays versus rationalists as people who love truth and hate lies

It's only possible for us to systematically make optimal plays IF we have a sufficient grasp of truth. There's only an equivocation in the minds of people who don't understand that one goal is a necessary precursor for the other.

Comment author: Nebu 13 December 2015 07:20:47AM 1 point [-]

rationalists as people who make optimal plays versus rationalists as people who love truth and hate lies

It's only possible for us to systematically make optimal plays IF we have a sufficient grasp of truth. There's only an equivocation in the minds of people who don't understand that one goal is a necessary precursor for the other.

No, I think there is an equivocation here, though that's probably because of the term "people who love truth and hate lies" instead of "epistemic rationalist".

An epistemic rationalist wants to know truth and to eliminate lies from their mind. An instrumental rationalist wants to win, and one precursor to winning is to know truth and to eliminate lies from one's own mind.

However, someone who "loves truth and hates lies" doesn't merely want their own mind to filled with truth. They want for all minds in the universe to be filled with truth and for lies to be eliminated from all minds. This can be an impediment to "winning" if there are competing minds.

Comment author: Annoyance 21 March 2009 03:28:36PM 1 point [-]

Rationalists should WIN!

Rationalists have better definitions of "winning". They don't necessarily include triumphing in social wrestling matches.

Comment author: Nebu 13 December 2015 07:24:00AM 1 point [-]

Actually, I think "Rationalists should WIN" regardless of what their goals are, even if that includes social wrestling matches.

The "should" here is not intended to be moral prescriptivism. I'm not saying in an morally/ethically ideal world, rationalists would win. Instead, I'm using "should" to help define what the word "Rationalist" means. If some person is a rationalist, then given equal opportunity, resources, difficult-of-goal, etc., they will on average, probabilistically win more often than someone who was not a rationalist. And if they happen to be an evil rationalist, well that sucks for the rest of the universe, but that's still what "rationalist" means.

I believe this definitional-sense of "should" is also what the originator of the "Rationalists should WIN" quote intended.

Comment author: Lumifer 13 December 2015 11:18:01PM 1 point [-]

I'm using "should" to help define what the word "Rationalist" means.

There is a bit of a problem here in that the list of the greatest rationalists ever will be headed by people like Genghis Khan and Prophet Muhammad.

Comment author: Nebu 14 December 2015 05:41:16AM *  0 points [-]

People who win are not necessarily rationalists. A person who is a rationalist is more likely to win than a person who is not.

Consider someone who just happens to win the lottery vs someone who figures out what actions have the highest expected net profit.

Edit: That said, careful not to succumb to http://rationalwiki.org/wiki/Argument_from_consequences maybe Genghis Khan really was one of the greatest rationalists ever. I've never met the guy nor read any of his writings, so I wouldn't know.

Comment author: Lumifer 14 December 2015 03:42:38PM 2 points [-]

Even ignoring the issue that "rationalist" is not a binary variable, I don't know how in practice will you be able to tell whether someone is a rationalist or not. Your definition depends on counterfactuals and without them you can't disentangle rationalism and luck.

Comment author: Nebu 16 December 2015 08:19:37AM 0 points [-]

I assume that you accept the claim that it is possible to define what a fair coin is, and thus what an unfair coin is.

If we observe some coin, at first, it may be difficult to tell if it's a fair coin or not. Perhaps the coin comes from a very trustworthy friend who assures you that it's fair. Maybe it's specifically being sold in a novelty store and labelled as an "unfair coin" and you've made many purchases from this store in the past and have never been disappointed. In other words, you have some "prior" probability belief that the coin is fair (or not fair).

As you see the coin flip, you can keep track of its outcomes, and adjust your belief. You can ask yourself "Given the outcomes I've seen, is it more likely that the coin is fair? or unfair?" and update accordingly.

I think the same applies for rationalist here. I meet someone new. Eliezer vouches for her as being very rational. I observe her sometimes winning, sometimes not winning. I expend mental effort and try to judge how easy/difficult her situation was and how much effort/skill/rationality/luck/whatever it would have taken her to win in that situation. I try to analyze how it came about that she won when she won, or lost when she lost. I try to dismiss evidence where luck was a big factor. She bought a lottery ticket, and she won. Should I update towards her being a rationalist or not? She switched doors in Monty Hall, but she ended up with a goat. Should I update towards her being a rationalist or not? Etc.

Comment author: Lumifer 16 December 2015 03:55:35PM 3 points [-]

Hm, OK. So you are saying that the degree of rationalism is an unobservable (hidden) variable and what we can observe (winning or losing) is contaminated by noise (luck). That's a fair way of framing it.

The interesting question then becomes what kind of accuracy can you achieve in the real world given that the noise level are high, information available to you is limited, and your perception is imperfect (e.g. it's not uncommon to interpret non-obvious high skill as luck).

Comment author: Nebu 18 December 2015 06:10:51AM 1 point [-]

Right, I suspect just having heard about someone's accomplishments would be an extremely noisy indicator. You'd want to know what they were thinking, for example by reading their blog posts.

Eliezer seems pretty rational, given his writings. But if he repeatedly lost in situations where other people tend to win, I'd update accordingly.

Comment author: ChristianKl 18 December 2015 11:27:14AM 1 point [-]

But what about the other case? People who don't seem rational given their writings but who repeatedly win?

Comment author: Lumifer 18 December 2015 04:04:07PM 0 points [-]

seems pretty rational, given his writings

If you define rationality as winning, why does it matter what his writings seem like?

Comment author: VoiceOfRa 16 December 2015 03:20:28AM 0 points [-]

Well, if what you want to accomplish is motivating large groups of people into supporting you and using them to conquer a large empire, you should study what they did and how they did it.

Comment author: Lumifer 16 December 2015 05:33:44AM 4 points [-]

Now that you mention it, I actually don't.