The product of Less Wrong is truth. However, there seems to be a reluctance of the personality types here - myself included - to sell that product. Here's my evidence:

Yvain said: But the most important reason to argue with someone is to change his mind. ... I make the anecdotal observation that a lot of smart people are very good at winning arguments in the first sense [(logic)], and very bad at winning arguments in the second sense [(persuasion)]. Does that correspond to your experience?

Eliezer said: I finally note, with regret, that in a world containing Persuaders, it may make sense for a second-order Informer to be deliberately eloquent if the issue has already been obscured by an eloquent Persuader - just exactly as elegant as the previous Persuader, no more, no less.  It's a pity that this wonderful excuse exists, but in the real world, well...

Robin Hanson said: So to promote rationality on interesting important topics, your overwhelming consideration simply must be: on what topics will the world’s systems for deciding who to hear on what listen substantially to you? Your efforts to ponder and make progress will be largely wasted if you focus on topics where none of the world’s “who to hear on what” systems rate you as someone worth hearing. You must not only find something worth saying, but also something that will be heard.

We actually label many highly effective persuasive strategies that can be used to market our true ideas as "dark arts". What's the justification for this negative branding? A necessary evil is not evil. Even if - and this is a huge if - our future utopia is free of dark arts, that's not the world we live in today. Choosing not to use them is analogous to a peacenik wanting to rid the world of violence by suggesting that police not use weapons.

We treat our dislike of dark arts as if it's a simple corollary of the axiom of the virtue of truth. Does this mean we assume the ends (more people believe the truth) doesn't justify the means (persuasion to the truth via exploiting cognitive biases)? Or are we just worried about being hypocrites? Whatever the reason, such an impactful assumption deserves an explanation. Speaking practically, the successful practice of dark arts requires the psychological skill of switching hats, to use Edward de Bono's terminology. While posting on Less Wrong, we can avoid and are in fact praised for avoiding dark arts, but we need to switch up in other environments, and that's difficult. Frankly, we're not great at it, and it's very tempting to externalize the problem and say "the art is bad" rather than "we're bad at the art".

Our distaste for rhetorical tactics, both aesthetically and morally, profoundly affects the way we communicate. That distaste is tightly coupled with the mental habit of always interpreting the value of what is said purely for its informational content, logical consistency, and insight. I'm basing the following question on my own introspection, but I wonder if this almost religiously entrenched mental habit could make us blind to the value of the art of persuasion? Let's imagine for a moment, the most convincing paragraph ever written. It was truly a world-wonder of persuasion - it converted fundamentalist Christians into atheists, suicide bombers into diplomats, and Ann Coulter-4-President supporters into Less Wrong sycophants. What would your reaction to the paragraph be? Would you "up-vote" this work of genius? No way. We'd be competing to tell the fundamentalist Christian that there were at least three argument fallacies in the first sentence, we'd explain to the suicide bomber that the rhetoric could be used equally well to justify blowing us all up right now, and for completeness we'd give the Ann Coulter supporter a brief overview of Bayesianism.

New Comment
71 comments, sorted by Click to highlight new comments since:
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings

To have this attitude, you need a strong presumption of your own superiority. Instead of engaging them in a conversation where you can both better discover the truth, with you also remaining open to info they may offer, you decide this is war and "all is fair in love and war." You know what is true and what is good for the world and you've decided it is important enough to the world for them to believe your truth that you will force it upon them in any way you can. No doubt this is a possible situation, and there is possible evidence that could reasonably convince you that this is your situation. But do pause and consider whether your confidence might be overconfidence, biased by arrogance, and also consider the consequences of their hearing that this is in fact your attitude toward them.

9BenAlbahari
First of all, let me say that rationalist-to-rationalist conversations don't really have this problem. This is all about what happens when we talk to people who think less in "far-mode" if you will. What I've found is when talking with non-rationalists, I have to consciously switch into a different mindset to get "flow". Let me give a personal example. I was once at a bar where I met some random people, and one of them told me something that a rationalist would consider "woo". He explained that he'd read that an atomic bomb was a particularly terrible thing, because unlike when you die normally, the radiation destroys the souls. I paused for a moment, swallowed all my rationalist impulses, and thought: "Is there anyway what he said could be meaningful?" I responded: "Well the terrible thing about an atomic explosion, is that it kills not just a person in isolation, but whole families, whole communities... if just one person dies, their friends and families can respect that person's death and celebrate their memories, but when that many people die all at once, their entire history of who they are, their souls, are just erased in an instant". He told me that was deep, and bought me a drink. Did I feel dishonest? Not really. I decided what was relevant and what was not. Obviously the bit he said about radiation didn't make scientific sense, but I didn't feel the reason he'd brought up the idea was that he cared for a science lesson. Similarly, I could have asked the question: "well exactly what do you mean by a 'soul'?" Instead I chose an interpretation that seemed agreeable to both of us. Now, had he specifically asked me for an analytical opinion, I would have absolutely given him that. But for now, what I'd done had earned myself some credibility, so that later in the conversation, if I wanted to be persuasive of an important "rationalist" opinion, I'd actually be someone worth listening too.
3RobinHanson
Yes of course you should not habitually divert conversations into you lecturing others on how some specific thing they said might be in error. For example, when I listen to an academic talk I do not speak up about most of the questionable claims made - I wait until I can see the main point of the talk and see what questionable claims might actually matter for their main claim. Those are the points I consider raising. Always keep in mind the purpose of your conversation.
1BenAlbahari
Depending on the purpose of the conversation, do you think dark arts are sometimes legitimate? Or perhaps a more interesting question for an economist: can you speculate as to the utility (let's say some measure of persuasive effectiveness) of dark arts, depending on the conversation type (e.g. a State of the Union Address and a Conversation on Less Wrong would presumably be polar opposites)
3PhilGoetz
I'd rather ask the question without the word "sometimes". Because what people do is use that word "sometimes" as a rationalization. "We'll only use the Dark Arts in the short term, in the run-up to the Singularity." The notion is that once everybody becomes rational, we can stop using them. I'm skeptical that will happen. As we become more complex reasoners, we will develop new bugs and weaknesses in our reasoning for more-sophisticated dark artists to exploit. And we will have more complicated disagreements with each other, with higher stakes; so we will keep justifying the use of the Dark Arts.
0ata
Are we expecting to become more complex reasoners? It seems to be the opposite to me. We are certainly moving in the direction of reasoning about increasingly complex things, but by all indications, the mechanisms of normal human reasoning are much more complex than they should be, which is why it has so many bugs and weaknesses in the first place. Becoming better at reasoning, in the LW tradition, appears to consist entirely of removing components (biases, obsolete heuristics, bad epistemologies and cached thoughts, etc.), not adding them. If the goal is to become perfect Bayesians, then the goal is simplicity itself. I realize that is probably an impossible goal — even if the Singularity happens and we all upload ourselves into supercomputer robot brains, we'd need P=NP in order to compute all of our probabilities to exactly where they should be — but every practical step we take, away from our evolutionary patchwork of belief-acquisition mechanisms and toward this ideal of rationality, is one less opportunity for things to go wrong.
0BenAlbahari
This is exactly the chain of reasoning I had in mind in my original post when I referred to the "big if".
4PhilGoetz
I'd like to hear some justification - some extensive justification, at least a sequence's worth - explaining how building a Friendly AI, with the already-expressed intent of beating all other AIs to the punch and then using your position of power to suppress or destroy construction of any other AIs at any cost, and to make yours a singleton designed in such a way that the values you programmed it with can never be altered - -- can amount to anything other than what Robin just described. (Elaborating after a day with no responses) I realize that the first answer is going to be something along the lines of, "But we don't program in the values. We just design an algorithm that can extrapolate values from everyone else." First, I've spoken with many of the people involved, and haven't heard any of them express views consistent with this - they want their values to be preserved, in fact two said explicitly that they did not care what happened to the universe if their personal values were not preserved - and yet they also believe that their values are extreme minority views among humanity. What's more, they have views of transhumanism and humanity that make excluding lower animals from this extrapolated volition unjustifiable on any grounds that would not also exclude humans. Second, the problem of trying to program an AI in such a way that your values do not determine the values it acquires, is isomorphic to the problem of trying to write a program to do Bayesian analysis in a way that will not be influenced by your priors; or trying to evaluate a new scientific idea that isn't influenced by your current scientific paradigm. It can't be done, except by defining your terms in a way that hides the problem. Third, the greatest concern here is how much respect will be given to our free will when setting up the AI governor over us. Given that the people doing the setting up unanimously don't believe in free will, the only logical answer is, Zero.
0David Althaus
Could you add more details? I'm really interested in this issue since I have similar concerns regarding the whole CEV-idea: It presumes that the values of almost every human somehow converge if they were only smarter, had more time etc. . I don't know, it would definitely be nice if this were true, but if I look at most people around me, read a random history book, or just watch 5 minutes TV, I see values absurdly different from mine. To be frank, I think I would trust a CEV more if the FAI would only extrapolate the volition of highly intelligent people. Damn, thinking about it, I have to say: If I had to choose between a FAI only scanning the brain of Eliezer and a FAI scanning every human on earth, then I would choose Eliezer! Well, you could argue that this only shows that I'm a fanatic lunatic or a cynical misanthrope... Anyway, I would like to hear your current thoughts on this subject!
2ArisKatsaris
By 'extrapolated' we mean that the FAI is calculating what the wishes of those people would be IF they were as intelligent and well-informed as the FAI. Given that, what difference do you think it would make for the FAI to only scan intelligent people? I can imagine only negatives: a potential neglect of physical/non intellectuals pursuits as a potential source of Fun, greater political opposition if not everyone is scanned, harder time justifying the morality of letting something take control that doesn't take EVERYONE'S desires into consideration...
0David Althaus
I don't think I understand this. If the FAI would make Stalin as intelligent and well-informed as the FAI, then this newly created entity wouldn't be Stalin anymore. In fact, it would be something totally different. But maybe I'm just too stupid and there is some continuity of identity going on. Then I have to ask: Why not extrapolate the volition of every animal on earth? If you can make Stalin intelligent and moral and you somehow don't annihilate the personality of Stalin then I propose the same thing is possible for every (other) pig. Now you could say something like " Well, Stalin is an exception, he obviously was hopelessly evil", but even today Stalin is the hero of many people. To but it bluntly: Many folks seem to me rather stupid or malicious or both. If the FAI makes them "as intelligent and well-informed as the FAI itself" then they would be different people. Anyway, these were only my misanthropic ramblings, and I don't believe they are accurate, since they are probably at least partly caused by depression and frustration. But somehow I felt the urge to utter them;) Look, I really hope this whole CEV-stuff makes sense, but I think I'm not the only one who doubts it. ( And not only ordinary folks like me, but e.g. Wei Dai, Phil Goetz, etc. ) Do you know of any further explanations of CEV besides this one?
0ArisKatsaris
Yeah, so? Nobody said the intelligently extrapolated volition of Stalin would be Stalin. It would be a parameter in the calculation parameters of the FAI. We're not talking about simulating people. We're talking about extrapolating their volition. Continuity of identity has nothing to do with anything. Which is the exact point, that we don't want the current volition of people (since people are currently stupid), we want their extrapolated volition, what they would choose if they were much more intelligent than they currently are.
[-]Bongo120

About the use of dark arts, and the magic paragraph... Maybe there's an important difference between:

  • turning someone into an extreme rationalist who would see they had been "hacked" and be grateful for it.
  • giving someone correct opinions but otherwise have them remain irrational and ignorant of having been "hacked".
0sketerpot
I propose that we call dark-arts persuasion "brain hacking" henceforth. (Or ghost-hacking, if you prefer.) It's an excellent term, and I don't think that people usually appreciate just how creepy it is to try to hack into somebody's mind and change it like that.
3pjeby
FTFY. ;-) Seriously, it's hard to change your own mind even when you want with all your heart to do so. Dark arts don't persuade anybody of anything they don't already find attractive to believe. (i.e., things that humans are already biased towards thinking are true or useful) Successful direct marketers advise that you only advertise to people who already want what you're selling and are inclined to believe you, for this very reason. (And if anybody had incentive to develop truly "creepy" levels of persuasion capability, it'd be direct marketers, who can make millions more by increasing their persuasiveness a few percentage points.)
4sketerpot
In some circumstances, it can be hard to change people's minds. In others, it can be way too easy. For example, look at bullying in schools: a large number of people somehow come to agree on bullying a certain set of targets, even when they probably would never be so cruel on their own. Or look at political propaganda, when given by a trusted source and decked out in applause-lights-inducing standard rhetoric -- preaching a slightly new message to the choir. In both cases, the people being persuaded just weren't adequately defending themselves from the influence of the people around them. An obvious example of a case where it's hard to hack somebody's mind is trying to get someone to stop being racist. That's hard; it's easier to get them to stop being racist in public, by making them ashamed of it. So, sure, the Dark Arts are not all-powerful, but they're not weak either. And they are always creepy.
2BenAlbahari
Argument fallacies and rhetorical tactics are used all the time by people who aren't even aware of it. In a sense, this lack of awareness makes it even creepier. It's like one zombie fiddling with the brain of another zombie.
0bogdanb
I saw many comments referring to the dark arts as creepy, and I never understood what they meant* until I read this line :-) (*: as opposed to, eg, reprehensible)
0BenAlbahari
I think that's a useful distinction. It's debatable whether extreme rationalism is healthy for everyone (if even feasible). See Is self-deception a fault?. If not, and we respect someone's well-being, then it may not be possible to persuade them with rationalist tactics alone. Yet we know their cognitive biases. And, to sound very Machiavellian, if we don't exploit these biases, someone else will.

Let's imagine for a moment, the most convincing paragraph ever written. It was truly a world-wonder of persuasion - it converted fundamentalist Christians into atheists, suicide bombers into diplomats, and Ann Coulter-4-President supporters into Less Wrong sycophants.

Would it produce people who could explain the differences between Pascal's Wager and the expected utility argument for cryonics and why they should produce different answers? Or who could accept Many Worlds without thinking that they make all available decisions "equally"? Or who ... (read more)

Someone believes that the Singularity is a sack of absurdities and is sad about the fact that we're all deluded into believing it. To help pry us loose from our little cult, they post an argument that's really persuasive on a gut level: A photoshopped picture of me killing Meredith Kercher. (Shortly after, I am arrested by the Italian police.)

Is there a moral difference between that and what you're proposing besides, "Oh, but they're wrong and we're right?" If so, what do you think it is?

6Tyrrell_McAllister
They have vastly different consequences. Why should a consequentialist seriously consider the possibility that there is no moral difference?
5Vladimir_Nesov
That's quite an important difference.
3BenAlbahari
To lay out the dark-arts debate a little, we can discuss: 1. The underlying reasons why we dislike dark arts. 2. Whether we should like dark arts. 2a. - Efficiency (essentially assuming utilitarianism) 2b. - Morality (essentially denying utilitarianism) 3. Is there an "acceptable level of usage" (applies separately to 1, 2a, and 2b). So you're asking about 2b and 3. I'll make another comment to dive into that specific path.
0jimrandomh
Yes, fabricating evidence of a murder is morally worse than almost any other type of lie.
7JGWeissman
Ok, here is a different example not framing anyone for murder: Someone believes that cryonics could not possibly work, and uses the dark arts to convince people already signed up not to go through with it, effectively killing them (though the persuader does not believe this, and in their expectations, have produced only good effects).
7loqi
Sadly, the scenario you describe is very realistic.
-1BenAlbahari
Stripping each of the two examples down to the point where all you have left to compare them with is the principle of "ends justifies the means", then sure, they're the same. If our categorical imperative was "the ends never justifies the means" then we could stop gathering information. However, I think the specific means and specific ends matter. You've constructed your example - or in any case could clearly construct such an example - such that the value of the end in both examples is equivalent from the perspective of the person conducting dark arts. However, the, means are clearly different. Framing someone for murder is far worse. That's the moral difference. If you changed the example to radically reduce the heinousness of the means, then I think the moral case is equivalent in both examples, from the perspective of the person conducting the dark arts. My final thought is that there's a grand hidden "ethical prior" that we're ignoring here. Regardless of the particular end in question, we should take into account the degree to which rational processes were used to justify that end. We can legitimately claim the moral high ground if the opposing side fails to provide any rational justification for the end they are promulgating.
[-]ata70

We actually label persuasive strategies that can be used to market our true ideas as "dark arts".

The linked page specifies that the "dark arts" specifically take advantage of biases for persuasion. So it's a bit misleading to say "We actually label persuasive strategies that can be used…", because we do not label all strategies as such. Our goal should be to snap people out of their biases, so that they can naturally accept anything that turns out to be true. That could be taken as a "persuasive strategy", but it ... (read more)

3BenAlbahari
Thanks for the detailed reply - I'll try to respond to each of your points. First of all, using dark arts does not imply you have to tell outright lies. Secondly, you say "if a person ends up with better ideas but all the same biases, their heads can later just as easily be filled with whole new sets of bad ideas by other Dark Arts practitioners." When the alternative is that they only had bad ideas in their head, this is a still a win. And your example is the minimum win possible. What if we used dark arts to help someone remove a cognitive bias? Is it now justified? Third, PZ Myer chose a very effective persuasion strategy, The Admirable Admission Pitch. However, one case where someone was effective sans-dark arts hardly proves the sans-dark arts approach is optimal in general. When you look at a heavy-weight persuader of the world like Al Gore, you can see he makes heavy use of the dark arts. Finally, you're correct with respect to the problem you pointed out in your 1st paragarph. I'll tweak the post to fix it.
4ata
It does, in a very real way: if you say "You should believe X because Y", and Y is not actually a good reason to believe X, you have spoken a falsehood, whether Y is a lie about external reality or just some rhetorical trick. I am not convinced of this. You are right in a strictly utilitarian sense — it is better for someone to have good ideas for bad reasons than to have bad ideas for bad reasons — but in most cases, it's a false dilemma. At most, using the Dark Arts is only justified if someone is absolutely intractable in attempts to debias them. Using them too soon could amount to a missed opportunity — causing us to declare victory and move on too quickly, and/or causing them to be even less open to a more fundamental debiasing afterwards. Let's say we convince a Christian to believe in evolution by arguing as though it can be reconciled with a day-age reading of Genesis. This is a bit better than them being a literalist young-earth creationist, but they have not become more rational. And if you convince them of evolution thusly and then try to convince them that their religious epistemology is wrong altogether, I can imagine them saying "Come on, you convinced me to believe in evolution, what more would you want from me?" or "Liar! You said believing in evolution wouldn't be a slippery slope to atheism!" or "What, so I believed that science was incompatible with my religion, and you convinced me they're compatible after all, but now you've switched to arguing that they aren't?". If you want to make someone question their deepest and most cherished beliefs, they are likely to take you even less seriously if you previously convinced them of a lesser point by acting like those beliefs could be true. (That is a hypothesis, supported only by some personal experience and intuition. It can probably be tested; until then, I invite anyone with personal experience or other thoughts on this point, whether supporting or opposing it, to share them.) I'm not quite sure w
1BenAlbahari
Video of killing a cognitive bias with dark arts: http://www.youtube.com/watch?v=haP7Ys9ocTk (Also illustrates Bongo's first bullet point in a comment above)
2ata
To use the virus metaphor again, this is like a security expert who finds exploits and reports them so they can be fixed. (Or, more closely analogous to this video, working with an individual and delivering an active but harmless virus to their computer so they will become aware of and concerned about the potential for real harm.) The grey area, somehow using invalid but persuasive arguments against people's actual biases, is like my previous example making a virus that patches the very holes it uses to get in. Using the Dark Arts is like using those exploits to install things on people's computers on the basis that you're only using it to install really good software that people ought to have anyway. So, showing someone by demonstration how a particular thought pattern serves them poorly is not what I'm talking about. That's a good thing. (I was going to say it's "not the Dark Arts", but so we don't get into arguing about the definition, I'll just say that this is an example of something I support, while I think I've given enough examples of the sort of thing that I don't support for the distinction to be clear. It is indeed pretty much what Bongo is saying. My point right now is that the two concepts are different enough that we shouldn't be referring to them with the same term, especially a connotation-heavy one like "Dark Arts".)
0Eugine_Nier
Yes, this is one of the reasons I have serious doubts about global warming.
1markrkrebs
Saying there are white arts as well as dark ones is conceding the point, isn't it? One should be allowed to be persuasive as well as right, and sometimes just being right isn't enough, especially if the audience is judging the surface appeal of an argument (and maybe even accepting it or not!) prior to digging into it's meat. In such situations, attractive wrapping isn't just pretty, it's a prerequisite. So, I love your idea of inventing a protocol for DAtDA.

"shut up and get rich" is the only strategy I see working for rationalists TBH.

0Douglas_Knight
If you get rich, they'll probably stop asking "If you're so smart, why aren't you rich?" Maybe you'll be cool enough that they venerate your words. But will they actually listen? (or maybe I don't know what you mean by "working"?)
1nazgulnarsil
the greater the degree that you can demonstrate having a winning strategy the more people will listen to you. no one is going to listen to rationalists about winning unless we start winning.
4[anonymous]
But winning is subjective. You win if you reach your goals. And since other people have other goals they probably won't recognize your win. The discussion would probably go like this: They: If you're so smart, why aren't you rich? Me: Because becoming richer than I am is not my goal. They: Then what is your goal? Me: Solving intellectual puzzles. They: What kind of goal is that? - then walks away.
1nazgulnarsil
I think we actually agree. let me try to state it differently. people will only listen to rationalist techniques from people whom they share goals with. I am assuming the goal with the highest commonality would be "get rich". Though in the past I have made appeals to other goals, such as personal security for the self and loved ones.

The post and comments -- all very interesting so far -- tend to assume that a rationalist's goal in conversation with someone else is always a first-order one, where success and failure are represented only by whether the interlocutor changes his/her mind with respect to rationality or truth-seeking. There might be a limited category of cases where this assumption isn't good.

For example, if there were a young-earth creationist who was also a fourth-grade teacher, and who was in the habit of subtly undermining scientific truth with talk of "controvers... (read more)

4grouchymusicologist
Much shorter version: in cases where you probably won't succeed in turning crazy people into aspiring rationalists, underhanded tactics are fair game to make their particular brand of craziness a less damaging one to the overall cause of rationality.
2JGWeissman
If, on the other hand, he told these impressionable children that evolution was true because these verses in the bible say so, the broad cause of rationality has been hurt. (There may be a less convenient world where this is not a concern, but I want to make the point that you might not accurately predict the effects of effective persuasion through the dark arts, and their use might not be as contained as you would hope.)

I guess I'm for persuasion, think the ends justify in this case. Otherwise you're all bringing knives to a gunfight and crying foul when you get shot down. Could there be a degree of "sour grapes" here resulting from finding one's self inexplicably unpersuasive next to a velvet tongued dummy? Are we sure we eschew the tactics of rhetoric because they're wrong? Is it even fair to describe "dark arts" as wrong?

I say rather that speech is meant to be persuasive. Better speech would then be more persuasive. As such persuasion backed by t... (read more)

3Jonathan_Graehl
What's the meta-point being made by your obnoxious metaphor-laden text? What's the reason for your abuse of rhetorical questions? It's not effective communication. You demonstrate just how when you're too heavy on simpleton-wowing flash, you risk losing the part of the audience that demands respect. Here's my gloss of your three paragraphs: 1) "Losers always whine about their best" 2) If you're right, and you don't persuade, then your speech wasn't good enough. 3) It's fine to design your speech so as to mislead the dumb and the inattentive toward your side. To the last I'd add the obvious caveat that you should consider how much of your audience you lose if you do it gracelessly.
0markrkrebs
Jonathan, I'll try again, with less flair... My comments were to the original post, which asks if "dark arts" are justified and I say simply, yes. I think lots of otherwise very smart people who like to communicate with bare logic and none of the cultural niceties of linguistic foreplay can actually alienate the people they hope to persuade. You may have just done that, assuming you were trying to persuade me of something. Re: losing the audience that demands respect, firstly I was trying to be illustrative in a funny, not disrespectful way, and more importantly I was not talking about you, at all. I am talking about arguing with people who are unswayed by the logical content. "If the glove does not fit, you must acquit!" What? That's a freaking POEM: rhyming doesn't make it true! ...and yet, history teaches us that Johnny Cochran was a genius: OJ walked. That's the world you and I live in, unfortunately. How shall we persuade it? Logic isn't enough. I'd presumed, (and I suggest my tack is actually quite respectful of THIS readership), that the very part of the audience you're cautioning me not to lose doesn't need to be convinced, 'cause they can "do the math" already. The facts will work just fine for them. No, I am hunting smaller, game. At the risk of another metaphor, I'll reach for resonance. Different antennae resonate to the beat of different wavelengths. A high power signal of surpassing precision will pass unnoticed through an antenna not sized to receive it. It is possible to give one's opponents too much credit.
2wedrifid
I tend to be quite persuasive. I'm sure I'll find something else to be resentful about. 'Wrong'? Describing things as 'wrong' is a dark art.

The very definition of the "dark arts" is those arts of persuasion that don't particularly correlate well to truth. So if such a super-persuasive paragraph existed, there's no reason to expect it to be one that favoured our beliefs over those of Ann Coulter. If we abandon those persuasive techniques that favour truth, we throw away the one advantage we believe we have over the many others that wish to persuade.

Also, don't throw around boo lights like "almost religiously entrenched" without evidence.

0BenAlbahari
Your statement seems idealistic: "If we abandon those persuasive techniques that favour truth, we throw away the one advantage we believe we have over the many others that wish to persuade" It reminds me of the statement from obama: "It is precisely our ideals that give us the strength and the moral high ground to be able to effectively deal with the unthinking violence that we see emanating from terrorist organizations around the world." Source It gets an applause but when you're really trying to affect the world, things are not so black and white. I'm reminded of Machiavelli, who wrote "And short-sighted writers admire [the rulers] deeds from one point of view and from another condemn the principal cause of them.". I would imagine you would criticize a politician for using dark arts, but they couldn't even be in that position without using them. P.S. I concede with the boo lights. Let me know the policy on tweaking posts.
2Paul Crowley
What you quote of mine does have some of the connotations you ascribe to it, but it does also denote something, and it feels as if you haven't engaged with that so it's a little hard to reply to you. Do you disagree with what it denotes? EDIT: tweaking is perfectly usual!
0BenAlbahari
To share my personal belief history here, I'd say I'm questioning whether I agree with your statement, despite my prior conviction. Except for a few rationalist types, most people I've met don't really seem to mind sophistry whatsoever; the sophist will seem more charismatic and more worth listening to. And when the sophist really does say an insightful but difficult-to-believe truth, they're the ones with the spoonful of sugar to make the medicine go down. P.S. I'd be very interested hearing about someone applying Robin Hanson's take on status and signaling to this.
[-]Rain40

I use self-promotion techniques that I know are more flair and feeling than rational points to convince people of how good I am at certain things; I then follow through by showing that I am, in fact, as good or better than my marketing material showed me to be. This combination works very well, and I do not feel deceptive for properly calibrating their expectations before I am able to provide true evidence.

I suppose I'm more concerned with how well they're calibrated (beliefs) than with their methods of calibration (rationality); the latter provides far more benefits, but is also far more difficult to change.

[-]loqi40

This post seems predicated on the notion that we've established a gospel to preach, and we're just not preaching it. What's the gospel? "Rationality"? "Truth"? The Sequences? I don't buy it. There's no finished product here to sell.

I'm reminded of conversations I've had with people who express deep frustration with Obama's inability to cajole Congress into moving quickly on health care reform, but don't have a good answer to the question, "Why do you think Congress will come up with a workable solution to the problem?"

1BenAlbahari
The contention of my post is that rationalist personality types are often bad at selling their beliefs to others, due to their reluctance to use what they believe to be "dark arts". Perhaps I should have expressed this meaning more literally in the post's title - the concept of "a product to sell" is really just a metaphor.
1loqi
Ah, that definitely scales down my objection, but it's still present in a similar form. At what point does one decide that a belief is ready for "sale"? If you have plenty of good reasons for believing something, then those are the arguments you want to field. I'm having trouble seeing what would convince me to go further and risk epistemic backwash, other than contrived scenarios designed to maximize certainty. In the worst-case scenario, you might steamroll right over a valid counter-argument, and miss a chance to be proven wrong under conditions of high confidence - one of the most valuable learning experiences I know of. I guess what I'm looking for is a solid rebuttal to this advice from the Supreme Leader:
2BenAlbahari
Well that's obvious. Just tell them the Supreme Leader wears them.

If you persuade people with bad reasons-- and if people are persuaded more by eloquence than they would have been by the substance of the argument, they are persuaded for bad reasons -- in the end they will be worse off than before with respect to truth, even if they change their minds about some particular errors.

5sketerpot
Are you sure about that? Let me give a counterexample. I used to be a Christian. Around middle school, I was generally unhappy, and since I had been raised on a steady diet of science fiction and fantasy, I had a conflict between the religious worldview of Christianity and the secular worldview that I used the rest of the time. The rituals in the church just started to seem embarrassingly silly. This was a bad reason to stop believing, but it freed me to start actually thinking about faith and evidence. Without that bad reason, I might not have managed to break free for much longer, if ever. I shudder to think of it. People can develop the right reasons after bad reasons have freed their mind from the shackles that prevent it from working.
2Unknowns
For all you know, ten years from now you might give a different counterexample, explaining why you reconverted to Christianity for bad reasons, but now know that it's all true... The problem is that when you become convinced of something for bad reasons, even when you see that those reasons are bad, this doesn't stop you from rationalizing your decision with new reasons. These reasons may be bad as well, for all you know.
2khafra
I see sketerpot's story less as an arbitrary change in beliefs backfilled by rationalizations, and more as him learning that he can change his beliefs in such a fundamental way and then exploring beliefs with epistemic best practices in mind. But that might just be because it's also my story.

You shouldn't mistake the convention that people aren't supposed to use the Dark Arts here on LessWrong, to mean that the LW community rejects the Dark Arts.

I've spoken with dozens of people who are involved with SIAI or read/post on LessWrong. I would not characterize them as being slow to embrace the Dark Arts when compared with American intellectuals in general.

I like the metaphor of the peacenik wanting to rid the world of violence by suggesting that police not use weapons. Let's elaborate on the analogy between Dark Arts and violence.

Tit For Tat is a common policy for trying to control violence. One obvious and much lamented flaw in the strategy is that it tends to foster cycles of violence, with each side claiming that the other side started it and that "they" use more vicious tactics.

To get past the problems of biased measurement of proportional response and so on, and thereby break the cycle of v... (read more)

4Jack
There might be worse things than being the targeted civilians... like being the soldiers. One thing soldiers report is that after they kill they stop valuing life as much as they did before, a psychological barrier is broken and violence becomes easier. If you are spending a lot of time arguing with your enemies and you are constantly practicing the Dark Arts, I think there is a much greater chance of the Dark Arts weaseling their way into the rest of your thinkings. I used to participate in a few very popular liberal community blogs- part of the reason I don't anymore is that I've changed. But I think another part is that those communities made decisions to sacrifice certain norms of traditional rationality in order to win their rhetorical battles with the right (who we felt had long given up those norms). Once the communities grew up and internalized this rhetorical anarchism the quality of thought degraded rapidly. Indeed, they even started to take policy positions that I believe they wouldn't have if their had been no political advantage in doing so.
0torekp
That's a very real danger, but that's where the "dramatically de-escalate" part comes in. One can also call foul on one's own side when excessively dark maneuvers are used.

It's tough to market existential risk because you end up sounding like a crazy doomsday prophet if the sell is too hard. Most people don't like to think about it and don't like being reminded that the world could suddenly end, unless it's a religion reminding them.

-1timtyler
http://www.whowillsurvive2012.com/ ...illustrates some of the current secular DOOM marketing.
3ata
The 2012 lunacy could be a good basis for satire as marketing for cryonics, though. "Tens of millions of people are DOOMED TO DIE in 2012. Don't let yourself or your loved ones be among them!"
0sketerpot
Intriguing; as soon as I saw that slogan, I wanted to protest that you're conflating meatdeath with infodeath. And yet your slogan is probably more effective than what I would instinctively come up with. Are rational thinking habits just antithetical to salesmanship? (Still, I think "YOUR MEAT MAY DIE IN 2012! Sign up for cryonics today!" is a pretty good slogan.)

The desire to persuade people isn't necessarily rational, especially when it comes to "enlightening" people on the superiority of rational thought. I think a truly rational person's allegiance should always rest in truth. Truth, in it self, is a very powerful notion that doesn't need the help of manipulative persuasive tactics to inspire people.

I think persuasive techniques can be adapted to help discover the truth, as long as the parties involved completely respect each other and are willing to ask questions that help the other better articulat... (read more)

[-][anonymous]00

I like the metaphor of the peacenik wanting to rid the world of violence by suggesting that police not use weapons. Let's elaborate on the analogy between Dark Arts and violence.

Tit For Tat is a common policy for trying to control violence. One obvious and much lamented flaw in the strategy is that it tends to foster cycles of violence, with each side claiming that the other side started it and that "they" use more vicious tactics.

To get past the problems of biased measurement of proportional response and so on, and thereby break the cycle of v... (read more)

This may have some connection to the akrasia discussions-- what are non-destructive methods (or even beneficial methods) for getting from thought to action? I'm assuming that thinking well is as much an action as physical movement-- there's some difference since a lot of people have preferences and talent for one or the other-- but subjectively, I find there's an overlap.

Is there a qualitative difference between the methods that work to change your own behavior and methods to change other people's?