Edit: This is old material. It may be out of date.

Or is that just a point of view?

I'm going to assume familiary with the common use of the following two terms on this site:

Otherwise don't worry, I've hanged out here for ages and I still need to update my cache of terms quite often. If you have questions about either after reading the wiki please feel free to ask since there are people much more knowledgeable than me that will probably answer them. I don't know if other users agree, but the Discussion section seems like the best place to ask questions that might have been already covered elsewhere for people who have trouble despite extensive study, in a way this OP is basically an example of this.

I'm also making the following assumptions:

  1. People are more often rational than otherwise when the rational answer happens to say good things about them.
  2. I hope people here agree that learning to be more rational will necessarily at least in some areas change your beliefs (unlikely any one person is right about everything).
  3. If one hasn't changed any beliefs its likely that they haven't employed rationality where it would do them the most good.
  4. People are better at defending or promoting a position when they think its true.
  5. From the above points it seems to follow that people who become more rational than average will send also more bad signals than they would if they hadn't become more rational.
  6. Most people would try to avoid 5., it represents a disincentive for becoming more rational. 

The main question of this thread:

How can one work around 5. without employing Dark Arts to sanitize the feelings accompanying a conclusion? Is it even possible? Can or should we talk about this and try to find and catalogue ways to do this since many of us are not skilled at social interactions (higher than average self identified non-neurotypicals visit LW)?

 

Notes:

- I also wish to emphasise that not only do some conclusions send bad signals, wanting to open *some* topics to rational inquiry in itself often sends bad signals even if you do eventually end up with a conclusion that sends good signals.

- I feel that, even if it isn't possible to hide bad signalling, the better map of reality one enjoys will off set these costs in other ways. Despite this, considering we are social animals I think many people would like to avoid this particular cost quite strongly, myself included.

New to LessWrong?

New Comment
51 comments, sorted by Click to highlight new comments since: Today at 4:03 AM

You could learn Dark Arts, to lie and deceive, or you could also use them to more effectively signal what you mean.

Like, "I want to be right" normally sounds pretty obnoxious but there are ways to state accurate beliefs that are semi-gregarious and don't involve lying.

A few examples:

  • Making a show out of changing your opinion to that of the person who changed our mind in a buddy-buddy manner. "Ooooh! I get it now -- you're totally right" slap on the back, then say your few caveats.

  • When someone talks about their pet political issue, act apologetic and try to change the subject. The appeasement makes it hard for them to feel like you're attacking their subject, even though you disagree.

  • Laughing off people who point out that you used be believe something else (then seriously addressing them). This downplays the issue and lets you seem confident even though you changed your mind.

This might be a slippery slope, but I think that there's a difference in employing the Dark Arts to change someone else's mind and employing them to allow yourself to be portrayed in an accurate but positive light.

[-][anonymous]13y00

This is very much in line with the kind of suggestions I was searching for.

How can one work around 5. without employing Dark Arts to sanitize the feelings accompanying a conclusion? Is it even possible?

Yes if you want to manage your image optimally you need to lie or spin the truth. Either to yourself or to others.

Can or should we talk about this and try to find and catalogue ways to do this since many of us are not skilled at social interactions (higher than average self identified non-neurotypicals visit LW)?

Probably a good idea but I would not even bother trying to have such discussion on lesswrong. The pressure to signal naive morality in that kind of area is just far too strong.

[-][anonymous]13y90

A rational discussion about how to sanitize bad signals sent by some rationalist discussions or conclusions, can't be carried out effectively because of the bad signals that it sends. However this can be pointed out since it happens to send good signals.

Humans are funny aren't we?

Probably a good idea but I would not even bother trying to have such discussion on lesswrong. The pressure to signal naive morality in that kind of area is just far too strong.

I'd be interested to see you give it a shot.

I'd be interested to see you give it a shot.

I'm afraid I reached the limits of my patience even in this brief discussion. :)

Yes, there is sometimes a trade-off between truth and optimal signaling; and in those cases, if you aren't willing to lie or good at lying, rationality makes your signaling worse. But not always; there are far more cases where it's just a matter of recognizing what you're signaling and how, and fixing any incorrect signals. In those cases, rationality makes your signaling better. I believe the effects from the second case are usually stronger, so that becoming more rational represents a net gain, albeit a smaller gain than if everyone loved interacting with honest truth-seekers.

[-][anonymous]13y40

Well one of the questions in the op is precisely about fixing the signals. How exactly does one go about that? I'm basically asking if this application of rationality can or should be discussed on LW in a way similar to lets say the discussion on akrasia.

why would anyone want to avoid employing the dark arts as a general rule?

[-][anonymous]13y50

If one can't or doesn't want to avoid Dark Arts in some circumstances then I'm clearly asking if the people on LW should or can discuss and study their use for this application.

Yes! Social skills are healthy. :)

Are you saying social skills are the same as the dark arts?

Are you saying social skills are the same as the dark arts?

I am alluding to a distinct overlap between social skills and that which is labelled the 'dark arts'. This is particularly the case when instinctive and emotion driven behaviors have been driven to the level of self awareness.

(I incidentally note that 'dark arts' includes features that are more or less necessary for healthy human interaction. "Dark Arts" are often beneficial for the person you are interacting with and sometimes expected as a courtesy.)

Can you give examples, either of emotion driven behaviors becoming dark arts when raised to awareness, or of dark arts being necessary to healthy interaction? I think we are using different definitions of what dark arts are.

On the wiki the dark arts are defined as exploiting the biases of others so that they behave irrationally. This is morally wrong - I want others not only to accept the right answers, but to accept them for the right reasons.

Also, the dark arts are, well, dark.

If you're classifying the intentional use of human biases as wrong in a terminal moral sense, there's not much more to be said other than that I don't share your moral values, not even when you format them in italics.

If you're instead claiming they are wrong in some instrumental sense -- that is, that they lead to bad results -- I'd like to understand how you derive that.

In other words: suppose I want to convince people to do something, or to stop doing something, or to feel a certain way or stop feeling a certain way, or some other X. Suppose I then convince people to X by using the "dark arts" and "exploiting the biases of others."

For example, suppose I want someone to think that making use of human biases is a bad thing, and so I label that activity using words with negatively weighted denotations like "exploit" and "dark."

What have I made worse, by so doing?

Let's say I'm working with Bob. By exploiting his cognitive biases, I can convince him to do two things that I value. Without such exploitation, I can only convince him to do one. If I do exploit his biases, these bad things happen:

  1. I have less confidence that either of the two things were actually worthwhile.

  2. It is more likely that my enemy will be able to convince Bob to undo the valuable things he did.

  3. I have less trust in Bob in the future, and his total value to me is reduced.

In some cases these effects might outweigh the value of getting two things done rather than one.

Nobody doubts that doing stupid or ill-considered things with the dark arts could have undesirable consequences.

Note: the parent is another example of a dark arts persuasion technique.

Note: the parent is another example of a dark arts persuasion technique.

I think your problem, is you have too broad a notion of what constitutes "dark arts".

I think your problem, is

I don't accept disagreement with Eugine_Nier as a 'problem'.

you have too broad a notion of what constitutes "dark arts".

There is a time and a place for each of the following:

  • "Grey" arts.
  • Dark arts as defined on the wiki.
  • The alternate version of 'dark arts' that nerzhin presented.

Further, there are instances in each category where the use of dark arts is pro-social. It seems that the term 'dark arts' has become a hindrance to understanding instead of a help. It does not mean evil!

I agree that manipulating Bob makes it hard to rely on Bob for "sanity checks" of my motives, and that that's a significant loss if Bob would otherwise have been useful in that capacity.

And I can sorta see how it might be true in some cases that manipulating Bob might render him more manipulable by others, and therefore less valuable to me, than he would have been had I not manipulated him. (I have trouble coming up with a non-contrived example, though, so I'm not convinced.)

So, yes, agreed: in cases like those, it makes things worse.

For example, suppose I want someone to think that making use of human biases is a bad thing, and so I label that activity using words with negatively weighted denotations like "exploit" and "dark."

What have I made worse, by so doing?

Now that was well done. Although technically you would have counter-factually made the universe worse according to your values. You will have lost a small measure of respect from your audience and by so doing reduced your social influence - a critical instrumental resource. Even worse your credibility will have decreased most specifically when it comes to moral authority.

(nods) Agreed, given that my audience is such that I lose more respect/influence/authority than I gain. Which some audiences are.

The description on the wiki does put a negative spin on it (although not quite as negative as you do - behavior is not even mentioned). From his description I get the impression that Konkvistador is also including 'grey arts' too.

This is morally wrong

I reject this moral proscription and any other moral proscription that would make becoming more rational a form of self sabotage.

I reject this moral proscription and any other moral proscription that would make becoming more rational a form of self sabotage.

Given that you suck at dark arts, as demonstrated by the fact that you openly admit on a public forum that you're willing to use them, I don't see how this moral proscription is a form of self sabotage.

Given that you suck at dark arts, as demonstrated by the fact that you openly admit on a public forum that you're willing to use them, I don't see how this moral proscription is a form of self sabotage.

Your attitude is objectionable and your understanding of signalling strategy lacks nuance.

Are you sure the beliefs you're using dark arts to promote are correct? If a belief you're promoting turns out to be wrong, it'll be nearly impossible to back paddle. Read this for a more detailed description.

That strikes me more as an excuse to say avoiding the dark arts is the desirable thing to do than an actual reason.

People are more often rational than otherwise when the rational answer happens to say good things about them.

I would not call that 'more rational'.

[-][anonymous]13y40

All things being equal I people have less trouble being rational when the right answer happens to be convenient, I wanted to emphasise that the convenient answer sometimes is the right one.

All things being equal I people have less trouble being rational when the right answer happens to be convenient, I wanted to emphasise that the convenient answer sometimes is the right one.

It is true that the convenient thing is sometimes the right one. That just doesn't make giving the convenient answer 'rational'. Just like a clock that is broken is not measuring time even during those two minutes per day that it gives the correct time.

[-][anonymous]13y40

I see I still wasn't clear on this. I was not talking about being right for the wrong reason.

Reasoning is hard. It seems easier when you like where it is going, but sometimes you quite clearly feel that you are entering a ugh field, you really don't want to do this, it suddenly seems harder. And because its harder sometimes you don't go through with it or botch the process by being sloppy.

That is an interesting observation. It does seem plausible. I am still a little reluctant to call it more rational given that it could be said to be describing the aetiology of a particular kind of irrational bias. Yet at the same time I can accept it as reasonable conclusion when coming from a certain way of modelling and evaluating the thinking process.

How do you get (5) from (1-4)? What signals are you talking about?

[-][anonymous]13y10

Voted up for the (in retrospect) good question.

1.

People are more often rational than otherwise when the rational answer happens to say good things about them.

Good things in this context is synonymous with good signals. Its easier to be rational when being rational or the conclusion that you reach make imply good things about you to other people.

2.

I hope people here agree that learning to be more rational will necessarily at least in some areas change your beliefs (unlikely any one person is right about everything).

People who learn rationality will likely change some of the beliefs that did not have a rational foundation before. And they are likely to (eventually) question all of them. Pre-rational beliefs are more likely than not things that are either neutral or send good things about you to your social group.

Putting 1. and 2. together make it seem that people who are rational might get into trouble by either opening debates about a subject (even if they eventually reach the "approved" conclusion) or by reaching a conclusion that would make him look bad in front of others.

Point 3. leads me to believe that the scenario of the previous paragraph would be true more often than I feel it is if I was more rational than I currently am, since I am to a certain extent selective at employing rationality, using it more often when it gives a convenient result.

If one is very rational one should also be better at ways to avoid sending bad signals, but point 4. seems very strong in people and may overwhelm this, hence 5. seems true.

A clear way around this at first glance seem to be the Dark Arts, but I don't want to use them because some seem unethical, while others may cause bad habits that reduce my own rationality. Thus I decided to ask the community the bolded question.

What signals are you talking about?

Could you please rephrase this question? I'm afraid I'm not sure what exactly you're asking about. I thought the link to signaling would clear up my usage of the words signaling and signal, but I assume you are familiar with that use.

Here is a summary of what I think you are saying:

"Acquiring the skills of rationality changes you. You will acquire new ways of assessing beliefs, and will forsake some old beliefs for new ones. This change may result in your fitting less well into the social niche that you occupied. This may be a disincentive to making such a change."

Yes, this is a standard observation in all fields of personal development. The greatest resistance to change comes first from the person making that change, then from those around them, in order from the closest outwards. The only question to ask is, is it worth it? In the case of rationality, I think there is a very clear and simple answer: Yes.

I am minded to suggest some advice for rationality akin to Michael Pollan's advice for diet. ("Eat food. Not too much. Mostly plants.")

"Be rational. All the time. About everything."

The Sequences are mostly about how to be rational, but the basic concept here is ultra-simple. Anything more is over-thinking it.

"Be rational. All the time. About everything."

Be rational about everything, including optimal allocation of cognitive resources.

That's just a minor detail of the how-to.

Except in as much as it amounts to discarding both "all the time" and "about everything" in all but the most esoteric technical sense. Being rational all the time about everything is a terrible idea when running on human hardware.

I still see this as nothing but a trite nitpick. What examples would you give where it is irrational to be rational? Where it's smart to take stupid pills?

[-][anonymous]13y30

Where it's smart to take stupid pills?

Sometimes thinking about a problem in all its depth costs you more than you would loose by forgoing to optimize it.

Then the smart thing to do is to not sweat over it.

Speaking of which, this conversation has become a case in point.

Seems to me that Richard is roughly talking about instrumental rationality, while Konkvistador is roughly talking about epistemic rationality. Let's not quibble over the word rationality.

I think you must be talking about epistemic rationality.

I don't think instrumental rationality runs into quite the same problem.

[-][anonymous]13y20

Yes this debate is most topical with epistemic rationality. However I feel someone using instrumental rationality can encounter similar problems, its easy to imagine sending bad signals to people about yourself when discussing how to go about "winning".

its easy to imagine sending bad signals to people about yourself when discussing how to go about "winning".

That is, when having an epistemologically rational discussion of instrumental rationality considerations. Instrumental rationality just lies.

[-][anonymous]13y00

Yes.