Yes, there is sometimes a trade-off between truth and optimal signaling; and in those cases, if you aren't willing to lie or good at lying, rationality makes your signaling worse. But not always; there are far more cases where it's just a matter of recognizing what you're signaling and how, and fixing any incorrect signals. In those cases, rationality makes your signaling better. I believe the effects from the second case are usually stronger, so that becoming more rational represents a net gain, albeit a smaller gain than if everyone loved interacting with honest truth-seekers.
Well one of the questions in the op is precisely about fixing the signals. How exactly does one go about that? I'm basically asking if this application of rationality can or should be discussed on LW in a way similar to lets say the discussion on akrasia.
Edit: This is old material. It may be out of date.
Or is that just a point of view?
I'm going to assume familiary with the common use of the following two terms on this site:
Otherwise don't worry, I've hanged out here for ages and I still need to update my cache of terms quite often. If you have questions about either after reading the wiki please feel free to ask since there are people much more knowledgeable than me that will probably answer them. I don't know if other users agree, but the Discussion section seems like the best place to ask questions that might have been already covered elsewhere for people who have trouble despite extensive study, in a way this OP is basically an example of this.
I'm also making the following assumptions:
The main question of this thread:
How can one work around 5. without employing Dark Arts to sanitize the feelings accompanying a conclusion? Is it even possible? Can or should we talk about this and try to find and catalogue ways to do this since many of us are not skilled at social interactions (higher than average self identified non-neurotypicals visit LW)?
Notes:
- I also wish to emphasise that not only do some conclusions send bad signals, wanting to open *some* topics to rational inquiry in itself often sends bad signals even if you do eventually end up with a conclusion that sends good signals.
- I feel that, even if it isn't possible to hide bad signalling, the better map of reality one enjoys will off set these costs in other ways. Despite this, considering we are social animals I think many people would like to avoid this particular cost quite strongly, myself included.