Intellectual honesty: being up-front not just about what you believe, but also why you believe it, what your motivations are in saying it, and the degree to which you have evidence for it.
If you have extensive knowledge about a topic it's often not possible to communicate all of that. Attempting to do so also means that you take up a lot of valuable speaking time. Especially in group discussions where only one person can speak at a time.
In that time that you take to speak about a subject it's also worthwhile to focus on arguments that the other person can evaluate. We believe a lot of things based on personal experience. When there's published scientific evidence for those beliefs it's better to appeal to the scientific evidence than appealing to one's personal experience even when that might mislead the other person about why one holds a certain belief.
I'm just reading Keith Stanovich book "How To Think Straight About Psychology". In it he makes the argument that scientists should not be open about personal beliefs they hold for which they have no scientific evidence.
It's not possible to communicate all the reasons, agreed -- it's often not even possible to articulate all of one's reasons even given unlimited time. However, the difference I'm pointing at is larger than the time-allocation problem. It's the difference between agreeing with someone as a sign of social support vs as a sign that you have further evidence in the same direction. This changes the way conversational resources are allocated (often saving a lot of time, as I argued in my original post), but the reason for the change is due to a shift in the underlying goal of the conversation.
Communicating why you believe in A is a different goal than communicating reasons why it makes sense for another person to believe in A.
When it comes to the later, scientific evidence is more important than it is for the former.
I somewhat agree, but I am a little confused about it.
Focusing on truth rather than status in a conversation tends to save time with respect to the goal of truth.
Conveying reasons is an important sub-goal of conveying truth, especially when we can't fully trust each other's rationality.
Conveying scientific reasons only rather than personal reasons, and therefore restricting to only scientifically established truths, sounds like a safeguard to keep science from being contaminated by weird beliefs held by compartmentalized thinkers or simply poor rationalists. (Is that the intention?)
It also sounds somewhat like the false humility which comes from trusting science above all other things; the kind of humility which would complain of Einstein's arrogance.
Being as honest as possible about the holes in evidence, the contrary findings, and any possibly doubts is an important part of scientific honesty.
Looking for the cause of belief, not only the justifications, seems a useful safeguard against the clever arguer.
Communicating why you believe A is different from communicating why it makes sense for other people to believe A. However, if the two are very different, something has gone wrong:
Therefore rational beliefs are contagious, among honest folk who believe each other to be honest. And it's why a claim that your beliefs are not contagious—that you believe for private reasons which are not transmissible—is so suspicious. If your beliefs are entangled with reality, they should be contagious among honest folk.
If your model of reality suggests that the outputs of your thought processes should not be contagious to others, then your model says that your beliefs are not themselves evidence, meaning they are not entangled with reality. You should apply a reflective correction, and stop believing.
It seems like what you are talking about applies to scientists speaking publicly about science, but does not apply very well to scientists speaking privately to each other.
Focusing on truth rather than status in a conversation tends to save time with respect to the goal of truth.
Those two aren't the only possible ways of having a discussion. There a lot more that goes into having discussions.
Communicating why you believe A is different from communicating why it makes sense for other people to believe A. However, if the two are very different, something has gone wrong:
Not at all.
At the LW-Europe Community camp I did a workshop on Focusing. There are two ways to provide evidence that Focusing works.
I personally choose mental techniques based on trying different techniques and experiencing what the techniques do. I can speak about my empiric personal experience.
I can also refer to Eugine Gendlin being a respected academic psychologist and the fact that there are many published studies that support Focusing.
Both arguments are entangled with reality but it's more useful to talk about the scientific evidence. It's more likely to convince my audience that Focusing is valuable.
Cross-posted to my blog.
A while ago, I wrote about epistemic trust. The thrust of my argument was that rational argument is often more a function of the group dynamic, as opposed to how rational the individuals in the group are. I assigned meaning to several terms, in order to explain this:
Intellectual honesty: being up-front not just about what you believe, but also why you believe it, what your motivations are in saying it, and the degree to which you have evidence for it.
Intellectual-Honesty Culture: The norm of intellectual honesty. Calling out mistakes and immediately admitting them; feeling comfortable with giving and receiving criticism.
Face Culture: Norms associated with lack of intellectual honesty. In particular, a need to save face when one's statements turn out to be incorrect or irrelevant; the need to make everyone feel included by praising contributions and excusing mistakes.
Intellectual trust: the expectation that others in the discussion have common intellectual goals; that criticism is an attempt to help, rather than an attack. The kind of trust required to take other people's comments at face value rather than being overly concerned with ulterior motives, especially ideological motives. I hypothesized that this is caused largely by ideological common ground, and that this is the main way of achieving intellectual-honesty culture.
There are several subtleties which I did not emphasize last time.