Cross-posted to my blog.


A while ago, I wrote about epistemic trust. The thrust of my argument was that rational argument is often more a function of the group dynamic, as opposed to how rational the individuals in the group are. I assigned meaning to several terms, in order to explain this:

Intellectual honesty: being up-front not just about what you believe, but also why you believe it, what your motivations are in saying it, and the degree to which you have evidence for it.

Intellectual-Honesty Culture: The norm of intellectual honesty. Calling out mistakes and immediately admitting them; feeling comfortable with giving and receiving criticism.

Face Culture: Norms associated with lack of intellectual honesty. In particular, a need to save face when one's statements turn out to be incorrect or irrelevant; the need to make everyone feel included by praising contributions and excusing mistakes.

Intellectual trust: the expectation that others in the discussion have common intellectual goals; that criticism is an attempt to help, rather than an attack. The kind of trust required to take other people's comments at face value rather than being overly concerned with ulterior motives, especially ideological motives. I hypothesized that this is caused largely by ideological common ground, and that this is the main way of achieving intellectual-honesty culture.

There are several subtleties which I did not emphasize last time.

  • Sometimes it's necessary to play at face culture. The skills which go along with face-culture are important. It is generally a good idea to try to make everyone feel included and to praise contributions even if they turn out to be incorrect. It's important to make sure that you do not offend people with criticism. Many people feel that they are under attack when engaged in critical discussion. Wanting to work against this is not an excuse for ignoring it.
  • Face culture is not the error. Being unable to play the right culture at the right time is the error. In my personal experience, I've seen that some people are unable to give up face-culture habits in more academic settings where intellectual honesty is the norm. This causes great strife and heated arguments! There is no gain in playing for face when you're in the midst of an honesty culture, unless you can do it very well and subtly. You gain a lot more face by admitting your mistakes! On the other hand, there's no honor in playing for honesty when face-culture is dominant. This also tends to cause more trouble than it's worth.
  • It's a cultural thing, but it's not just a cultural thing. Some people have personalities much better suited to one culture or the other, while other people are able to switch freely between them. I expect that groups can switch further toward intellectual honesty as a result of establishing intellectual trust, but that is not the only factor. Try to estimate the preferences of the individuals you're dealing with (while keeping in mind that people may surprise you later on).
New Comment
9 comments, sorted by Click to highlight new comments since:

Consider moving to Main.

Seconded.

Ok, I moved it!

Intellectual honesty: being up-front not just about what you believe, but also why you believe it, what your motivations are in saying it, and the degree to which you have evidence for it.

If you have extensive knowledge about a topic it's often not possible to communicate all of that. Attempting to do so also means that you take up a lot of valuable speaking time. Especially in group discussions where only one person can speak at a time.

In that time that you take to speak about a subject it's also worthwhile to focus on arguments that the other person can evaluate. We believe a lot of things based on personal experience. When there's published scientific evidence for those beliefs it's better to appeal to the scientific evidence than appealing to one's personal experience even when that might mislead the other person about why one holds a certain belief.

I'm just reading Keith Stanovich book "How To Think Straight About Psychology". In it he makes the argument that scientists should not be open about personal beliefs they hold for which they have no scientific evidence.

It's not possible to communicate all the reasons, agreed -- it's often not even possible to articulate all of one's reasons even given unlimited time. However, the difference I'm pointing at is larger than the time-allocation problem. It's the difference between agreeing with someone as a sign of social support vs as a sign that you have further evidence in the same direction. This changes the way conversational resources are allocated (often saving a lot of time, as I argued in my original post), but the reason for the change is due to a shift in the underlying goal of the conversation.

Communicating why you believe in A is a different goal than communicating reasons why it makes sense for another person to believe in A.

When it comes to the later, scientific evidence is more important than it is for the former.

I somewhat agree, but I am a little confused about it.

Focusing on truth rather than status in a conversation tends to save time with respect to the goal of truth.

Conveying reasons is an important sub-goal of conveying truth, especially when we can't fully trust each other's rationality.

Conveying scientific reasons only rather than personal reasons, and therefore restricting to only scientifically established truths, sounds like a safeguard to keep science from being contaminated by weird beliefs held by compartmentalized thinkers or simply poor rationalists. (Is that the intention?)

It also sounds somewhat like the false humility which comes from trusting science above all other things; the kind of humility which would complain of Einstein's arrogance.

Being as honest as possible about the holes in evidence, the contrary findings, and any possibly doubts is an important part of scientific honesty.

Looking for the cause of belief, not only the justifications, seems a useful safeguard against the clever arguer.

Communicating why you believe A is different from communicating why it makes sense for other people to believe A. However, if the two are very different, something has gone wrong:

Therefore rational beliefs are contagious, among honest folk who believe each other to be honest. And it's why a claim that your beliefs are not contagious—that you believe for private reasons which are not transmissible—is so suspicious. If your beliefs are entangled with reality, they should be contagious among honest folk.

If your model of reality suggests that the outputs of your thought processes should not be contagious to others, then your model says that your beliefs are not themselves evidence, meaning they are not entangled with reality. You should apply a reflective correction, and stop believing.

It seems like what you are talking about applies to scientists speaking publicly about science, but does not apply very well to scientists speaking privately to each other.

Focusing on truth rather than status in a conversation tends to save time with respect to the goal of truth.

Those two aren't the only possible ways of having a discussion. There a lot more that goes into having discussions.

Communicating why you believe A is different from communicating why it makes sense for other people to believe A. However, if the two are very different, something has gone wrong:

Not at all.

At the LW-Europe Community camp I did a workshop on Focusing. There are two ways to provide evidence that Focusing works.

I personally choose mental techniques based on trying different techniques and experiencing what the techniques do. I can speak about my empiric personal experience.

I can also refer to Eugine Gendlin being a respected academic psychologist and the fact that there are many published studies that support Focusing.

Both arguments are entangled with reality but it's more useful to talk about the scientific evidence. It's more likely to convince my audience that Focusing is valuable.

[-][anonymous]00

Consider moving to Main.

[This comment is no longer endorsed by its author]Reply