Today's post, Two Cult Koans was originally published on 21 December 2007. A summary (taken from the LW wiki):

 

Two Koans about individuals concerned that they may have joined a cult.


Discuss the post here (rather than in the comments to the original post).

This post is part of the Rerunning the Sequences series, where we'll be going through Eliezer Yudkowsky's old posts in order so that people who are interested can (re-)read and discuss them. The previous post was Politics and Awful Art, and you can use the sequence_reruns tag or rss feed to follow the rest of the series.

Sequence reruns are a community-driven effort. You can participate by re-reading the sequence post, discussing it here, posting the next day's sequence reruns post, or summarizing forthcoming articles on the wiki. Go here for more details, or to have meta discussions about the Rerunning the Sequences series.

New to LessWrong?

New Comment
25 comments, sorted by Click to highlight new comments since: Today at 7:26 AM

I have to admit that I find this post impossible to understand, even after reading Eliezer's clarification in the comments. If there's a correct interpretation, what is it?

I think that there is an actual impossibility lurking here.

Imagine the situation in which the pupil is fixed in this view "To learn from my teacher, it suffices to learn his words off by heart." and never does anything more. The teacher notices the problem, and tries to help the pupil by telling him that it is not enough to learn the words, you must understand the meaning, and put the teachings into practice.

The pupil is grateful for the teaching and writes it in his note book "To learn from my teacher, it is not enough to learn the words, I must understand the meaning, and put the teachings into practice." It is only twenty four words, many of them short. The pupil puts the teaching into his spaced repetition software, and is soon word perfect, although he continues to ignore the meaning of the teachings he memorizes.

What more can the teacher do? Can words point beyond words? Obviously yes, but if the pupil looks at the finger and not at the moon, is there anything that the teacher can say that will get through to the pupil? It may be that it is actually impossible for the teacher to help the pupil until the pupil is changed by the impact of external events.

I think that I have understood the first koan. I am still mystified by the second koan. I suspect that it relates to things that Eliezer has experienced or seen. Eliezer has examples of followership in mind and the koans would be clear in context.

' I am still mystified by the second koan.': The novice associates {clothing types which past cults have used} with cults, and fears that his group's use of these clothing types suggests that the group may be cultish.

In practice (though the clothing may have an unrelated advantage), the clothing one wears has no effect on the validity of the logical arguments used in reasoning/debate.

The novice fears a perceived connection between the clothing and cultishness (where cultishness is taken to be a state of faith over rationality, or in any case irrationality). The master reveals the lack of effect of clothing on the subjects under discussion with the extreme example of the silly hat, pointing out the absurdity of wearing it affecting one's ability to effectively use probability theory (or any practical use of rationality for that matter).

This is similar to the first koan, {in which}/{in that} what matters is whether the (mental/conceptual) tools actually /work/ and yield useful results.

The student, more-or-less enlightened by this, takes it to heart and serves as an example to others by always discussing important concepts in absurd clothing, to get across to his own students(, others whom he interacts with, et cetera) that the clothing someone wears has nothing to do with the validity/accuracy of their ideas.

(Or, at least, that's my interpretation.)

Edit: A similar way of describing this may be to imagine that the novice is treating clothing-cult correlation as though it were causation, and the master points out with use of absurdity that there cannot be clothing->cult causation for the same reason that there cannot be silly_hat->comprehension causation. (What counts being the usefulness of the hammer, the validity of the theories used, rather than unrelated things which coincide with them.)

In practice (though the clothing may have an unrelated advantage), the clothing one wears has no effect on the validity of the logical arguments used in reasoning/debate.

The purpose of the clothing is to make people aware of the dangers of cultishness, even though wearing identical clothing all else equal encourages cultishness. All else is not equal, it is a worthwhile cost to bring the issue to the fore and force people to compensate by thinking non-cultishly (not counter-cultishly).

A novice rationalist approached the master Ougi and said, "Master, I worry that our rationality dojo is... well... a little cultish."
"That is a grave concern," said Ougi.
The novice waited a time, but Ougi said nothing more.
So the novice spoke up again: "I mean, I'm sorry, but having to wear these robes, and the hood - it just seems like we're the bloody Freemasons or something."
"Ah," said Ougi, "the robes and trappings."

Note how Ougi waited for the novice to explain himself, Ougi wanted to know if the thought patterns or clothing was causing the concern.

There is no direct relationship between clothing and probability theory, but there is a relationship that goes through the human. Human beliefs are influenced by social factors.

The student, more-or-less enlightened by this

The student learned only what was nearly explicitly said, that there is no direct relationship between clothing and probability theory.

The student failed to learn the lesson about cultishness. He is only said to have reached the rank of grad student, not master, unlike the student in the other koan. Bouzo "would only discuss rationality while wearing a clown suit." Only when wearing a clown suit - this is cultish countercultishness. To avoid giving the impression that understanding or cultishness have to do with mystic clothing, he never tried to increase understanding while in mystic clothing, lest that cause cultishness - oops for him.

What causes cultishness depends deeply on the audience, what meta-level of contrarianism each person is at, whether they will bristle at or be swept along by cultishness when wearing the same uniform, etc.

This is different from the first koan because the student is not a role model. One should not assume that the characters in a leader's koan are role models and think of how to justify their behavior. Instead, one must independently ask if it makes sense for Bouzo to only discuss rationality while wearing a clown suit in the context of the story. Like in most contexts, the answer to that question is "no, that's silly."

[-][anonymous]12y00

The sentence I found most salient in Two Cult Koans:

Disordered thoughts begin as feelings of attachment to preferred conclusions.

[This comment is no longer endorsed by its author]Reply

A very interesting perspective: Thank you!

Is there anything that the teacher can say that will get through to the pupil?

No, but there may be something the teacher can do.

In particular, the teacher could tell the student to take on various tasks or do experiments which might lead to a breakthrough, or possibly work to lower the student's level of background anxiety-- the latter is plausible to me because there might be a reason the student has no trust in his or her own perceptions and thoughts, and that reason might be locked in place by fear. Of course, I'm just guessing about a hypothetical student-- the teacher (if competent) should be able to form hypotheses about why the student is so stuck and what might help.

Imagine the situation in which the pupil is fixed in this view

Fixed and dilated?

I think the point is that if rationality works, you should be able to use it to actually do things.

A hammer's ability to drive nails determines its hammerness, not it's price.

When the student succeeds the master, he enforces this idea by making the novices use the techniques of rationality without naming them. They're supposed to grok the thought processes that are rationality and then apply those thought processes to produce what they say, rather than just referring to thought processes that they may or may not use.

When they actually brush up against the real world, whether or not they're rational becomes obvious. Either they win or they don't. The nail's nailed, or it's not.

When they actually brush up against the real world, whether or not they're rational becomes obvious. Either they win or they don't. The nail's nailed, or it's not.

Whether they're rational or not becomes determinable with moderate certainty.

If they happen to get hit by a meteorite while walking on the sidewalk, they lose, but were not irrational for walking on the sidewalk. If they happen to win the lottery, they win, but were (let's stipulate it is a normal scenario) irrational for playing.

One should evaluate choices by their probable, rather than actual, outcomes. In practice this is very difficult and actual outcomes are a good proxy.

In practice, you don't win the lottery.

I am beginning to suspect that novice rationalists should actually go buy a scratch ticket — do the math, then try it anyway, actually notice that you don't win, that it was a dumb idea (or at least, a money-losing idea) to play, and that this is what a one-in-a-million chance feels like: it feels like losing a buck, 999999 times out of a million.

With one ticket, there is usually something like a one out of fifty chance of winning a few dollars.

That's perhaps even better lesson, if you keep proper accounting.

Winning once in a while, very small amounts, losing on average... when what originally motivated you was an idea of winning a huge amount, which has never happened, and remains so unlikely to ever happen... and yet feeling a desire to continue... that's a very insidious way of losing.

I don't understand this sentence:

Because you needed to resolve your cognitive dissonance, you were willing to put on a silly hat.

Read: agreeing to put on the silly hat is evidence that you desire to resolve your cognitive dissonance (strength of the desire being at least proportional to unpleasantness of wearing a silly hat)

I know the master said this but isn't the more likely reason why Ougi put on the hat because his master told him to meaning that if Ougi had no cognitive dissonance he would have still put on the hat? I don't understand how the master determined that Ougi putting on the hat showed that Ougi had cognitive dissonance.

if Ougi had no cognitive dissonance he would have still put on the hat

Well, the master said that wearing the hat is necessary in order to resolve the dissonance. Therefore this seems implausible.

(Ougi is the name of the master in that story.)

There are so many comments here about what does and doesn't count as a "cult", and whether Lesswrong is cultish. People, it doesn't matter. The point is that that's not the point.

Why are we wary of cults? Because they're harmful in various ways. There are particular ways in which they're harmful, and particular things that cause them to be harmful in those ways.

Suppose that the inclusion criteria for "cult" had nothing to do with the harmful parts and consisted entirely of beneficial features. Suppose, for instance, that a cult is merely "any social thing that tends to make participants happy". Then it would be a good thing to be in a cult, and any harm done by the activities of the cult would be due entirely to the harmful activities and not to its status as cult.

When we try to pin down how our real notions of cult differ from that scenario, we end up with a list of features. Some of those features are harmful, and some are beneficial. Moreover, some of the features are both harmful and beneficial. We can circle those features. Then, regardless of what we call ourselves, we can avoid the items on the list that are merely harmful, mitigate or eliminate the potential damage done by the items that are also beneficial, and thereby create a kick-ass thing that helps us win. Whether it's rightly called a cult is irrelevant to whether it's a good thing.

The point is not to avoid being a cult. The point is to avoid causing the damage cults tend to cause, especially while borrowing their most useful strategies. Remember: "Do not lose reasonably. Win."

My interpretation is that cultishness is very human, a human bias. Lesswrong is a cult, but it is a cult that strives to overcome its culty biases.

I think the atom of cultishness is this: As long as any one of us believes that someone else knows something that we do not, and that we want to learn, we are believing through faith rather than through a private rational process. Believing that someone else knows true things that we do not know is an alternative to personal rationality. In the metaphor of the sequence, I see Eliezer hammer some nails in with his hammer, I try it myself and the nails bend and break most of the time. In matters of nails, I put more faith in Eliezer than I do in my very own hammer. As long as I see (or believe I see) Eliezer hammering nails that I cannot hammer (for example, proofs about FAI or CEV), I will either believe them through faith in Eliezer, or I won't believe them because I can't or won't or haven't hammered them myself.

Wanting to learn from Eliezer how to do these things, preceding that desire has to be the faith in Eliezer's hammering, a faith that a rationality I do not yet fully possess and may never fully posses exists.

I don't think either of the two koans are as instructive as desired.

  1. As previously pointed out, a cult can still teach useful information; noticing that you have learned useful information is not sufficient to differentiate a cult from a non-cult.

  2. It seems there are two different interpretations to the ending of the second koan: that the student understood that clothing doesn't matter, or that he didn't. It depends on whether the clown suit merely represents Robes 2.0 or whether it is counter-cultish behavior.

--- Robes 2.0: What is the point of the robes, then? If the student is questioning the point of the robes, the answer is not to say, "it doesn't matter; just focus on my teachings". Okay, then why not take off the robes? The question goes completely unaddressed. It's not like the master treated it as a teaching opportunity, where the whole point of the robes is for the student to question the robes. Or was that the point? In which case, the learning occurs more as a "ok, Master is full of shit" moment, rather than "aha I see that Master has taught another great lesson", which I think is less than ideal.

--- Counter-cultish: What are we supposed to take from this story? The student had a legitimate question which the master refused to answer, and instead solidified a mistaken impression on the student? Sounds like a real wise master.

[-][anonymous]12y00

Could you clarify the mindset behind the question? I am not sure I have a meaningful opinion of it as I am not sure what is meant by the language in the second part of the koan.

[This comment is no longer endorsed by its author]Reply

What is your opinion about the traditional Ganto's Axe koan?

Could you clarify the mindset behind the question? I am not sure I have a meaningful opinion of it as I am not sure what is meant by the language in the second part of the koan.

Sure. You were commenting on a post that was riffing on a particular style of Zen koan, and I didn't quite understand, as you put it, the mindset of the comment. So I figured one step towards clarifying it was figuring out whether you were reacting to the general koan-nature of it, or to something specific about this post within that context. One way to get at that was to get your opinion about other things that share the koan-nature. Ganto's Axe is among my favorite Zen koans, and also seems to share certain thematic elements with the post you were commenting on, so I picked that one..