Previously in series: Whining-Based Communities
"But there is a reason why many of my students have achieved great things; and by that I do not mean high rank in the Bayesian Conspiracy. I expected much of them, and they came to expect much of themselves." —Jeffreyssai
Among the failure modes of martial arts dojos, I suspect, is that a sufficiently dedicated martial arts student, will dream of...
...becoming a teacher and having their own martial arts dojo someday.
To see what's wrong with this, imagine going to a class on literary criticism, falling in love with it, and dreaming of someday becoming a famous literary critic just like your professor, but never actually writing anything. Writers tend to look down on literary critics' understanding of the art form itself, for just this reason. (Orson Scott Card uses the analogy of a wine critic who listens to a wine-taster saying "This wine has a great bouquet", and goes off to tell their students "You've got to make sure your wine has a great bouquet". When the student asks, "How? Does it have anything to do with grapes?" the critic replies disdainfully, "That's for grape-growers! I teach wine.")
Similarly, I propose, no student of rationality should study with the purpose of becoming a rationality instructor in turn. You do that on Sundays, or full-time after you retire.
And to place a go stone blocking this failure mode, I propose a requirement that all rationality instructors must have secret identities. They must have a life outside the Bayesian Conspiracy, which would be worthy of respect even if they were not rationality instructors. And to enforce this, I suggest the rule:
Rationality_Respect1(Instructor) = min(Rationality_Respect0(Instructor), Non_Rationality_Respect0(Instructor))
That is, you can't respect someone as a rationality instructor, more than you would respect them if they were not rationality instructors.
Some notes:
• This doesn't set Rationality_Respect1 equal to Non_Rationality_Respect0. It establishes an upper bound. This doesn't mean you can find random awesome people and expect them to be able to teach you. Explicit, abstract, cross-domain understanding of rationality and the ability to teach it to others is, unfortunately, an additional discipline on top of domain-specific life success. Newton was a Christian etcetera. I'd rather hear what Laplace had to say about rationality—Laplace wasn't as famous as Newton, but Laplace was a great mathematician, physicist, and astronomer in his own right, and he was the one who said "I have no need of that hypothesis" (when Napoleon asked why Laplace's works on celestial mechanics did not mention God). So I would respect Laplace as a rationality instructor well above Newton, by the min() function given above.
• We should be generous about what counts as a secret identity outside the Bayesian Conspiracy. If it's something that outsiders do in fact see as impressive, then it's "outside" regardless of how much Bayesian content is in the job. An experimental psychologist who writes good papers on heuristics and biases, a successful trader who uses Bayesian algorithms, a well-selling author of a general-audiences popular book on atheism—all of these have worthy secret identities. None of this contradicts the spirit of being good at something besides rationality—no, not even the last, because writing books that sell is a further difficult skill! At the same time, you don't want to be too lax and start respecting the instructor's ability to put up probability-theory equations on the blackboard—it has to be visibly outside the walls of the dojo and nothing that could be systematized within the Conspiracy as a token requirement.
• Apart from this, I shall not try to specify what exactly is worthy of respect. A creative mind may have good reason to depart from any criterion I care to describe. I'll just stick with the idea that "Nice rationality instructor" should be bounded above by "Nice secret identity".
• But if the Bayesian Conspiracy is ever to populate itself with instructors, this criterion should not be too strict. A simple test to see whether you live inside an elite bubble is to ask yourself whether the percentage of PhD-bearers in your apparent world exceeds the 0.25% rate at which they are found in the general population. Being a math professor at a small university who has published a few original proofs, or a successful day trader who retired after five years to become an organic farmer, or a serial entrepreneur who lived through three failed startups before going back to a more ordinary job as a senior programmer—that's nothing to sneeze at. The vast majority of people go through their whole lives without being that interesting. Any of these three would have some tales to tell of real-world use, on Sundays at the small rationality dojo where they were instructors. What I'm trying to say here is: don't demand that everyone be Robin Hanson in their secret identity, that is setting the bar too high. Selective reporting makes it seem that fantastically high-achieving people have a far higher relative frequency than their real occurrence. So if you ask for your rationality instructor to be as interesting as the sort of people you read about in the newspapers—and a master rationalist on top of that—and a good teacher on top of that—then you're going to have to join one of three famous dojos in New York, or something. But you don't want to be too lax and start respecting things that others wouldn't respect if they weren't specially looking for reasons to praise the instructor. "Having a good secret identity" should require way more effort than anything that could become a token requirement.
Now I put to you: If the instructors all have real-world anecdotes to tell of using their knowledge, and all of the students know that the desirable career path can't just be to become a rationality instructor, doesn't that sound healthier?
Part of the sequence The Craft and the Community
Next post: "Beware of Other-Optimizing"
Previous post: "Whining-Based Communities"
From my perspective, what I did was to hypothesize that you had the motive to do good but wrong beliefs. The beliefs I attributed to you in my guess was that komponisto's comment constituted social aggression and/or dark arts, and therefore countering/punishing it would be good for LW.
I do not understand in what sense I hypothesized "negative motives" in you or where I said or implied that you should be shamed (except in the sense that having systematically wrong beliefs might be considered shameful in a community that prides itself on its rationality, but I'm guessing that's not what you mean).
You said you didn't punish me in this instance but that you would endorse doing so, and I bet that many of the people you did punish are in the same bewildered position of wondering what they did to deserve it, and have little idea how they're supposed to avoid such punishments, except by avoiding drawing your attention. The fact that
All of these do not help. And I note that since you like to defend people besides yourself against perceived wrongs, there is no reliable way to avoid drawing your attention except by not posting or commenting.
EDIT: This reply applies to a previous version of the parent. I'm not sure whether it applies to the current version since just a glance at the new bulleted list was too much.
Yes, were I to have actually objected in this manner to you comment I clearly would have objected to the attribution of "false beliefs result in " based on untenable mind-reading and not "sinister motives". You will note that Vladimir referred to both. As it... (read more)