Previously in series: Whining-Based Communities
"But there is a reason why many of my students have achieved great things; and by that I do not mean high rank in the Bayesian Conspiracy. I expected much of them, and they came to expect much of themselves." —Jeffreyssai
Among the failure modes of martial arts dojos, I suspect, is that a sufficiently dedicated martial arts student, will dream of...
...becoming a teacher and having their own martial arts dojo someday.
To see what's wrong with this, imagine going to a class on literary criticism, falling in love with it, and dreaming of someday becoming a famous literary critic just like your professor, but never actually writing anything. Writers tend to look down on literary critics' understanding of the art form itself, for just this reason. (Orson Scott Card uses the analogy of a wine critic who listens to a wine-taster saying "This wine has a great bouquet", and goes off to tell their students "You've got to make sure your wine has a great bouquet". When the student asks, "How? Does it have anything to do with grapes?" the critic replies disdainfully, "That's for grape-growers! I teach wine.")
Similarly, I propose, no student of rationality should study with the purpose of becoming a rationality instructor in turn. You do that on Sundays, or full-time after you retire.
And to place a go stone blocking this failure mode, I propose a requirement that all rationality instructors must have secret identities. They must have a life outside the Bayesian Conspiracy, which would be worthy of respect even if they were not rationality instructors. And to enforce this, I suggest the rule:
Rationality_Respect1(Instructor) = min(Rationality_Respect0(Instructor), Non_Rationality_Respect0(Instructor))
That is, you can't respect someone as a rationality instructor, more than you would respect them if they were not rationality instructors.
Some notes:
• This doesn't set Rationality_Respect1 equal to Non_Rationality_Respect0. It establishes an upper bound. This doesn't mean you can find random awesome people and expect them to be able to teach you. Explicit, abstract, cross-domain understanding of rationality and the ability to teach it to others is, unfortunately, an additional discipline on top of domain-specific life success. Newton was a Christian etcetera. I'd rather hear what Laplace had to say about rationality—Laplace wasn't as famous as Newton, but Laplace was a great mathematician, physicist, and astronomer in his own right, and he was the one who said "I have no need of that hypothesis" (when Napoleon asked why Laplace's works on celestial mechanics did not mention God). So I would respect Laplace as a rationality instructor well above Newton, by the min() function given above.
• We should be generous about what counts as a secret identity outside the Bayesian Conspiracy. If it's something that outsiders do in fact see as impressive, then it's "outside" regardless of how much Bayesian content is in the job. An experimental psychologist who writes good papers on heuristics and biases, a successful trader who uses Bayesian algorithms, a well-selling author of a general-audiences popular book on atheism—all of these have worthy secret identities. None of this contradicts the spirit of being good at something besides rationality—no, not even the last, because writing books that sell is a further difficult skill! At the same time, you don't want to be too lax and start respecting the instructor's ability to put up probability-theory equations on the blackboard—it has to be visibly outside the walls of the dojo and nothing that could be systematized within the Conspiracy as a token requirement.
• Apart from this, I shall not try to specify what exactly is worthy of respect. A creative mind may have good reason to depart from any criterion I care to describe. I'll just stick with the idea that "Nice rationality instructor" should be bounded above by "Nice secret identity".
• But if the Bayesian Conspiracy is ever to populate itself with instructors, this criterion should not be too strict. A simple test to see whether you live inside an elite bubble is to ask yourself whether the percentage of PhD-bearers in your apparent world exceeds the 0.25% rate at which they are found in the general population. Being a math professor at a small university who has published a few original proofs, or a successful day trader who retired after five years to become an organic farmer, or a serial entrepreneur who lived through three failed startups before going back to a more ordinary job as a senior programmer—that's nothing to sneeze at. The vast majority of people go through their whole lives without being that interesting. Any of these three would have some tales to tell of real-world use, on Sundays at the small rationality dojo where they were instructors. What I'm trying to say here is: don't demand that everyone be Robin Hanson in their secret identity, that is setting the bar too high. Selective reporting makes it seem that fantastically high-achieving people have a far higher relative frequency than their real occurrence. So if you ask for your rationality instructor to be as interesting as the sort of people you read about in the newspapers—and a master rationalist on top of that—and a good teacher on top of that—then you're going to have to join one of three famous dojos in New York, or something. But you don't want to be too lax and start respecting things that others wouldn't respect if they weren't specially looking for reasons to praise the instructor. "Having a good secret identity" should require way more effort than anything that could become a token requirement.
Now I put to you: If the instructors all have real-world anecdotes to tell of using their knowledge, and all of the students know that the desirable career path can't just be to become a rationality instructor, doesn't that sound healthier?
Part of the sequence The Craft and the Community
Next post: "Beware of Other-Optimizing"
Previous post: "Whining-Based Communities"
Yes, I believe that they don't recognize the low merit.
An expected utility calculation applies and my estimation is that I have erred on the side of too much explaining, not too little.
Another good reason would be that I find arguing with you about what posts should be made to be both fruitless and unpleasant. I find that the difference in preferences, assumptions and beliefs constitute an inferential distance that does not seem to be successfully crossed---I don't find I learn anything from your exhortations and don't expect to convince you of anything either. Note that I applied rudimentary tact and mentioned only the contextual reason because no matter how many caveats I include it is always going to come across as more personal and rude than I intend to be (where that intent would be the minimum possible given significant disagreement).
Since this is something of a pattern you should note that a tendency to make it difficult to end conversations with you gracefully makes it less practical to engage in such conversations in the first place. Let's assume that you are right and the reason expressed for withdrawing was a bad one---for emphasis, let's even assume that for some reason me ending a particular conversation is both epistemically and instrumentally irrational as well as immoral. Even in such a case you choosing push a frame where I should continue a conversation or should explain myself to you or others would still give incentive to avoid the conversation if my foresight allows, to avoid the awkwardness and anticipated social cost.
What I am saying is that there is a tradeoff to making comments like the parent. It may achieve some goals that you could have (persuasion of someone regarding the wrongess of ending a particular conversation perhaps) but come with the cost or reducing the likelyhood of future engagement. Whether that trade off is worth it depends on your preferences and what you are trying to achieve.
Ok, I think I figured it out. It seems rather obvious in retrospect and I'm not sure what took me so long.
You have a very different view of the current state of LW than I do. Whereas I see mostly reasonable efforts at truth seeking with only occasional forays into politics, you see a lot more social aggression and political fights. Whereas I think komponisto's comment was at worst making a honestly mistaken point or asking a badly phrased question, you interpret it as dark arts and/or social aggression, and think that the appropriate response is a counter... (read more)