My experience of the rationality community is one where we value Daniel's poems and Raemon's songs. The vibe of the LessWrong community weekend is not one of cultural values that tell people to avoid art.
To the extend that this share is true, for what subset of the rationality community suffers from it?
(By the way, Eliezer Yudkowsky, this is what post-Rationalists are, it’s not that complicated—they don’t have explicit principles because they’ve moved on from thinking that life is entirely about explicit principles. Perhaps you don’t intuitively grasp this because you don’t see the social group you’ve founded as a social group.)
They key reason why he won't grasp that is because he doesn't think that life is entirely about explicit principles.
To the extend that this share is true, for what subset of the rationality community suffers from it?
I recall having had this feeling, in particular I once mentioned to another member of the community that I was thinking about working on a fiction-writing project but I also felt bad to admit it, because I was afraid he'd look down on me for spending time on something so frivolous. (This was quite a while ago, as in 2014 or something like that.)
It seems like any cultural prospiracy to increase standards to exceptional levels, which I see as a good thing, would be quickly branded as 'toxic' by this outlook. It is a matter of contextual objection-solving whether or not large parts of you can be worse than a [desirably [normatively basic]] standard. If it is toxic, then it is not obvious to me that toxicity is bad, and if toxicity must be bad, then it is not clear to me that you can, in fair order, sneak in that connotation by characterizing rationalist self-standards as toxic.
The OP provides examples to illustrate what they mean by an overly extreme standard. They also say that many EA/rationalist principles are good, and that there’s less toxicity in these communities than in others.
Might be good to taboo “toxicity.” My definition is “behavior in the name of a worthy goal that doesn’t really deliver that goal, but makes the inflictor of the toxicity feel better or get a selfish benefit in the short run, while causing problems and bad feelings for others.”
For example, a trainer berating trainees in the name of high standards after a failure, in an attempt to punish them into working harder, or in order to make the trainees into the objects of blame by the trainer’s superiors and not the trainer.
Or a person beating themselves up over a $5 purchase for themselves in the name of charity, only to burn out on EA entirely after a few years. This isn’t obviously toxic, by the definition above, except through some sort of internal family systems framework in which one part of themselves is getting off on the mental beratement, while another part suffers. Seems linked to Eliezer’s idea of “hammering down on one part of yourself” from his most recent post here.
This critique seems to rely on a misreading of the post. The author isn’t saying the rationality community has exceptionally toxic social norms.
I’m not mentioning these communities because I think they’re extra toxic or anything, by the way. They’re probably less toxic than the average group, and a lot of their principles are great.
Rather that goals, even worthy goals, can result in certain toxic social dynamics that no one would endorse explicitly:
Sometimes—often—these forbidden thoughts/actions aren’t even contrary to the explicit values. They just don’t fit in with the implied group aesthetic, which is often a much stricter, more menacing guideline, all the more so because it’s a collective unwritten fiction.
There’s a bit of an aesthetic parallel to ai alignment. It would be surprising if the poorly understood process that produces social dynamics just so happened to be healthy for everyone involved in the case of the rationality project. Verbalizing some of the implicit beliefs gives people the ability to reflect on which ones they want to keep.
I would expect the author to agree that most (all?) communities contain toxic dynamics.
It seems like any cultural prospiracy to increase standards to exceptional levels, which I see as a good thing, would be quickly branded as 'toxic' by this outlook.
I read it not as saying that having high standards would be bad by itself, but that the toxicity is about a specific dynamic where the standards become interpreted as disallowing even things that have nothing to do with the standards themselves. E.g. nothing about having high standards for rationality requires one to look down on art.
You make good points. Toxicity is relative to some standard. A set of social norms that are considered toxic from the perspective of, say, a postmodern studies department (where absolute non-offensiveness is prime), might be perfectly healthy from the perspective of a physics research department (where objectivity is prime). It’s important to ask, “Toxic according to who, and with respect to what?”
Emile Durkheim asked his readers to imagine what would happen in a “society of saints.” There would still be sinners because “faults which appear venial to the layman” would there create scandal.
Command-f quote marks.
It's highly suggestive that every single "quote" Sasha uses here to illustrate the supposed social norms of the EA/Rat community is invented. He literally could not find a single actual source to support his claims about what EA/Rats believe.
“don’t ever indulge in Epicurean style, and never, ever stop thinking about your impact on the world.”
Does GiveWell endorse that message on any public materials? Does OpenPhil? FHI? The only relevant EA writing I'm aware of (Scott Alexander, Ben Kuhn, Kelsey Piper) is about how that is specifically not the attitude they endorse.
Come on, this is pure caricature.
I don’t think I agree that this is made-up though. You’re right that the quotes are things people wouldn’t say but they do imply it through social behavior.
I suppose you’re right that it’s hard to point to specific examples of this happening but that doesn’t mean it isn’t happening, just that it’s hard to point to examples. I personally have felt multiple instances of needing to do the exact things that Sasha writes about - talk about/justify various things I’m doing as “potentially high impact”; justify my food choices or donation choices or career choices as being self-improvement initiatives; etc.
this article points at something real
The drowning child argument comes close enough to endorsing that message that Eliezer felt a need to push back on it.
There's certainly been discussion of people in EA feeling a moral obligation to spend all their time and money on making a positive impact. I've personally felt it and know several others who have, and e.g. these [1 2] articles discuss it, to name just a few examples.
I have probably spent hundreds of hours reading EA material, and have literally never come across an institutional publication with a phrase of the variety:
And Sasha never claims that you would! In fact he explicitly notes that you won't:
Generally, toxic social norms don’t develop intentionally, nobody wants them to happen, they’re not written down, and nobody enforces them explicitly.
Social norms and what's publically endorsed are not the same thing. It's still debatable whether those norms exist but this is a bad argument.
I agree with the general consensus in the comments that Sasha is under the wrong impression of what the rationality community is about. However, I think that this false impression is very telling. I suspect a lot of people also have this same wrong impression of what the rationality community is about. This seems like an important thing for us to pay attention to.
What can we do about it? I'm not sure. One thing that comes to mind is to simply repeat ourselves a lot.
I myself sometimes feel bad when I engage in, say, writing fiction. (Reading fiction is pretty obviously useless, so I know I am just doing it for “fun.” It doesn’t confuse me the way producing fiction does.) I was like this before I even knew there was a Rationality subculture. I don’t try to justify these behaviors at all; I am just not sure if they are aligned with my values, or not, and in what quantities they are healthy.
So while I agree with the gist of this post, I believe the core issue to be more of a tradeoff rather than an obvious evil.
This was highly insightful. Thanks for sharing.
How would we go about disincentivizing this drift towards undesirable social norms? This seems like a situation in which individuals acting in their parochial best interest (virtue signalling, gaining social acceptance, over-identifying with an ideal) is detrimental to the group as a whole—and ultimately to the individuals whose identity has become defined by the group. I’m reminded of this quote from The Greatest Show on Earth by Dawkins:
Natural selection […] chooses between rival individuals within a population. Even if the entire population is diving to extinction, driven down by individual competition, natural selection will still favour the most competitive individuals, right up to the moment when the last one dies. Natural selection can drive a population to extinction, while constantly favouring, to the bitter end, those competitive genes that are destined to be the last to go extinct.
I don’t think these phenomena are particular to rationality and EA communities and I don’t deny their existence. My personal experiences (for what they are worth) of these communities have been largely positive. I find LW to be reasonably tolerant. I recall reading a funny criticism of LW on RationalWiki claiming that LW is too tolerant of the less epistemically rigorous ideas. I’ve read horror stories on reddit about toxic spirituality communities (mere anecdotes, I don’t have data on the toxicity of spirituality vs rationality). The drift towards cultishness is present in any human community, and as argued elsewhere on LW, it takes an unwavering effort to resist this drift.
How would we go about disincentivizing this drift towards undesirable social norms?
Perhaps it could be useful if we had some high-status members in the community, who would sometimes very visibly do something non-rational, non-effective, non-altruist, just because it is fun for them.
As an extreme thought experiment, imagine Eliezer Yudkowsky writing and publishing fan fiction. LOL
writing and publishing fan fiction
That made me chuckle. Or writing some of the funniest philosophical humour I've read.
I don't understand the view that "rationalists" are emotionless and incapable of appreciating aesthetics. I haven't seen much evidence to back this claim, mere anecdotes. If anything, people who see reality more clearly can see more of its beauty. As Feynman put it, a scientist can see more beauty in the world than an artist because the scientist can see the surface level beauty as well as the beauty in the layers of abstraction all the way down to fundamental physics.
If someone consistently fails to achieve their instrumental goals by adhering too firmly to some rigid and unreasonable notion of "rationality", then what they think rationality is must be wrong/incomplete.
Downvoted. Do you actually consider HPMOR non-rational and non-effective? It isn't just fan fiction, it's a tiny layer of fan fiction wrapped around the Sequences. Judging from the numerous comments in the Open Threads starting with "I've discovered LW through HPMOR", I think we could argue that HPMOR was more effective than the Sequences themselves (at least with respect to the goal of creating more aspiring rationalists).
More generally, every single piece of fiction written by EY that I've read so far involves very rational characters doing very rational things, and that's kind of the point. No one is saying that you shouldn't write fiction in general, but I do say that you shouldn't stop being rational while writing fiction. Or poetry. A rationalist poet should lean toward didactic poetry or the like (at least, that's what I would do). I am probably biased against silly poetry in general, but I personally regard writing dumb verses as I regard eating unhealthy cookies... do it if you need them to have fun, but you shouldn't be proud of this.
More generally, every single piece of fiction written by EY that I've read so far involves very rational characters doing very rational things, and that's kind of the point. No one is saying that you shouldn't write fiction in general, but I do say that you shouldn't stop being rational while writing fiction.
This feels to me like a goalpost being moved.
Yes, Eliezer's characters do smart things, but the likely reason is that he likes writing them that way, the audience enjoys reading that, and he has a comparative advantage doing that. (Kinda like Dick Francis writes about horse racing all the time.)
And I guess HPMOR really is "the Sequences for people who wouldn't read the Sequences otherwise". But was is also strategically necessary to write this or this? New audience, perhaps, but strongly diminishing returns.
The original objection was that rationalists and effective altruists feel like they are not allowed to do things that are not optimal (fully rational, or fully altruistic). Writing HPMOR could have been an optimal move for Eliezer, but the following stories probably were not. They are better explained by a hypothesis that Eliezer enjoys writing.
(more behind the link)