Ezekiel comments on Ritual 2012: A Moment of Darkness - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (136)
Correct me if I'm wrong, but it looks like you're talking about anti-deathism (weak or strong) as if it was a defining value of the LessWrong community. This bothers me.
If you're successful, these rituals will become part of the community identity, and I personally would rather LW tried to be about rationality and just that as much as it can. Everything else that correlates with membership - transhumanism, nerdiness, thinking Eliezer is awesome - I would urge you not to include in the rituals. It's inevitable that they'd turn up, but I wouldn't give them extra weight by including them in codified documents.
As an analogy, one of the things that bugged me about Orthodox Judaism was that it claims to be about keeping the Commandments, but there's a huge pile of stuff that's done just for tradition's sake, that isn't commanded anywhere (no, not even in the Oral Lore or by rabbinical decree).
Well, it depends what you mean by "defining value". The LW community includes all sorts of stuff that simply becomes much more convincing/obvious/likely when you're, well, more rational. Atheism, polyamory, cryonics ... there's quite a few of these beliefs floating around. That seems like it's as it should be; if rationality didn't cause you to change your beliefs, it would be meaningless, and if those beliefs weren't better correlated with reality, it would be useless.
As of now, there is no evidence that the average LessWronger is more rational than the average smart, educated person (see the LW poll). Therefore, a lot of LWers thinking something is not any stronger evidence for its truth than any other similarly-sized group of smart, educated people thinking it. Therefore, until we get way better at this, I think we should be humble in our certainty estimates, and not do mindhacky things to cement the beliefs we currently hold.
Who said anything about mindhacking? I'm just saying that we should expect rationalists to believe some of the same things, even if nonrationalists generally don't believe these things. Considering the whole point of this site is to help people become more rational, recognize and overcome their biases etc. I'm not sure what you're doing here if you don't think that actually, y'know, happens.
Raemon did. It's a ritual, deliberately styled after religious rituals, some of the most powerful mindhacks known.
I ... didn't get the impression that this was intended to mindhack people into moving closer to LessWrong consensus.
Oh, sorry, neither did I. I'm not trying to accuse Raemon of deliberate brainwashing. But getting together every year to sings songs about, say, existential risk will make people more likely to disregard evidence showing that X-risk is lower than previously thought. Same for every other subject.
Ah, I guess it was the use of "deliberately" that confused me. Now I come to think of it, this is mentioned as a possible risk in the article, and dismissed as much less powerful than, y'know, talking about it all the damn time.