Comment author: MugaSofer 26 December 2012 01:48:17AM 1 point [-]

I ... didn't get the impression that this was intended to mindhack people into moving closer to LessWrong consensus.

Comment author: Ezekiel 26 December 2012 11:07:41AM 5 points [-]

Oh, sorry, neither did I. I'm not trying to accuse Raemon of deliberate brainwashing. But getting together every year to sings songs about, say, existential risk will make people more likely to disregard evidence showing that X-risk is lower than previously thought. Same for every other subject.

Comment author: MugaSofer 26 December 2012 01:26:20AM -1 points [-]

Who said anything about mindhacking? I'm just saying that we should expect rationalists to believe some of the same things, even if nonrationalists generally don't believe these things. Considering the whole point of this site is to help people become more rational, recognize and overcome their biases etc. I'm not sure what you're doing here if you don't think that actually, y'know, happens.

Comment author: Ezekiel 26 December 2012 01:33:10AM 3 points [-]

Who said anything about mindhacking?

Raemon did. It's a ritual, deliberately styled after religious rituals, some of the most powerful mindhacks known.

Comment author: MugaSofer 25 December 2012 10:41:18PM 7 points [-]

Correct me if I'm wrong, but it looks like you're talking about anti-deathism (weak or strong) as if it was a defining value of the LessWrong community. This bothers me.

Well, it depends what you mean by "defining value". The LW community includes all sorts of stuff that simply becomes much more convincing/obvious/likely when you're, well, more rational. Atheism, polyamory, cryonics ... there's quite a few of these beliefs floating around. That seems like it's as it should be; if rationality didn't cause you to change your beliefs, it would be meaningless, and if those beliefs weren't better correlated with reality, it would be useless.

Comment author: Ezekiel 26 December 2012 12:17:33AM 6 points [-]

As of now, there is no evidence that the average LessWronger is more rational than the average smart, educated person (see the LW poll). Therefore, a lot of LWers thinking something is not any stronger evidence for its truth than any other similarly-sized group of smart, educated people thinking it. Therefore, until we get way better at this, I think we should be humble in our certainty estimates, and not do mindhacky things to cement the beliefs we currently hold.

Comment author: NancyLebovitz 25 December 2012 06:38:12PM 0 points [-]

I thought there was a rule about not breaking tradition, even if the tradition isn't otherwise supported. No?

Comment author: Ezekiel 25 December 2012 08:04:24PM 0 points [-]

The line that people tend to quote there is "מנהג ישראל דין הוא" (the custom of Israel is law), but most people have never looked up its formal definition. Its actual halachic bearing is much too narrow to justify (for example) making kids sing Shabbat meal songs.

Comment author: Ezekiel 25 December 2012 11:39:08AM 14 points [-]

Correct me if I'm wrong, but it looks like you're talking about anti-deathism (weak or strong) as if it was a defining value of the LessWrong community. This bothers me.

If you're successful, these rituals will become part of the community identity, and I personally would rather LW tried to be about rationality and just that as much as it can. Everything else that correlates with membership - transhumanism, nerdiness, thinking Eliezer is awesome - I would urge you not to include in the rituals. It's inevitable that they'd turn up, but I wouldn't give them extra weight by including them in codified documents.

As an analogy, one of the things that bugged me about Orthodox Judaism was that it claims to be about keeping the Commandments, but there's a huge pile of stuff that's done just for tradition's sake, that isn't commanded anywhere (no, not even in the Oral Lore or by rabbinical decree).

Comment author: Ezekiel 25 December 2012 11:27:08AM 3 points [-]

So everyone in the human-superiority crowd gloating about how they're superior to mere machines and formal systems, because they can see that Godel's Statement is true just by their sacred and mysterious mathematical intuition... "...Is actually committing a horrendous logical fallacy [...] though there's a less stupid version of the same argument which invokes second-order logic."

So... not everyone. In Godel, Escher, Bach, Hofstadter presents the second-order explanation of Godel's Incompleteness Theorem, and then goes on to discuss the "human-superiority" crowd. Granted, he doesn't give it much weight - but for reasons that have nothing to do with first- versus second-order logic.

Don't bash a camp just because some of their arguments are bad. Bash them because their strongest argument is bad, or shut up.

(To avoid misunderstanding: I think said camp is in fact wrong.)

Comment author: chaosmosis 04 December 2012 06:05:25PM *  1 point [-]

In my experience, the people on this site don't perceive signalling as wrong or useless, even when it's superficial. I do not understand why that's so because I perceive most of signalling as a waste of resources and think that cultivating a community which tried to minimize unnecessary signalling would be good.

Comment author: Ezekiel 04 December 2012 07:03:10PM 1 point [-]

I perceive most of signalling as a waste of resources and think that cultivating a community which tried to minimize unnecessary signalling would be good.

Correcting spelling errors doesn't waste many resources. But yeah, the amount of pointless signalling that goes on in the nerd community is kind of worrying.

Why do I do it myself? Force of habit, probably. I was the dumbest person in my peer group throughout high school, so I had to consciously cultivate an image that made me worth their attention, which I craved.

In response to comment by Ezekiel on LessWrong podcasts
Comment author: pinyaka 03 December 2012 03:38:09PM *  9 points [-]

I've been lurking on this site for a few months and seeing this in my RSS feed this morning was surprisingly shocking. I guess I just assumed that people trying to be more logical never made this kind of mistake. It was a good reminder that a mistake only invalidates the conclusions drawn from the mistake, so spelling and grammar errors should be pretty low on the list of offenses. It's kind of saddening that this kind of problem draws my attention much quicker than serious logical problems.

In response to comment by pinyaka on LessWrong podcasts
Comment author: Ezekiel 03 December 2012 07:20:22PM 3 points [-]

It's kind of saddening that this kind of problem draws my attention much quicker than serious logical problems.

To be fair, they're a hell of a lot easier to notice. Although there's probably a signalling issue involved as well - particular kinds of pedantry are good ways of signalling "nerdiness", and I think most LWers try to cultivate that kind of image.

In response to LessWrong podcasts
Comment author: Ezekiel 03 December 2012 03:26:06PM 6 points [-]

The founders of Castify are big fans of Less Wrong so their rolling out their beta with some of our content.

Twitch.

But seriously, this is great. I'm trying to get into the habit of using podcasts and recorded lectures to make better use of my time, especially while travelling.

Comment author: Larks 05 November 2012 10:25:12PM 8 points [-]

Voting is more like stealing thousands of dollars to donate to an ok charity.

Comment author: Ezekiel 06 November 2012 04:26:33PM 0 points [-]

Stealing?

View more: Next