Jiro comments on Rationality Quotes December 2014 - Less Wrong

8 Post author: Salemicus 03 December 2014 10:33PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (440)

You are viewing a single comment's thread. Show more comments above.

Comment author: Jiro 17 December 2014 03:51:25PM 1 point [-]

Now substitute "abortion is unconditionally bad" with "creationism should not be taught as science in public schools".

If you would still be creeped out by that, then your creep detector is miscalibrated; that would mean nobody can have an organization dedicated to a cause without creeping you out.

If you would not be creeped out by that, then your initial reaction to the abortion example was probably being mindkilled by abortion, not being creeped out by the fact that a lot of people agreed on something.

Comment author: dxu 17 December 2014 04:00:13PM *  2 points [-]

Just because I agree with their ideas doesn't mean I won't find it creepy. A cult is a cult, regardless of what it promotes. If I wanted to join an anti-creationist community, I certainly wouldn't join that one, and there are plenty such communities that manage to get their message across without coming off as cultish.

Comment author: Jiro 17 December 2014 04:15:54PM 2 points [-]

The example is supposed to sound cultist because the people think alike. But I have a hard time seeing how a non-cultist anti-creationist group would produce different arguments against creationism.

The non-cultist group could of course not all use the same welcome phrase, but that's not really the heart of what the example is supposed to illustrate,

Comment author: dxu 18 December 2014 11:16:46PM *  3 points [-]

There are multiple anti-creationist arguments out there, so if they all immediately jump to the same one, I'd be suspicious. But even beyond that, it's natural for humans to disagree about stuff, because we're not perfect Bayesians. If you see a bunch of humans agreeing completely, you should immediately think "cult", or at the very least "these people don't think for themselves". (I'd be much less suspicious if we replace humans with Bayesian superintelligences, however, because those actually follow Aumann's Agreement Theorem.)