Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

private_messaging comments on Things You Can't Countersignal - Less Wrong

51 Post author: Alicorn 19 February 2010 12:18AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (122)

You are viewing a single comment's thread. Show more comments above.

Comment author: private_messaging 03 February 2015 03:46:11AM 0 points [-]

Well, if someone ironically says that they are "dropping out of school to join a doomsday cult" (and they are actually dropping out of school to join something), they got to be joining something that has something to do with a doomsday, rather than, say, another school, or a normal job, or the like.

Comment author: Nornagest 03 February 2015 03:55:19AM *  0 points [-]

There's a lot of doomsdays out there. My first assumption, if I was talking to someone outside core rationalist demographics, would probably be climate change advocacy or something along those lines -- though I'd probably find it funnier if they were joining NORAD.

Comment author: private_messaging 03 February 2015 04:08:07AM 0 points [-]

Well, you start with a set containing google, mcdonalds, and all other organizations one could be joining, inclusive of all doomsday cults, and then you end up with a much smaller set of organizations, inclusive of all doomsday cults. Which ought to boost the probability of them joining an actual doomsday cult, even if said probability would arguably remain below 0.5 or 0.9 or what ever threshold of credence.

Comment author: Nornagest 03 February 2015 04:13:18AM *  1 point [-]

Yes, I understand the statistics you're trying to point to. I just don't think it's as simple as narrowing down the reference class. I expect material differences in behavior between the cases "joining a doomsday cult or something that could reasonably be mistaken for one" and "joining something that kinda looks enough like a doomsday cult that jokes about it are funny, but which isn't", and those differences mean that this can't be solved by a single application of Bayes' Rule.

Maybe your probability estimate ends up higher by epsilon or so. That depends on all sorts of fuzzy readings of context and estimations of the speaker's character, far too fuzzy for me to do actual math to it. But I feel fairly confident in saying that it shouldn't adjust that estimate enough to justify taking any sort of action, which is what actually matters here.

Comment author: private_messaging 03 February 2015 04:47:30AM 1 point [-]

Well, a doomsday cult is not only a doomsday cult but also kinda looks enough like a doomsday cult, too. Of people joining something that kinda looks enough like a doomsday cult, some are joining an actual doomsday cult. Those people, do they, in your model, know that they're joining a doomsday cult, so they can avoid joking about it?