Manfred comments on Should we discount extraordinary implications? - Less Wrong

9 Post author: XiXiDu 29 December 2011 02:51PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (107)

You are viewing a single comment's thread.

Comment author: Manfred 30 December 2011 06:02:55AM 1 point [-]

For very large or very small probabilities, I agree it's important to start taking into account the "model uncertainty." And if some argument leads to the conclusion 2=1 (or that you should never act as if you'll die, which is of similar levels of wrong), of course you discount it, not in defiance of probability, but with probability, since we have so much evidence against that claim.

However, in the "donating to SIAI" case, I don't think we're actually talking about particularly large or small probabilities, or fallacious arguments. Implications can be labeled "extraordinary" for being socially unusual. This sort of extraordinary doesn't seem like it should be discounted.

Comment author: Eugine_Nier 30 December 2011 11:48:24PM 1 point [-]

However, in the "donating to SIAI" case, I don't think we're actually talking about particularly large or small probabilities, or fallacious arguments. Implications can be labeled "extraordinary" for being socially unusual. This sort of extraordinary doesn't seem like it should be discounted.

This behavior isn't actually "socially unusual", in fact there are many social institutions that this resembles at least from an outside view, they're commonly called "cults". What this means is that humans seem to have a bias in favor of donating to "their cult" and believing they're acting rationally while doing so. As such you should consider whether you're belief that it's rational to donate to SIAI is affected by the same bias.

Comment author: Manfred 31 December 2011 07:46:37AM *  1 point [-]

As such you should consider whether you're belief that it's rational to donate to SIAI is affected by the same bias.

You're right, you should. Although there are some serious holes in the claim that SIAI looks like a cult using the outside view, that's not totally relevant.

My point is that you should correct for this kind of bias using probabilities, rather than saying "well, I don't find the conclusions aesthetically pleasing, so since I'm biased I can just throw them out." And if you correct for the bias, and model uncertainty, and include all the evidence, and you still get the aesthetically unpleasing answer, well, tough.