WrongBot comments on Some Thoughts Are Too Dangerous For Brains to Think - Less Wrong

15 Post author: WrongBot 13 July 2010 04:44AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (311)

You are viewing a single comment's thread. Show more comments above.

Comment author: WrongBot 13 July 2010 03:03:20PM *  6 points [-]

I am grouping together "everything that goes into your brain," which includes lots and lots of stuff, most of it unconscious. See research on priming, for example.

This argument is explicitly about encouraging people to justify ignoring relevant data about reality. It is, I recognize, an extremely dangerous proposition, of exactly the sort I am warning against!

At risk of making a fully general counterargument, I think it's telling that a number of commenters, yourself included, have all but said that this post is too dangerous.

  • You called it "deeply worrisome."
  • RichardKennaway called it "defeatist scaremongering."
  • Emile thinks it's Dark Side Epistemology. (And see my response.)

These are not just people dismissing this as a bad idea (which would have encouraged me to do the same), these are people are worrying about a dangerous idea. I'm more convinced I'm right than I was when I wrote the post.

Comment author: Jonathan_Graehl 13 July 2010 04:43:59PM 5 points [-]

It doesn't make you right. It just makes them as wrong (or lazy) as you.

If you feel afraid that incorporating a belief would change your values, that's fine. It's understandable that you won't then dispassionately weigh the evidence for it; perhaps you'll bring a motivated skepticism to bear on the scary belief. If it's important enough that you care, then the effort is justified.

However, fighting to protect your cherished belief is going to lead to a biased evaluation of evidence, so refusing to engage the scary arguments is just a more extreme and honest version of trying to refute them.

I'd justify both practices situationally: considering the chance you weigh the evidence dispassionately but get the answer quite wrong (even your confidence estimation is off), you can err on the side of caution in protecting your most cherished values. That is, your objective function isn't just to have the best Bayesian-rational track record.

Comment author: Bongo 14 July 2010 07:56:51AM 3 points [-]

Your post is not dangerous knowledge. It's dangerous advice about dangerous knowledge.

Comment author: Vladimir_Nesov 13 July 2010 03:19:38PM 6 points [-]

Heh. So most of the critics argue their disapproval of the argument in your post based essentially on the same considerations as discussed in the post.

Comment author: mattnewport 13 July 2010 06:21:07PM 3 points [-]

These are not just people dismissing this as a bad idea (which would have encouraged me to do the same), these are people are worrying about a dangerous idea. I'm more convinced I'm right than I was when I wrote the post.

Becoming more convinced of your own position when presented with counterarguments is a well known cognitive bias.

Comment author: WrongBot 13 July 2010 06:38:03PM 3 points [-]

Knowing about biases may have hurt you. The counterarguments are not what convinced me; it's that the counterarguments describe my post as bad because it belongs to the class of things that it is warning against.

There are other counterarguments in the comments here that have made me less convinced of my position; this is not a belief of which I am substantially certain.

Comment author: JoshuaZ 13 July 2010 03:14:05PM 2 points [-]

"Deeply worrisome" may have been bad wording on my part. It might be more accurate to say that this is an attitude which is so much more often wrong than right that it is better to acknowledge the low probability of such knowledge existing but not actually deliberately keep knowledge out.