Grognor comments on People who "don't rationalize"? [Help Rationality Group figure it out] - Less Wrong

12 Post author: Mercurial 02 March 2012 11:38PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (85)

You are viewing a single comment's thread. Show more comments above.

Comment author: Vladimir_Nesov 03 March 2012 12:25:22AM *  7 points [-]

I don't think I rationalize to any significant extent. Even the examples I came up with for Anna's thread concern inefficient allocation of attention and using zero-information arguments, not something specifically directed to defense of a position. I admit being wrong or confused on simple things, sometimes incorrectly (so that I have to go back to embrace a momentarily-rejected position). It's possible I'm completely incapable of noticing rationalization and would need a new basic skill to fix that, but doesn't seem very likely.

(Alternatively, perhaps "rationalization" needs to be unpacked a bit, so that problems like those in the examples I referred to above can find a place in that notion. As it is, they seem more like flaws in understanding unbiased with respect to a favored conclusion, unless that conclusion is to be selected in the hindsight.)

Comment author: Grognor 16 March 2012 03:06:56AM *  1 point [-]

What about this? Do you not count this because you were sleepy at the time, because it was a minor incident, or what?

(Also, I did not go through your comments to find that. Just thought I'd point that out because of shminux's comment.)

Comment author: Vladimir_Nesov 16 March 2012 09:46:10AM 2 points [-]

I don't remember the experience, but it sounds like a collection of absent-minded system 1 responses that build on each other, there doesn't appear to be a preferred direction to them. This is also the characterization from the comment itself:

My mind confused this single thing for the light turning off, and then produced a whole sequence of complex thoughts around this single confusion, all the way relying on this fact being true.

As I understand, "rationalization" refers to something like optimization of thoughts in the direction of a preferred conclusion, not to any kind of thinking under a misconception. If I believe something wrong, of course I'll be building on the wrong thing and making further wrong conclusions, until I notice that it's wrong.