Grognor comments on People who "don't rationalize"? [Help Rationality Group figure it out] - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (85)
You'd think not. Yet even Eliezer seems to think that one of our case studies really, truly might not ever rationalize and possibly never has before. This seems to be a case of a beautiful, sane theory beaten to death by a small gang of brutal facts.
It means that I don't know how to measure how strong someone's rationality skills are other than talking to others whom I intuitively want to say are good rationalists and comparing notes. So I'm hedging my assertions. But to whatever degree several people at the Singularity Institute are able to figure out who is or is not a reasonably good rationalist, some of our sample "non-rationalizers" appear to us to be good rationalists, and some appear not to be so.
Sure. We tell them the kinds of situations in which Tarski is useful, including some personal examples of our own applications of it, and they just blink at us and completely fail to relate. For instance, I might say, "So once I was walking past a pizza place and smelled pizza. Cheese turns out to be really bad for me, but at the time I was hungry. So I watched my mind construct arguments like, 'I haven't gotten much calcium for the last while.'" Nothing of this sort - fake justification, selective search, nothing - seems to connect to something they can relate to. So they just don't see where they'd ever use Tarski.
And yes, we've had at least one person be openly skeptical that anyone could possibly find Tarski useful because he didn't think anyone rationalized the way we were describing. And another of our case studies seemed to know rationalization only as a joke. ("The cake has fewer calories and doesn't count if I eat it while standing, right?")
This is VERY interesting. I'm as baffled as you are, sorry to say.
It seems like you've described rationalizations that prevent true (or 'maximally accurate') beliefs. Have you tried asking these case studies their rationales for decision-making? One theme of my rationalization factory is spitting out true but misleading reasons for doing things, rarely allowing me to reason out doing what I know - somehow - that I should. Said factory operates by preventing me from thinking certain thoughts. Perhaps this goes on in these people?
I've performed one heck of an update thanks to your comment and realizing that I was generalizing from only a few examples.