RobinZ comments on What Cost for Irrationality? - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (113)
But this is just unfair. You're judging rationality according to rational arguments, and so OF COURSE you end up finding that rationality is sooo much better.
I, on the other hand, judge my irrationality on an irrational basis, and find that actually it's much better to be irrational.
What's the difference? Of course in response to this question you're bound to come up with even more rational arguments to be rational, but I don't see how this gets you any further forward.
I, on the other hand, being irrational, don't have to argue about this if I don't want to. What kind of sense would it make to argue rationally about the advantages of irrationality anyway? Surely this is a contradiction in terms? But the nice thing about being irrational is that I can irrationally use rationality from time to time anyway, and then just stop and go back to being irrational again when irrationality is clearly more inspired.
OK - so I'm messing about. But you can't prove rationality is more rational by rational argument. Well, you can, but it's irrational in a way as you're assuming the very thing you're trying to prove. It's as example of trying to pick yourself up by your own bootstraps.
I think many of us have considered these ideas before. Eliezer Yudkowsky certainly has.
The fact of the matter is: either you are that crazy that you will be incapable of developing a rationality that works ... or you aren't. If you are, you will lose. If you aren't, you can probably judge the rationality you have according to the rational arguments you have to develop a better rationality.
Just had a look at what Eliezer said there. I think it's not quite the same thing as what I'm talking about here. It's true that if you have in your mind a system of rationality - that leads you in a rational way to improve what you have over time. I agree this works if you have the required intelligence and don't start with an entirely pathological system of rationality.
Let me give a slightly more concrete example. I had a conversation some time ago regarding homeopathy - that branch of alternative medicine that uses ingredients which have been diluted down by a factor of 10 - in this case 120 times in succession. This results in an overall dilution of 1 in 10^120. Since there are only 10^85 or so atoms in the entire observable universe, this provides a very high degree of certainty that there is none of the active ingredient in the homeopathic bottle that this person swore was highly effective.
Pointing this out had no effect, as you might expect. In fact, the power of the treatment is said to rise as it becomes more dilute. The person absolutely believed in the power of that remedy, even though they agreed with my argument that in fact there were no molecules of the original substance in the bottle. I don't suppose talking about placebos and hypnotic suggestion would have made any difference either - in fact I believe I did mention the placebo effect. No difference at all.
We've all come across stuff like this. My point is that the applicability of rationality is what is at issue in arguments like this. I say it is - they say that in some way it isn't. My argument stops me from buying the homeopathic remedy, but it is almost irrelevant to the other person because rationality itself is what is at issue.
Wait, are you asking how to convince an irrational human being to be rational?
Sort of. And we all know the answer to that question is that it's often completely impossible.
Some of the examples in the article are matters where human hardware tends to lead us in the wrong direction. But others - particularly the Albanian case, are to a large extent failures of intent. Good quality rationality is a long term investment that many people choose not to make. The result is vulnerability to believing impossible things. Irrationality is often a choice, and I think that, long term, our failure to be rational springs as much from choosing not to be as much as it does from failures in execution when sincerely trying to be. You can compensate, to a degree, for our hardware based inclinations to see patterns where none exist, or stick with what we have. But nothing compensates for choosing the irrational.
We can all see that irrationality is expensive to varying degrees depending on what you do. But this is only convincing to those of us who are already convinced and don't need to know. So what was the article intending to do?
So yes - sort of.
Not to sound insufficiently pessimistic, but I don't think that's been rigorously established. It doesn't seem impossible to raise the sanity waterline - it seems more likely that we have inferential distances to cross and armors built to protect false beliefs we must pierce.
I like this comment. Given the historical improvements that have already come about, it can't be unreasonable to look for more.