I am passionately fond of the idea of creating an “Art of Rationality” sensibility/school as described in the [A Sense That More is Possible](http://lesswrong.com/lw/2c/a_sense_that_more_is_possible/) article.
The obstacle I see as most formidable in such an undertaking is the fact that, no matter how much “rational software” our brains absorb, we cannot escape the fact that we exist within the construct of “irrational hardware”.
My physical body binds me to countless irrational motivations. Just to name a few: 1) Sex. In an overpopulated world, what is the benefit of yearning for sexual contact on a daily basis? How often does the desire for sex influence rational thought? Is “being rational” sexy? If not, it is in direct conflict with my body’s desire and therefore, undesirable (whereas being able to “kick someone’s ass” is definitely sexy in cultural terms) 2) Mortality. Given an expiration date, it becomes fairly easy to justify immediate/individually beneficial behavior above long term/expansively beneficial behavior that I will not be around long enough to enjoy. 3) Food, water, shelter. My body needs a bare minimum in order to survive. If being rational conflicts with my ability to provide my body with its basic needs (because I exist within an irrational construct)… what are the odds that rationality will be tossed out in favor of irrational compliance that assures my basic physical needs will be met?
As far as I can tell, being purely rational is in direct opposition to being human. In essence, our hardware is in conflict with rationality.
The reason there is not a “School of Super Bad Ass Black Belt Rationality” could be as simple as…. It doesn't make people want to mate with you. It’s just not sexy in human terms.
I’m not sure being rational will be possible until we transcend our flesh and blood bodies, at which point creating “human friendly” AI would be rather irrelevant. If AI materializes before we transcend our flesh and blood bodies, it seems more likely that human beings will cause a conflict than the purely rational AI, so shouldn't the focus be toward human transcendence rather than FAI?
Yes. The obvious one to me is that it is totally irrational of me to want to eat pile of sweets that I know from previous experience will make me feel bad about myself ten minutes after eating it and which I rationally don't need nutritionally. I can make myself not do it, but to make myself not want to is like trying to not see an optical illusion...
Wants are like pains: sometimes they're useful information that something should be attended to, and sometimes they're irrelevant distractions because there are more important things to do, and just have to be endured and otherwise ignored.