Comment author: hhadzimu 10 March 2009 09:29:09PM 22 points [-]

We're forgetting signaling. Robin would never forgive us, because he sees it in a lot of things, and I happen to agree with him that it's far more pervasive than people think.

In fact, the Tversky example gives people two opportunities to signal: not only do they get to demonstrate higher pain tolerance [especially important for men], they also get to "demonstrate" a healthier heart. Both should be boosts in status.

The same goes for Calvinists: though, when you think about it, you truly believe in the elect, you don't think about it most of your life [as we know, much of our day to day life is subconsciously guided] and are instead focused on signaling your elect status with a good life.

For good measure, it even works with the car: you buy a new car to signal wealth to signal health.

However, I do believe that we engage in lots of automatic self-deception [making it easier to deceive others into believing we have higher status]: thus, we may actually believe that an extra car/a good life/a higher pain tolerance would improve your life expectancy/grace/heart, but that's merely the proximate cause. Ultimately, we're driven by status-seeking.

In response to Define Rationality
Comment author: jimmy 06 March 2009 02:18:57AM 2 points [-]

The only problem I have with Robin's definition (more "rational" means better believing what is true, given one's limited info and analysis resources. ) is that it doesn't make a point to distinguish between irrationality and other forms of stupidity.

I wouldn't call someone irrational if they dropped a sign in a calculation, or were simply not intelligent enough to understand how to calculate the answer, but if someone correctly calculates the the optimal trajectory, then takes a different route because "faith" tells him to, I would call that irrational.

My concept of rationality fits better to the idea of skillfully choosing which "Rituals Of Cognition" to trust. To put it another way, someone is rational to the extent that their preferred rituals of cognition "win" at the game of believing what is true (even if they manage to fail at implementing their "ROC"- that just makes them stupid/error prone).

The "given one's limited analysis resources" clause seems to cover some of this, but only vaguely, and would seem to give someone "rationality points" for coming up with a better algorithm that requires less clock cycles, while I would just give them "cleverness points". If one counts "non rationality intelligence" as a limited resource, then Robin's definition seems to agree, but "intelligence" is not very well defined either so defining "rationality" in terms of "intelligence" won't help us nail it down concretely.

Does anyone else have any thoughts on the difference between "irrational" and "other stupidity"?

In response to comment by jimmy on Define Rationality
Comment author: hhadzimu 06 March 2009 03:27:27AM 1 point [-]

I disagree... I think "limited analysis resources" accounts for the very difference you speak of. I think the "rituals of cognition" you mention are themselves subjection to rationality analysis: if I'm understanding you correctly, you are talking about someone who knows how to be rational in theory but cannot implement such theory in practice. I think you run into three possibilities there.

One, the person has insufficient analytical resources to translate their theory into action, which Robin accounts for. The person is still rational, given their budget constraint.

Two, the person could gain the ability to make the proper translation, but the costs of doing so are so high that the person is better off with the occasional translation error. The person rationally chooses not to learn better translation techniques.

Three, the person systematically makes mistakes in the translations. That, I think, we can fairly call a bias, which is what we're trying to avoid here. The person is acting irrationally - if there is a predictable bias, it should have been corrected for.

On your last point: "[Robin would] give someone "rationality points" for coming up with a better algorithm that requires less clock cycles, while I would just give them "cleverness points"." I think I have to side with Robin here. On certain issues it might not matter how quickly or efficiently the rational result is arrived at, but I think in almost all situations coming up with a faster way to arrive at a rational result is more rational, since individuals face constraints of time and resources. While the faster algorithm isn't more rational on a single, isolated issue [assuming they both lead to the same rational result], the person would be able to move on to a different issue faster and thus have more resources available to be more rational in a different setting.

In response to Define Rationality
Comment author: hhadzimu 05 March 2009 11:38:12PM 1 point [-]

I think we're missing a fairly basic definition of rationality, one that I think most people would intuitively come to. It involves the question at what stage evidence enters the decision-making calculus.

Rationality is a process: it involves making decisions after weighing all available evidence and calculating the ideal response. Relevant information is processed consciously [though see Clarification below] before decision is rendered.

This approach is opposed to a different, less conscious process, which are our instinctive and emotional responses to situations. In these situations, actual evidence doesn't enter the conscious decision-making process; instead, our brains, having evolved over time to respond in certain ways to certain stimuli, automatically react in certain pre-programmed ways. Those ways aren't random, of course, but adaptions to the ancestral environment. The key is that evidence specific to the situation isn't actually weighed and measured: the response is based on the brain's evolved automatic reaction.

Clarification: A process that is nearly automatic is still a rational process if it is the result of repeated training, rather than innate. For example, those who drive manual transmission cars will tell you that after a short while, you don't think about shifting: you just do. It becomes "second nature." This is still a conscious process: over time, you become trained to interpret information more efficiently and react quickly. This differs from the innate emotional and instinctive responses: we are instinctively attracted to beautiful people, for example, without having to learn it over and over again - it's "first nature." Though the responses are similar in appearance, I think most people would say that the former is rational, the latter is not.

View more: Prev