Vladimir_Nesov comments on Comments for "Rationality" - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (38)
Suppose I have a blind man and a sighted man. The blind man has a cataract that could be repaired with surgery. I tell them that a jar contains a very large number of pebbles, all of them are either green or blue, and there are twice as many of one color than the other. I pull out a random pebble, which turns out to be green, and show it to both the men. I ask them to write down what they think the probability is that the next pebble I pull out will be green. The sighted man writes 51%; the blind man writes 50%. Who is more rational?
You are being cryptic.
The sighted man is executing an incorrect probability update on better information, leading him to a slightly higher expected score. I answer that the blind man is more rational, unless he has refused to repair the cataract for no apparent reason, in which case he is exhibiting a different, unusual, and in this case slightly more damaging form of irrationality.
But if you define rationality as either "obtaining beliefs that correspond to reality as closely as possible" or "achieving your values", it seems that the sighted man has been more successful. I guess "believing, and updating on evidence, so as to systematically improve the correspondence between your map and the territory" is better, since the blind man has less evidence. I think the question now, though, is when a person "has" a given piece of evidence. What if I fail to recognize that a certain fact is evidence for a certain hypothesis? What if I do recognize this, but don't have the time to apply Bayes' law?
In other words, it's possible to construct and then resolve an arbitrary problem based on the given description.