Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.
Thus runs the ancient parable:
If a tree falls in a forest and no one hears it, does it make a sound?
One says, "Yes it does, for it makes vibrations in the air."
Another says, "No it does not, for there is no auditory processing in any brain."
So begins a long, acrimonious battle...
The conventional resolution is that the two are fighting over the definition of a word, and such labels do not have intrinsic definitions, only agreed-upon definitions.
Yet if you need to know about the forest for any pragmatic reason - if there is anything you plan on doing with the knowledge - then the answer is no longer a matter of mutual agreement. If, for example, you need to know whether landmines will be set off by the tree falling, then you cannot make the land mines explode or unexplode by any possible amount of agreement about the meaning of the word "sound". You can get the whole world to agree, one way or the other, and it still won't make a difference.
You find yourself in an unheard-falling-tree dilemma, only when you become curious about a question with no pragmatic use, and no predictive consequences. Which suggests that you may be playing loose with your purposes.
So does this mean that truth reduces to usefulness? But this, itself, would be a purpose-loss, a subgoal stomp, a mistaking of the indicator for the indicated. Usefulness for prediction, and demonstrated powers of manipulation, is one of the best indicators of truth. This does not mean that usefulness is truth. You might as well say that the act of driving to the supermarket is eating chocolate.
There is, nonetheless, a deep similarity between the pragmatic and the epistemic arts of rationality, in the matter of keeping your eye on the ball.
In pragmatic rationality, keeping your eye on the ball means holding to your purpose: Being aware of how each act leads to its consequence, and not losing sight of utilities in leaky generalizations about expected utilities. If you hold firmly in your mind the image of a drained swamp, you will be less likely to get lost in fighting alligators.
In epistemic rationality, keeping your eye on the ball means holding to your question: Being aware of what each indicator says about its indicatee, and not losing sight of the original question in fights over indicators. If you want to know whether landmines will detonate, you will not get lost in fighting over the meaning of the word "sound".
Both cases deal with leaky generalizations about conditional probabilities: P(Y=y|X=x) is nearly but not quite 1.
In the case of pragmatic rationality: driving to the supermarket may almost always get you chocolate, but on some occasions it will not. If you forget your final purpose and think that x=y then you will not be able to deal with cases where the supermarket is out of chocolate.
In the case of epistemic rationality: seeing a "Chocolate for sale" sign in the supermarket may almost always indicate that chocolate is available, but on some occasions it will not. If you forget your original question and think that x=y then you will go on arguing "But the sign is up!" even when someone calls out to you, "Hey, they don't have any chocolate today!"
This is a deep connection between the human arts of pragmatic and epistemic rationality...
...which does not mean they are the same thing.