Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.
One of the most commonly proposed Noble Lies is belief in an afterlife. Surely, goes the argument, the crushing certainty of absolute annihilation in a few decades is too much for any human being to bear. People need hope - if they don't believe in an afterlife, they won't be able to live.
Surely this must be the strongest of all arguments for Noble Lies. You can find Third Alternatives to many dilemmas, but can you find one to Death?
Well, did you close your eyes and think creatively about the problem for five minutes? No excuses, please; just answer "Yes" or "No". Did you, or did you not, brainstorm the problem for five minutes by the clock before giving up?
The assumed task is to find a source of hope against looming death. So at the very least I would cite medical nanotechnology, the argument from actuarial escape velocity, cryonics, or meddling with the forbidden ultimate technology. But do you think that anyone who actually argued for afterlife as a Noble Lie would be glad to hear about these Third Alternatives? No, because the point was not really to find the best strategy for supplying hope, but rather to excuse a fixed previous belief from criticism.
You can argue against the feasibility of one of the above Third Alternatives, or even argue against the feasibility of all of them, but that's not the point. Any one of those Third Alternatives stretches credulity less than a soul - that is (a) an imperishable dualistic stuff floating alongside the brain which (b) malfunctions exactly as the brain is neurologically damaged and yet (c) survives the brain's entire death. Even if we suppose the above Third Alternatives to be false-in-fact, they are packaged with far fewer associated absurdities, and put far less of a strain on the Standard Model.
Thus on the presentation of any one of these Third Alternatives, afterlife-ism stands immediately convicted because it cannot be the best strategy even as a Noble Lie. The old Noble Lie is dominated in the payoff table. If you decided to lie (to others or yourself) to soften the horror of personal extinction, then you'd nudge the balance of evidence a little on actuarial escape velocity - not spin up a soul from whole cloth.
(A truly fanatic rationalist - like me - would refuse to judge between these two lies, regarding them both as equal transgressions of the deontological commandments Thou Shalt Not Nudge Thy Probability Assignments and Thou Shalt Not Pursue Hope As An Emotion, Only Actual Positive Outcomes. Which is still no argument in favor of afterlife-ism; when a negative utility drops off my radar screen and becomes incomparable, I generally don't choose that policy.)