endoself comments on [SEQ RERUN] Just Lose Hope Already - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (15)
From the comments:
--Eliezer Yudkowsky
This was a great addition and probably should have been in the post, so I'm reposting it here for everyone
Is this ("It has a small probability of success, but we should pursue it, because the probability if we don't try is zero") not a standard pro-cryonics argument? Given a sufficiently large expected payoff, it seems perfectly valid ...
Cryonics should just work if everything we currently already believe about the brain is true and there are no surprises. It is not a small probability. It is the default mainline probability.
Cryonics being possible given advanced technology is the default mainline probability. But the probability of being revived given that you prepare to be cryo-preserved is not.
"My head remains in stasis in a facility that remains functional until such time as an agent in the future is willing and able to revive me" is not something that just happens. It could even be said to be a long shot. But the only shot available.
That's exactly what I meant. A lot of practical things can go wrong even if our beliefs about the brain are entirely correct. Rationality, Cryonics and Pascal's Wager gives a probability of 0.228 which is, indeed, not that improbable, but it is still less than 50%.
I conclude, then, that the supposedly useless heuristic described above
is useless only if the probability of success is very small
No. In cryonics we do an explicit cost-benefit calculation in order to see whether we value it enough to spend money that could be used elsewhere. Eliezer is referring to the specific case where something is found to be far less likely than cryonics (which isn't that improbable) but is pursued anyways because the alternative has exactly zero benefit. Such situations almost always ignore some cost or benefit in order to rationalize a choice despite ~0 probability.
Shut up and do the impossible!
Doug was right, Eliezer was wrong. At least as the quote is stated. That is not to say that implementing a heuristic avoidance of low probability plans is not usually a good idea.
I think there is a distinction. In this case literally the entire argument is "It has a small probability of success, but we should pursue it, because the probability if we don't try is zero". It would be valid to justify a low probability with a high utility, but sometimes people just ignore or refuse to calculate probabilities because they believe that all alternatives are futile, even in the face of repeated counterevidence pushing the probability ever-lower. While such a situation is possible, beliefs of this type are far more likely to be caused by rationalization.