The "Is driving worth the risk?" question calculates based on 2 assumptions:

  1. Surviving till approximately 2060 would give one the chance to be made immortal by an artificial super-intelligence

  2. Dying before the advent of ASI would prevent one from ever being made immortal by the ASI.

This related question queries whether assumption 2 makes sense.

When people sign up for cryogenics, they express the belief that an ASI will eventually become able to resurrect people or bits of people who currently qualify as deceased. Cryogenics assumes that an AI which could grant immortality could resurrect the dead as long as the dead go out of their way to make that task as easy as possible.

And of course, there's an infamous infohazard whose premise assumes that an ASI will eventually be able to simulate every human who's existed before its creation. Peoples' tendency to take that infohazard seriously suggests that many believe that an ASI could just resurrect the dead.

If any ASI capable of granting immortality-type life extension would also eventually be able to resurrect people who were considered deceased at the time it came into existence, the cost of dying in a car crash a few years before the singularity would go from "losing infinite years" to "losing the number of years it takes the ASI to solve resurrection", which puts the value of not dying near the order of magnitude of the original $10 million estimate.

12

New Answer
New Comment

1 Related Questions

Parent Question
3 comments, sorted by Click to highlight new comments since:

FWIW, I did have this question in mind when I asked Is driving worth the risk?. I chose not to ask this question because it is sort of a beast to answer, whereas the driving question gets at this question while keeping things ballpark-y, approachable, and relatable, which seems like it makes it a good first step.

[-]nim80

I agree that driving is more concrete, and thus slightly easier to find real numbers about.

The difference in likelihood between immortality-and-resurrection ASI vs immortality-without-resurrection ASI seems to me to be smaller than the difference in likelihood between "ASI is possible" and "ASI as we imagine it is impossible for some reason we haven't discovered yet". (for "ASI as we imagine it" being a superintelligence that both can and wants to make us immortal, the "is impossible" might be as simple as it deciding that there's some watertight ethical case against immortality which we just weren't smart enough to figure out)

I think that guesstimating an actual likelihood that an ASI which could offer immortality couldn't offer resurrection is a worthwhile exercise in reasoning about the limits of the hypothetical ASI, which would in turn offer a structure for reasoning about the likelihood that an ASI might never exist, or that it might exist and decide that giving us eternal happiness or immortality or whatever is actually not a good idea.

Thank you for this. The idea of "if you die before the singularity but are signed up for cryonics you might be revived" didn't really register with me until now. I feel silly. It's a hugely important thing that I overlooked. It's kinda shaken me. Currently, avoiding death is a pretty big thing for me, but given this, it may not be something worth prioritizing so much. Let me try to play with some numbers.

I suppose we're just multiplying by the probability of immortality without resurrection. Eg. if I die right now, let's ignore the ~50 years of pre-singularity life I lose and focus on me losing the 10% chance of living 100k post-singularity years. Or an expectation of 10k years. But I only lose those 10k years if it's immortality without resurrection. So what is the probability of immortality without resurrection? Suppose it's 30%. Then the expectation is 3k years instead of 10k.

Furthermore, if it's immortality without resurrection, I think those life years are more likely to be unpleasant. I might not even want to be living those life years. Doesn't immortality without resurrection indicate pretty strongly that the AI is unfriendly? In which case, it wouldn't make sense to go to great lengths trying to avoid death, eg. by not riding in cars.

On the other hand, when people die in car accidents, it seems like the type of thing where your brain could be damaged enough such that you wouldn't be able to be cryonically frozen. Hm, this feels pretty cruxy. There's gotta be at least a 10% chance that the car accident that kills you would also prevent you from being cryonically frozen, right? If so, we're only cutting things down by an order of magnitude. That seems like a lower bound. In realize, I'd think that it's more like a 50% chance.