Even if you don't think that a Malthusian scenario is likely, it still likely that the future will be an approximately steady-state economy, which means it would be strong disincentives against adding more people.
I'm inclined to agree, actually, but I would expect a post-scarcity "steady-state economy" large enough that absorbing such a tiny number of people is negligible.
With that said:
Honestly, it doesn't sound all that implausible that humans will find ways to expand - if nothing else, without FTL (I infer you don't anticipate FTL) there's pretty much always going to be a lot of unused universe out there for many billions of years to come (until the universe expands enough we can't reach anything, I guess.)
Brain emulations sound extremely plausible. In fact, the notion that we will never get them seems ... somewhat artificial in it's constraints. Are you sure you aren't penalizing them merely for sounding "exotic"?
I can't really comment on turning Jupiter into processing substrate and living there, but ... could you maybe throw out some numbers regarding the amounts of processing power and population numbers you're imagining? I think I have a higher credence for "extreme singularity scenarios" than you do, so I'd like to know where you're coming from better.
Hanson, for instance, predicts that his brain emulation society would be a Malthusian subsistence economy. I don't think that such a society could afford to ever revive any significant number of cryopatients, even if they had the technology (how Hanson can believe that society is likely and be still signed up for cryonics, is beyond my understanding).
That ... is strange. Actually, has he talked anywhere about his views on cryonics?
Honestly, it doesn't sound all that implausible that humans will find ways to expand - if nothing else, without FTL (I infer you don't anticipate FTL)
Obviously I don't anticipate FTL. Do you?
there's pretty much always going to be a lot of unused universe out there for many billions of years to come (until the universe expands enough we can't reach anything, I guess.)
Yes, but exploiting resources in our solar system is already difficult and costly. Currently there is nothing in space worth the cost of going there or bringing it back, maybe in the fut...
Background:
On the most recent LessWrong readership survey, I assigned a probability of 0.30 on the cryonics question. I had previously been persuaded to sign up for cryonics by reading the sequences, but this thread and particularly this comment lowered my estimate of the chances of cryonics working considerably. Also relevant from the same thread was ciphergoth's comment:
Based on this, I think there's a substantial chance that there's information out there that would convince me that the folks who dismiss cryonics as pseudoscience are essentially correct, that the right answer to the survey question was epsilon. I've seen what seem like convincing objections to cryonics, and it seems possible that an expanded version of those arguments, with full references and replies to pro-cryonics arguments, would convince me. Or someone could just go to the trouble of showing that a large majority of cryobiologists really do think cryopreserved people are information-theoretically dead.
However, it's not clear to me how well worth my time it is to seek out such information. It seems coming up with decisive information would be hard, especially since e.g. ciphergoth has put a lot of energy into trying to figure out what the experts think about cryonics and come away without a clear answer. And part of the reason I signed up for cryonics in the first place is because it doesn't cost me much: the largest component is the life insurance for funding, only $50 / month.
So I've decided to put a bounty on being persuaded to cancel my cryonics subscription. If no one succeeds in convincing me, it costs me nothing, and if someone does succeed in convincing me the cost is less than the cost of being signed up for cryonics for a year. And yes, I'm aware that providing one-sided financial incentives like this requires me to take the fact that I've done this into account when evaluating anti-cryonics arguments, and apply extra scrutiny to them.
Note that there are several issues that ultimately go in to whether you should sign up for cryonics (the neuroscience / evaluation of current technology, estimate of the probability of a "good" future, various philosophical issues), I anticipate the greatest chance of being persuaded from scientific arguments. In particular, I find questions about personal identity and consciousness of uploads made from preserved brains confusing, but think there are very few people in the world, if any, who are likely to have much chance of getting me un-confused about those issues. The offer is blind to the exact nature of the arguments given, but I mostly foresee being persuaded by the neuroscience arguments.
And of course, I'm happy to listen to people tell me why the anti-cryonics arguments are wrong and I should stay signed up for cryonics. There's just no prize for doing so.