daenerys comments on On "Friendly" Immortality - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (103)
Unknown Consequences Discussion Thread
I'm going to bring up the classic Aubrey de Grey response, which actually works for all of these issues, but I think this one in particular. Yes, there will be problems if we live a much longer time. Yes, they will be very big problems. No, we don't even know what those problems will even be. But those problems will pale in comparison to the needless deaths of hundreds of thousands of people every day.
Yup, there's ginormous status quo bias going on here. If we lived in a world stretched to the limit for resources, where policy is impossible because everyone is a bigoted moral fossil and the few sane leaders left have no clue what they're doing because it's all so new, and you proposed "Hey, I know! Let's kill everyone over 80!"... everyone would just stare and ask "Have you been reading Pebble in the sky again?".
I think that's the hand-waving the OP was talking about.
How is this hand-waving?
If the consequences of life extension really are unknown, then they might be really good just as easily as they could be bad. Giving those up has a high opportunity cost. Without more information about the nature of these "unknown" consequences, they are not relevant (for non-risk-averse utility maximizers).
Reproductive rights deactivated at birth and restored pending parent certification, maybe? But even if we don't get nanotech or FAI, I bet we'll master biology via massively fast computers juggling the sequenced genes of all life on Earth. Combine this with much greater understanding of the brain, and a lot of stuff get demystified and de-romanticized*, and consequently optimized. Fewer things will be left up to chance, because we won't have to leave them to chance. This could be restricted solely to the developed world. I don't know.
Human nature is the elephant in the room. Will 10 billion un-adjusted humans accept that they can't ever expect to use resources like an average American circa 2005? Will we be able to update our stone age software en masse without FAI?
I have my doubts about space colonization unless we can create Earth-like environments.
*If anyone wants to attempt a "de-romanticized in 2050" list, I'd really love to see it.