Digital immortality seems much cheaper than cryonics and of similar effectiveness. Why isn't it more popular?
I question whether future society would be willing to bring someone back to life even if it was clear that the person wanted to be brought back and there was sufficient information stored to allow it to happen.
There might not be a moral reason to bring someone back to life, because if future agents value content agents, they presumably would be able to create far more content agents far more easily by engineering agents from scratch for maximum contentment with minimum resource use.
There might not be an economic reason to bring someone back to life, becaus...
You suggested reading Longecity. However, it seems that most articles on Longecity are only relevant to whoever posted it, for example by asking about what to do in a very specific situation, or aren't relevant at all to increasing the chance of becoming immortal. Knowing this, how do you recommend reading Longecity, if at all?
Okay, when you would like me to help, email me at 0xakaria@gmail.com.
If a version is written in English, I'll probably be willing to review and proofread it. I'm a decent writer in English, and I know a fairly large amount about immortality. I wrote Immortality: A Practical Guide using a different Less Wrong account, in case you're interested.
Thanks for the response. Do you know if the book will be made available in English, and if so, approximately when?
Yes, that's why I'm asking here.
In case you didn't know, storing writing as images like it is in your mind map is bad for accessibility. Those who aren't visual, for example search engine indexing bots and blind humans, have difficulty reading such writing.
Thanks for the post. The bottom of the mind map references the book Immortality by Alexey Turchin, but an Internet search failed to reveal any links to or discussing it. Do you know where it can be found?
I have been looking for articles discussing to what extent terminal values change. This question is important, as changing terminal values are generally very harmful to their accomplishment, as explained here for AI under "Basic AI drives".
This article says that some values change. This paper suggests that there are core values that are unlikely to change. However, neither of these articles say whether the values they examined are terminal values, and I'm not knowledgeable enough about psychology to determine if they are.
Any relevant thoughts or ...
I have been considering the potential for demographic changes due to mind uploading to be even more extreme than you might initially think. This may be caused by people who are both willing to create massive numbers of copies of themselves and who are better suited for an economic niche than anyone else is for that niche, or at least anyone else willing to make very large numbers of copies of themselves. In such a situation, it would be more profitable for a firm to hire such a person than it would be for them to hire others, which may result it that niche...
I'm considering taking anti-androgens, but I'm not sure what effect this would have on lifespan.
Would anti-androgen use have similar effects on lifespan as castration? I know both anti-androgens and castration cause decreased testosterone production, but I know almost nothing about this sort of thing, so I don't know if this is relevant.
Anti-androgens are much easier to attain than castration. According to this, "WPATH Standards of Care no longer encourage therapy as a requirement to access hormones".
Also, according to the article I linked, "... (read more)