You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Vladimir_Nesov comments on 2013 Census/Survey: call for changes and additions - Less Wrong Discussion

27 Post author: Yvain 05 November 2013 03:10AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (154)

You are viewing a single comment's thread.

Comment author: Vladimir_Nesov 06 November 2013 11:20:15AM *  5 points [-]

The same problem as for the cryonics question is present in the anti-agathics question, and arguably it's worse for anti-agathics, as the total probability doesn't inform any decisions. The current question is as follows:

P(Anti-Agathics)
What is the probability that at least one person living at this moment will reach an age of one thousand years?

In my estimate, most of the factors influencing the answer have nothing to do with life prolonging technologies. For a person to live for 1000 years, there has to be no global catastrophe that kills all during all this time. But most of the probability for the outcome of 1000 years goes into either a global catastrophe or a FAI, with some going into stable upload systems, where it's unclear how to count the years. On the FAI branch most humans likely won't persist in their original form, if at all, which further reduces the probability. The resulting probability has almost nothing to do with longevity. (CharlesR also points out that there is a cryonics loophole, so I'm changing "will reach the age of ..." to "will live for at least ...".)

I suggest replacing the question with a hypothetical:

P(Anti-Agathics | No external defeaters)
What is the probability that at least one person living at this moment will live for at least one thousand years, conditional on no global catastrophe and no strong AI?