Mirzhan_Irkegulov comments on Leaving LessWrong for a more rational life - Less Wrong

33 [deleted] 21 May 2015 07:24PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (268)

You are viewing a single comment's thread. Show more comments above.

Comment author: Vaniver 21 May 2015 08:56:43PM 16 points [-]

On the other hand, de Grey and others who are primarily working on the scientific and/or engineering challenges of singularity and transhumanist technologies were far less likely to subject themselves to epistematic mistakes of significant consequences.

This part isn't clear to me. The researcher who goes into generic anti-cancer work, instead of SENS-style anti-aging work, probably has made an epistemic mistake with moderate consequences, because of basic replaceability arguments.

But to say that MIRI's approach to AGI safety is due to a philosophical mistake, and one with significant consequences, seems like it requires much stronger knowledge. Shooting very high instead of high is riskier, but not necessarily wronger.

Thankfully there is an institution that is doing that kind of work: the Future of Life institute (not MIRI).

I think you underestimate how much MIRI agrees with FLI.

Why they do not get more play in the effective altruism community is beyond me.

SENS is the second largest part of my charity budget, and I recommend it to my friends every year (on the obvious day to do so). My speculations on why EAs don't favor them more highly mostly have to do with the difficulty of measuring progress in medical research vs. fighting illnesses, and possibly also the specter of selfishness.

Comment author: Mirzhan_Irkegulov 21 May 2015 11:01:28PM *  6 points [-]

Yudkowsky obviously supports immortality. Quote from his letter on his brother's death:

If you object to the Machine Intelligence Research Institute then consider Dr. Aubrey de Grey's Methuselah Foundation, which hopes to defeat aging through biomedical engineering.

If SENS is not sufficiently promoted as a target for charity, I have no idea why is that, and I dispute that it's because of LW community's philosophical objections, unless somebody can convince me otherwise. BTW, EA community != LW community, so maybe lot's of Effective Altruists just don't consider immortality the same way the do malaria (cached thoughts etc).

Comment author: [deleted] 21 May 2015 11:31:40PM *  2 points [-]

If SENS is not sufficiently promoted as a target for charity, I have no idea why is that, and I dispute that it's because of LW community's philosophical objections, unless somebody can convince me otherwise.

To be clear this is not an intended implication. I'm aware that Yudkowsky supports SENS, and indeed my memory is fuzzy but it might have been though exactly the letter you quote that I first heard about SENS.