Eliezer_Yudkowsky comments on Normal Cryonics - Less Wrong

58 Post author: Eliezer_Yudkowsky 19 January 2010 07:08PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (930)

You are viewing a single comment's thread. Show more comments above.

Comment author: Eliezer_Yudkowsky 20 January 2010 04:06:07AM 3 points [-]

If your life were literally at stake and I were a Friendly AI, I bet I could wake you up next to someone who could become fast friends with you within five hours. It doesn't seem like a weak link in the chain, let alone the weakest one.

Comment author: Alicorn 20 January 2010 04:10:09AM 2 points [-]

It is the most terrifying link in the chain. Most of the other links, if they break, just look like a dead Alicorn, not a dead Alicorn who killed herself in a fit of devastating, miserable starvation for personal connection.

If you thought it was reasonably likely that, given the success of cryonics, you'd be obliged to live without something you'd presently feel suicidal without (I'm inclined to bring up your past analogy of sex and heroin fix here, but substitute whatever works for you), would you be so gung-ho?

Comment author: Eliezer_Yudkowsky 20 January 2010 04:20:52AM 7 points [-]

I could sorta understand this if we were talking about one person you couldn't live without, it's the idea of worrying about not having any deep friends in general that's making me blink.

Some people are convinced they'll have to live without the strangest things after the Singularity... having encountered something possibly similar before, I do seriously wonder if you might be suffering from a general hope-in-the-future deficiency.

PS/Edit: Spider Robinson's analogy, not mine.

Comment author: Kevin 20 January 2010 12:23:47PM *  4 points [-]

If you were the friendly AI and Alicorn failed to make a fast friend as predicted and that resulted in suicidal depression, would that depression be defined as mental illness and treated as such? Would recent wake-ups have the right to commit suicide? I think that's an incredibly hard question so please don't answer if you don't want to.

Have you written anything on suicide in the metaethics sequence or elsewhere?

Comment author: wedrifid 20 January 2010 12:34:38PM 3 points [-]

would that depression be defined as mental illness and treated as such?

And the relevant question extends to the assumption behind the phrase 'and treated as such'. Do people have the right to be nuts in general?

Comment author: Alicorn 20 January 2010 04:27:08AM *  2 points [-]

I have only managed to live without particular persons who've departed from my life for any reason by virtue of already having other persons to console me.

That said, there are a handful of people whose loss would trouble me especially terribly, but I could survive it with someone else around to grieve with.