MichaelGR comments on Normal Cryonics - Less Wrong

58 Post author: Eliezer_Yudkowsky 19 January 2010 07:08PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (930)

You are viewing a single comment's thread. Show more comments above.

Comment author: MichaelGR 20 January 2010 03:10:44PM 2 points [-]

That was my understanding, but I think that any world in which there is an AGI that isn't Friendly probably won't be very stable. If that happens, I think there's a lot more chances that humanity will be destroyed quickly and you won't be woken up than that a stable but "worse than death" world will form and decide to wake you up.

But maybe I'm missing something that makes such "worse than death" worlds plausible.

Comment author: wedrifid 20 January 2010 03:34:27PM 2 points [-]

That was my understanding, but I think that any world in which there is an AGI that isn't Friendly probably won't be very stable.

I think you're right. The main risk would be Friendly to Someone Else AI.