You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

gurugeorge comments on "Immortal But Damned to Hell on Earth" - Less Wrong Discussion

1 Post author: Bound_up 29 May 2015 07:55PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (20)

You are viewing a single comment's thread. Show more comments above.

Comment author: gurugeorge 30 May 2015 09:38:57PM 0 points [-]

Oh, true for the "uploaded prisoner" scenario, I was just thinking of someone who'd deliberately uploaded themselves and wasn't restricted - clearly suicide would be possible for them.

But even for the "uploaded prisoner", given sufficient time it would be possible - there's no absolute impermeability to information anywhere, is there? And where there's information flow, control is surely ultimately possible? (The image that just popped into my head was something like, training mice via. flashing lights to gnaw the wires :) )

But that reminds me of the problem of trying to isolate an AI once built.

Comment author: Lumifer 01 June 2015 03:07:00PM 1 point [-]

I was just thinking of someone who'd deliberately uploaded themselves and wasn't restricted - clearly suicide would be possible for them.

That is not self-evident to me at all. If you don't control the hardware (and the backups), how exactly would that work? As a parallel, imagine youself as sole mind, without a body. How will your sole mind kill itself?

And where there's information flow, control is surely ultimately possible?

Huh? Of course not. Information is information and control is control. Don't forget that as you accumulate infomation, so do your jailers.