James_Miller comments on Why I haven't signed up for cryonics - Less Wrong

29 Post author: Swimmer963 12 January 2014 05:16AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (249)

You are viewing a single comment's thread. Show more comments above.

Comment author: James_Miller 12 January 2014 07:06:52PM 0 points [-]

For scenario 1, it would almost certainly require less free energy just to get the information directly from the brain without ever bringing the person to consciousness.

For scenario 2, you would seriously consider suicide if you fear that a failed friendly AI might soon be developed. Indeed, since there is a chance you will become incapacitated (say by falling into a coma) you might want to destroy your brain long before such an AI could arise.