Nornagest comments on "Ray Kurzweil and Uploading: Just Say No!", Nick Agar - Less Wrong

4 Post author: gwern 02 December 2011 09:42PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (79)

You are viewing a single comment's thread. Show more comments above.

Comment author: Nornagest 03 December 2011 12:50:00AM *  3 points [-]

I'd rate the chance that early upload techniques miss some necessary components of sapience as reasonably high, but that's a technical problem rather than a philosophical one. My confidence in uploading in principle, on the other hand, is roughly equivalent to my confidence in reductionism: which is to say pretty damn high, although not quite one or one minus epsilon. Specifically: for all possible upload techniques to generate a discontinuity in a way that, say, sleep doesn't, it seems to me that not only do minds need to involve some kind of irreducible secret sauce, but also that that needs to be bound to substrate in a non-transferable way, which would be rather surprising. Some kind of delicate QM nonsense might fit the bill, but that veers dangerously close to woo.

The most parsimonious explanation seems to be that, yes, it involves a discontinuity in consciousness, but so do all sorts of phenomena that we don't bother to note or even notice. Which is a somewhat disquieting thought, but one I'll have to live with.

Comment author: vi21maobk9vp 03 December 2011 09:07:23AM 0 points [-]

Actually, http://lesswrong.com/lw/7ve/paper_draft_coalescing_minds_brain/ seems to discuss a way of upload being non-destructive transition. We know that brain can learn to use implanted neurons under some very special conditions now; so maybe you could first learn to use an artificial mind-holder (without a mind yet) as a minor supplement and then learn to use it more and more until death of your original brain is just a flesh wound. Maybe not - but it does seem to be a technological problem.

Comment author: Nornagest 03 December 2011 06:09:19PM 1 point [-]

Yeah, I was assuming a destructive upload for simplicity's sake. Processes similar to the one you outline don't generate an obvious discontinuity, so I imagine they'd seem less intuitively scary; still, a strong Searlean viewpoint probably wouldn't accept them.