lavalamp comments on Timeless Identity - Less Wrong

23 Post author: Eliezer_Yudkowsky 03 June 2008 08:16AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (234)

Sort By: Old

You are viewing a single comment's thread. Show more comments above.

Comment author: shminux 01 October 2013 05:24:29PM *  -1 points [-]

"a strong subjective experience of moment-to-moment continuity" is an artifact of the algorithm your brain implements. It certainly exists in as much as the algorithm itself exists. So does your personal identity. If in the future it becomes possible to run the same algorithm on a different hardware, it will still produce this sense of personal identity and will feel like "you" from the inside.

Comment author: [deleted] 01 October 2013 05:53:06PM *  0 points [-]

Yes, I'm not questioning whether a future simulation / emulation of me would have an identical subjective experience. To reject that would be a retreat to epiphenomenalism.

Let me rephrase the question, so as to expose the problem: if I were to use advanced technology to have my brain scanned today, then got hit by a bus and cremated, and then 50 years from now that brain scan is used to emulate me, what would my subjective experience be today? Do I experience “HONK Screeeech, bam” then wake up in a computer, or is it “HONK Screeeech, bam” and oblivion?

Yes, I realize that in both cases result in a computer simulation of Mark in 2063 claiming to have just woken up in the brain scanner, with a subjective feeling of continuity. But is that belief true? In the two situations there's a very different outcome for the Mark of 2013. If you can't see that, then I think we are talking about different things, and maybe we should taboo the phrase “personal/subjective identity”.

Comment author: lavalamp 01 October 2013 07:17:20PM 0 points [-]

Do I experience “HONK Screeeech, bam” then wake up in a computer, or is it “HONK Screeeech, bam” and oblivion?

Non-running algorithms have no experiences, so the latter is not a possible outcome. I think this is perhaps an unspoken axiom here.

Comment author: [deleted] 01 October 2013 07:25:01PM 1 point [-]

Non-running algorithms have no experiences, so the latter is not a possible outcome. I think this is perhaps an unspoken axiom here.

No disagreement here - that's what I meant by oblivion.

Comment author: lavalamp 01 October 2013 07:31:37PM 1 point [-]

OK, cool, but now I'm confused. If we're meaning the same thing, I don't understand how it can be a question-- "not running" isn't a thing an algorithm can experience; it's a logical impossibility.