gwern comments on The Anthropic Trilemma - Less Wrong

24 Post author: Eliezer_Yudkowsky 27 September 2009 01:47AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (218)

You are viewing a single comment's thread. Show more comments above.

Comment author: gwern 10 October 2009 03:54:43PM 0 points [-]

So only anything in immediate consciousness counts? Fine, let's remove all of the long-term memories of one of the copies - after all, he's not thinking about his childhood...

As far as a merging, well, in that case who, precisely, is the one that's being killed?

Obviously whichever one isn't there afterwards; if the bit is 1, then 0 got killed off & vice versa. If we erase both copies and replace with the original, then both were killed.

Comment author: Psy-Kosh 10 October 2009 04:16:40PM *  0 points [-]

I'd have to say that IF two (equivalent) instances of a mind count as "one mind", then removing an unaccessed data store does not change that for the duration that the effect of the removal doesn't propagate directly or indirectly to the conscious bits.

If one then restores that data store before anything was noticed regarding it being missing, then, conditional on the assumption that IF the two instances originally only counted as one being, then.... so they remain.

EDIT: to clarify, though... my overall issue here is that I think we may be effectively implicitly treating conscious agents as irreducible entities. If we're ever going to find an actual proper reduction of consciousness, well, we probably need to ask ourselves stuff like "what if two agents are bit for bit identical... except for these couple of bits here? What if they were completely identical? Is the couple bit difference enough that they might as well be completely different?" etc...

Comment author: Eliezer_Yudkowsky 10 October 2009 04:46:23PM 0 points [-]

And if we restore a different long-term memory instead?

Comment author: Psy-Kosh 10 October 2009 04:59:32PM *  0 points [-]

I think I'd have to say still "Nothing of significance happened until memory access occurs"

Until then, well, how's it any different then stealing your books... and then replacing them before you notice?

Now, as I said, we probably ought be asking questions like "what if in the actual "conscious processing" part of the agent, a few bits were changed in one instance... but just that... so initially, before it propagates enough to completely diverge... what should we say? To say it completely changes everything instantly, well... that seems too much like saying "conscious agents are irreducible", so...

(just to clarify: I'm more laying out a bit of my confusion here rather than anything else, plus noting that we seem to have been, in our quest to find reductions for aspects of consciousness, implicitly treating agents as irreducible in certain ways)

Comment author: gwern 10 October 2009 06:49:00PM 0 points [-]

(just to clarify: I'm more laying out a bit of my confusion here rather than anything else, plus noting that we seem to have been, in our quest to find reductions for aspects of consciousness, implicitly treating agents as irreducible in certain ways)

Indeed. It's not obvious what we can reduce agents down further into without losing agents entirely; bit-for-bit identity is at least clear in a few situations.

(To continue the example - if we see the unaccessed memory as being part of the agent, then obviously we can't mess with it without changing the agent; but if we intuitively see it as like the agent having Internet access and the memory being a webpage, then we wouldn't regard it as part of its identity.)