lackofcheese comments on "Solving" selfishness for UDT - Less Wrong

18 Post author: Stuart_Armstrong 27 October 2014 05:51PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (50)

You are viewing a single comment's thread. Show more comments above.

Comment author: Stuart_Armstrong 29 October 2014 04:49:49PM *  3 points [-]

Probabilities are a function that represents what we know about events

As I said to lackofcheese:

If we create 10 identical copies of me and expose 9 of them one stimuli and 1 to another, what is my subjective anticipation of seeing one stimuli over the other? 10% is one obvious answer, but I might take a view of personal identity that fails to distinguish between identical copies of me, in which case 50% is correct. What if identical copies will be recombined later? Eliezer had a thought experiment where agents were two dimensional, and could get glued or separated from each other, and wondered whether this made any difference. I do to. And I'm also very confused about quantum measure, for similar reasons.

In general, the question "how many copies are there" may not be answerable in certain weird situations (or can be answered only arbitrarily).

EDIT: with copying and merging and similar, you get odd scenarios like "the probability of seeing something is x, the probability of remembering seeing it is y, the probability of remembering remembering it is z, and x y and z are all different." Objectively it's clear what's going on, but in terms of "subjective anticipation", it's not clear at all.

Or put more simply: there are two identical copies of you. They will be merged soon. Do you currently have a 50% chance of dying soon?

Comment author: lackofcheese 29 October 2014 08:00:17PM 1 point [-]

You definitely don't have a 50% chance of dying in the sense of "experiencing dying". In the sense of "ceasing to exist" I guess you could argue for it, but I think that it's much more reasonable to say that both past selves continue to exist as a single future self.

Regardless, this stuff may be confusing, but it's entirely conceivable that with the correct theory of personal identity we would have a single correct answer to each of these questions.

Comment author: Stuart_Armstrong 30 October 2014 09:39:01AM 1 point [-]

Conceivable. But it doesn't seem to me that such a theory is necessary, as it's role seems merely to be able to state probabilities that don't influence actions.