Simulation_Brain comments on MWI, copies and probability - Less Wrong

13 [deleted] 25 June 2010 04:46PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (127)

You are viewing a single comment's thread.

Comment author: Simulation_Brain 25 June 2010 06:07:15PM 5 points [-]

I think the point is that not valuing non-interacting copies of oneself might be inconsistent. I suspect it's true; that consistency requires valuing parallel copies of ourselves just as we value future variants of ourselves and so preserve our lives. Our future selves also can't "interact" with our current self.

Comment author: Morendil 25 June 2010 07:07:13PM 2 points [-]

The poll in the previous post had to do with a hypothetical guarantee to create "extra" (non-interacting) copies.

In the situation presented here there is nothing justifying the use of the word "extra", and it seems analogous to quantum-lottery situations that have been discussed previously. I clearly have a reason to want the world to be such that (assuming MWI) as many of my future selves as possible experience a future that I would want to experience.

As I have argued previously, the term "copy" is misleading anyway, on top of which the word "extra" was reinforcing the connotations linked to copy-as-backup, where in MWI nothing of the sort is happening.

So, I'm still perplexed. Possibly a clack on my part, mind you.

Comment deleted 25 June 2010 07:42:05PM [-]
Comment author: Morendil 25 June 2010 08:11:11PM 5 points [-]

I value having a future that accords with my preferences. I am in no way indifferent to your tossing a grenade my way, with a subjective 1/2 probability of dying. (Or non-subjectively, "forcing half of the future into a state where all my plans, ambitions and expectations come to a grievous end.")

I am, however, indifferent to your taking an action (creating an "extra" non-interacting copy) which has no influence on what future I will experience.

Comment deleted 25 June 2010 08:47:17PM *  [-]
Comment author: Morendil 25 June 2010 09:09:04PM 2 points [-]

I wouldn't be happy to experience waking up and realizing that I was a copy about to be snuffed (or even wondering whether I was). So I would prefer not to inflict that on any future selves.

Comment deleted 25 June 2010 09:12:50PM [-]
Comment author: Morendil 25 June 2010 09:21:55PM 3 points [-]

It doesn't really seem to matter, in that case, that you wake them up at all.

And no, I wouldn't get very worked up about the fate of such patterns (except insofar as I would like them to be preserved for backup purposes).

Comment deleted 25 June 2010 07:44:04PM [-]
Comment author: Morendil 25 June 2010 08:03:21PM 6 points [-]

As cousin_it has argued, "selectively killing most of my future selves" is something that I subjectively experience as "having a sizeable probability of dying". That doesn't appeal.

Comment deleted 25 June 2010 08:06:31PM *  [-]
Comment author: Morendil 25 June 2010 08:14:45PM 0 points [-]

Yup.