You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Pentashagon comments on DRAFT:Ethical Zombies - A Post On Reality-Fluid - Less Wrong Discussion

0 Post author: MugaSofer 09 January 2013 01:38PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (116)

You are viewing a single comment's thread. Show more comments above.

Comment author: Pentashagon 10 January 2013 11:22:49PM 0 points [-]

Is it worse to torture a virtual person running on redundant hardware (say 3 computers in lock-step, like the Space Shuttle used) whose permanent state (or backups) is stored on a RAID1 of disks instead of a virtual person running on a single CPU with one disk? Or even simpler; is it worse to torture a more massive person than a less massive person? Personally, I would say no.

Just like there's only one electron, I think there's only one of any particular thing, at least in the map. The territory may actually be weird and strange, but I don't have any evidence that redundant exact copies have as much moral weight as a single entity. I think that it's worse to torture 1 non-redundant person than it is to torture n-1 out of n exact copies of that person, for any n. That only applies if it's exactly the same simulation n-1 times. If those simulations start to diverge into n different persons, it starts to become as bad as torturing n different unique people. Eventually even those n-1 exact copies would diverge enough from the original to be considered copies of a different person with its own moral weight. My reasoning is just probabilistic in expected utility: It's worse for an agent to expect p(torture)=1 than p(torture)=n-1/n, and an identical agent can't distinguish between identical copies (including its environment) of itself.

Comment author: OrphanWilde 10 January 2013 11:31:22PM 1 point [-]

As soon as you start torturing one of those identical agents, it ceases to be identical.

I guess the question from there is, does this produce a cascade of utility, as small divergences in the simulated universe produce slightly different agents for the other 6 billion people in the simulation, whose utility then exists independently?

Comment author: MugaSofer 11 January 2013 12:39:31PM -2 points [-]

That it is true, if unintuitive, that people gain moral worth the more "real" they get, is a position I have seen on LW, and the arguments do seem reasonable. (It is also rather more coherent when used in a Big Universe.) This post assumes that position, and includes a short version of the most common argument for that position.

Incidentally, I used to hold the position you describe; how do you deal with the fact that a tortured copy is, by definition, no longer "part" of the original?