You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

wedrifid comments on Lifeism, Anti-Deathism, and Some Other Terminal-Values Rambling - Less Wrong Discussion

4 Post author: Pavitra 07 March 2011 04:35AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (87)

You are viewing a single comment's thread.

Comment author: wedrifid 07 March 2011 02:32:37PM 3 points [-]

Do you push the button?

Yes. You included a lot of disclaimers and they seem to be sufficient.

According to my preferences there are already more humans around than desirable, at least until we have settled a few more galaxies. Which emphasizes just how important the no externalities clause was to my judgement. Even the externality of diluting the neg-entropy in the cosmic commons slightly further would make the creation a bad thing.

I don't share the same preference intuitions as you regarding self-clone-torture. I consider copies to be part of the output. If they are identical copies having identical experiences then they mean little more than having a backup available. If some are getting tortured then the overall output of the relevant computation really does suffer (in the 'get slightly worse' sense although I suppose it is literal too).

Also, I would hesitate to torture copies of other people, on the grounds that there's a conflict of interest and I can't trust myself to reason honestly. I might feel differently after I'd been using my own fork-slaves for a while.

It's OK. I (lightheartedly) reckon my clone army could take out your clone army if it became necessary to defend myselves. I/we'd then have to figure out how to put 'ourselfs' back together again without merge conflicts once the mobilization was no longer required. That sounds like a tricky task, but it could be fun.

Comment author: Pavitra 08 March 2011 04:22:21AM 0 points [-]

I don't share the same preference intuitions as you regarding self-clone-torture. I consider copies to be part of the output.

I derive my intuitions from the analogy of a cpu-inefficient interpreted language. I don't care about the 99% wasted cycles, except secondarily as a moderate inconvenience. I care about whether the job gets done.