wedrifid comments on Lifeism, Anti-Deathism, and Some Other Terminal-Values Rambling - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (87)
Yes. You included a lot of disclaimers and they seem to be sufficient.
According to my preferences there are already more humans around than desirable, at least until we have settled a few more galaxies. Which emphasizes just how important the no externalities clause was to my judgement. Even the externality of diluting the neg-entropy in the cosmic commons slightly further would make the creation a bad thing.
I don't share the same preference intuitions as you regarding self-clone-torture. I consider copies to be part of the output. If they are identical copies having identical experiences then they mean little more than having a backup available. If some are getting tortured then the overall output of the relevant computation really does suffer (in the 'get slightly worse' sense although I suppose it is literal too).
It's OK. I (lightheartedly) reckon my clone army could take out your clone army if it became necessary to defend myselves. I/we'd then have to figure out how to put 'ourselfs' back together again without merge conflicts once the mobilization was no longer required. That sounds like a tricky task, but it could be fun.
I derive my intuitions from the analogy of a cpu-inefficient interpreted language. I don't care about the 99% wasted cycles, except secondarily as a moderate inconvenience. I care about whether the job gets done.