shminux comments on Advice for AI makers - Less Wrong

7 Post author: Stuart_Armstrong 14 January 2010 11:32AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (196)

You are viewing a single comment's thread. Show more comments above.

Comment author: shminux 28 June 2014 07:26:11AM *  2 points [-]

The standard excuse given by those who pretend to believe in many worlds is that you are likely to get maimed in the universes where you get shot but don't die, which is somewhat unpleasant. If you come up with a more reliable way to quantum suicide, like using a nuke, they find another excuse.

Comment author: [deleted] 28 June 2014 04:17:24PM 0 points [-]

Methinks that is still a lack of understanding, or a disagreement on utility calculations. I myself would rate the universes where I die as lower utility still than those were I get injured (indeed the lowest possible utility).

Better still if in all the universes I don't die.

Comment author: DefectiveAlgorithm 29 June 2014 02:47:36AM *  0 points [-]

I do think 'a disagreement on utility calculations' may indeed be a big part of it. Are you a total utilitarian? I'm not. A big part of that comes from the fact that I don't consider two copies of myself to be intrinsically more valuable than one - perhaps instrumentally valuable, if those copies can interact, sync their experiences and cooperate, but that's another matter. With experience-syncing, I am mostly indifferent to the number of copies of myself to exist (leaving aside potential instrumental benefits), but without it I evaluate decreasing utility as the number of copies increases, as I assign zero terminal value to multiplicity but positive terminal value to the uniqueness of my identity.

My brand of utilitarianism is informed substantially by these preferences. I adhere to neither average nor total utilitarianism, but I lean closer to average. Whilst I would be against the use of force to turn a population of 10 with X utility each into a population of 3 with (X + 1) utility each, I would in isolation consider the latter preferable to the former (there is no inconsistency here - my utility function simply admits information about the past).

Comment author: [deleted] 29 June 2014 06:41:43AM 0 points [-]

That line of thinking leads directly to recommending immediate probabilistic suicide, or at least indifference to it. No thanks.

Comment author: DefectiveAlgorithm 29 June 2014 07:28:26AM 0 points [-]

How so?