FeepingCreature comments on Forked Russian Roulette and Anticipation of Survival - Less Wrong

7 Post author: FeepingCreature 06 April 2012 03:57AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (25)

You are viewing a single comment's thread. Show more comments above.

Comment author: Mitchell_Porter 06 April 2012 10:48:38AM *  6 points [-]

I know that's your idea, I'm saying it's stupid. If I torture you every night and wipe your memory before morning, are you just indifferent to that? I could add this to the torture: "I asked your daylight self after the mindwipe if it would be wrong to do what I'm doing to you, and he said no, because by black-box reasoning torturing you now doesn't matter, so long as I erase the effects by morning."

ETA: Maybe it's harsh to call it stupid when your original scenario wasn't about deliberately ignoring torture inside the black box. It was just an innocent exercise in being copied and then one of you deleted.

But you cannot presume that the person who anticipates surviving with certainty is correct, just because a copy of them certainly survives to get the bigger payoff. Your argument is: hey cryonics skeptic, here we see someone with a decision procedure which identifies the original with its copies, and it gets the bigger payoff; so judged by the criterion of results obtained ("winning") this is the superior attitude, therefore the more rational attitude, and so your objection to cryonics is irrational.

However, this argument begs the question of whether the copy is the same person as the original. A decision procedure would normally be regarded as defective if it favors an outcome because of mistaken identity - because person X gets the big payoff, and it incorrectly identifies X with the intended beneficiary of the decision making. And here I might instead reason as follows: that poor fool who volunteers for iterated russian roulette, the setup has fooled him into thinking that he gets to experience the payoff, just because a copy of him does.

As I recently wrote here, there is a "local self", the "current instance" of you, and then there may be a larger "extended self" made of multiple instances with which your current instance identifies. In effect, you are asking people to adopt a particular expansive identity theory - you want them to regard their copies as themselves - because it means bigger payoffs for them in your thought-experiment. But the argument is circular. For someone with a narrow identity theory ("I am only my current instance"), to run the gauntlet of iterated russian roulette really is to make a mistake.

The scenario where we torture you and then mindwipe you is not an outright rebuttal of an expansive attitude to one's own personal identity, but it does show that the black-box argument is bogus.

Comment author: FeepingCreature 06 April 2012 01:38:22PM 0 points [-]

And your edit leaves you with an interesting conundrum.

It can put you in a situation where you see people around yourself adopting one of two strategies, and the people who adopt one strategy consistently win, and the people who adopt another strategy consistently lose, but you still refuse to adopt the winning strategy because you think the people who win are .. wrong.

I'm not sure if you can call that a win.

Comment author: Mitchell_Porter 06 April 2012 02:10:34PM *  5 points [-]

"Win" by what standards? If I think it is ontologically and factually incorrect - an intellectual mistake - to identify with your copies, then those who do aren't winning, any more than individual lemmings win when they dive off a cliff. If I am happy to regard a person's attitude to their copies as a matter of choice, then I may regard their choices as correct for them and my choices as correct or me.

Robin Hanson predicts a Malthusian galactic destiny, in which the posthuman intelligences of the far future are all poorer than human individuals of the present, because selection will favor value systems which are pro-replication. His readers often freak out over Robin's apparent approval of this scenario of crowded galactic poverty; he approves because he says that these far-future beings will be emotionally adapted to their world; they will want things to be that way.

So this is a similar story. I am under no obligation to adopt an expansive personal identity theory, even if that is a theory whose spread is favored by the conditions of uploaded life. That is merely a statement about how a particular philosophical meme prospers under new conditions, and about the implications of that for posthuman demographics; it is not a fact which would compel me to support the new regime out of self-interest, precisely because I do not already regard my copies as me, and I therefore do not regard their winnings as mine.

Comment author: Luke_A_Somers 06 April 2012 06:59:20PM 3 points [-]

Winning by the standard that a person who thinks gaining $1k is worth creating 1023 doomed copies of themselves will, in this situation, get ahead by $1k.