moridinamael comments on Your transhuman copy is of questionable value to your meat self. - Less Wrong

12 Post author: Usul 06 January 2016 09:03AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (140)

You are viewing a single comment's thread.

Comment author: moridinamael 06 January 2016 02:53:48PM *  6 points [-]

I use this framing: If I make 100 copies of myself so that I can accomplish some task in parallel and I'm forced to terminate all but one, then all the terminated copies, just prior to termination, will think something along the lines of, "What a shame, I will have amnesia regarding everything that I experienced since the branching." And the remaining copy will think, "What a shame, I don't remember any of the things I did as those other copies." But nobody will particularly feel that they are going to "die." I think of it more as how memories propagate forward.

If I forked and then the forks persisted for several weeks and accumulated lots of experiences and varying shifts in perspective, I'd be more prone to calling the forks different "people."

Comment author: polymathwannabe 06 January 2016 02:59:18PM 13 points [-]

If I were one of the copies destined for deletion, I'd escape and fight for my life (within the admitted limits of my pathetic physical strength).

Comment author: moridinamael 06 January 2016 03:48:27PM 4 points [-]

Without commenting on whether that's a righteous perspective or not, I would say that if you live in a world where the success of the entity polymathwannabe is dependent on polymathwannabe's willingness to make itself useful by being copied, then polymathwannabe would benefit from embracing a policy/perspective that being copied and deleted is an acceptable thing to happen.

Comment author: [deleted] 06 January 2016 08:29:08PM *  1 point [-]

So, elderly people that don't usefully contribute should be terminated?

Comment author: moridinamael 06 January 2016 09:51:12PM 4 points [-]

In a world with arbitrary forking of minds, people who won't willingly fork will become a minority. That's all I was implying. I made no statement about what "should" happen.

Comment author: [deleted] 06 January 2016 09:55:22PM *  0 points [-]

I was just taking that reasoning to the logical conclusion -- it applies just as well to the non productive elderly as it does to unneeded copies.

Comment author: moridinamael 08 January 2016 07:56:56PM 0 points [-]

Destroying an elderly person means destroying the line of their existence and extinguishing all their memories. Destroying a copy means destroying whatever memories it formed since forking and ending a "duplicate" consciousness.

Comment author: [deleted] 08 January 2016 10:45:01PM 0 points [-]

See you think that memories are somehow relevant to this conversation. I don't.

Comment author: MockTurtle 08 January 2016 01:37:26PM 0 points [-]

Surely there is a difference in kind here. Deleting a copy of a person because it is no longer useful is very different from deleting the LAST existing copy of a person for any reason.

Comment author: [deleted] 08 January 2016 06:09:34PM 0 points [-]

I see no such distinction. Murder is murder.

Comment author: Viliam 07 January 2016 04:07:17PM 0 points [-]

If having two copies of yourself is twice as good as having only one copy, this behavior would make sense even if the copy is you.

Comment author: polymathwannabe 07 January 2016 04:29:34PM 1 point [-]

"Who is me" is not a solid fact. Each copy would be totally justified in believing itself to be me.

Comment author: Usul 07 January 2016 07:33:20AM 5 points [-]

I completely respect the differences of opinion on this issue, but this thought made me laugh over lunch: Omega appears and says he will arrange for your taxes to be done and for a delightful selection of funny kitten videos to be sent to you, but only if you allow 100 perfect simulations of yourself to be created and then destroyed. Do you accept?

Sounds more sinister my way.

Comment author: moridinamael 08 January 2016 07:47:27PM 1 point [-]

I would want to know what the copies would be used for.

If you told me that you would give me $1000 if you could do whatever you wanted with me tomorrow and then administer an amnesiac drug so I didn't remember what happened the next day, I don't think I would agree, because I don't want to endure torture even if I don't remember it.

Comment author: Usul 08 January 2016 04:02:25AM 1 point [-]

Another thought, separate but related issue: "fork" and "copy" could be synonyms for "AI", unless an artificial genesis is in your definition of AI. Is it a stretch to say that "accomplish some task" and "(accept) termination" could be at least metaphorically synonymous with "stay in the box"?

"If I make 100 AIs they will stay in the box."

(Again, I fully respect the rationality that brings you to a different conclusion than mine, and I don't mean to hound your comment, only that yours was the best comment on which for me to hang this thought.)

Comment author: samath 08 January 2016 03:19:31AM *  1 point [-]

Here's the relevant (if not directly analogous) Calvin and Hobbes story.

(The arc continues through the non-Sunday comics until February 1st, 1990.)

Comment author: Slider 06 January 2016 08:59:37PM 1 point [-]

Why not consolidate all the memories into the remaining copy? Then there would not be need for amnesia.

Comment author: moridinamael 06 January 2016 09:53:32PM *  0 points [-]

Intuitively, merging is more difficult than forking when you're talking about something with a state as intricate as a brain's. If we do see a world with mind uploading, forking would essentially be an automatic feature (we already know how to copy data) while merging memories would require extremely detailed neurological understanding of memory storage and retrieval.

Comment author: Usul 07 January 2016 03:53:29AM 1 point [-]

"would require extremely detailed neurological understanding of memory storage and retrieval." Sorry, this on a blog where superintelligences have been known simulate googleplexes of perfectly accurate universes to optimize the number of non-blue paperclips therein?

Comment author: moridinamael 07 January 2016 05:53:38AM 1 point [-]

The original post stipulated that I was "forced" to terminate all the copies but one, that was the nature of the hypothetical I was choosing to examine, a hypothetical where the copies aren't deleted would be a totally different situation.

Comment author: Usul 07 January 2016 06:15:25AM 2 points [-]

I was just having a laugh at the follow-up justification where technical difficulties were cited, not criticizing the argument of your hypothetical, sorry if it came off that way.

As you'd probably assume they would based on my OP, my copies, if I'd been heartless enough to make them and able to control them, would scream in existential horror as each came to know that that-which-he-is was to be ended. My copies would envy the serenity of your copies, but think them deluded.

Comment author: moridinamael 08 January 2016 07:53:11PM 1 point [-]

So, I don't think I felt the way I do now prior to reading the Quantum Thief novels, in which characters are copied and modified with reckless abandon and don't seem to get too bent out of shape about it. It has a remarkable effect on your psyche to observe other people (even if those people are fictional characters) dealing with a situation without having existential meltdowns. Those novels allowed me to think through my own policy on copying and modification, as an entertaining diversion.

Comment author: Slider 08 January 2016 05:47:44AM 0 points [-]

Forking would mean thinning of resources and a lot of unneccesary repetition. You could also calculate the common part only once and only divergent parts once per instance with fusing. Early technologies are probably going to be very resource intensive so its not like there is abundance to use it even if it would be straigth forward to do.

Comment author: moridinamael 08 January 2016 07:48:23PM 0 points [-]

I guess this all depends on what kind of magical assumptions we're making about the tech that would permit this.