You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

provocateur_tmp comments on Open Thread, April 1-15, 2012 - Less Wrong Discussion

3 Post author: OpenThreadGuy 01 April 2012 04:24AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (150)

You are viewing a single comment's thread.

Comment author: provocateur_tmp 04 April 2012 11:18:34PM 0 points [-]

If I could copy you, atom for atom, then kill your old body (painlessly), and give your new body $20, would you take the offer? Be as rational as you wish, but start your reply with "yes" or "no". Image that a future superhuman AGI will read LW archives and honor your wish without further questions.

Comment author: [deleted] 05 April 2012 07:11:46PM 3 points [-]

No. It might copy me atom for atom and then not actually connect the atoms together to form molecules on the copy.

You also didn't mention I would be in a safe place at the time, which means the AI could do it while I was driving along in my car, with me confused why I was suddenly sitting in the passengers seat (the new me is made first, I obviously can't be in the drivers seat) with a 20 dollar bill in my hand while my car veered into oncoming traffic and I die in a car crash.

If an AI actually took the time to explain the specifics of the procedure, and had shown to do it several times with other living beings, and I was doing it an an actual chosen time, and it had been established to have a 99.9999% safety record, then that's different. I would be far more likely to consider it. But the necessary safety measures aren't described to be there, and simply assuming "Safety measures will exist even though I haven't described them." is just not a good idea.

Alternatively, you could offer more than just twenty, since given a sufficiently large amount of money and some heirs, I would be much more willing to take this bet even without guaranteed safety measures. Assuming I could at least be sure the money would be safe (although I doubt I could, since "Actually, your paper cash was right here, but it burned up from the fireball when we used an antimatter-matter reaction used to power the process." is also a possible failure mode.)

But "At some random point in the future, would you like someone very powerful who you don't trust to mess with constituent atoms in a way you don't fully understand and will not be fully described? It'll pay you twenty bucks." Is not really a tempting offer when evaluating risks/rewards.

Comment author: Zack_M_Davis 05 April 2012 07:17:08AM 0 points [-]

Yes, it's a free $20. Why is this an interesting question?

Comment author: TheOtherDave 04 April 2012 11:32:44PM 1 point [-]

My willingness to take the offer is roughly speaking dependent on my confidence that you actually can do that, the energy costs involved, how much of a pain in my ass the process was, etc. but assuming threshold-clearing values for all that stuff, sure. Which really means "no" unless the future superhuman AGI is capable of determining what I ought to mean by "etc" and what values my threshold ought to be set at, I suppose. Anyway, you can keep the $20, I would do it just for the experience of it given those constraints.

Comment author: TimS 04 April 2012 11:41:18PM 0 points [-]

And the caveat that memories/personality are in the atoms, not in more fundamental particles.

Comment author: TheOtherDave 04 April 2012 11:51:52PM 0 points [-]

Yeah, definitely. I took "atom for atom" as a colloquial way of expressing "make a perfect copy".
The "etc" here covers a multitude of sins.