Johnicholas comments on What makes you YOU? For non-deists only. - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (92)
A challenge to all doubters.
Omega comes to you with a proposition - heck, I may be able to do it myself in a few decades. I offer to create N atomically precise clones of yourself on the other side of the planet, and give each one the dollar value of all your assets. You can set N as high as you like, provided there's a value that will make you accept the bargain. The price is that I'll kill you ten seconds later.
I assure you that, from past experience, you will not notice the creation of these clones during the ten seconds you have to live. Your extreme similarity will not cause any magical sharing of experiences. And the clones will not notice anything when, ten seconds from their creation, you die; no empirically measurable (including self-reported) consciousness transfers from you to them.
Do you accept the offer?
If you do, and you don't care about your own death, and you don't give as the reason the accomplishment of some external goals that are more important to you than life - then there's a fundamental disconnect between me and you. Indeed, between you and what I naively consider to be universal human ways of thinking. (Of course, if this is the case, the fault lies in my understanding; I'm not trying to denigrate anyone who responds.)
If someone thinks the external goals thingy is a problem (e.g., that all decisions are ultimately taken to satisfy external goals, because you believe a continuous self does not exist) then I can try to formulate a version of the scenario where this is not a consideration.
I think in practice I'd probably set N pretty high (10? 100? 10k?) - it's hard to know what one will do in extreme situations, particularly such unlikely ones.
But an alternative question might be: what should a rational entity do? The answer to this alternative is much easier to compute, and I think it's where the N=1 or N=2 answers are coming from. Would you agree that a creature evolved in an environment with such teleporter-and-duplicators would casually use them at N=1 and eagerly use them at N=2?
Yes, of course such a creature would agree at N=2 and 1. It's a direct way to maximize number of descendants.
Don't describe it as the rational choice though. Rationality has nothing to do with goals. It's the right thing to do only if your goal is to maximize the number of descendants, or clones.
I agree with you that an entity with different goals would behave differently, and that evolution's "goal" isn't (entirely) the same as my goals.
However, there's a sense of coherence with the physical world that I admire about evolution's decisions, and I want to emulate that coherence in choosing my own goals.
The fact that evolution values "duplicate perfectly, then destroy original" equivalently to "teleport" isn't a conclusive argument I should value them equivalently, but it's a suggestive argument towards that conclusion. The fact that my evolutionary environment never contained anything like that is suggestive that my gut feeling about it isn't likely to be helpful.
The balance of evidence seems to be against any such thing as continuous experience existing - an adaptive illusion analogous to the blind spot. Valuing continuous experience highly just doesn't seem to cut nature at its joints.
I think you run into logistical problems when N gets large, by the way.