Consider a future person living a happy and fulfilling life. They're unfortunate enough to suffer a severe accident, but there's time to preserve and then scan their brain fully, after which they can be brought back up in an emulator on a computer. [1] It doesn't matter that they're now running on digital hardware instead of in a biological brain; they're still a person and they still count.

Now imagine this person or "em" asks to be let alone, cuts off all communication from the rest of the world, and rejoices privately in finally being able to fully explore their introverted nature. This isn't what I imagine myself doing, but is a choice I can respect.

Someone comes along and suggests turning off this person's emulation on the grounds that no one will know the difference, and we can use the hardware for something else. This seems wrong. Which means this computational process is valuable entirely for its own sake, independent of its effect on the world.

Unlike biological brains, computational processes are very flexible. We could run many copies, or run them much faster or slower than usual. We could run a specific segment of their independent experience repeatedly, perhaps the happiest few moments. It also seems unlikely that a full emulation of a human is the only thing that's valuable. Perhaps there are simpler patterns we could emulate that would be much better in terms of value per dollar?

I'm trying to reduce my concept of value and getting lots of strange questions.

I also posted this on my blog


[1] I think this will be possible, but not for a while.

 

New Comment
15 comments, sorted by Click to highlight new comments since:

It seems to me that the unfamiliarity of the scenario is preventing you from harnessing some perfectly standard moral intuitions. Let's suppose, instead, that we have a Mark 1 human running on ordinary carbon, who announces his intention to be a hermit off in the woods and never speak to anyone again. Further, he says, he will need regular supplies of apples so as to avoid scurvy. Would you, on the grounds that this person has value in himself, go to the appointed spot every week with a bag of apples? If not, why would you supply the em with electricity? I observe in passing that some cultures have indeed supplied their holy hermits with regular offerings, but it is not clear to me that they did so while never seeing the hermit or receiving a blessing from him.

Now, if the em is not demanding charity but is running on electricity he owns, then the question seems quite different. In that case the observation "nobody will know the difference" is factually wrong: At least two people will know that they live in a culture in which contracts or property rights are not always respected when the owner is not there to defend his rights. There are all kinds of good reasons not to take any steps towards such a society, which have nothing to do with the value of any particular em.

Would you, on the grounds that this person has value in himself, go to the appointed spot every week with a bag of apples?

I would see this as a potential charitable act, in competition with other charities. As such it's not particularly efficient: my time plus a bag of apples weekly to keep him free of scurvy is nowhere near as good as something like the AMF or SCI.

So perhaps the value of keeping him living is too low for the cost, and similarly we could have this for an emulated person. But the important thing is that they do have a value independent of their effect on others, and there may be cases where supporting emulations could be the most effective charity.

Well, there you go then: Now we have a standard problem in efficient charity. No new intuitions required.

It's still a problem, in that valuing a computational process remains somewhat bizarre. Questions like whether it's still valuable to run exactly identical copies or rerun the same computation repeatedly from the same state just don't come up with people.

[-]fare110

You should DEFINITELY read Greg Egan's "Permutation City", where he explores all kinds of such concept even to the point of absurdity -- but you are the one who gets to decide where it starts to be absurd and why; he just does the exploring in a delicious SF novel.

I agree that it does seem likely that humans (would) value certain classes of computations.

Someone comes along and suggests turning off this person's emulation on the grounds that no one will know the difference, and we can use the hardware for something else.

You could consider the rest of humainty to be it's own computation, and then it also seems obvious that it would be wrong for this lone emulation to shut down the rest of humanity.

It also seems unlikely that a full emulation of a human is the only thing that's valuable. Perhaps there are simpler patterns we could emulate that would be much better in terms of value per dollar?

The first things that comes to mind are babies and pets. However, I don't think it is as valuable to keep a compuation at baby-level, as it would be to allow it to extend to normal human level (by growing up to an adult, essentially). And for pets, I think at least part of the value comes from the interaction with a human-level computation (since people don't seem to value arbitrary animals used for meat nearly as much as pets are valued). So I don't think that either of these cases could be used as a substitute; at least I wouldn't find it very valuable if we tiled the universe in baby or cat emulations.

It also seems unlikely that a full emulation of a human is the only thing that's valuable.

I actually find this moderately likely, at least in the sense that I think most people would consider it to be very undesireable to not have a "complete" life experience. What constitutes a "complete" life might vary with culture (is death required for a complete life?), but I think there would be some sort of minimum valuable computation.

It's entirely possible that the universe that we live in is a computational process. If so, it does not appear to take any input, and could well not provide any output. If we were to find out that this were the case (I can see the headline now: 'GOOD NEWS: FREE UNLIMITED ENERGY -- BAD NEWS: RELIES ON FLOATING POINT ROUNDING ERROR'), I would not suddenly conclude that my life had no value. This being the case, I must extend the same sort of value to computational processes running under our universe.

[-][anonymous]00

A typical formulation of Turing machines stipulates that they take no input. For certain purposes, Turing machines have as their only output whether or not they halt.

Turing machines that take input take it only as the initial state of their tape. Those that produce output produce it as the final state of their tape.

Someone comes along and suggests turning off this person's emulation on the grounds that no one will know the difference, and we can use the hardware for something else. This seems wrong. Which means this computational process is valuable entirely for its own sake, independent of its effect on the world.

I expect it only feels wrong because if you were in an introverted state, you would prefer the power not be turned off on you (so you wish for others to have the same courtesy).

I'm not sure that changes the logic, though. Why do I prefer that the power not be turned off to introverted-me, if not because I consider executing introverted-me valuable?

I was drawing a distinction between "I consider introverted-me valuable and worthy of avoiding execution" and "this computational process is valuable entirely for its own sake". Which, on reflection, may not be much of a distinction.

[-]Shmi20

This seems wrong.

Why does this seem wrong to you? Can you trace your reasoning to your terminal values?

[-]fare-10

If I put some em in a context that makes him happy and that somehow "counts", what if I take the one em whose happiness is maximal (by size / cost / whatever measure), then duplicate the very same em, in the very same context, ad infinitum, and have 1 gazillion copies of him, e.g. being repeatedly jerked off by $starlet ? Does each new copy count as much as the original? Why? Why not? What if the program was run on a tandem computer for redundancy, with two processors in lock step doing the same computation? Is it redundant in that case, or does it count double? What if I build a virtual machine in which this entire simulation happens in one instruction? Since the simulation has no I/O, what if my optimized implementation does away with it?

You're still deep into the fairy dust theory of utility. More nano-paperclips, please!

I tried that earlier and ran into different issue with resource allocation:

http://lesswrong.com/lw/d8z/a_small_critique_of_total_utilitarianism/6x3y

Also, for isolated computational processes it is not even quite clear if running them or not running them can feel any different from inside. Running once or running several times ought to feel the same from inside.

I recommend launching yourself into space with your computational substrate strapped to a thorium reactor with enough delta to make catching you a waste of energy.