Consider a future person living a happy and fulfilling life. They're unfortunate enough to suffer a severe accident, but there's time to preserve and then scan their brain fully, after which they can be brought back up in an emulator on a computer. [1] It doesn't matter that they're now running on digital hardware instead of in a biological brain; they're still a person and they still count.
Now imagine this person or "em" asks to be let alone, cuts off all communication from the rest of the world, and rejoices privately in finally being able to fully explore their introverted nature. This isn't what I imagine myself doing, but is a choice I can respect.
Someone comes along and suggests turning off this person's emulation on the grounds that no one will know the difference, and we can use the hardware for something else. This seems wrong. Which means this computational process is valuable entirely for its own sake, independent of its effect on the world.
Unlike biological brains, computational processes are very flexible. We could run many copies, or run them much faster or slower than usual. We could run a specific segment of their independent experience repeatedly, perhaps the happiest few moments. It also seems unlikely that a full emulation of a human is the only thing that's valuable. Perhaps there are simpler patterns we could emulate that would be much better in terms of value per dollar?
I'm trying to reduce my concept of value and getting lots of strange questions.
I also posted this on my blog
[1] I think this will be possible, but not for a while.
If I put some em in a context that makes him happy and that somehow "counts", what if I take the one em whose happiness is maximal (by size / cost / whatever measure), then duplicate the very same em, in the very same context, ad infinitum, and have 1 gazillion copies of him, e.g. being repeatedly jerked off by $starlet ? Does each new copy count as much as the original? Why? Why not? What if the program was run on a tandem computer for redundancy, with two processors in lock step doing the same computation? Is it redundant in that case, or does it count double? What if I build a virtual machine in which this entire simulation happens in one instruction? Since the simulation has no I/O, what if my optimized implementation does away with it?
You're still deep into the fairy dust theory of utility. More nano-paperclips, please!