Decius comments on DRAFT:Ethical Zombies - A Post On Reality-Fluid - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (116)
What does it mean to simulate someone, and why should I value manipulation of a simulation?
How good does the simulation have to be before I value it; should I value a book in which I get cakes more? What about a fairly good simulation of the world, but the contents of my pantry are randomized each time I open it–should I value that simulation more if the expected number of cakes the next time the simulated me opens the pantry are higher?
I was assuming perfect quantum-level modelling of you and everything you interact with, acquired and sustained via magic. It makes things much simpler
As for your actual question ... I'm not sure. The sim would have to conscious, obviously, but the point at which it becomes "you" is ... unclear. It seems trivially true that a magical perfect simulation as above is "you", but an AI programmed to believe it's you is not. Beyond those two extremes ... it's tricky to say.
Of course, if utilities are additive, two almost-yous should be worth as much as one you with twice as much reality-fluid. So I guess humans can get away with ignoring the distinction between me and you, at least as long as they're using TDT or similar.
How close is a model that has an arbitrary number of cakes added?
I also say that no simulation has value to me if I am in a frame that knows they are a simulation. Likewise for quantum states that I don't manipulate.
Perfectly so before the cakes are added.
To be clear, are you actually asserting this or merely suggesting a possible resolution to the dilemma?
So you believe that it is irrelevant whether or not Omega' (a resident of the universe running a simulation) can create things of value to you but chooses not to? You have no preference for living in a world with constant physical laws?
It's a solution, but for it to apply to others they would have to share my values. What I'm saying is that there is no intrinsic value to me to the orientations of electrons representing a number which has a transformation function which results in a number which is perfectly analogous to me, or to any other person. Other people are permitted to value the integrity of those electrical orientations representing bits as they see fit.
So you, in fact, do not value simulations of yourself? Or anyone else, for that matter?
With the caveat that I am not a simulation for the purposes if that judgement. I care only about my layer and the layers which are upstream of (simulating) me, if any.
Well, obviously this post is not aimed at you, but I must admit I am curious as to why you hold this belief. What makes "downstream" sims unworthy of ethical consideration?
Maybe I've got a different concept of 'simulation'. I consider a simulation to be fully analogous to a sufficiently well-written computer program, and I don't believe that representations of numbers are morally comparable to living creatures, even if those numbers undergo transformations completely analogous to those creatures.
Why should I care if you calculate f(x) or f'(x), where x is the representation of the current state of the universe, f() is the standard model, and f'() is the model with all the cake?
Does that stay true if those representations are implemented in a highly distributed computer made out of organic cells?