You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

MugaSofer comments on DRAFT:Ethical Zombies - A Post On Reality-Fluid - Less Wrong Discussion

0 Post author: MugaSofer 09 January 2013 01:38PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (116)

You are viewing a single comment's thread. Show more comments above.

Comment author: MugaSofer 13 January 2013 10:29:47AM -2 points [-]

I was assuming perfect quantum-level modelling of you and everything you interact with, acquired and sustained via magic. It makes things much simpler

As for your actual question ... I'm not sure. The sim would have to conscious, obviously, but the point at which it becomes "you" is ... unclear. It seems trivially true that a magical perfect simulation as above is "you", but an AI programmed to believe it's you is not. Beyond those two extremes ... it's tricky to say.

Of course, if utilities are additive, two almost-yous should be worth as much as one you with twice as much reality-fluid. So I guess humans can get away with ignoring the distinction between me and you, at least as long as they're using TDT or similar.

Comment author: Decius 13 January 2013 08:16:28PM 0 points [-]

How close is a model that has an arbitrary number of cakes added?

I also say that no simulation has value to me if I am in a frame that knows they are a simulation. Likewise for quantum states that I don't manipulate.

Comment author: MugaSofer 13 January 2013 09:20:26PM -2 points [-]

How close is a model that has an arbitrary number of cakes added?

Perfectly so before the cakes are added.

I also say that no simulation has value to me if I am in a frame that knows they are a simulation.

To be clear, are you actually asserting this or merely suggesting a possible resolution to the dilemma?

Comment author: Decius 13 January 2013 11:39:50PM *  0 points [-]

So you believe that it is irrelevant whether or not Omega' (a resident of the universe running a simulation) can create things of value to you but chooses not to? You have no preference for living in a world with constant physical laws?

I also say that no simulation has value to me if I am in a frame that knows they are a simulation.

To be clear, are you actually asserting this or merely suggesting a possible resolution to the dilemma?

It's a solution, but for it to apply to others they would have to share my values. What I'm saying is that there is no intrinsic value to me to the orientations of electrons representing a number which has a transformation function which results in a number which is perfectly analogous to me, or to any other person. Other people are permitted to value the integrity of those electrical orientations representing bits as they see fit.

Comment author: MugaSofer 14 January 2013 09:59:43AM *  -2 points [-]

So you, in fact, do not value simulations of yourself? Or anyone else, for that matter?

Comment author: Decius 14 January 2013 01:47:47PM 0 points [-]

With the caveat that I am not a simulation for the purposes if that judgement. I care only about my layer and the layers which are upstream of (simulating) me, if any.

Comment author: MugaSofer 14 January 2013 02:38:01PM -2 points [-]

Well, obviously this post is not aimed at you, but I must admit I am curious as to why you hold this belief. What makes "downstream" sims unworthy of ethical consideration?

Comment author: Decius 14 January 2013 03:35:30PM 0 points [-]

Maybe I've got a different concept of 'simulation'. I consider a simulation to be fully analogous to a sufficiently well-written computer program, and I don't believe that representations of numbers are morally comparable to living creatures, even if those numbers undergo transformations completely analogous to those creatures.

Why should I care if you calculate f(x) or f'(x), where x is the representation of the current state of the universe, f() is the standard model, and f'() is the model with all the cake?

Comment author: TheOtherDave 14 January 2013 03:39:52PM 0 points [-]

I don't believe that representations of numbers are morally comparable to living creatures

Does that stay true if those representations are implemented in a highly distributed computer made out of organic cells?

Comment author: Decius 14 January 2013 03:45:41PM 0 points [-]

Are you trying to blur the distinction between a simulated creature and a living one, or are you postulating a living creature which is also a simulator? I don't have moral obligation regarding my inner Slytherin beyond any obligations I have regarding myself.