Cyan comments on Consciousness of simulations & uploads: a reductio - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (139)
There have been some opinions expressed on another thread that disagree with that.
The key question is whether terminating a simulation actually does harm to the simulated entity. Some thought experiments may improve our moral intuitions here.
I tend to agree with your invocation of xenia, but I'm not sure it applies to simulations. At what point do simulated entities become my guests? When I buy the shrink-wrap software? When I install the package? When I hit start?
I really remain unconvinced that the metaphor applies.
Applying the notion of information-theoretic death to simulated beings results in the following answers:
Slowing down a simulation also does harm if there are interactions which the simulation would prefer to maintain which are made more difficult or impossible.
The same would apply to halting a simulation.
Request for clarification:
Do I understand this properly to say that if the stopped simulation had been derived from the save file state using non-deterministic or control-console inputs, inputs that are not duplicated in the restarted simulation, then harm is done?
Hmmm. I am imagining a programmer busy typing messages to his simulated "creations":
Looks at what was entered ...
Thinks about what just happened. ... "Aw Sh.t!"
As I understand it, yes. But the harm might not be as bad as what we currently think of as death, depending on how far back the restore went. Backing one's self up is a relatively common trope in a certain brand of Singularity fic (e.g. Glasshouse).
(I needed three parentheses in a row just now: the first one, escaped, for the Wikipedia article title, the second one to close the link, and the third one to appear as text.)