JenniferRM comments on The Stamp Collector - Less Wrong

23 Post author: So8res 01 May 2015 11:11PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (12)

You are viewing a single comment's thread.

Comment author: JenniferRM 06 May 2015 07:34:42AM *  3 points [-]

This is the first time I've seen anyone on LW make this point with quite this level of explicitness, and I appreciated reading it.

Part of why it might be a useful message around these parts is that it has interesting implications for simulationist ethics, depending on how you treat simulated beings.

Caring about "the outer world" in the context of simulation links naturally to thought experiments like Nozick's Experience Machine but depending on one's approach to decision theory, it also has implications for simulated torture threats.

The decision theoretic simulated torture scenario (where your subjective experience of making the decision has lots of simulation measure via your opponent having lots of CPU, with the non-complying answer causing torture in all simulated cases) has been kicking around since at least 2006 or so. My longstanding position (if I were being threatened with simulated torture) has always been to care, as a policy, about only "the substrate" universe.

In terms of making it emotionally plausible that I would stick to my guns on this policy, I find it helpful to think about all my copies (in the substrate and in the simulation) being in solidarity with each other on this point, in advance.

Thus, my expectations are that when I sometimes end up experiencing torture for refusing to comply with such a threat, I will get the minor satisfaction of getting a decisive signal that I'm in a simulation, and "taking one for the team". Conversely, when I sometimes end up not experiencing simulated torture it increments my belief that I'm "really real" (or at least in a sim where the Demon is playing a longer and more complex game) and should really keep my eye on the reality ball so that my sisters in the sims aren't suffering in vain.

The only strong argument against doing this is for the special case where I'm being simmed with relatively high authenticity for the sake of modeling what the real me is likely to do in a hostile situation in the substrate... like a wargame sort of thing... and in that case, it could be argued that acting "normally" so the simulation is very useful is a traitorous act to the version of me that "really matters" (who all my reality-focused-copies, in the sim and in the real world, would presumably prefer to be less predictable).

For the most part I discount the wargame possibility in practice, because it is such a weirdly paranoid setup that it seems to deserve to be very discounted. (Also, it would be ironic if telling me that I might be in an enemy run wargame sim makes the me that counts the most act erratically in case she might be in the sim!)

I feel like the insight that "the outer world matters" has almost entirely healthy implications. Applying the insight to simulationist issues is fun but probably not that pragmatically productive except possibly if one is prone to schizophrenia or some such... and this seems like more of an empiric question that could be settled by psychiatrists than by philosophers ;-)

However, the fact that reality-focused value systems and related decision theories are somewhat determinative for the kinds of simulations that are worth running (and hence somewhat determinative of which simulations are likely to have measure as embeddings within larger systems) seems like a neat trick. Normally the metaphysicians claim to be studying "the most philosophically fundamental thing", but this perspective gives reason to think that the most fundamental thing (even before metaphysics?) might be how decisions about values work :-)