This is our monthly thread for collecting arbitrarily contrived scenarios in which somebody gets tortured for 3^^^^^3 years, or an infinite number of people experience an infinite amount of sorrow, or a baby gets eaten by a shark, etc. and which might be handy to link to in one of our discussions. As everyone knows, this is the most rational and non-obnoxious way to think about incentives and disincentives.
- Please post all infinite-torture scenarios separately, so that they can be voted up/down separately. (If they are strongly related, reply to your own comments. If strongly ordered, then go ahead and post them together.)
- No more than 5 infinite-torture scenarios per person per monthly thread, please.
Hi. I'm your reality's simulator. Well, the most real thing you could ever experience, anyway.
I'm considering whether I should set the other beings in the simulator to cooperate with you (in the game-theoretic sense). To find the answer, I need to know whether you will cooperate with others. And to do that, I'm running a simulation of you in a sub-simulation while the rest of the universe is paused. That's where you are right now.
Certainly, you care more about yourself in the main simulation than in this sub-simulation. Therefore, if you are to suffer as a result of cooperating, this sub-simulation is the place to do it, as it will lead to you reaping the benefits of mutual cooperation in the main simulation.
If, on the other hand, you defect (in the game-theoretic sense) in your present world, the real(er) you in the main simulation will suffer tremendously from the defection of others, such as through torture.
Don't bother trying to collect evidence to determine whether you're really (!) in the main simulation or the sub -- it's impossible to tell from the inside. The only evidence you have is me.
By the way, I'm isomorphic to rot13 [zbfg irefvbaf bs gur Puevfgvna tbq], if that sort of thing matters for your decision.
Not all [Puevfgvna]s are agnostic.