This is our monthly thread for collecting arbitrarily contrived scenarios in which somebody gets tortured for 3^^^^^3 years, or an infinite number of people experience an infinite amount of sorrow, or a baby gets eaten by a shark, etc. and which might be handy to link to in one of our discussions. As everyone knows, this is the most rational and non-obnoxious way to think about incentives and disincentives.
- Please post all infinite-torture scenarios separately, so that they can be voted up/down separately. (If they are strongly related, reply to your own comments. If strongly ordered, then go ahead and post them together.)
- No more than 5 infinite-torture scenarios per person per monthly thread, please.
That would fall under "nitpicking". When I said "impossible" I meant to say "they won't work on us here". Or will work with negligible probability, which is pretty much the same thing. My question to Carl stands: does he agree that it's impossible/pointless to save people in the past by building rescue sims? Is this a consequence of UDT, the way he understands it?
A word on nitpicking: even if I believe it's likely you meant a given thing, if it's nonetheless not clear that you didn't mean another, or presentation doesn't make it clear for other people that you didn't mean another, it's still better to debias the discussion from illusion of transparency by explicitly disambiguating than relying on fitting the words to a model that was never explicitly tested.