The relevant question is, how does surrendering, or not surrendering, control the probability of the ultimatum having been given? If it doesn't, we should surrender. If the aliens sufficiently more likely wouldn't make the ultimatum if we wouldn't surrender if they did, we shouldn't surrender. Furthermore, we should look for third options whose choosing could also control aliens' actions.
Since this information is not given in the story, and the only thing we can go on is anthropomorphic intuition that we shouldn't give in to blackmail (since having the property to not give in really does control the probability of getting blackmailed by humans), the correct answer wasn't suggested, which defeats part of the appeal of a puzzle like this and can wreak some unnecessary memetic hazard.
For the same reason, focusing on whether one "cares about simulations" in this context is misleading, a false dilemma, since this is not the most relevant consideration. It's like asking whether you should cross a road on prime-numbered minutes, and pointing out examples of people who did cross the road on prime-numbered minutes and were run over, instead of focusing on how you should respond to traffic lights.
It seems you can set up a similar scenario without simulations: have the aliens physically kidnap 90% of the human race, then try to blackmail the other 10% into killing themselves. That would make for an interesting moral dilemma too.
And you can set up a scenario without dragging in torture and extinction. Aliens from Ganymede are about to ink a contract to trade us tons of Niobium in exchange for tons of Cobalt. But then the aliens reveal that they have billions of cloned humans working as an indentured proletariat in the mines of the Trojan asteroids. These humans are generally well treated, but the aliens offer to treat them even better - feed them ice cream - if we send the Cobalt without requiring payment in Niobium.
The central problem in all of these thought experiments is the crazy notion that we should give a shit about the welfare of other minds simply because they exist and experience things analogously to the way we experience things.
Is there a standard name for the logical fallacy where you attempt a reductio ad absurdum but fail to notice that you're deriving the absurdity from more than one assumption? Why conclude that it's the caring about far-away strangers that is crazy, as opposed to the decision algorithm that says you should give in to extortions like this?
But then changing your values to not care about simulated torture won't prevent the extortion attempt either (since the aliens will think there's a small chance you haven't actually changed your values and it costs them nothing to try). Unless you already really just don't care about simulated torture, it seems like you'd want to have a decision algorithm that makes you go to war against such extortionists (and not just ignore them).
If Fred cared about the aliens exterminating China, and Thud didn't care; then if the aliens instead threatened to exterminate China, Fred would again have problems and Thud again wouldn't have.
A rock doesn't care about anything, and therefore it has no problems at all.
This topic isn't really about simulation, it's about the fact that caring about anything permits you to possibly sacrifice something else for it. Anything that isn't our highest value may end up traded away, sure.
You can travel from here to China and back. Therefore, caring about China has at least a potential instrumental consequence on the rest of my life.
That's the only sane reason you believe can exist for caring about distant people at all? That you can potentially travel to them?
So if you're a paraplegic , who doesn't want to travel anywhere, can't travel anywhere, and know you'll die in two weeks anyway. You get a choice to push a button or not push it. If you push it you get 1 dollar right now, but 1 billion Chinese people will die horrible deaths in two weeks, after your own death.
Are you saying that the ONLY "sane" choice is to push the button, because you can use the dollar to buy bubblegum or something, while there'll never be a consequence on you for having a billion Chinese die horrible deaths after your own death?
If so, your definition of sanity isn't the definition most people have. You're talking about the concept commonly called "selfishness", not "sanity".
Err, the point of having a decision theory that makes you go to war against extortionists is not to have war, but to have no extortionists. Of course you only want to do that against potential extortionists who can be "dissuaded". Suffice it to say that the problem is not entirely solved, but the point is that it's too early to say "let's not care about simulated torture because otherwise we'll have to give in to extortion" given that we seem to have decision theory approaches that still show promise of solving such problems without having to change our values.
I'll give you utility if you give me utility is a trade.
I won't cause you disutility if you give me utility is extortion.
I don't think that's exactly the right distinction. Let's say you go to your neighbour because he's being noisy.
Scenario A: He says "I didn't mean to disturb you, I just love my music loud. But give me 10 dollars, and sure, I'll turn the volume down." I'd call that a trade, though it's still about him not giving you disutility.
Scenario B: He says "Yeah, I do that on purpose, so that I can make people pay me to turn the volume down. It'll be 10 bucks. " I'd call that an extortion.
The difference isn't between the results of the offer if you accept or reject -- the outcomes and their utility for you is the same in each (loud music, silence - 10 dollars).
The difference is that in Scenario B, you wish the other person had never decided to make this offer. It's not the utility of your options that are to be compared with each other, but the utility of the timeline where the trade can be made vs the utility of the timeline where the trade can't be made...
In the Trade scenarios, if you can't make a trade with the person, he's still being noisy, and utility minimizes. In the Extortion scenarios, if you can't make a trade with the person, he has no reason to be noisy, and utility maximizes.
I'll probably let someone else to transform the above description into equations containing utility functions.
No, I think the central "problem" is that having preferences that others can thwart with little effort is risky because it makes you more vulnerable to extortion.
For example, if you have a preference against non-prime heaps of pebbles existing, the aliens can try to extort you by building huge numbers of non-prime heaps on their home planet and sending you pictures of them, and therefore, the argument goes, it's crazy and stupid to care about non-prime heaps.
The argument also yields a heuristic that the farther away a thing is from you, the more stupid and crazy it is to care about it.
Doubtful. It's likely that losing 90% of earth's population would result in collapse of the world's cultural and economic institutions, including nation states, resulting in a very different test. Maybe if they had clone-farms and produced physical humans to torture offworld?
Which brings to mind we should expend effort on "rescue" attempts, such as hacking the little spheres or stopping the aliens through other means.
Thud yawned. "Fred, you're fired. Get out of my house." As Fred left, Thud closed his curtains and tried to get back to sleep.
I loved this line. It gave me the same warm glow inside that I get when I read about a guy heroically saving the entire world and Thud pulls it off by going to sleep. That's a whole new level above Chuck Norris. Even a level beyond The Doctor. Tennant went to all the trouble of challenging the alien leader to a duel to the death when the Sycorax pulled this stunt!
Would it make a difference if instead of simulation, they had gotten human dna and were speed-growing clones to torture?
Stuart Armstrong posted a similar scenario earlier. A lot of the discussion there are relevant here.
Some of the responses on that thread are hilarious, when did LW become so serious?
(I guess the other thing to do is to look at other similarly old threads and see if it has equally irreverent comments)
Don't give in to blackmail, duh.
This would be more interesting if they were offering some sort of simulated reward instead. For sufficiently large, otherwise unreachable rewards giving in might even be the correct answer.
If the aliens care so much about us building death camps and marching our citizens into those, why don't they simulate that directly instead of simulating torture in an indirect attempt to achieve the primary goal?
What happens if we surrender? Do they start treating the tortured simulations well? What kind of torture do they subject us to?
I think we have to take this as motivation for war with the aliens, rather than giving into or ignoring their demands. Assume that the torturers try this sort of thing with more than one set of victims. If any of the victims successfully defeats the torturers, that spares the would-be subsequent victims from the dilemma. Assume that there are multiple species of aliens that would try this torture technique, and some will concoct more persuasive versions of their threat. If none are stopped, we receive a series of alien torture threats that continues un...
Why exactly is Fred changing his preferences? If he's trying to make it so his future preferences are fulfilled, he should change them to preferring the universe to be exactly the way it is. If he's trying to make it so his current preferences are fulfilled, he probably shouldn't change them.
And what's he changing them to? Don't care about cheaply made people?
This has been said below but in two parts after a fashion.
Thud is in a position of protecting the people of earth. Thus it is his job to do whatever is best for that group.
If Thud makes a policy of giving in to simulated torture, it seems likely (although our experience with inhuman intelligence is a limiting factor here) that aliens or whatnot would be more likely to simulate and torture people.
So someone in Thud's position has a specific obligation to stop people like Fred from making the universe see earth as a pushover planet that will give up their resources to simulators.
Why would we be willing to believe that they will stop the torture if we killed ourselves? Further, if we killed ourselves why wouldn't they delete or kill the people being tortured?
They gave us the diagrams of themselves if I understand correctly and from the story we have the understanding to tell they tell the truth. What is to stop us from building massive super computers that can simulate some way of torturing trillions of them unless they stop simulating the torture of us? By communicating this possibility to the aliens and letting them know that if...
These aliens seem temptingly fragile. They would make perfect targets to aim at when cane toad whacking. There is something appealing about treating that kind of threat with complete and utter contempt. Take them out with projectile vermin!
Since there are only a couple of thousand aliens around using such an inaccurate cane toad launching mechanism will just serve to make the fun last longer. But if more of the aliens start floating around upgrading the targetting system could be viable.
"General Thud! General Thud! Wake up! The aliens have landed. We must surrender!" General Thud's assistant Fred turned on the lights and opened the curtains to help Thud wake up and confront the situation. Thud was groggy because he had stayed up late supervising an ultimately successful mission carried out by remotely piloted vehicles in some small country on the other side of the world. Thud mumbled, "Aliens? How many? Where are they? What are they doing?" General Thud looked out the window, expecting to see giant tripods walking around and destroying buildings with death rays. He saw his lawn, a bright blue sky, and hummingbirds hovering near his bird feeder.
Fred was trying to bring Thud up to speed as quickly as possible. "Thousands of them, General! 2376, to be precise. They gave us a map; we know where they all are. They aren't doing anything overt, but the problem is their computation! I have one here, if you'd like to look." Fred removed a black sphere two inches in diameter from his pocket and gave it to Thud.
Thud sat on his bed holding the small sphere and staring at it dumbfounded. "Okay, you think we should surrender to a few thousand small spheres. Why is that, exactly?" The sphere seemed a little flexible in Thud's hand. As he experimented a few seconds to see just how flexible, it collapsed in his hand, converting itself into a loose clump of alien sand that landed in his lap and started to dribble onto his bed and the floor. Thud stood up and brushed the rest of the sand off of his pyjamas and bed, and thought for a moment about where he left his vacuum cleaner bags. He was not impressed with these aliens.
Fred said "I don't think you wanted to do that, sir. Their ultimatum states that for every alien we destroy, they'll manufacture two in the outer reaches of the Solar System where we'll never find them!"
Thud said, "Okay, so now you think we should surrender to 2375 small spheres, and two or more small spheres that are out of the battlefield for the moment. Why is that?"
Fred said "Well, you remember a few years back when some people copied their brain state into a computer and posted it to the Internet? Apparently somebody copied the data across an unencrypted wireless link, the aliens picked it up with their radio telescopes, and now they are simulating those poor people in these black spheres and torturing the simulations! They sent us videos!" Fred held up his cell phone, pushed a button, and showed the video to Thud.
Thud looked at the video for a moment and said, "Yep, that's torture. Do these people know anything potentially useful to the aliens?"
Fred said, "Well, they know how to break into a laboratory that has brain scanning tools and push some buttons. That was apparently the high point of their lives. But none of that matters, the aliens don't seem to be torturing them for information anyway."
Thud was still suffering from morning brain fog. He rubbed his eyes. "And why should we surrender?"
Fred said, "The aliens have made a trillion copies of these poor people and will run the torture simulations on the little black spheres until we march all of our citizens into the death camps they demand we build! We have analyzed these black spheres and the engineering diagrams the aliens gave us, and we know this to be true. We only have ten billion citizens, and this simulated torture is much worse than simulated death, so the total utility is much greater if we surrender!"
Thud yawned. "Fred, you're fired. Get out of my house." As Fred left, Thud closed his curtains and tried to get back to sleep.
-----
Michael said "So I take it you no longer assist Thud. What are you doing now?"
Fred reclined comfortably on the analyst's couch. "I help out at the cafeteria as a short order cook. But I'm not worried about my career right now. I have nightmares about all these simulated people being tortured in the flimsy alien spheres."
"Thud surely knows the simulations are being tortured too. Do you think he has nightmares about this?"
"No, he doesn't seem to care."
"Have you always cared about the well-being of simulations?"
"No, when I was a teenager I was self-centered and conceited and didn't care about anybody else, including simulated people."
"So at some point you self-modified to care about simulations. If it helps you, you could self-modify again."
"But I don't want to!"
"Did you want to self-modify to care about simulations in the first place?"
"No, it just sort of happened as I grew up."
"Is there any logical inconsistency in Thud's position?"
Fred thought for a bit. "Not that I can see. The value one assigns to simulations seems to be an arbitrary choice. Ignoring the alien invasion certainly hasn't harmed his career."
"Concern about simulations seems to give the aliens more influence over you than Thud would prefer. What would you prefer?"
"Well, I'd also prefer the aliens not to be able to jerk me around. I really don't have room in my life for it now. In the grand scheme of things, it seems just wrong -- they shouldn't be able to genocide a species with a few thousand stupid spheres that just sit there converting sunlight to heat."
Michael passed Fred a piece of paper with a short list of bulleted items. "This is the procedure I teach my clients who want to change their preferences. After you've learned it, you can decide whether and how you want to use it..."