mikem comments on The Trolley Problem in popular culture: Torchwood Series 3 - Less Wrong

16 Post author: botogol 27 July 2009 10:46PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (86)

You are viewing a single comment's thread.

Comment author: mikem 28 July 2009 09:10:47AM 4 points [-]

Lesswrongers will be encouraged to learn that the Torchwood characters were rationalists to a man and woman - there was little hesitation in agreeing to the 456's demands.

Are you joking? They weren't rationalists, they were selfish. There is a distinction. They were looking after there own asses and those of their families (note that the complicit politicians specifically excluded their own family's children from selection, regardless of 'worth').

children - or units as they were plausibly referred to

What do you mean by 'plausibly'? They were referred to as units in order to dehumanize them. Because the people referring to the children as such recognized that what they were doing was abhorrently wrong, and so had to mask the fact, even to themselves, by obscuring the reality of what they were discussing: the wholesale slaughter of their fellows.

... governments paying attention to round up the orphans, the refugees and the unloved - for the unexpectedly rational reason of minimising the suffering of the survivors

That's laughable. It had nothing to do with minimizing suffering, that was a rationalization. They were doing it for the same reason any government targets the vulnerable; because there are few willing to protect them and argue for them. It was pretty clear if you watched the show that the children being targeted were hardly 'unloved'.

You can't consider the scenario without considering the precedent that it would set. The notion that there are wide swaths of the population -- children, who've never even had the opportunity to truly prove themselves or do much of anything -- who are completely without worth and sacrificeable at the whim of the government is untenable in a society that values things like individuality, personal autonomy, the pursuit of happiness and, well, human life! They would not be saving humanity, they would be mutilating it.

The poster failed to mention that the sacrificed children were being sentence to an eternal fate which worse than death.

And there is a difference between the actions of the government and the actions of the main character. One of them was fighting the monsters. The others were the monster's business partners.

Comment author: cousin_it 28 July 2009 09:40:28AM *  8 points [-]

The poster failed to mention that the sacrificed children were being sentenced to an eternal fate which worse than death.

I really wonder what other LWers will say about this. Would you prefer to give one person huge disutility, or destroy humankind? For extra fun consider a 1/2^^^3 chance of 3^^^3 disutility to that one person.

Eliezer in particular considers his utility function to be provably unbounded in the positive direction at least, thinks we have much more potential for pain than pleasure, thinks destroying humankind has finite disutility on the order of magnitude of "billions of lives lost" (otherwise he'd oppose the LHC no matter what), and he's an altruist and expected utility consequentialist. Together this seems to imply that he will have to get pretty inventive to avoid destroying humankind.

Comment author: Eliezer_Yudkowsky 10 September 2009 10:12:08PM 1 point [-]

1/2^^^3 = 2^^(2^^2) = 2^^(2^2) = 2^^4 = 2^2^2^2 = 65536.

Comment author: cousin_it 10 September 2009 11:03:38PM *  1 point [-]

Oh, shit. Well... uhhhh.. in the least convenient impossible possible world it isn't! :-)

Comment author: Larks 10 September 2009 10:22:02PM 1 point [-]

1/65536, surely?

Comment author: Eliezer_Yudkowsky 14 September 2009 06:25:12PM 0 points [-]

Er, yes.

Comment author: cousin_it 14 September 2009 04:53:37AM 0 points [-]

Come to think, I don't even see how your observation makes the question any easier.

?

Comment author: Eliezer_Yudkowsky 14 September 2009 06:26:22PM *  0 points [-]

1/65536 probability of someone suffering 3^^^3 disutilons? If humanity's lifespan is finite, that's far worse than wiping out humanity. (If humanity's lifespan is infinite, or could be infinite with probability greater than 1/65536, the reverse is true.)

Comment author: cousin_it 14 September 2009 06:39:36PM *  0 points [-]

I'll take that for an answer. Now let's go over the question again: if humanity's lifespan is potentially huge... counting "expected deaths from the LHC" is the wrong way to calculate disutility... the right way is to take the huge future into account... then everyone should oppose the LHC no matter what? Why aren't you doing it then - I recall you hoped to live for infinity years?

Comment author: CarlShulman 14 September 2009 06:56:38PM *  2 points [-]

The very small probability of a disaster caused directly by the LHC is swamped by the possible effects (positive or negative) of increased knowledge of physics. Intervening too stridently would be very costly in terms of existential risk: prominent physicists would be annoyed at the interference (asking why those efforts were not being dedicated to nuclear disarmament or biodefence efforts, etc) and could discredit concern with exotic existential risks (e.g. AI) in their retaliation.

Comment author: Eliezer_Yudkowsky 14 September 2009 07:17:47PM 2 points [-]

Agree with all except the first sentence.

Comment author: cousin_it 14 September 2009 10:27:10PM *  2 points [-]

...Okay. You do sound like an expected utility consequentialist, I didn't quite believe that before. Here's an upvote. One more question and we're done.

Your loved one is going to be copied a large number of times. Would you prefer all copies to get a dust speck in the eye, or one copy to be tortured for 50 years?

Comment author: CarlShulman 14 September 2009 07:23:03PM 1 point [-]

Hmm? In light of Bostrom and Tegmark's Nature article?

Comment author: Eliezer_Yudkowsky 14 September 2009 07:50:12PM 1 point [-]

We don't know enough physics to last until the end of time, but we know enough to build computers; if I made policy for Earth, I would put off high-energy physics experiments until after the Singularity. It's a question of timing. But I don't make such policy, of course, and I agree with the rest of the logic for why I shouldn't bother trying.

Comment author: rhollerith_dot_com 14 September 2009 08:49:03PM 0 points [-]

Same here. Unless I am missing something (and please do tell me if I am), the knowledge gained by the LHC is very unlikely to help much to increase the rationality of civilization or to reduce existential risk, so the experiment can wait a few decades, centuries or millenia till civilization has become vastly more rational (and consequently vastly better able to assess the existential risk of doing the experiment).