aspera comments on Torture vs. Dust Specks - Less Wrong

39 Post author: Eliezer_Yudkowsky 30 October 2007 02:50AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (596)

Sort By: Old

You are viewing a single comment's thread. Show more comments above.

Comment author: mantis 10 September 2012 05:08:40PM 0 points [-]

If dust specks have a value of 0, then what's the smallest amount of discomfort that has a nonzero value instead?

I don't know exactly where I'd make the qualitative jump from the "discomfort" scale to the "pain" scale. There are so many different kinds of unpleasant stimuli, and it's difficult to compare them. For electric shock, say, there's probably a particular curve of voltage, amperage and duration below which the shock would qualify as discomfort, with a zero value on the pain scale, and above which it becomes pain (I'll even go so far as to say that for short periods of contact, the voltage and amperage values lies between those of a violet wand and those of a stun gun). For localized heat, I think it would have to be at least enough to cause a small first-degree burn; for localized cold, enough to cause the beginnings of frostbite (i.e. a few living cells lysed by the formation of ice crystals in their cytoplasm). For heat and cold over the whole body, it would have to be enough to overcome the body's natural thermostat, initiating hypothermia or heatstroke.

It occurs to me that I've purposefully endured levels of discomfort I would probably regard as pain with a non-zero value on the torture scale if it was inflicted on me involuntarily, as a result of working out at the gym (which has an expected payoff in health and appearance, of course), and from wearing an IV for two 36-hour periods in a pharmacokinetic study for which I'd volunteered (it paid $500); I would certainly do so again, for the same inducements. Choice makes a big difference in our subjective experience of an unpleasant stimulus.

50 years of torture for one person is probably not as bad as 25 years of torture for a trillion people.

Of course not; by the scale I posited above, 50 years for one person isn't even as bad as 25 years for two people.

If we keep doing this (halving the torture length, multiplying the number of people by a trillion) then are we always going from bad to worse?

No, but the length has to get pretty tiny (probably somewhere between a millisecond and a microsecond) before we reverse the direction.

And do we ever get to the point where each individual person tortured experiences about as much discomfort as our replacement dust speck?

Yes, we do; in fact, we eventually get to a point where each person "tortured" experiences no discomfort at all, because the nervous system is not infinitely fast nor infinitely sensitive. If you're using temperature for your torture, heat transfer happens at a finite speed; no matter how hot or cold the material that touches your skin, there's a possible time of contact short enough that it wouldn't change your skin temperature enough to cause any discomfort at all. Even an electric shock could be brief enough not to register.

Comment author: aspera 16 November 2012 08:06:02PM 1 point [-]

The idea that the utility should be continuous is mathematically equivalent to the idea that an infinitesimal change on the discomfort/pain scale should give an infinitesimal change in utility. If you don't use that axiom to derive your utility funciton, you can have sharp jumps at arbitrary pain thresholds. That's perfectly OK - but then you have to choose where the jumps are.

Comment author: shminux 16 November 2012 09:25:14PM *  1 point [-]

then you have to choose where the jumps are

It could be worse than that: there might not be a way to choose the jumps consistently, say, to include different kinds of discomfort, some related to physical pain and others not (tickling? itching? anguish? ennui?)

Comment author: mantis 21 November 2012 08:54:13PM 1 point [-]

I think that's probably more practical than trying to make it continuous, considering that our nervous systems are incapable of perceiving infinitesimal changes.

Comment author: aspera 23 November 2012 05:40:16AM 0 points [-]

Yes, we are running on corrupted hardware at about 100 Hz, and I agree that defining broad categories to make first-cut decisions is necessary.

But if we were designing a morality program for a super-intelligent AI, we would want to be as mathematically consistent as possible. As shminux implies, we can construct pathological situations that exploit the particular choice of discontinuities to yield unwanted or inconsistent results.