Comment author: Unknown3 30 January 2008 12:10:00PM 1 point [-]

To your voting scenario: I vote to torture the terrorist who proposes this choice to everyone. In other words, asking each one personally, "Would you rather be dust specked or have someone randomly tortured?" would be much like a terrorist demanding $1 per person (from the whole world), otherwise he will kill someone. In this case, of course, one would kill the terrorist.

I'm still thinking about the best way to set up the lever to make the point the most obvious.

Comment author: Unknown3 30 January 2008 09:56:00AM 3 points [-]

Ben: suppose the lever has a continuous scale of values between 1 and 3^^^3. When the lever is set to 1, 1 person is being tortured (and the torture will last for 50 years.). If you set it to 2, two people will be tortured by an amount less the first person by 1/3^^^3 of the difference between the 50 years and a dust speck. If you set it to 3, three people will be tortured by an amount less than the first person by 2/3^^^3 of the difference between the 50 years and the dust speck. Naturally, if you pull the lever all the way to 3^^^3, that number of people will suffer the dust specks.

Will you pull the lever over to 3^^^3? And if so, will you assert that things are getting better during the intermediate stages (for example when you are torturing a google persons by an amount less than the first person by an entirely insignificant quantity?) And if things first get worse and then get better, where does it change?

Will you try to pull the lever over to 3^^^3 if there's a significant chance the lever might get stuck somewhere in the middle?

Comment author: Unknown3 30 January 2008 06:20:00AM 0 points [-]

Adam, by that argument the torture is worth 0 as well, since after 1,000,000 years, no one will remember the torture or any of its consequences. So you should be entirely indifferent between the two, since each is worth zero.

Comment author: Unknown3 29 January 2008 07:48:00PM 0 points [-]

Also: I wonder if Robin Hanson's comment shows concern about the lack of comments on his posts?

Comment author: Unknown3 29 January 2008 07:46:00PM 0 points [-]

Eisegetes, would you pull the lever if it would stop someone from being tortured for 50 years, but inflict one day of torture on each human being in the world? And if so, how about one year? or 10 years, or 25? In other words, the same problem arises as with the specks. Perhaps you can defend one punch per human being, but there must be some number of human beings for whom one punch each would outweigh torture.

Salutator, I never said utilitarianism is completely true.

In response to Circular Altruism
Comment author: Unknown3 26 January 2008 02:38:00PM 2 points [-]

Ben: as I said when I brought up the sand example, Eliezer used dust specks to illustrate the "least bad" bad thing that can happen. If you think that it is not even a bad thing, then of course the point will not be apply. In this case you should simply move to the least thing which you consider to be actually bad.

In response to Circular Altruism
Comment author: Unknown3 26 January 2008 02:35:00PM 1 point [-]

Paul : "Slapping each of 100 people once each is not the same as slapping one person 100 times."

This is absolutely true. But no one has said that these two things are equal. The point is that it is possible to assign each case a value, and these values are comparable: either you prefer to slap each of 100 people once, or you prefer to slap one person 100 times. And once you begin assigning preferences, in the end you must admit that the dust specks, distributed over multiple people, are preferable to the torture in one individual. Your only alternatives to this will be to contradict your own preferences, or to admit to some absurd preference such as "I would rather torture a million people for 49 years than one person for 50."

In response to Circular Altruism
Comment author: Unknown3 26 January 2008 06:19:00AM 2 points [-]

Caledonian, offering an alternative explanation for the evidence does not imply that it is not evidence that Eliezer expends some resources overcoming bias: it simply shows that the evidence is not conclusive. In fact, evidence usually can be explained in several different ways.

In response to Circular Altruism
Comment author: Unknown3 26 January 2008 06:10:00AM 4 points [-]

Eliezer's question for Paul is not particularly subtle, so I presume he won't mind if I give away where it is leading. If Paul says yes, there is some number of dust specks which add up to a toe stubbing, then Eliezer can ask if there is some number of toe stubbings that add up to a nipple piercing. If he says yes to this, he will ultimately have to admit that there is some number of dust specks which add up to 50 years of torture.

Rather than actually going down this road, however, perhaps it would be as well if those who wish to say that the dust specks are always preferable to the torture should the following facts:

1) Some people have a very good imagination. I could personally think of at least 100 gradations between a dust speck and a toe stubbing, 100 more between the toe stubbing and the nipple piercing, and as many as you like between the nipple piercing and the 50 years of torture.

2) Arguing about where to say no, the lesser pain can never add up to the slightly greater pain, would look a lot like creationists arguing about which transitional fossils are merely ape-like humans, and which are merely human-like apes. There is a point in the transitional fossils where the fossil is so intermediate that 50% of the creationists say that it is human, and 50% that it is an ape. Likewise, there will be a point where 50% of the Speckists say that dust specks can add up to this intermediate pain, but the intermediate pain can't add up to torture, and the other 50% will say that the intermediate pain can add up to torture, but the specks can't add up the intermediate pain. Do you really want to go down this path?

3) Is your intuition about the specks being preferable to the torture really greater than the intuition you violate by positing such an absolute division? Suppose we go down the path mentioned above, and at some point you say that specks can add up to pain X, but not to pain X+.00001 (a representation of the minute degree of greater pain in the next step if we choose a fine enough division). Do you really want to say that you prefer that a trillion persons (or a google, or a googleplex, etc) than that one person suffer pain X+.00001?

While writing this, Paul just answered no, the specks never add up to a toe stub. This actually suggests that he rounds down the speck to nothing; you don't even notice it. Remember however that originally Eliezer posited that you feel the irritation for a fraction of a second. So there is some pain there. However, Paul's answer to this question is simply a step down the path laid out above. I would like to see his answer to the above. Remember the (minimally) 100 gradations between the dust speck and the toe stub.

In response to Circular Altruism
Comment author: Unknown3 26 January 2008 05:51:00AM 2 points [-]

The fact that Eliezer has changed his mind several times on Overcoming Bias is evidence that he expends some resources overcoming bias; if he didn't, we would expect exactly what you say. It is true that he hasn't changed his mind often, so this fact (at least by itself) is not evidence that he expends many resources in this way.

View more: Prev | Next