In response to Hand vs. Fingers
Comment author: Unknown3 31 March 2008 06:24:00AM 0 points [-]

Paul: "we are morally obliged to kill everyone we meet" has no scientific implications, but it definitely has moral implications. To speak plainly, your position is false, and obviously so.

Some children (2-4 years of age) assume that other human beings are zombies, because they do not meet their model of a conscious observer, e.g. adults don't go and eat the ice cream in the freezer, even though no one can stop them, and even though any conscious being would of course eat that ice cream, if it were in its power.

This fact actually is one proof that Caledonian and others are talking nonsense in saying that the zombie world is incoherent; everyone involved in this discussion, including Caledonian, knows exactly what the world would be like if it were a zombie world, and none of us think that the world is actually that way (except maybe poke-- he may actually believe the zombie world is real.)

Comment author: Unknown3 26 March 2008 07:23:00AM 1 point [-]

Roko, I strongly suspect that a limitedly caring universe just reduces to a materialist universe with very complex laws. For example, isn't it kind of like magic that when I want to lift my hand, it actually moves? What would be the difference if I could levitate or change lead into gold? If the universe obeys my will about lifting my hand, why shouldn't it obey in other things, and if it did, why would this be an essential difference?

Comment author: Unknown3 06 February 2008 06:21:00AM 0 points [-]

Jeffrey, do you really think serial killing is no worse than murdering a single individual, since "Subjective experience is restricted to individuals"?

In fact, if you kill someone fast enough, he may not subjectively experience it at all. In that case, is it no worse than a dust speck?

Comment author: Unknown3 05 February 2008 04:24:00AM 2 points [-]

Jeffrey, on one of the other threads, I volunteered to be the one tortured to save the others from the specks.

As for "Real decisions have real effects on real people," that's absolutely correct, and that's the reason to prefer the torture. The utility function implied by preferring the specks would also prefer lowering all the speed limits in the world in order to save lives, and ultimately would ban the use of cars. It would promote raising taxes by a small amount in order to reduce the amount of violent crime (including crimes involving torture of real people), and ultimately would promote raising taxes on everyone until everyone could barely survive on what remains.

Yes, real decisions have real effects on real people. That's why it's necessary to consider the total effect, not merely the effect on each person considered as an isolated individual, as those who favor the specks are doing.

Comment author: Unknown3 01 February 2008 07:09:00PM 4 points [-]

Eliezer, I have a question about this: "There is no finite amount of life lived N where I would prefer a 80.0001% probability of living N years to an 0.0001% chance of living a googolplex years and an 80% chance of living forever. This is a sufficient condition to imply that my utility function is unbounded."

I can see that this preference implies an unbounded utility function, given that a longer life has a greater utility. However, simply stated in that way, most people might agree with the preference. But consider this gamble instead:

A: Live 500 years and then die, with certainty.
B: Live forever, with probability 0.000000001%; die within the next ten seconds, with probability 99.999999999%

Do you choose A or B? Is it possible to choose A and have an unbounded utility function with respect to life? It seems to me that an unbounded utility function implies the choice of B. But then what if the probability of living forever becomes one in a googleplex, or whatever? Of course, this is a kind of Pascal's Wager; but it seems to me that your utility function implies that you should accept the Wager.

It also seems to me that the intuitions suggesting to you and others that Pascal's Mugging should be rejected similarly are based on an intuition of a bounded utility function. Emotions can't react infinitely to anything; as one commenter put it, "I can only feel so much horror." So to the degree that people's preferences reflect their emotions, they have bounded utility functions. In the abstract, not emotionally but mentally, it is possible to have an unbounded function. But if you do, and act on it, others will think you a fanatic. For a fanatic cares infinitely for what he perceives to be an infinite good, whereas normal people do not care infinitely about anything.

This isn't necessarily against an unbounded function; I'm simply trying to draw out the implications.

Comment author: Unknown3 31 January 2008 07:26:00PM 0 points [-]

About the slugs, there is nothing strange in asserting that the utility of the existence of something depends partly on what else exists. Consider chapters in a book: one chapter might be useless without the others, and one chapter repeated several times would actually add disutility.

So I agree that a world with human beings in it is better than one with only slugs: but this says nothing about the torture and dust specks.

Eisegetes, we had that discussion previously in regard to the difference between comparing actions and comparing outcomes. I am fairly sure I would not torture someone to save New York (at least not for 50 years), but this doesn't mean I think that the fact of someone being tortured, even for 50 years, outweighs the lives of everyone in New York. I might simply accept Paul's original statement on the matter, "Torture is wicked. Period."

It does matter how it is done, though. In my lever case, if the lever were set to cause the dust specks, I would definitely move it over to the 50 year torture side.

Another factor that no one has yet considered (to make things more realistic). If there were 3^^^3 people, googleplexes of them would certainly be tortured for 50 years (because the probability of someone being tortured for 50 years is certainly high enough to ensure this). So given an asymptote utility function (which I don't accept), it shouldn't matter if one more person is tortured for 50 years.

Comment author: Unknown3 31 January 2008 02:58:00PM 0 points [-]

Z.M. Davis, that's an interesting point about the slugs, I might get to it later. However, I suspect it has little to do with the torture and dust specks.

Doug, here's another problem for your proposed function: according to your function, it doesn't matter whether a single person takes all the pain or if it is distributed, as long as it sums to the same amount according to your function.

So let's suppose that the pain of solitary confinement without anything interesting to do can never add up to the pain of 50 years torture. According to this, would you honestly choose to suffer the solitary confinement for 3^^^3 years, rather than the 50 years torture?

I suspect that most people would prefer to take the torture and get on with their lives, instead of suffering for the confinement for eternity.

But if you modify the function to allow for this, more preference reversals are coming: for we can begin to decrease the length of the solitary confinement by a microsecond while increasing the number of people who suffer it by a large amount.

In order to prevent an extremely short confinement for 3^^^3 people from exceeding the torture, which would presumably imply the same possibility for dust specks, you will have to say that there is some length of solitary confinement for some number of people, such that solitary confinement for Almost Infinite people, for a length of time shorter by the shortest possible noticeable time period, can never be worse.

Would you hold to this too, or would you honestly prefer the 3^^^3 years confinement to 50 years torture?

Comment author: Unknown3 31 January 2008 07:14:00AM 0 points [-]

Notice that in Doug's function, suffering with intensity less than 0.393 can never add up to 50 years of torture, even when multiplied infinitely, while suffering of 0.394 will be worse than torture if it is sufficiently multiplied. So there is some number of 0.394 intensity pains such that no number of 0.393 intensity pains can ever be worse, despite the fact that these pains differ by 0.001, stipulated by Doug to be the pain of a dust speck. This is the conclusion that I pointed out follows with mathematical necessity from the position of those who prefer the specks.

Doug, do you actually accept this conclusion (about the 0.393 and 0.394 pains), or you just trying to show that the position is not logically impossible?

Comment author: Unknown3 30 January 2008 07:55:00PM 0 points [-]

Ben P: the arrangement of the scale is meant to show that the further you move the lever toward 3^^^3 dust specks, the worse things get. The torture decreases linearly simply because there's no reason to decrease it by more; the number of people increases in the way that it does because of the nature of 3^^^3 (i.e. the number is large enough to allow for this). The more we can increase it at each stop, the more obvious it is that we shouldn't move the lever at all, but rather we should leave it at torturing 1 person 50 years.

Comment author: Unknown3 30 January 2008 03:07:00PM 1 point [-]

Ben, here's my new and improved lever. It has 7,625,597,484,987 settings. On setting 1, 1 person is tortured for 50 years plus the pain of one dust speck. On setting 2, 3 persons are tortured for 50 years minus the pain of (50-year torture/7,625,597,484,987), i.e. they are tortured for a minute fraction of a second less than 50 years, again plus the pain of one dust speck. On setting 3, 3^3 persons, i.e. 27 persons, are tortured for 50 years minus two such fractions of a second, plus the pain of one dust speck. On setting 4, 3^27, i.e. 7,625,597,484,987 persons are tortured for 50 years minus 3 such fractions, plus the pain of one dust speck....

Once again, on setting 7,625,597,484,987, 3^^^3 persons are dust specked.

Will you still push the lever over?

View more: Prev | Next