Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

cousin_it comments on Purchase Fuzzies and Utilons Separately - Less Wrong

75 Post author: Eliezer_Yudkowsky 01 April 2009 09:51AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (85)

You are viewing a single comment's thread.

Comment author: cousin_it 01 April 2009 10:42:16AM *  4 points [-]

Maybe relevant to this post: the googolplex dust specks issue seems to be settled by nonlinearity/proximity.

Other people's suffering is non-additive because we value different people differently. The pain of a relative matters more to me than the pain of a stranger. A googolplex people can't all be important to me because I don't have enough neural circuitry for that. (Monkeysphere is about 150 people.) This means each subsequent person-with-dust-speck means less to me than the previous one, because they're further from me. The infinite sum may converge to a finite value that I feel is smaller than 50 years of torture.

It seems that to shut up and multiply, an altruist/rationalist needs to accept a non-obvious axiom that each person's joy or suffering carries equal weight regardless of proximity to the altruist. I for one refuse to accept this axiom because it's immoral to me; think about it.

Comment author: Stuart_Armstrong 01 April 2009 10:58:29AM 1 point [-]

Quite a few holes here... You don't need any proximity axiom for the googolplex. The person to be tortured can be made more remote than any of those suffering from dust specks (if you insist on mentioning proximity, consider the balancing between a googolplex squared of dust speck sufferers versus a mere googolplex of torture victims).

(I personally reject the googplex dust speck argument simply because I don't consider a single dust speck to amount to a suffering; I accept the argument at about the level of a toe stubing that would be still felt the next day)

Comment author: cousin_it 01 April 2009 11:36:58AM *  5 points [-]

There are two ways you might be wrong. First, the neg-utility of dust specks could approach zero as distance increases, and the neg-utility of torture could approach a nonzero value that's greater than the sum of infinitely many dust specks. Second, I could imagine accepting torture if the victim were sufficiently neurologically distant from me, say on the empathetic level of a fictional character. (Neurological distance is, more or less, the degree of our gut acknowledgement that a given person actually exists. The existence of a googolplex people is quite a leap of faith.) Take your pick.

I still believe proximity solves the dust speck and Pascal's mugging parables. Well, not quite "solves": proximity gives a convincing rationalization to the common-sense decision of a normal person that rationalism so cleverly argues against. Unfortunately scholastics without experiment can't "solve" a problem in any larger sense.

Comment author: randallsquared 01 April 2009 08:34:26PM 2 points [-]

I don't see why anyone would think the dust speck problem is a problem. The simplest solution seems to be to acknowledge that suffering (and other utility, positive or negative) isn't additive. Is there some argument that it is or should be?

Comment author: cousin_it 01 April 2009 09:49:16PM *  2 points [-]

Well, you're right, but I wasn't completely satisfied by such a blunt argument and went on to invent an extra layer of rationalization: justify non-additivity with proximity. Of course none of this matters except as a critique of the "shut up and multiply" maxim. I wouldn't want to become a utility-additive mind without proximity modifiers. Maybe Eliezer would; who knows.

Comment author: Regex 11 October 2015 06:17:30PM 0 points [-]

It seems that to shut up and multiply, an altruist/rationalist needs to accept a non-obvious axiom that each person's joy or suffering carries equal weight regardless of proximity to the altruist. I for one refuse to accept this axiom because it's immoral to me; think about it.

I have the exact opposite intuition. It is not obvious at all to me that the closeness (emotionally or physically) to someone changes the weight of their suffering. If someone is going to get their fingers slammed in a door, then it matters not should I know them personally or be a thousand light years distant.

Admittedly, I may have a slightly more visceral reaction if someone I know gets in a car wreck than looking at the statistics, but I disagree that means it is Right for me to prevent that car wreck of someone close, only to thereby cause another and in addition lead someone to stub their toe.

Comment author: RichardKennaway 12 October 2015 06:41:46AM 1 point [-]

It is not obvious at all to me that the closeness (emotionally or physically) to someone changes the weight of their suffering.

Where they are does not change their suffering, but perhaps it changes the weight of your obligation to do something about it?

Comment author: Regex 13 October 2015 11:31:44PM 0 points [-]

In social situations perhaps. But that's only because you can't physically act or it is more optimal economically and logistically for everyone to manage their own sphere of influence. If you have in front of you two buttons and you must press one, this changes nothing.