Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Dacyn comments on The "Intuitions" Behind "Utilitarianism" - Less Wrong

31 Post author: Eliezer_Yudkowsky 28 January 2008 04:29PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (199)

Sort By: Old

You are viewing a single comment's thread. Show more comments above.

Comment author: wafflepudding 19 September 2015 12:22:52AM 0 points [-]

I believe that the vast majority of people in the dust speck thought experiment would be very willing to endure the collision of the dust speck, if only to play a small role in saving a man from 50 years of torture. I would choose the dust specks on the behalf of those hurt by the dust specks, as I can be very close to certain that most of them would consent to it.

A counterargument might be that, since 3^^^3 is such a vast number, the collective pain of the small fraction of people who would not consent to the dust speck still multiplies to be far larger than the pain that the man being tortured would endure. Thus, I would most likely be making a nonconsensual tradeoff in favor of pain. However, I do not value the comfort of those that would condemn a man to 50 years of torture in order to alleviate a moment's mild discomfort, so 100% of the people whose lack of pain I value would willingly trade it over.

If someone can sour that argument for my mind, I'll concede that I prefer the torture.

Comment author: Dacyn 10 December 2017 10:26:13PM 0 points [-]

The only people who would consent to the dust speck are people who would choose SPECKS over TORTURE in the first place. Are you really saying that you "do not value the comfort of" Eliezer, Robin, and others?

However, your argument raises another interesting point, which is that the existence of people who would prefer that SPECKS was chosen over TORTURE, even if their preference is irrational, might change the outcome of the computation because it means that a choice of TORTURE amounts to violating their preferences. If TORTURE violates ~3^^^3 people's preferences, then perhaps it is after all a harm comparable to SPECKS. This would certainly be true if everyone finds out about whether SPECKS or TORTURE was chosen, in which case TORTURE makes it harder for a lot of people to sleep at night.

On the other hand, maybe you should force them to endure the guilt, because maybe then they will be motivated to research why the agent who made the decision chose TORTURE, and so the end result will be some people learning some decision theory / critical thinking...

Also, if SPECKS vs TORTURE decisions come up a lot in this hypothetical universe, then realistically people will only feel guilty over the first one.

Comment author: g_pepper 12 December 2017 02:39:34PM 0 points [-]

On the other hand, maybe you should force them to endure the guilt, because maybe then they will be motivated to research why the agent who made the decision chose TORTURE, and so the end result will be some people learning some decision theory / critical thinking...

The argument that 50 years of torture of one person is preferable to 3^^^3 people suffering dust specs presumes utilitarianism. A non-utilitarian will not necessarily prefer torture to dust specs even if his/her critical thinking skills are up to par.

Comment author: Dacyn 15 December 2017 01:49:28AM 0 points [-]

I'm not a utilitarian. The argument that 50 years of torture is preferable to 3^^^3 people suffering dust specks only presumes that preferences are transitive, and that there exists a sequence of gradations between torture and dust specks with the properties that (A) N people suffering one level of the spectrum is always preferable to N*(a googol) people suffering the next level, and (B) the spectrum has at most a googol levels. I think it's pretty hard to consistently deny these assumptions, and I'm not aware of any serious argument put forth to deny them.

It's true that a deontologist might refrain from torturing someone even if he believes it would result in the better outcome. I was assuming a scenario where either way you are not torturing someone, just refraining from preventing them from being tortured by someone else.

Comment author: entirelyuseless 15 December 2017 02:04:35PM 0 points [-]

Right. Utilitarianism is false, but Eliezer was still right about torture and dust specks.