Comment author: Nornagest 12 November 2011 01:06:04AM *  1 point [-]

It is simple why I chose 10. It is to highlight the paradox of those who choose to torture. I have made it easier for you. Lets see that we increase 10 to 3^^^3 deprived rapists. The point is, if you surely would not let the victim be raped when there are 3^^^3 deprived rapists suffering, you surely would not allow it to happen if it was only 10 suffering rapists. So with that said, how is it different?

I just went over how the scenarios differ from each other in considerable detail. I could repeat myself in grotesque detail, but I'm starting to think it wouldn't buy very much for me, for you, or for anyone who might be reading this exchange.

So let's try another angle. It sounds to me like you're trying to draw an ethical equivalence between dust-subjects in TvDS and rapists in TVCR: more than questionable in real life, but I'll grant that level of suffering to the latter for the sake of argument. Also misses the point of drawing attention to scope insensitivity, but that's only obvious if you're running a utilitarian framework already, so let's go ahead and drop it for now. That leaves us with the mathematics of the scenarios, which do have something close to the same form.

Specifically: in both cases we're depriving some single unlucky subject of N utility in exchange for not withholding N * K utility divided up among several subjects for some K > 1. At this level we can establish a mapping between both thought experiments, although the exact K, the number of subjects, and the normative overtones are vastly, sillily different between the two.

Fine so far, but you seem to be treating this as an open-and-shut argument on its own: "you surely would not let the victim [suffer]". Well, that's begging the question, isn't it? From a utilitarian perspective it doesn't matter how many people we divide up N * K among, be it ten or some Knuth up-arrow abomination, as long as the resulting suffering can register as suffering. The fewer slices we use, the more our flawed moral intuitions take notice of them and the more commensurate they look; actually, for small numbers of subjects it starts to look like a choice between letting one person suffer horribly and doing the same to multiple people, at which point the right answer is either trivially obvious or cognate to the trolley problem depending on how we cast it.

About the only way I can make sense of what you're saying is by treating the N case -- and not just for the sake of argument, but as an unquestioned base assumption -- as a special kind of evil, incommensurate with any lesser crime. Which, frankly, I don't. It all gets mapped to people's preferences in the end, no matter how squicky and emotionally loaded the words you choose to describe it are.

Comment author: D227 12 November 2011 02:51:09AM *  -1 points [-]

From a utilitarian perspective it doesn't matter how many people we divide up N * K among, be it ten or some Knuth up-arrow abomination, as long as the resulting suffering can register as suffering.

I agree with this statement 100%. That was the point in my TvCR thought experiment. People who obviously picked T should again pick T. No one except one commentor actually conceded this point.

The fewer slices we use, the more our flawed moral intuitions take notice of them and the more commensurate they look; actually, for small numbers of subjects it starts to look like a choice between letting one person suffer horribly and doing the same to multiple people, at which point the right answer is either trivially obvious or cognate to the trolley problem depending on how we cast it.

Again, I feel as if you are making my argument for me. The problem is as you say obvious to the trolley problem on how we cast it.

You say my experiment is not really the same as Eliezer's. fine. If doesn't matter because we could just use your example. If utilitarians do not care for how many people we divide N*K with, then these utilitarians should state that they would indeed allow T to happen no matter what subject matter the K is as long as K is >1

Comment author: Nornagest 11 November 2011 08:41:28PM *  2 points [-]

I will consider this claim, if you can show my how it is really different.

Torture vs. Dust Specks attempts to illustrate scope insensitivity in ethical thought by contrasting a large unitary disutility against a fantastically huge number of small disutilities. And I really do mean fantastically huge: if the experiences are ethically commensurate at all (as is implied by most utilitarian systems of ethics), it's large enough to swamp any reasonable discounting you might choose to perform for any reason. It also has the advantage of being relatively independent of questions of "right" or "deserving": aside from the bare fact of their suffering, there's nothing about either the dust-subjects or the torture-subject that might skew us one way or another. Most well-reasoned objections to TvDS boil down to finding ways to make the two options incommensurate.

Your Ten Very Committed Rapists example (still not happy about that choice of subject, by the way) throws out scope issues almost entirely. Ten subjects vs. one subject is an almost infinitely more tractable ratio than 3^3^3^3 vs. one, and that allows us to argue for one option or another by discounting one of the options for any number of reasons. On top of that, there's a strong normative component: we're naturally much less inclined to favor people who get their jollies from socially condemned action, even if we've got a quasi-omniscient being standing in front of us and saying that their suffering is large and genuine.

Long story short, about all these scenarios have in common is the idea of weighing suffering against a somehow greater suffering. Torture vs. Dust Specks was trying to throw light on a fairly specific subset of scenarios like that, of which your example isn't a member. Nozick's utility monster, by contrast, is doing something quite a lot like you are, i.e. leveraging an intuition pump based on a viscerally horrible utilitarian positive. I don't see the positive vs. negative utility distinction as terribly important in this context, but if it bothers you you could easily construct a variant Utility Monster in which Utilizilla's terrible but nonfatal hunger is temporarily assuaged by each sentient victim or something.

Comment author: D227 12 November 2011 12:07:49AM 0 points [-]

Torture vs. Dust Specks attempts to illustrate scope insensitivity in ethical thought by contrasting a large unitary disutility against a fantastically huge number of small disutilities

Your Ten Very Committed Rapists example (still not happy about that choice of subject, by the way) throws out scope issues almost entirely. Ten subjects vs. one subject is an almost infinitely more tractable ratio than 3^3^3^3 vs. one, and that allows us to argue for one option or another by discounting one of the options for any number of reasons.

I do sincerely apologize if you are offended, but rape is torture as well and Eliezer's example can be equally if not more reprehensible.

It is simple why I chose 10. It is to highlight the paradox of those who choose to torture. I have made it easier for you. Lets see that we increase 10 to 3^^^3 deprived rapists. The point is, if you surely would not let the victim be raped when there are 3^^^3 deprived rapists suffering, you surely would not allow it to happen if it was only 10 suffering rapists. So with that said, how is it different?

Comment author: Nornagest 11 November 2011 07:31:44PM *  3 points [-]

You've essentially just constructed a Utility Monster. That's a rather different challenge to utilitarian ethics than Torture vs. Dust Specks, though; the latter is meant to be a straightforward scope insensitivity problem, while the former strikes at total-utility maximization by constructing an intuitively repugnant situation where the utility calculations come out positive. Unfortunately it looks like the lines between them have gotten a little blurry.

I'm really starting to hate thought experiments involving rape and torture, incidentally; the social need to signal "rape bad" and "torture bad" is so strong that it often overwhelms any insight they offer. Granted, there are perfectly good reasons to test theories on emotionally loaded subjects, but when that degenerates into judging ethical philosophy mostly by how intuitively benign it appears when applied to hideously deformed edge cases, it seems like something's gone wrong.

Comment author: D227 11 November 2011 07:53:26PM 0 points [-]

Unfortunately it looks like the lines between them have gotten a little blurry.

I will consider this claim, if you can show my how it is really different.

I have taken considerable care to construct a problem in which we are indeed are dealing with the trading suffering for potentially more suffering. It does not effect me one bit, that the topic has now switched from specks to rape. In fact if "detraction" happens, shouldn't it be the burden of the person who feels detracted to explain it? I merely ask for consistency.

In my mind I choose to affiliate with the I do not know the answer camp. There is no shame in that. I have not resolved the question yet. Yet there are people for whom it is obvious to choose torture, and refuse to answer the rape question. I am consistent in that I claim not to know or not to have resolved the question yet. May I ask for the same amount of consistency?

Comment author: DanielLC 11 November 2011 02:26:33AM 4 points [-]

Assume that dollars are utilons and they are linear (2 dollars indeed gives twice as much utility).

If each dollar gives the same amount of utility, then one person with $0 and one person with $1,000,000 would be just as good as two people with $500,000. That's how utility is defined. If Bob doesn't consider these choices just as good, then they do not give the same utility according to his PVG.

If you are a prioritarian, you'd go for specks. That said, I think you'd be less prioritarian if you had less of a scope insensitivity. If you really understood how much torture 3^^^3 dust specks produces, and you really understood how unlikely a 1/3^^^3 chance is, you'd probably go with torture.

Comment author: D227 11 November 2011 07:22:39PM -2 points [-]

If you really understood how much torture 3^^^3 dust specks produces...

You make a valid point. I will not deny that you have a strong point. All I ask is that you not deny me of having you remain consistent with your reasoning. I have reposted a thought experiment, please tell me what your answer is:

Omega has given you choice to allow or disallow 10 rapists to rape someone. Why 10 rapists? Omega knows the absolute utility across all humans, and unfortunately as terrible as it sounds, the suffering/torturing of 10 rapists not being able to rape is more suffering than what the victim feels. What do you do? 10 is < 3^^^3 suffering rapists. So lucky you, Omega need not burden you with the suffering of 3^^^3, if you chose to have rapist suffer. It is important that you not finagle your way out of the question. Please do not say that not being able to rape is not torture. Omega has already stated that indeed there is suffering for these rapist. It matters not if you would suffer such a thing.

Disclaimer: I am searching for the truth through rationality. I do not care whether the answer is torture, specks, or rape, only that it is the truth. If the rational answer is rape, I can do nothing but accept that for I am only in search of truth and not truth that fits me.

There are implications to choosing rape as the right answer. It means that in a rational society we must allow bad things to happen if that bad thing allows for total less suffering. We have to be consistent. Omega has given you a number of rapist far far far less than 3^^^3, surely you must allow for the rape to occur.

Literally, DanielLC, walks into a room with 10 rapists and a victim. The rapists tell him to "go away, and don't call the cops.". Omega appears and says, you may stop it if you want to, but I am all knowing and know that the utility experienced by the rapist or suffering from being deprived of raping is indeed greater than the suffering of the victim. What does Daniel do?

If you really understood how much torture 3^^^3 ~~dust specks~~ deprived rapists produces...

Comment author: peter_hurford 11 November 2011 05:10:05PM 0 points [-]

No one ever understands just how friggin large 3^^^3 is.

One could safely argue that it is better for the entire current world population to suffer a dust speck each than for someone to get tortured for fifty years, but expand that to 3^^^3 people? Radically different story.

Comment author: D227 11 November 2011 06:32:37PM *  -2 points [-]

That is the crux of the problem. Bob understands just as much as you claim you understand what 3^^^3 is. Yet he chooses the "Dust Holocaust".

First let me assume that you, peter_hurford, are a "Torturer" or rather, you are from the camp that obviously chooses 50 years. I have no doubt in my mind that you bring extremely rational and valid points to this discussions. You are poking holes in Bobs reasoning at its weakest points. This is a good thing.

I whole-heartedly concede that you have compelling points, by poking into holes into Bob's reasons. But lets start poking around your reasoning now.

Omega has given you choice to allow or disallow 10 rapist to rape someone. Why 10 rapist? Omega knows the absolute utility across all humans, and unfortunately as terrible as it sounds, the suffering/torturing of 10 rapist not being able to rape is more suffering than what the victim feels. What do you do? 10 is < 3^^^3 suffering rapists. So lucky you, Omega need not burden you with the suffering of 3^^^3, if you chose to have rapist suffer. It is important that you not finagle your way out of the question. Please do not say that not being able to rape is not torture. Omega has already stated that indeed there is suffering for these rapist. It matters not if you would suffer such a thing.

Disclaimer: I am searching for the truth through rationality. I do not care whether the answer is torture, specks, or rape, only that it is the truth. If the rational answer is rape, I can do nothing but accept that for I am only in search of truth and not truth that fits me.

There are implications to choosing rape as the right answer. It means that in a rational society we must allow bad things to happen if that bad thing allows for total less suffering. We have to be consistent. Omega has given you a number of rapist far far far less than 3^^^3, surely you must allow for the rape to occur.

Literally, peter_hurford ,walks into a room with 10 rapists and a victim. The rapists tell him to "go away, and don't call the cops.". Omega appears and says, you may stop it if you want to, but I am all knowing and know that the utility experienced by the rapist or suffering from being deprived of raping is indeed greater than the suffering of the victim. What does peter do?

Edit: Grammer

Comment author: Manfred 11 November 2011 04:14:41AM 2 points [-]

Are you familiar with prospect theory? You seem to be describing what you (an imperfectly rational agent) would choose, simply using "PVG" to label the stuff that makes you choose what you actually choose, and you end up taking probability into consideration in a way similar to prospect theory.

Are you also familiar with the reasons why prospect theory and similar probability-dependent values are pretty certainly irrational?

Comment author: D227 11 November 2011 04:54:51AM 1 point [-]

Are you familiar with prospect theory?

No, but I will surely read up on that now.

You seem to be describing what you (an imperfectly rational agent) would choose, simply using "PVG" to label the stuff that makes you choose what you actually choose, and you end up taking probability into consideration in a way similar to prospect theory.

Absolutely. In fact I can see how a theist will simply say, "it is my PVG to believe in God, therefore It is rational for me to do so."

I do not have a response to that. I will need to learn more before I can work this out in my head. Thank you for the insightful comments.

Do the people behind the veil of ignorance vote for "specks"?

1 D227 11 November 2011 01:26AM

The veil of ignorance as Rawls put it ..."no one knows his place in society, his class position or social status; nor does he know his fortune in the distribution of natural assets and abilities, his intelligence and strength, and the like." 

 

The device allows for certain issues like slavery and income distribution to be determined beforehand.  Would one vote for a society in which there is a chance of severe misfortune, but greater total utility?  e.g, a world where 1% earn $1 a day and 99% earn $1,000,000 vs. a world where everyone earns $900,000 a day.  Assume that dollars are utilons and they are linear (2 dollars indeed gives twice as much utility).  What is the obvious answer?  Bob chooses $900,000 a day for everyone.  

 

But Bob is clever and he does not trust himself that his choice is the rational choice, so he goes into self-dialogue to investigate:

Q: "What is my preference, value or goal(PVG), such that, instrumental rationality may achieve it?"

A "I my preference/value/goal is for there to be a world in which total utility is less, but severe misfortune eliminated for everyone"

Q " As an agent are you maximizing your own utility by your actions of choosing a $900,000 a day world?

A " Yes, my actions are consistent with my preferences; I will maximize my utility by achieving my preference of limiting everyone's utility.  This preference takes precedence.

Q: "I will now attack your position with the transitivity argument.  At which point does your consistency change?  What if the choices where 1% earns $999,999 and 99% earn 1,000,000?"

A: "My preference,values and goals have already determined a threshold, in fact my threshold is my PVG.  Regardless the fact that my threshold may be different from everyone else's threshold, my threshold is my PVG.  And achieving my PVG is rational."

Q: "I will now attack your position one last time, with the "piling argument".  If every time you save one person from destitution, you must pile on the punishment on the others such that everyone will be suffering."

A: "If piling is allowed then it is to me a completely different question.  Altering what my PVG is.  I have one set of values for a non piling and piling scenario.  I am consistent because piling and not piling are two different problems."

 

In the insurance industry, purchasing insurance comes with a price.  Perhaps 1.5% premium of the cost of reimbursing you for your house that may burn down.  The actuaries have run the probabilities and determine that you have a 1% chance that your house will burn down.  Assume that all dollar amounts are utilons across all assets.  Bob once again is a rational man.  Every year Bob is chooses to pay 1.5% in premium even though his average risk is technically a 1% loss, because Bob is risk adverse. So risk adverse that he prefers a world in which he has less wealth, the .5% went to the insurance companies making a profit. Once again Bob questions his rationality on purchasing insurance:

Q: "What is my preference?"

A: "I would prefer to sacrifice more than my share of losses( .5% more), for the safety-net of zero chance catastrophic loss."

Q "Are your actions achieving your values?"

A "Yes, I purchased insurance, maximizing my preference for safety."

Q "Shall I attack you with the transitivity argument?"

A "It wont work.  I have already set my PVG, it is a premium price at which I judge to make the costs prohibitive.  I will not pay 99% premium to protect my house , but I will pay 5%."

Q "Piling?"

A "This is a different problem now."

 

Eliezer's post on Torture vs. Dust Specks [Herehas generated lots of discussion as well as what Eliezer describes as interesting [ways] of [avoiding] the question.  We will do no sort of thing in this post, we will answer the question as intended; I will interpret that eye specks is cumulatively greater suffering than the suffering of 50 years. 

 My PVG tells me that I would rather have a speck in my eye, as well as the eye's of 3^^^3 people, than to risk to have one (perhaps me) suffer torture for 50 years, even though my risk is only 1/(3^^^3) which is a lot less than 50 years (Veil of ignorance).  My PVG is what I will maximize, and doing so is the definition of instrumental rationality.  

In short, the rational answer is not TORTURE or SPECKS, but depends on what your preference, values and goals are.  You may be one of those whose preference is to let that one person feel torture for 50 years, as long as your actions that steer the future toward outcomes ranked higher in your preferences, you are right too.

Correct me if I am wrong but I thought rationality did not imply that there were absolute rational preferences, but rather rational ways to achieve your preferences...

 

I want to emphasize that in no way did I intend for this post to declare anything.  And want to thank everyone in advance for picking apart every single word I have written.  Being wrong is like winning the lottery.  I do not claim to know anything, the assertive manner in which I wrote this post was merely a way to convey my ideas, of which, I am not sure off.   

 

 

 

View more: Next