Does this analysis focus on pure, monotone utility, or does it include the huge ripple effect putting dust specks into so many people's eyes would have? Are these people with normal lives, or created specifically for this one experience?
The answer that's obvious to me is that my mental moral machinery -- both the bit that says "specks of dust in the eye can't outweigh torture, no matter how many there are" and the bit that says "however small the badness of a thing, enough repetition of it can make it arbitrarily awful" or "maximize expected sum of utilities" -- wasn't designed for questions with numbers like 3^^^3 in. In view of which, I profoundly mistrust any answer I might happen to find "obvious" to the question itself.
Since there was a post about what seems obvious to the speaker might not be to the listener in this blog a few days ago, I thought I would point out that : It was NOT AT ALL obvious to me what should be preferred, torture 1 man for 50 years or speck of dust in 3^^^3 people. Can you please plase clarify/update what the point of the post was?
The dust speck is described as "barely enough to make you notice", so however many people it would happen to, it seems better than even something a lot less worse than 50 years of horrible torture. There are so many irritating things that a human barely notices in his/her life, what's an extra dust speck?
I think I'd trade the dust specks for even a kick in the groin.
But hey, maybe I'm missing something here...
If 3^^^3 people get dust in their eye, an extraordinary number of people will die.
The premise assumes it's "barely enough to make you notice", which was supposed to rule out any other unpleasant side-effects.
Anon, I deliberately didn't say what I thought, because I guessed that other people would think a different answer was "obvious". I didn't want to prejudice the responses.
Even when applying the cold cruel calculus of moral utilitarianism, I think that most people acknowledge that egalitarianism in a society has value in itself, and assign it positive utility. Would you rather be born into a country where 9/10 people are destitute (<$1000/yr), and the last is very wealthy (100,000/yr)? Or, be born into a country where almost all people subsist on a modest (6-8000/yr) amount?
Any system that allocates benefits (say, wealth) more fairly might be preferable to one that allocates more wealth in a more unequal fashion. And, the same goes for negative benefits. The dust specks may result in more total misery, but there is utility in distributing that misery equally.
The dust specks seem like the "obvious" answer to me, but how large the tiny harm must be to cross the line where the unthinkably huge number of them outweighs a single tremendous one isn't something I could easily say, when clearly I don't think simply calculating the total amount of harm caused is the right measure.
It seems obvious to me to choose the dust specks because that would mean that the human species would have to exist for an awfully long time for the total number of people to equal that number and that minimum amount of annoyance would be something they were used to anyway.
I too see the dust specks as obvious, but for the simpler reason that I reject utilitarian sorts of comparisons like that. Torture is wicked, period. If one must go further, it seems like the suffering from torture is qualitatively worse than the suffering from any number of dust specks.
Anon prime: dollars are not utility. Economic egalitarianism is instrumentally desirable. We don't normally favor all types of equality, as Robin frequently points out.
Kyle: cute
Eliezer: My impulse is to choose the torture, even when I imagine very bad kinds of torture and very small annoyances (I think that one can go smaller than a dust mote, possibly something like a letter on the spine of a book that your eye sweeps over being in a shade less well selected a font). Then, however, I think of how much longer the torture could last and still not outweigh the trivial annoyances if I am to take the utilitarian perspective and my mind breaks. Condoning 50 years of torture, or even a day worth, is pretty much the same as condoning universes of agonium lasting for eons in the face of numbers like these, and I don't think that I can condone that for any amount of a trivial benefit.
Personally, I choose C: torture 3^^^3 people for 3^^^3 years. Why? Because I can.
Ahem. My morality is based on maximizing average welfare, while also avoiding extreme individual suffering, rather than cumulative welfare.
So torturing one man for fifty years is not preferable to annoying any number of people.
This is different when the many are also suffering extremely, though - then it may be worthwhile to torture one even more to save the rest.
Trivial annoyances and torture cannot be compared in this quantifiable manner. Torture is not only suffering, but lost opportunity due to imprisonment, permanent mental hardship, activation of pain and suffering processes in the mind, and a myriad of other unconsidered things.
And even if the torture was 'to have flecks of dust dropped in your eyes', you still can't compare a 'torturous amount' applied to one person, to substantial number dropped in the eyes of many people: We aren't talking about cpu cycles here - we are trying to quantify qualifiables.
If ...
I think this all revolves around one question: Is "disutility of dust speck for N people" = N*"disutility of dust speck for one person"?
This, of course, depends on the properties of one's utility function.
How about this... Consider one person getting, say, ten dust specks per second for an hour vs 106060 = 36,000 people getting a single dust speck each.
This is probably a better way to probe the issue at its core. Which of those situations is preferable? I would probably consider the second. However, I suspect one person getting a billion dust specks in their eye per second for an hour would be preferable to 1000 people getting a million per second for an hour.
Suffering isn't linear in dust specks. Well, actually, I'm not sure subjective states in general can be viewed in a linear way. At least, if there is a potentially valid "linear qualia theory", I'd be surprised.
But as far as the dust specks vs torture thing in the original question? I think I'd go with dust specks for all.
But that's one person vs buncha people with dustspecks.
Oh, just had a thought. A less extreme yet quite related real world situation/question would be this: What is appropriate punishment for spammers?
Yes, I understand there're a few additional issues here, that would make it more analogous to, say, if the potential torturee was planning on deliberately causing all those people a DSE (Dust Speck Event)
But still, the spammer issue gives us a more concrete version, involving quantities that don't make our brains explode, so considering that may help work out the principles by which these sorts of questions can be dealt with.
The problem with spammers isn't the cause of a singular dust spec event: it's the cause of multiple dust speck events repeatedly to individuals in the population in question. It's also a 'tragedy of the commons' question, since there is more than one spammer.
To respond to your question: What is appropriate punishment for spammers? I am sad to conclude that until Aubrey DeGray manages to conquer human mortality, or the singularity occurs, there is no suitable punishment for spammers.
After either of those, however, I would propose unblocking everyone's toilets and/or triple shifts as a Fry's Electronics floor lackey until the universal heat death, unless you have even >less< interesting suggestions.
If you could take all the pain and discomfort you will ever feel in your life, and compress it into a 12-hour interval, so you really feel ALL of it right then, and then after the 12 hours are up you have no ill effects - would you do it? I certainly would. In fact, I would probably make the trade even if it were 2 or 3 times longer-lasting and of the same intensity. But something doesn't make sense now... am I saying I would gladly double or triple the pain I feel over my whole life?
The upshot is that there are some very nonlinear phenomena involved with calculating amounts of suffering, as Psy-Kosh and others have pointed out. You may indeed move along one coordinate in "suffering-space" by 3^^^3 units, but it isn't just absolute magnitude that's relevant. That is, you cannot recapitulate the "effect" of fifty years of torturing with isolated dust specks. As the responses here make clear, we do not simply map magnitudes in suffering space to moral relevance, but instead we consider the actual locations and contours. (Compare: you decide to go for a 10-mile hike. But your enjoyment of the hike depends more on where you go, than the distance traveled.)
Yes the answer is obvious. The answer is that this question obviously does not yet have meaning. It's like an ink blot. Any meaning a person might think it has is completely inside his own mind. Is the inkblot a bunny? Is the inkblot a Grateful Dead concert? The right answer is not merely unknown, because there is no possible right answer.
A serious person-- one who take moral dilemmas seriously, anyway-- must learn more before proceeding.
The question is an inkblot because too many crucial variables have been left unspecified. For instance, in order for thi...
The non-linear nature of 'qualia' and the difficulty of assigning a utility function to such things as 'minor annoyance' has been noted before. It seems to some insolvable. One solution presented by Dennett in 'Consciousness Explained' is to suggest that there is no such thing as qualia or subjective experience. There are only objective facts. As Searle calls it 'consciousness denied'. With this approach it would (at least theoretically) be possible to objectively determine the answer to this question based on something like the number of ergs needed to...
Uh... If there's no such thing as qualia, there's no such thing as actual suffering, unless I misunderstand your description of Dennett's views.
But if my understanding is correct, and those views were correct, then wouldn't the answer be "nobody actually exists to care one way or another?" (Or am I sorely mistaken in interpreting that view?)
Regarding your example of income disparity: I might rather be born into a system with very unequal incomes, if, as in America (in my personal and biased opinion), there is a reasonable chance of upping my income through persistence and pluck. I mean hey, that guy with all that money has to spend it somewhere-- perhaps he'll shop at my superstore!
But wait, what does wealth mean? In the case where everyone has the same income, where are they spending their money? Are they all buying the same things? Is this a totalitarian state? An economy without disparity ...
If even one in a hundred billion of the people is driving and has an accident because of the dust speck and gets killed, that's a tremendous number of deaths. If one in a hundred quadrillion of them survives the accident but is mangled and spends the next 50 years in pain, that's also a tremendous amount of torture.
If one in a hundred decillion of them is working in a nuclear power plant and the dust speck makes him have a nuclear accident....
We just aren't designed to think in terms of 3^^^3. It's too big. We don't habitually think much about one-in-a-million chances, much less one in a hundred decillion. But a hundred decillion is a very small number compared to 3^^^3.
Douglas and Psy-Kosh: Dennett explicitly says that in denying that there are such things as qualia he is not denying the existence of conscious experience. Of course, Douglas may think that Dennett is lying or doesn't understand his own position as well as Douglas does.
James Bach and J Thomas: I think Eliezer is asking us to assume that there are no knock-on effects in either the torture or the dust-speck scenario, and the usual assumption in these "which economy would you rather have?" questions is that the numbers provided represent the situati...
J Thomas: You're neglecting that there might be some positive-side effects for a small fraction of the people affected by the dust specks; in fact, there is some precedent for this. The resulting average effect is hard to estimate, but (considering that dust specks seem to mostly add entropy to the thought processes of the affected persons), would likely still be negative.
Copying g's assumption that higher-order effects should be neglected, I'd take the torture. For each of the 3^^^3 persons, the choice looks as follows:
1.) A 1/(3^^^3) chance of being tort...
As I read this I knew my answer would be the dust specks. Since then I have been mentally evaluating various methods for deciding on the ethics of the situation and have chosen the one that makes me feel better about the answer I instinctively chose.
I can tell you this though. I reckon I personally would choose max five minutes of torture to stop the dust specks event happening. So if the person threatened with 50yrs of torture was me, I'd choose the dust specks.
What if it were a repeatable choice?
Suppose you choose dust specks, say, 1,000,000,000 times. That's a considerable amount of torture inflicted on 3^^^3 people. I suspect that you could find the number of times equivalent to torturing each of thoes 3^^^3 people 50 years, and that number would be smaller than 3^^^3. In other words, choose the dust speck enough times, and more people would be tortured effectually for longer than if you chose the 50-year torture an equivalent number of times.
If that math is correct, I'd have to go with the torture, not the dust specks.
Kyle wins.
Absent using this to guarantee the nigh-endless survival of the species, my math suggests that 3^^^3 beats anything. The problem is that the speck rounds down to 0 for me.
There is some minimum threshold below which it just does not count, like saying, "What if we exposed 3^^^3 people to radiation equivalent to standing in front of a microwave for 10 seconds? Would that be worse than nuking a few cities?" I suppose there must be someone in 3^^^3 who is marginally close enough to cancer for that to matter, but no, that rounds down to 0...
Wow. The obvious answer is TORTURE, all else equal, and I'm pretty sure this is obvious to Eliezer too. But even though there are 26 comments here, and many of them probably know in their hearts torture is the right choice, no one but me has said so yet. What does that say about our abilities in moral reasoning?
Given that human brains are known not to be able to intuitively process even moderately large numbers, I'd say the question can't meaningfully be asked - our ethical modules simply can't process it. 3^^^3 is too large - WAY too large.
I'm unconvinced that the number is too large for us to think clearly. Though it takes some machinery, humans reason about infinite quantities all the time and arrive at meaningful conclusions.
My intuitions strongly favor the dust speck scenario. Even if forget 3^^^^3 and just say that an infinite number of people will experience the speck, I'd still favor it over the torture.
Robin is absolutely wrong, because different instances of human suffering cannot be added together in any meaningful way. The cumulative effect when placed on one person is far greater than the sum of many tiny nuisances experienced by many. Whereas small irritants such as a dust mote do not cause "suffering" in any standard sense of the word, the sum total of those motes concentrated at one time and placed into one person's eye could cause serious injury or even blindness. Dispersing the dust (either over time or across many people) mitigates...
The obvious answer is TORTURE, all else equal, and I'm pretty sure this is obvious to Eliezer too.
That is the straightforward utilitarian answer, without any question. However, it is not the common intuition, and even if Eliezer agrees with you he is evidently aware that the common intuition disagrees, because otherwise he would not bother blogging it. It's the contradiction between intuition and philosophical conclusion that makes it an interesting topic.
Robin's answer hinges on "all else being equal." That condition can tie up a lot of loose ends, it smooths over plenty of rough patches. But those ends unravel pretty quickly once you start to consider all the ways in which everything else is inherently unequal. I happen to think the dust speck is a 0 on the disutility meter, myself, and 3^^^3*0 disutilities = 0 disutility.
I believe that ideally speaking the best choice is the torture, but pragmatically, I think the dust speck answer can make more sense. Of course it is more intuitive morally, but I would go as far as saying that the utility can be higher for the dust specks situation (and thus our intuition is right). How? the problem is in this sentence: "If neither event is going to happen to you personally," the truth is that in the real world, we can't rely on this statement. Even if it is promised to us or made into a law, this type of statements often won't ...
Robin, could you explain your reasoning. I'm curious.
Humans get barely noticeable "dust speck equivalent" events so often in their lives that the number of people in Eliezer's post is irrelevant; it's simply not going to change their lives, even if it's a gazillion lives, even with a number bigger than Eliezer's (even considering the "butterfly effect", you can't say if the dust speck is going to change them for the better or worse -- but with 50 years of torture, you know it's going to be for the worse).
Subjectively for these people, ...
@Robin,
"But even though there are 26 comments here, and many of them probably know in their hearts torture is the right choice, no one but me has said so yet."
I thought that Sebastian Hagen and I had said it. Or do you think we gave weasel answers? Mine was only contingent on my math being correct, and I thought his was similarly clear.
Perhaps I was unclear in a different way. By asking if the choice was repeatable, I didn't mean to dodge the question; I meant to make it more vivid. Moral questions are asked in a situation where many people a...
Hmm, thinking some more about this, I can see another angle (not the suffering angle, but the "being prudent about unintended consequences" angle):
If you had the choice between very very slightly changing the life of a huge number of people or changing a lot the life of only one person, the prudent choice might be to change the life of only one person (as horrible as that change might be).
Still, with the dust speck we can't really know if the net final outcome will be negative or positive. It might distract people who are about to have genius ide...
Would you prefer that one person be horribly tortured for fifty years without hope or rest, or that 3^^^3 people get dust specks in their eyes?
The square of the number of milliseconds in 50 years is about 10^21.
Would you rather one person tortured for a millisecond (then no ill effects), or that 3^^^3/10^21 people get a dust speck per second for 50 centuries?
OK, so the utility/effect doesn't scale when you change the times. But even if each 1% added dust/torture time made things ten times worse, when you reduce the dust-speckled population to reflect that it's still countless universes worth of people.
I'm with Tomhs. The question has less value as a moral dilemma than as an opportunity to recognize how we think when we "know" the answer. I intentionally did not read the comments last night so I could examine my own thought process, and tried very hard to hold an open mind (my instinct was dust). It's been a useful and interesting experience. Much better than the brain teasers which I can generally get because I'm on hightened alert when reading El's posts. Here being on alert simply allowed me to try to avoid immediately giving in to my bias.
Averaging utility works only when law of large numbers starts to play a role. It's a good general policy, as stuff subject to it happens all the time, enough to give sensible results over the human/civilization lifespan. So, if Eliezer's experiment is a singular event and similar events don't happen frequently enough, answer is 3^^^3 specks. Otherwise, torture (as in this case, similar frequent enough choices would lead to a tempest of specks in anyone's eye which is about 3^^^3 times worse then 50 years of torture, for each and every one of them).
Benquo, your first answer seems equivocal, and so did Sebastian's on a first reading, but now I see that it was not.
Torture,
Consider three possibilities:
(a) A dusk speck hits you with probability one, (b) You face an additional probability 1/( 3^^^3) of being tortured for 50 years, (c) You must blink your eyes for a fraction of a second, just long enough to prevent a dusk speck from hitting you in the eye.
Most people would pick (c) over (a). Yet, 1/( 3^^^3) is such a small number that by blinking your eyes one more time than you normally would you increase your chances of being captured by a sadist and tortured for 50 years by more than 1/( 3^^^3). Thus, (b) must be better than (c). Consequently, most people should prefer (b) to (a).
There isn't any right answer. Answers to what is good or bad is a matter of taste, to borrow from Nietzsche.
To me the example has messianic quality. One person suffers immensely to save others from suffering. Does the sense that there is a 'right' answer come from a Judeo-Christian sense of what is appropriate. Is this a sort of bias in line with biases towards expecting facts to conform to a story?
Also, this example suggests to me that the value pluralism of Cowen makes much more sense than some reductive approach that seeks to create one objective me...
Why is this a serious question? Given the physical unreality of the situation, the putative existence of 3^^^3 humans and the ability to actually create the option in the physical universe - why is this question taken seriously while something like is it better to kill Santa Claus or the Easter Bunny considered silly?
Fascinating, and scary, the extent to which we adhere to established models of moral reasoning despite the obvious inconsistencies. Someone here pointed out that the problem wasn't sufficiently defined, but then proceeded to offer examples of objective factors that would appear necessary to evaluation of a consequentialist solution. Robin seized upon the "obvious" answer that any significant amount of discomfort, over such a vast population, would easily dominate, with any conceivable scaling factor, the utilitarian value of the torture of a si...
The hardships experienced by a man tortured for 50 years cannot compare to a trivial experience massively shared by a large number of individuals -- even on the scale that Eli describes. There is no accumulation of experiences, and it cannot be conflated into a larger meta dust-in-the-eye experience; it has to be analyzed as a series of discreet experiences.
As for larger social implications, the negative consequence of so many dust specked eyes would be negligible.
Eliezer wrote "Wow. People sure are coming up with interesting ways of avoiding the question."
I posted earlier on what I consider the more interesting question of how to frame the problem in order to best approach a solution.
If I were to simply provide my "answer" to the problem, with the assumption that the dust in the eyes is likewise limited to 50 years, then I would argue that the dust is to be preferred to the torture, not on a utilitarian basis of relative weights of the consequences as specified, but on the bigger-picture view th...
Eliezer, are you suggesting that declining to make up one's mind in the face of a question that (1) we have excellent reason to mistrust our judgement about and (2) we have no actual need to have an answer to is somehow disreputable?
As for your link to the "motivated stopping" article, I don't quite see why declining to decide on this is any more "stopping" than choosing a definite one of the options. Or are you suggesting that it's an instance of motivated continuation? Perhaps it is, but (as you said in that article) the problem with ...
What happens if there aren't 3^^^3 instanced people to get dust specks? Do those specks carry over such that person #1 gets a 2nd speck and so on? If so, you would elect to have the person tortured for 50 years for surely the alternative is to fill our universe with dust and annihilate all cultures and life.
Robin, of course it's not obvious. It's only an obvious conclusion if the global utility function from the dust specks is an additive function of the individual utilities, and since we know that utility functions must be bounded to avoid Dutch books, we know that the global utility function cannot possibly be additive -- otherwise you could break the bound by choosing a large enough number of people (say, 3^^^3).
From a more metamathematical perspective, you can also question whether 3^^3 is a number at all. It's perfectly straightforward to construct a p...
Eliezer, are you suggesting that declining to make up one's mind in the face of a question that (1) we have excellent reason to mistrust our judgement about and (2) we have no actual need to have an answer to is somehow disreputable?
Yes, I am.
Regarding (1), we pretty much always have excellent reason to mistrust our judgments, and then we have to choose anyway; inaction is also a choice. The null plan is a plan. As Russell and Norvig put it, refusing to act is like refusing to allow time to pass.
Regarding (2), whenever a tester finds a user input that cr...
Fascinating question. No matter how small the negative utility in the dust speck, multiplying it with a number such as 3^^^3 will make it way worse than torture. Yet I find the obvious answer to be the dust speck one, for reasons similar to what others have pointed out - the negative utility rounds down to zero.
But that doesn't really solve the problem, for what if the harm in question was slightly larger? At what point does it cease rounding down? I have no meaningful criteria to give for that one. Obviously there must be a point where it does cease doing...
"Regarding (1), we pretty much always have excellent reason to mistrust our judgments, and then we have to choose anyway; inaction is also a choice. The null plan is a plan. As Russell and Norvig put it, refusing to act is like refusing to allow time to pass."
This goes to the crux of the matter, why to the extent the future is uncertain, it is better to decide based on principles (representing wisdom encoded via evolutionary processes over time) rather than on the flat basis of expected consequences.
Would you condemn one person to be horribly tortured for fifty years without hope or rest, to save every qualia-experiencing being who will ever exist one blink?
Is the question significantly changed by this rephrasing? It makes SPECKS the default choice, and it changes 3^^^3 to "all." Are we better able to process "all" than 3^^^3, or can we really process "all" at all? Does it change your answer if we switch the default?
Would you force every qualia-experiencing being who will ever exist to blink one additional time to save one person from being horribly tortured for fifty years without hope or rest?
> For those who would pick TORTURE, what about Vassar's universes of agonium? Say a googolplex-persons' worth of agonium for a googolplex years.
If you mean would I condemn all conscious beings to a googolplex of torture to avoid universal annihilation from a big "dust crunch" my answer is still probably yes. The alternative is universal doom. At least the tortured masses might have some small chance of finding a solution to their problem at some point. Or at least a googolplex years might pass leaving some future civilization free to prosper. ...
> Would you condemn one person to be horribly tortured for fifty years without hope or rest, to save every qualia-experiencing being who will ever exist one blink?
That's assuming you're interpreting the question correctly. That you aren't dealing with an evil genie.
You never said we couldn't choose who specifically gets tortured, so I'm assuming we can make that selection. Given that, the once agonizingly difficult choice is made trivially simple. I would choose 50 years of torture for the person who made me make this decision.
Since I chose the specks -- no, I probably wouldn't pay a penny; avoiding the speck is not even worth the effort to decide to pay the penny or not. I would barely notice it; it's too insignificant to be worth paying even a tiny sum to avoid.
I suppose I too am "rounding down to zero"; a more significant harm would result in a different answer.
"For those who would pick SPECKS, would you pay a single penny to avoid the dust specks?"
To avoid all the dust specks, yeah, I'd pay a penny and more. Not a penny per speck, though ;)
The reason is to avoid having to deal with the "unintended consequences" of being responsible for that very very small change over such a large number of people. It's bound to have some significant indirect consequences, both positive and negative, on the far edges of the bell curve... the net impact could be negative, and a penny is little to pay to avoid responsibility for that possibility.
The first thing I thought when I read this question was that the dust specks were obviously preferable. Then I remembered that my intuition likes to round 3^^^3 down to something around twenty. Obviously, the dust specks are preferable to the torture for any number at all that I have any sort of intuitive grasp over.
But I found an argument that pretty much convinced me that the torture was the correct answer.
Suppose that instead of making this choice once, you will be faced with the same choice 10^17 times for the next fifty years (This number was chosen...
"... whenever a tester finds a user input that crashes your program, it is always bad - it reveals a flaw in the code - even if it's not a user input that would plausibly occur; you're still supposed to fix it. "Would you kill Santa Claus or the Easter Bunny?" is an important question if and only if you have trouble deciding. I'd definitely kill the Easter Bunny, by the way, so I don't think it's an important question."
I write code for a living; I do not claim that it crashes the program. Rather the answer is irrelevant as I don't thin...
By "pay a penny to avoid the dust specks" I meant "avoid all dust specks", not just one dust speck. Obviously for one speck I'd rather have the penny.
what about Vassar's universes of agonium? Say a googolplex-persons' worth of agonium for a googolplex years.
To reduce suffering in general rather than your own (it would be tough to live with), bring on the coddling grinders. (10^10^100)^2 is a joke next to 3^^^3.
Having said that, it depends on the qualia-experiencing population of all existence compared to the numbers affected, and whether you change existing lives or make new ones. If only a few googolplex-squared people-years exist anyway, I vote dust.
I also vote to kill the bunny.
For those who would pick TORTURE, what about Vassar's universes of agonium? Say a googolplex-persons' worth of agonium for a googolplex years.
Torture, again. From the perspective of each affected individual, the choice becomes:
1.) A (10(10100))/(3^^^3) chance of being tortured for (10(10100)) years.
2.) A 1 chance of a dust speck.
(or very slightly different numbers if the (10(10100)) people exist in addition to the 3^^^3 people; the difference is too small to be noticable)
I'd still take the former. (10(10100))/(3^^^3) is still so close to zero that there'...
Eliezer, it's the combination of (1) totally untrustworthy brain machinery and (2) no immediate need to make a choice that I'm suggesting means that withholding judgement is reasonable. I completely agree that you've found a bug; congratulations, you may file a bug report and add it to the many other bug reports already on file; but how do you get from there to the conclusion that the right thing to do is to make a choice between these two options?
When I read the question, I didn't go into a coma or become psychotic. I didn't even join a crazy religion or ...
Let's suppose we measure pain in pain points (pp). Any event which can cause pain is given a value in [0, 1], with 0 being no pain and 1 being the maximum amount of pain perceivable. To calculate the pp of an event, assign a value to the pain, say p, and then multiply it by the number of people who will experience the pain, n. So for the torture case, assume p = 1, then:
torture: 1*1 = 1 pp
For the spec in eye case, suppose it causes the least amount of pain greater than no pain possible. Denote this by e. Assume that the dust speck causes e amount of ...
"Wow. People sure are coming up with interesting ways of avoiding the question."
My response was a real request for information- if this is a pure utility test, I would select the dust specks. If this were done to a complex, functioning society, adding dust specks into everyone's eyes would disrupt a great deal of important stuff- someone would almost certainly get killed in an accident due to the distraction, even on a planet with only 10^15 people and not 3^^^^3.
Eliezer, in your response to g, are you suggesting that we should strive to ensure that our probability distribution over possible beliefs sum to 1? If so, I disagree: I don't think this can be considered a plausible requirement for rationality. When you have no information about the distribution, you ought to assign probabilities uniformly, according to Laplace's principle of indifference. But the principle of indifference only works for distributions over finite sets. So for infinite sets you have to make an arbitrary choice of distribution, which violates indifference.
"For those who would pick SPECKS, would you pay a single penny to avoid the dust specks?"
Yes. Note that, for the obvious next question, I cannot think of an amount of money large enough such that I would rather keep it than use it to save a person from torture. Assuming that this is post-Singularity money which I cannot spend on other life-saving or torture-stopping efforts.
"You probably wouldn't blind everyone on earth to save that one person from being tortured, and yet, there are (3^^^3)/(10^17) >> 7*10^9 people being blinded for ea...
My algorithm goes like this:
there are two variables, X and Y.
Adding a single additional dust speck to a person's eye over their entire lifetime increases X by 1 for every person this happens to.
A person being tortured for a few minutes increases Y by 1.
I would object to most situations where Y is greater than 1. But I have no preferences at all with regard to X.
See? Dust specks and torture are not the same. I do not lump them together as "disutility". To do so seems to me a preposterous oversimplification. In any case, it has to be argued that...
I am not convinced that this question can be converted into a personal choice where you face the decision of whether to take the speck or a 1/3^^^3 chance of being tortured. I would avoid the speck and take my chances with torture, and I think that is indeed an obvious choice.
I think a more apposite application of that translation might be:
If I knew I was going to live for 3^^^3+50*365 days, and I was faced with that choice every day, I would always choose the speck, because I would never want to endure the inevitable 50 years of torture.
The difference is that framing the question as a one-off individual choice obscures the fact that in the example proffered, the torture is a certainty.
1/3^^^3 chance of being tortured... If I knew I was going to live for 3^^^3+50*365 days, and I was faced with that choice every day, I would always choose the speck, because I would never want to endure the inevitable 50 years of torture.
That wouldn't make it inevitable. You could get away with it, but then you could get multiple tortures. Rolling 6 dice often won't get exactly one "1".
Tom McCabe wrote:
The probability is effectively much greater than that, because of complexity compression. If you have 3^^^^3 people with dust specks, almost all of them will be identical copies of each other, greatly reducing abs(U(specks)). abs(U(torture)) would also get reduced, but by a much smaller factor, because the number is much smaller to begin with.
Is there something wrong with viewing this from the perspective of the affected individuals (unique or not)? For any individual instance of a person, the probability of directly experiencing the tortu...
because of complexity compression. If you have 3^^^^3 people with dust specks, almost all of them will be identical copies of each other, greatly reducing abs(U(specks)).
If so, I want my anti-wish back. Evil Genie never said anything about compression. No wonder he has so many people to dust. I'm complaining to GOD Over Djinn.
If they're not compressed, surely a copy will still experience qualia? Does it matter that it's identical to another? If the sum experience of many copies is weighted as if there was just one, then I'm officially converting from infinite set agnostic to infinite set atheist.
Bayesianism, Infinite Decisions, and Binding replies to Vann McGee's "An airtight dutch book", defending the permissibility of an unbounded utility function.
An option that dominates in finite cases will always provably be part of the maximal option in finite problems; but in infinite problems, where there is no maximal option, the dominance of the option for the infinite case does not follow from its dominance in all finite cases.
If you allow a discontinuity where the utility of the infinite case is not the same as the limit of the utilities of t...
It is clearly not so easy to have a non-subjective determination of utility.
After some thought I pick the torture. That is because the concept of 3^^^3 people means that no evolution will occur while that many people live. The one advantage to death is that it allows for evolution. It seems likely that we will have evovled into much more interesting life forms long before 3^^^3 of us have passed.
What's the utility of that?
Recovering Irrationalist:
True: my expected value would be 50 years of torture, but I don't think that changes my argument much.
Sebastian:
I'm not sure I understand what you're trying to say. (50*365)/3^^^3 (which is basically the same thing as 1/3^^^3) days of torture wouldn't be anything at all, because it wouldn't be noticeable. I don't think you can divide time to that extent from the point of view of human consciousness.
I don't think the math in my personal utility-estimation algorithm works out significantly differently depending on which of the cas...
I'll go ahead and reveal my answer now: Robin Hanson was correct, I do think that TORTURE is the obvious option, and I think the main instinct behind SPECKS is scope insensitivity.
Some comments:
While some people tried to appeal to non-linear aggregation, you would have to appeal to a non-linear aggregation which was non-linear enough to reduce 3^^^3 to a small constant. In other words it has to be effectively flat. And I doubt they would have said anything different if I'd said 3^^^^3.
If anything is aggregating nonlinearly it should be the 50 years of torture, to which one person has the opportunity to acclimate; there is no individual acclimatization to the dust specks because each dust speck occurs to a different person. The only person who could be "acclimating" to 3^^^3 is you, a bystander who is insensitive to the inconceivably vast scope.
Scope insensitivity - extremely sublinear aggregation by individuals considering bad events happening to many people - can lead to mass defection in a multiplayer prisoner's dilemma even by altruists who would normally cooperate. Suppose I can go skydiving today but this causes the world to get warmer by 0.000001 degree Celsius...
Rounding to zero is odd. In the absence of other considerations, you have no preference whether or not people get a dust speck in their eye?
It is also in violation of the structure of the thought experiment - a dust speck was chosen as the least bad bad thing that can happen to someone. If you would round it to zero, then you need to choose slightly worse thing - I can't imagine your intuitions will be any less shocked by preferring torture to that slightly worse thing.
While some people tried to appeal to non-linear aggregation, you would have to appeal to a non-linear aggregation which was non-linear enough to reduce 3^^^3 to a small constant.
Sum(1/n^2, 1, 3^^^3) < Sum(1/n^2, 1, inf) = (pi^2)/6
So an algorithm like, "order utilities from least to greatest, then sum with a weight if 1/n^2, where n is their position in the list" could pick dust specks over torture while recommending most people not go sky diving (as their benefit is outweighed by the detriment to those less fortunate).
This would mean that scope insensitivity, beyond a certain point, is a feature of our morality rather than a bias; I am not sure my opinion of this outcome.
That said, while giving an answer to the one problem that some seem more comfortable with, and to the second that everyone agrees on, I expect there are clear failure modes I haven't thought of.
Edited to add:
This of course holds for weights of 1/n^a for any a>1; the most convincing defeat of this proposition would be showing that weights of 1/n (or 1/(n log(n))) drop off quickly enough to lead to bad behavior.
If anything is aggregating nonlinearly it should be the 50 years of torture, to which one person has the opportunity to acclimate; there is no individual acclimatization to the dust specks because each dust speck occurs to a different person
I find this reasoning problematic, because in the dust specks there is effectively nothing to acclimate to... the amount of inconvenience to the individual will always be smaller in the speck scenario (excluding secondary effects, such as the individual being distracted and ending up in a car crash, of course).
Which exa...
Well as long as we've gone to all the trouble to collect 85 comments on this topic, this seems like an great chance for a disagreement case study. It would be interesting to collect stats on who takes what side, and to relate that to their various kinds of relevant expertize. For the moment I disturbed by the fact that Eliezer and I seem to be in a minority here, but comforted a bit by the fact that we seem to know decision theory better than most. But I'm open to new data on the balance of opinion and the balance of relevant expertize.
The diagnosis of scope insensitivity presupposes that people are trying to perform a utilitarian calculation and failing. But there is an ordinary sense in which a sufficiently small harm is no wrong. A harm must reach a certain threshold before the victim is willing to bear the cost of seeking redress. Harms that fall below the threshold are shrugged off. And an unenforced law is no law. This holds even as the victims multiply. A class action lawsuit is possible, summing the minuscule harms, but our moral intuitions are probably not based on those.
Now, this is considerably better reasoning - however, there was no clue to this being a decision that would be selected over and over by countless of people. Had it been worded "you among many have to make the following choice...", I could agree with you. But the current wording implied that it was once-a-universe sort of choice.
The choice doesn't have to be repeated to present you with the dilemma. Since all elements of the problem are finite - not countless, finite - if you refuse all actions in the chain, you should also refuse the start of t...
Actually, that was a poor example because taxing one penny has side effects. I would rather save one life and everyone in the world poked with a stick with no other side effects, because I put a substantial probability on lifespans being longer than many might anticipate. So even repeating this six billion times to save everyone's life at the price of 120 years of being repeatedly poked with a stick, would still be a good bargain.
Where there are no special inflection points, a bad repeated action should be a bad individual action, a good repeated action ...
Robin: dare I suggest that one area of relevant expertise is normative philosophy for-@#%(^^$-sake?!
It's just painful -- really, really, painful -- to see dozens of comments filled with blinkered nonsense like "the contradiction between intuition and philosophical conclusion" when the alleged "philosophical conclusion" hinges on some ridiculous simplistic Benthamite utilitarianism that nobody outside of certain economics departments and insular technocratic computer-geek blog communities actually accepts! My model for the torture cas...
dozens of comments filled with blinkered nonsense like "the contradiction between intuition and philosophical conclusion" when the alleged "philosophical conclusion" hinges on some ridiculous simplistic Benthamite utilitarianism that nobody outside of certain economics departments and insular technocratic computer-geek blog communities actually accepts!
You've quoted one of the few comments which your criticism does not apply to. I carry no water for utilitarian philosophy and was here highlighting its failure to capture moral intuition.
all types of pleasures and pains are commensurable such that for all i, j, given a quantity of pleasure/pain experience i, you can find a quantity of pleasure/pain experience j that is equal to (or greater or less than) it. (i.e. that pleasures and pains exist on one dimension)
Is a consistent and complete preference ordering without this property possible?
"An option that dominates in finite cases will always provably be part of the maximal option in finite problems; but in infinite problems, where there is no maximal option, the dominance of the option for the infinite case does not follow from its dominance in all finite cases."
From Peter's proof, it seems like you should be able to prove that an arbitrarily large (but finite) utility function will be dominated by events with arbitrarily large (but finite) improbabilities.
"Robin Hanson was correct, I do think that TORTURE is the obvious opti...
Elizer: "It's wrong when repeated because it's also wrong in the individual case. You just have to come to terms with scope sensitivity."
But determining whether or not a decision is right or wrong in the individual case requires that you be able to place a value on each outcome. We determine this value in part by using our knowledge of how frequently the outcomes occur and how much time/effort/money it takes to prevent or assuage them. Thus knowing the frequency that we can expect an event to occur is integral to assigning it a value in the fi...
Where there are no special inflection points, a bad repeated action should be a bad individual action, a good repeated action should be a good individual action. Talking about the repeated case changes your intuitions and gets around your scope insensitivity, it doesn't change the normative shape of the problem (IMHO).
Hmm, I see your point. I can't help like feeling that there are cases where repetition does matter, though. For instance, assuming for a moment that radical life-extension and the Singularity and all that won't happen, and assuming that we co...
Constant, my reference to your quote wasn't aimed at you or your opinions, but rather at the sort of view which declares that the silly calculation is some kind of accepted or coherent moral theory. Sorry if it came off the other way.
Nick, good question. Who says that we have consistent and complete preference orderings? Certainly we don't have them across people (consider social choice theory). Even to say that we have them within individual people is contestable. There's a really interesting literature in philosophy, for example, on the incommensura...
Who says that we have consistent and complete preference orderings?
Who says you need them? The question wasn't to quantify an exact balance. You just need to be sure enough to make the decision that one side outweighs the other for the numbers involved.
By my values, all else equal, for all x between 1 millisecond and fifty years, 10^1000 people being tortured for time x is worse than one person being tortured for time x*2. Would you disagree?
So, 10^1000 people tortured for (fifty years)/2 is worse than one person tortured for fifty years.
Then, 10^2000 peo...
Since Robin is interested in data... I chose SPECKS, and was shocked by the people who chose TORTURE on grounds of aggregated utility. I had not considered the possibility that a speck in the eye might cause a car crash (etc) for some of those 3^^^3 people, and it is the only reason I see for revising my original choice. I have no accredited expertise in anything relevant, but I know what decision theory is.
I see a widespread assumption that everything has a finite utility, and so no matter how much worse X is than Y, there must be a situation in which it...
Eliezer, a problem seems to be that the speck does not serve the function you want it to in this example, at least not for all readers. In this case, many people see a special penny because there is some threshold value below which the least bad bad thing is not really bad. The speck is intended to be an example of the least bad bad thing, but we give it a badness rating of one minus .9-repeating.
(This seems to happen to a lot of arguments. "Take x, which is y." Well, no, x is not quite y, so the argument breaks down and the discussion follow...
Okay, here's the data: I choose SPECKS, and here is my background and reasons.
I am a cell biologist. That is perhaps not relevant.
My reasoning is that I do not think that there is much meaning in adding up individual instances of dust specks. Those of you who choose TORTURE seem to think that there is a net disutility that you obtain by multiplying epsilon by 3^^^3. This is obviously greater than the disutility of torturing one person.
I reject the premise that there is a meaningful sense in which these dust specks can "add up".
You can think in...
Mitchell, I acknowledge the defensibility of the position that there are tiers of incommensurable utilities. But to me it seems that the dust speck is a very, very small amount of badness, yet badness nonetheless. And that by the time it's multiplied to ~3^^^3 lifetimes of blinking, the badness should become incomprehensibly huge just like 3^^^3 is an incomprehensibly huge number.
One reason I have problems with assigning a hyperreal infinitesimal badness to the speck, is that it (a) doesn't seem like a good description of psychology (b) leads to total lo...
"The notion of sacred values seems to lead to irrationality in a lot of cases, some of it gross irrationality like scope neglect over human lives and "Can't Say No" spending."
Could you post a scenario where most people would choose the option which unambiguously causes greater harm, without getting into these kinds of debates about what "harm" means? Eg., where option A ends with shooting one person, and option B ends with shooting ten people, but option B sounds better initially? We have a hard enough time getting rid of irrationality, even in cases where we know what is rational.
Eliezer: Why does anything have a utility at all? Let us suppose there are some things to which we attribute an intrinsic utility, negative or positive - those are our moral absolutes - and that there are others which only have a derivative utility, deriving from the intrinsic utility of some of their consequences. This is certainly one way to get incommensurables. If pain has intrinsic disutility and inconvenience does not, then no finite quantity of inconvenience can by itself trump the imperative of minimizing pain. But if the inconvenience might give rise to consequences with intrinsic disutility, that's different.
Dare I say that people may be overvaluing 50 years of a single human life? We know for a fact that some effect will be multiplied by 3^^^3 by our choice. We have no idea what strange an unexpected existential side effects this may have. It's worth avoiding the risk. If the question were posed with more detail, or specific limitations on the nature of the effects, we might be able to answer more confidently. But to risk not only human civilization, but ALL POSSIBLE CIVILIZATIONS, you must be DAMN SURE you are right. 3^^^3 makes even incredibly small doubts significant.
I wonder if my answers make me fail some kind of test of AI friendliness. What would the friendly AI do in this situation? Probably write poetry.
For Robin's statistics:
Given no other data but the choice, I would have to choose torture. If we don't know anything about the consequences of the blinking or how many times the choice is being made, we can't know that we are not causing huge amounts of harm. If the question deliberately eliminated these unknowns- ie the badness was limited to an eyeblink that does not immediately result in some disaster for someone or blindness for another, and you really are the one and only person making the choice ever, then I'd go with the dust-- But these qualific...
@Paul, I was trying to find a solution that didn't assume "b) all types of pleasures and pains are commensurable such that for all i, j, given a quantity of pleasure/pain experience i, you can find a quantity of pleasure/pain experience j that is equal to (or greater or less than) it. (i.e. that pleasures and pains exist on one dimension).", but rather established it for the case at hand. Unless it's specifically stated in the hypothetical that this is a true 1-shot choice (which we know it isn't in the real world, as we make analogous choices a...
Eliezer -- I think the issues we're getting into now require discussion that's too involved to handle in the comments. Thus, I've composed my own post on this question. Would you please be so kind as to approve it?
Recovering irrationalist: I think the hopefully-forthcoming-post-of-my-own will constitute one kind of answer to your comment. One other might be that one can, in fact, prefer huge dust harassment to a little torture. Yet a third might be that we can't aggregate the pain of dust harassment across people, so that there's some amount of single-person dust harassment that will be worse than some amount of torture, but if we spread that out, it's not.
For Robin's statistics:
Torture on the first problem, and torture again on the followup dilemma.
relevant expertise: I study probability theory, rationality and cognitive biases as a hobby. I don't claim any real expertise in any of these areas.
I think one of the reasons I finally chose specks is because the unlike implied, the suffering does not simply "add up": 3^^^3 people getting one dust speck in their eye is most certainly not equal to one person getting 3^^^3 dust specks in his eyes. It's not "3^^^3 units of disutility, total", it's one unit of disutility per person.
That still doesn't really answer the "one person for 50 years or two people for 49 years" question, though - by my reasoning, the second option would be preferrable, while obviously the first optio...
It is my impression that human beings almost universally desire something like "justice" or "fairness." If everybody had the dust speck problem, it would hardly be percieved as a problem. If one person is beign tortured, both the tortured person and others percieve unfairness, and society has a problem with this.
Actually, we all DO get dust motes in our eyes from time to time, and this is not a public policy issue.
In fact relatively small numbers of people ARE being tortured today, and this is a big problem both for the victims and for people who care about justice.
Beyond the distracting arithmetic lesson, this question reeks of Christianity, positing a situation in which one person's suffering can take away the suffering of others.
For the moment I disturbed by the fact that Eliezer and I seem to be in a minority here, but comforted a bit by the fact that we seem to know decision theory better than most. But I'm open to new data on the balance of opinion and the balance of relevant expertize.
It seems like selection bias might make this data much less useful. (It applied it my case, at least.) The people who chose TORTURE were likely among those with the most familiarity with Eliezer's writings, and so were able to predict that he would agree with them, and so felt less inclined to respond. Also, voicing their opinion would be publicly taking an unpopular position, which people instinctively shy away from.
Paul: Yet a third might be that we can't aggregate the pain of dust harassment across people, so that there's some amount of single-person dust harassment that will be worse than some amount of torture, but if we spread that out, it's not.
My induction argument covers that. As long as, all else equal, you believe:
A googolplex people being dust speckled every second of their life without further ill effect
I don't think this is directly comparable, because the disutility of additional dust specking to one person in a short period of time probably grows faster than linearly - if I have to blink every second for an hour, I'll probably get extremely frustrated on top of the slight discomfort of the specks themselves. I would say that one person getting specked every second of their life is significantly worse than a couple billion people getting specked once.
the disutility of additional dust specking to one person in a short period of time probably grows faster than linearly
That's why I used a googolplex people to balance the growth. All else equal, do you disagree with: "A googolplex people dust specked x times during their lifetime without further ill effect is worse than one person dust specked for x*2 times during their lifetime without further ill effect" for the range concerned?
one person getting specked every second of their life is significantly worse than a couple billion people getting specked once.
I agree. I never said it wasn't.
Have to run - will elaborate later.
All else equal, do you disagree with: "A googolplex people dust specked x times during their lifetime without further ill effect is worse than one person dust specked for x*2 times during their lifetime without further ill effect" for the range concerned?
I agree with that. My point is that agreeing that "A googolplex people being dust speckled every second of their life without further ill effect is worse than one person being horribly tortured for the shortest period experiencable" doesn't oblige me to agree that "A few billion* g...
Just thought I'd comment that the more I think about the question, the more confusing it becomes. I'm inclined to think that if we consider the max utility state of every person having maximal fulfilment, and a "dust speck" as the minimal amount of "unfulfilment" from the top a person can experience, then two people experiencing a single "dust speck" is not quite as bad as a sigle person two "dust specks" below optimal. I think the reason I'm thinking that is that the second speck takes away more proportionally than ...
I agree with that. My point is that agreeing that "A googolplex people being dust speckled every second of their life without further ill effect is worse than one person being horribly tortured for the shortest period experiencable" doesn't oblige me to agree that "A few billion* googolplexes of people being dust specked once without further ill effect is worse than one person being horribly tortured for the shortest period experiencable".
Neither would I, you don't need to. :-)
The only reason I can pull this off is because 3^^^3 is such...
ok, without reading the above comments... (i did read a few of them, including robin hanson's first comment - don't know if he weighed in again).
dust specks over torture.
the apparatus of the eye handles dust specks all day long. i just blinked. it's quite possible there was a dust speck in there somewhere. i just don't see how that adds up to anything, even if a very large number is invoked. in fact with a very large number like the one described it is likely that human beings would evolve more efficient tear ducts, or faster blinking, or something like t...
Recovering irrationalist: in your induction argument, my first stab would be to deny the last premise (transitivity of moral judgments). I'm not sure why moral judgments have to be transitive.
Next, I'd deny the second-to-last premise (for one thing, I don't know what it means to be horribly tortured for the shortest period possible -- part of the tortureness of torture is that it lasts a while).
Eliezer, both you and Robin are assuming the additivity of utility. This is not justifiable, because it is false for any computationally feasible rational agent.
If you have a bounded amount of computation to make a decision, we can see that the number of distinctions a utility function can make is in turn bounded. Concretely, if you have N bits of memory, a utility function using that much memory can distinguish at most 2^N states. Obviously, this is not compatible with additivity of disutility, because by picking enough people you can identify more disti...
It's truly amazing the contortions many people have gone through rather than appear to endorse torture. I see many attempts to redefine the question, categorical answers that basically ignore the scalar, and what Eliezer called "motivated continuation".
One type of dodge in particular caught my attention. Paul Gowder phrased it most clearly, so I'll use his text for reference:
...depends on the following three claims:a) you can unproblematically aggregate pleasure and pain across time, space, and individuality,
"Unproblematically&quo...
Recovering irrationalist: in your induction argument, my first stab would be to deny the last premise (transitivity of moral judgments). I'm not sure why moral judgments have to be transitive.
I acknowledged it won't hold for every moral. There are some pretty barking ones out there. I say it holds for choosing the option that creates less suffering. For finite values, transitivity should work fine.
Next, I'd deny the second-to-last premise (for one thing, I don't know what it means to be horribly tortured for the shortest period possible -- part of the tort...
Recovering irrationalist, I hadn't thought of things in precisely that way - just "3^^4 is really damn big, never mind 3^^7625597484987" - but now that you point it out, the argument by googolplex gradations seems to me like a much stronger version of the arguments I would have put forth.
It only requires 3^^5 = 3^(3^7625597484987) to get more googolplex factors than you can shake a stick at. But why not use a googol instead of a googolplex, so we can stick with 3^^4? If anything, the case is more persuasive with a googol because a googol is mor...
Tom, your claim is false. Consider the disutility function
D(Torture, Specks) = [10 * (Torture/(Torture + 1))] + (Specks/(Specks + 1))
Now, with this function, disutility increases monotonically with the number of people with specks in their eyes, satisfying your "slight aggregation" requirement. However, it's also easy to see that going from 0 to 1 person tortured is worse than going from 0 to any number of people getting dust specks in their eyes, including 3^^^3.
The basic objection to this kind of functional form is that it's not additive. Howe...
Again, not everyone agrees with the argument that unbounded utility functions give rise to Dutch books. Unbounded utilities only admit Dutch books if you do allow a discontinuity between infinite rewards and the limit of increasing finite awards, but you don't allow a discontinuity between infinite planning and the limit of increasing finite plans.
Oh geez. Originally I had considered this question uninteresting so I ignored it, but considering the increasing devotion to it in later posts, I guess I should give my answer.
My justification, but not my answer, depends upon what how the change is made.
-If the offer is made to all of humanity before being implemented ("Do you want to be the 'lots of people get specks race' or the 'one guy gets severe torture' race?") I believe people could all agree to the specks by "buying out" whoever eventually gets the torture. For an immeasurabl...
the argument by googolplex gradations seems to me like a much stronger version of the arguments I would have put forth.
You just warmed my heart for the day :-)
But why not use a googol instead of a googolplex
Shock and awe tactics. I wanted a featureless big number of featureless big numbers, to avoid wiggle-outs, and scream "your intuition ain't from these parts". In my head, FBNs always carry more weight than regular ones. Now you mention it, their gravity could get lightened by incomprehensibility, but we we're already counting to 3^^^3.
Googol is better. Less readers will have to google it.
@Neel.
Then I only need to make the condition slightly stronger: "Any slight tendency to aggregation that doesn't beg the question." Ie, that doesn't place a mathematical upper limit on disutility(Specks) that is lower than disutility(Torture=1). I trust you can see how that would be simply begging the question. Your formulation:
D(Torture, Specks) = [10 * (Torture/(Torture + 1))] + (Specks/(Specks + 1))
...doesn't meet this test.
Contrary to what you think, it doesn't require unbounded utility. Limiting the lower bound of the range to (say) 2 * ...
With so many so deep in reductionist thinking, I'm compelled to stir the pot by asking how one justifies the assumption that the SPECK is a net negative at all, aggregate or not, extended consequences or not? Wouldn't such a mild irritant, over such a vast and diverse population, act as an excellent stimulus for positive adaptations (non-genetic, of course) and likely positive extended consequences?
A brilliant idea, Jef! I volunteer you to test it out. Start blowing dust around your house today.
Hrm... Recovering's induction argument is starting to sway me toward TORTURE.
More to the point, that and some other comments are starting to sway me away from the thought that disutility of single dust speck events per person becomes sublinear as people experiencing it increases (but total population is held constant)
I think if I made some errors, they were partly was caused by "I really don't want to say TORTURE", and partly caused by my mistaking the exact nature of the nonlinearity. I maintain "one person experiencing two dust specks"...
"A brilliant idea, Jef! I volunteer you to test it out. Start blowing dust around your house today."
Although only one person, I've already begun, and have entered in my inventor's notebook some apparently novel thinking on not only dust, but mites, dog hair, smart eyedrops, and nanobot swarms!
Tom, if having an upper limit on disutility(Specks) that's lower than disutility(Torture1) is begging the question in favour of SPECKS then why isn't not* having such an upper limit begging the question in favour of TORTURE?
I find it rather surprising that so many people agree that utility functions may be drastically nonlinear but are apparently completely certain that they know quite a bit about how they behave in cases as exotic as this one.
Tom, if having an upper limit on disutility(Specks) that's lower than disutility(Torture1) is begging the question in favour of SPECKS then why isn't not* having such an upper limit begging the question in favour of TORTURE?
It should be obvious why. The constraint in the first one is neither argued for nor agreed on and by itself entails the conclusion being argued for. There's no such element in the second.
I think we may be at cross purposes; my apologies if we are and it's my fault. Let me try to be clearer.
Any particular utility function (if it's real-valued and total) "begs the question" in the sense that it either prefers SPECKS to TORTURE, or prefers TORTURE to SPECKS, or puts them exactly equal. I don't see how this can possibly be considered a defect, but if it is one then all utility functions have it, not just ones that prefer SPECKS to TORTURE.
Saying "Clearly SPECKS is better than TORTURE, because here's my utility function and it sa...
g: that's exactly what I'm saying. In fact, you can show something stronger than that.
Suppose that we have an agent with rational preferences, and who is minimally ethical, in the sense that they always prefer fewer people with dust specks in their eyes, and fewer people being tortured. This seems to be something everyone agrees on.
Now, because they have rational preferences, we know that a bounded utility function consistent with their preferences exists. Furthermore, the fact that they are minimally ethical implies that this function is monotone in the...
I have argued in previous comments that the utility of a person should be discounted by his or her measure, which may be based on algorithmic complexity. If this "torture vs specks" dilemma is to have the same force under this assumption, we'd have to reword it a bit:
Would you prefer that the measure of people horribly tortured for fifty years increases by x/3^^^3, or that the measure of people who get dust specks in their eyes increases by x?
I argue that no one, not even a superintelligence, can actually face such a choice. Because x is at most ...
A consistent utilitarian would choose the torture, but I don't think it's the moral choice.
Let's bring this a little closer to home. Hypothetically, let's say you get to live your life again 3^^^3 times. Would you prefer to have an additional dust speck in your eye in each of your future lives, or else be tortured for 50 years in a single one of them?
Any takers for the torture?
I'll take it, as long as it's no more likely to be one of the earliest lives. I don't trust any universe that can make 3^^^3 of me not to be a simulation that would get pulled early.
Hrm... Recovering's induction argument is starting to sway me toward TORTURE.
Interesting. The idea of convincing others to decide TORTURE is bothering me much more than my own decision.
I hope these ideas never get argued out of context!
Cooking something for two hours at 350 degrees isn't equivalent to cooking something at 700 degrees for one hour.
I'd rather accept one additional dust speck per lifetime in 3^^^3 lives than have one lifetime out of 3^^^3 lives involve fifty years of torture.
Of course, that's me saying that, with my single life. If I actually had that many lives to live, I might become so bored that I'd opt for the torture merely for a change of pace.
Recovering: chuckles no, I meant thinking about that, and rethinking about what the actual properties of what I'd consider to be a reasonable utility function led me to reject my earlier claim of the specific nonlinearity that lead to my assumption that as you increase the number of people that recieve a spec, the disutility is sublinear, and now I believe it to be linear. So huge bigbigbigbiggigantaenormous num specks would, of course, eventually have to have more disutility than the torture. But since to get to that point knuth arrow notation had to be i...
I'd take it.
I find your choice/intuition completely baffling, and I would guess that far less than 1% of people would agree with you on this, for whatever that's worth (surely it's worth something.) I am a consequentialist and have studied consequentialist philosophy extensively (I would not call myself an expert), and you seem to be clinging to a very crude form of utilitarianism that has been abandoned by pretty much every utilitarian philosopher (not to mention those who reject utilitarianism!). In fact, your argument reads like a reductio ad absurdum ...
No Mike, your intuition for really large numbers is non-baffling, probably typical, but clearly wrong, as judged by another non-Utilitarian consequentialist (this item is clear even to egoists).
Personally I'd take the torture over the dust specks even if the number was just an ordinary incomprehensible number like say the number of biological humans who could live in artificial environments that could be built in one galaxy. (about 10^46th given a 100 year life span and a 300W (of terminal entropy dump into a 3K background from 300K, that's a large budge...
So, if additive utility functions are naive, does that mean I can swap around your preferences at random like jerking around a puppet on a string, just by having a sealed box in the next galaxy over where I keep a googol individuals who are already being tortured for fifty years, or already getting dust specks in their eyes, or already being poked with a stick, etc., which your actions cannot possibly affect one way or the other?
It seems I can arbitrarily vary your "non-additive" utilities, and hence your priorities, simply by messing with the nu...
Michael Vassar:
Well, in the prior comment, I was coming at it as an egoist, as the example demands.
It's totally clear to me that a second of torture isn't a billion billion billion times worse than getting a dust speck in my eye, and that there are only about 1.5 billion seconds in a 50 year period. That leaves about a 10^10 : 1 preference for the torture.
I reject the notion that each (time,utility) event can be calculated in the way you suggest. Successive speck-type experiences for an individual (or 1,000 successive dust specks for 1,000,000 individua...
To continue this business of looking at the problem from different angles:
Another formulation, complementary to Andrew Macdonald's, would be: Should 3^^^3 people each volunteer to experience a speck in the eye, in order to save one person from fifty years of torture?
And with respect to utility functions: Another nonlinear way to aggregate individual disutilities x, y, z... is just to take the maximum, and to say that a situation is only as bad as the worst thing happening to any individual in that situation. This could be defended if one's assignment of ...
I find it positively bizarre to see so much interest in the arithmetic here, as if knowing how many dust flecks go into a year of torture, just as one knows that sixteen ounces go into one pint, would inform the answer.
What happens to the debate if we absolutely know the equation:
3^^^3 dustflecks = 50 years of torture
or
3^^^3 dustflecks = 600 years of torture
or
3^^^3 dustfleck = 2 years of torture ?
The nation of Nod has a population of 3^^^3. By amazing coincidence, every person in the nation of Nod has $3^^^3 in the bank. (With a money suplly like that, those dollars are not worth much.) By yet another coincidence, the government needs to raise revenues of $3^^^3. (It is a very efficient government and doesn't need much money.) Should the money be raised by taking $1 from each person, or by simply taking the entire amount from one person?
I take $1 from each person. It's not the same dilemma.
----
Ri:The idea of convincing others to decide TORTURE is bothering me much more than my own decision.
PK:I don't think there's any worry that I'm off to get my "rack winding certificate" :P
Yes, I know. :-) I was just curious about the biases making me feel that way.
individual living 3^^^3 times...keep memories and so on of all previous lives
3^^^3 lives worth of memories? Even at one bit per life, that makes you far from human. Besides, you're likely to get tortured in googolplexes of those lif...
Andrew Macdonald asked:
Any takers for the torture?
Assuming the torture-life is randomly chosen from the 3^^^3 sized pool, definitely torture. If I have a strong reason to expect the torture life to be found close to the beginning of the sequence, similar considerations as for the next answer apply.
Recovering irrationalist asks:
OK here goes... it's this life. Tonight, you start fifty years being loved at by countless sadistic Barney the Dinosaurs. Or, for all 3^^^3 lives you (at your present age) have to singalong to one of his songs. BARNEYLOVE or SONGS?
...
Cooking something for two hours at 350 degrees isn't equivalent to cooking something at 700 degrees for one hour.
Caledonian has made a great analogy for the point that is being made on either side. May I over-work it?
They are not equivalent, but there is some length of time at 350 degrees that will burn as badly as 700 degrees. In 3^^^3 seconds, your lasagna will be ... okay, entropy will have consumed your lasagna by then, but it turns into a cloud of smoke at some point.
Correct me if I am wrong here, but I don't think there is any length of time at 75 ...
Zubon, we could formalize this with a tiered utility function (one not order-isomorphic to the reals, but containing several strata each order-isomorphic to the reals).
But then there is a magic penny, a single sharp divide where no matter how many googols of pieces you break it into, it is better to torture 3^^^3 people for 9.99 seconds than to torture one person for 10.01 seconds. There is a price for departing the simple utility function, and reasons to prefer certain kinds of simplicity. I'll admit you can't slice it down further than the essentially ...
...except that, if I'm right about the biases involved, the Speckists won't be horrified at each other.
If you trade off thirty seconds of waterboarding for one person against twenty seconds of waterboarding for two people, you're not visibly treading on a "sacred" value against a "mundane" value. It will rouse no moral indignation.
Indeed, if I'm right about the bias here, the Speckists will never be able to identify a discrete jump in utility across a single neuron firing, even though the transition from dust speck to torture can be br...
Assuming that there are 3^^^3 distinct individuals in existence, I think the answer is pretty obvious- pick the torture. However, the fact that we cannot possibly hope to visualize so many individuals it's a pointlessly large number. In fact, I would go so low as one quadrillion human beings with dust specks in their eyes outweighs one individual's 50 years of torture. Consider- one quadrillion seconds of minute but noticeable pain versus a scant fifty years of tortured hell. One quadrillion seconds is about 31,709,792 years. Let's just go with 32 mil...
My initial reaction (before I started to think...) was to pick the dust specks, given that my biases made the suffering caused by the dust specks morally equivalent to zero, and 0^^^3 is still 0.
However, given that the problem stated an actual physical phenomenon (dust specks), and not a hypothetical minimal annoyance, then you kind of have to take the other consequences of the sudden appearance of the dust specks under consideration, don't you?
If I was omnipotent, and I could make everyone on Earth get a dust speck in their eye right now, how many car acc...
I have a question/answer in relation to this post that seems to be off-topic for the forum. Click on my name if interested.
Since I would not be one of the people affected I would not consider myself able to make that decision alone. In fact my preferences are irrelevant in that situation even if I consider situation to be obvious.
To have situation with 3^^^3 people we must have at least that many people capable of existing in some meaningful way. I assume we cannot query them about their preferences in any meaningful (omniscient) way. As I cannot choose who will be tortured or who gets dust specks I have to make collective decission.
I think that my solution would be to take t...
Evolution seems to have favoured the capacity for empathy (the specks choice) over the capacity for utility calculation, even though utility calculation would have been a 'no brainer' for the brain capacity we have.
The whole concept reminds me of the Turing test. Turing, as a mathematician, just seems to have completely failed to understand that we don't assign rationality, or sentience, to another object by deduction. We do it by analogy.
I know that this is only a hypothetical example, but I must admit that I'm fairly shocked at the number of people indicating that they would select the torture option (as long as it wasn't them being tortured). We should be wary of the temptation to support something unorthodox for the effect of: "Hey, look at what a hardcore rationalist I can be." Real decisions have real effects on real people.
And we should be wary to select something orthodox for fear of provoking shock and outrage. Do you have any reason to believe that the people who say they prefer TORTURE to SPECKS are motivated by the desire to prove their rationalist credentials, or that they don't appreciate that their decisions have real consequences?
Jeffrey, on one of the other threads, I volunteered to be the one tortured to save the others from the specks.
As for "Real decisions have real effects on real people," that's absolutely correct, and that's the reason to prefer the torture. The utility function implied by preferring the specks would also prefer lowering all the speed limits in the world in order to save lives, and ultimately would ban the use of cars. It would promote raising taxes by a small amount in order to reduce the amount of violent crime (including crimes involving torture...
Following your heart and not your head - refusing to multiply - has also wrought plenty of havoc on the world, historically speaking. It's a questionable assertion (to say the least) that condoning irrationality has less damaging side effects than condoning torture.
"Following your heart and not your head - refusing to multiply - has also wrought plenty of havoc on the world, historically speaking. It's a questionable assertion (to say the least) that condoning irrationality has less damaging side effects than condoning torture."
I'm not really convinced that multiplication of the dust-speck effect is relevant. Subjective experience is restricted to individuals, not collectives. To me, this specific exercise reduces to a simpler question: Would it be better (more ethical) to torture individual A for 50 years,...
Jeffrey wrote: To me, this specific exercise reduces to a simpler question: Would it be better (more ethical) to torture individual A for 50 years, or inflict a dust speck on individual B? Gosh. The only justification I can see for that equivalence would be some general belief that badness is simply independent of numbers. Suppose the question were: Which is better, for one person to be tortured for 50 years or for everyone on earth to be tortured for 49 years? Would you really choose the latter? Would you not, in fact, jump at the chance to be the single ...
Jeffrey, do you really think serial killing is no worse than murdering a single individual, since "Subjective experience is restricted to individuals"?
In fact, if you kill someone fast enough, he may not subjectively experience it at all. In that case, is it no worse than a dust speck?
"Suppose the question were: Which is better, for one person to be tortured for 50 years or for everyone on earth to be tortured for 49 years? Would you really choose the latter? Would you not, in fact, jump at the chance to be the single person for 50 years if that were the only way to get that outcome rather than the other one?"
My criticism was for this specific initial example, which yes did seem "obvious" to me. Very few, if any, ethical opinions can be generalized over any situation and still seem reasonable. At least by my definiti...
I can see myself spending too much time here, so I'm going to finish-up and ya'll can have the last word. I'll admit that it's possible that one or more of you actually would sacrifice yourself to save others from a dust speck. Needless to say, I think it would be a huge mistake on your part. I definitely wouldn't want you to do it on my behalf, if for nothing more than selfish reasons: I don't want it weighing on my conscience. Hopefully this is a moot point anyway, since it should be possible to avoid both unwanted dust specks and unwanted torture (eg. via a Friendly AI). We should hope that torture dies-away with the other tragedies of our past, and isn't perpetuated into our not-yet-tarnished future.
I know you're all getting a bit bored, but I'm curious what you think about a different scenario:
What if you have to choose between (a) for the next 3^^^3 days, you get an extra speck in your eye per day than normally, and 50 years you're placed in stasis, or (b) you get the normal amount of specks in your eyes, but during the next 3^^^3 days you'll pass through 50 years of atrocious torture.
Everything else is considered equal in the other cases, including the fact that (i) your total lifespan will be the same in both cases (more than 3^^^3 days), (ii) th...
OK, I see I got a bit long-winded. The interesting part of my question is if you'd take the same decision if it's about you instead of others. The answer is obvious, of course ;-)
The other details/versions I mentioned are only intended to explore the "contour of the value space" of the other posters. (: I'm sure Eliezer has a term for this, but I forget it.)
Bogdan's presented almost exactly the argument that I too came up with while reading this thread. I would choose the specks in that argument and also in the original scenario (as long as I am not committing to the same choice being repeated an arbitrary number of times, and I am not causing more people to crash their cars than I cause not to crash their cars; the latter seems like an unlikely assumption, but thought experiments are allowed to make unlikely assumptions, and I'm interested in the moral question posed when we accept the assumption). Based on ...
I came across this post only today, because of the current comment in the "recent comments" column. Clearly, it was an exercise that drew an unusual amount of response. It further reinforces
my impression of much of the OB blog, posted in August, and denied by email.
I think you should ask everyone until you have at least 3^^^3 people whether they would consent to having a dust speck fly into their eye to save someone from torture. When you have enough people just put dust specks into their eyes and save the others.
The question is, of course, silly. It is perfectly rational to decline to answer. I choose to try to answer.
It is also perfectly rational to say "it depends". If you really think "a dust speck in 3^^^3 eyes" gives a uniquely defined probability distribution over different subsets of possibilityverse, you are being ridiculous. But let's pretend it did - let's pretend we had 3^^^^3 parallel Eleizers, standing on flat golden surfaces in 1G and one atmosphere, for just long enough to ask each other enough enough questions to define the prob...
Tim: You're right - if you are a reasonably attractive and charismatic person. Otherwise, the question (from both sides) is worse than the dust speck.
(Asking people also puts you in the picture. You must like to spend eternity asking people a silly question, and learning all possible linguistic vocalizations in order to do so. There are many fewer vocalizations than possible languages, and many fewer possible human languages than 3^^^3. You will be spending more time going from one person of the SAME language to another, at 1 femtosecond per journey, than ...
Torture is not the obvious answer, because torture-based suffering and dust-speck-based suffering are not scalar quantities with the same units.
To be able to make a comparison between two quantities, the units must be the same. That's why we can say that 3 people suffering torture for 49.99 years is worse than 1 person suffering torture for 50 years. Intensity Duration Number of People gives us units of PainIntensity-Person-Years, or something like that.
Yet torture-based suffering and dust-speck-based suffering are not measured in the same units. Consequ...
There is a false choice being offered, because every person in every lifetime is going to experience getting something in their eye, I get a bug flying into my eye on a regular basis whenever I go running (3 of them the last time!) and it'll probably have happened thousands of times to me at the end of my life. It's pretty much a certainty of human experience (Although I suppose it's statistically possible for some people to go through life without ever getting anything in their eyes).
Is the choice being offered to make all humanities eyes for all eternity immune to small inconveniences such as bugs, dust or eyelashes? Otherwise we really aren't being offered anything at all.
Doesn't "harm", to a consequentialist, consist of every circumstance in which things could be better, but aren't ? If a speck in the eye counts, then why not, for example, being insufficiently entertained ?
If you accept consequentialism, isn't it morally right to torture someone to death so long as enough people find it funny ?
It seems that many, including Yudkowsky, answer this question by making the most basic mistake, i.e. by cheating - assuming facts not in evidence.
We don't know anything about (1) the side-effects of picking SPECKS (such as car crashes); and definitely don't know that (2) the torture victim can "acclimate". (2) in particular seems like cheating in a big way - especially given the statement "without hope or rest".
There's nothing rational about posing a hypothetical and then adding in additional facts in your answer. However, that's a great way to avoid the question presented.
The obvious answer is that torture is preferable.
If you have to pick yourself a chance of 1/3^^^3 of 50 years torture vs the dust spec you will pick the torture.
We actually do this every day: we eat foods that can poison us rather than be hungry, we cross the road rather than stay at home, etc.
Imagine there is a safety improvement to your car that will cost 0.0001 cent but will save you from an event that will happen once in 1000 universe lifetimes would you pay for it?
Very-Related Question: Typical homeopathic dilutions are 10^(-60). On average, this would require giving two billion doses per second to six billion people for 4 billion years to deliver a single molecule of the original material to any patient.
Could one argue that if we administer a homeopathic pill of vitamin C in the above dilution to every living person for the next 3^^^3 generations, the impact would be a humongous amount of flu-elimination?
If anyone convinces me that yes, I might accept to be a Torturer. Otherwise, I assume that the negligibility of the speck, plus people's resilience, would make no lasting effects. Disutility would vanish in miliseconds. If they wouldn't even notice or have memory of the specks after a while, it'd equate to zero disutility.
It's not that I can't do the maths. It's that the evil of the speck seems too diluted to do harm.
Just like homeopathy is too diluted to do good.
I doubt anybody's going to read a comment this far down, but what the heck.
Perhaps going from nothing to a million dust specks isn't a million times as bad as going from nothing to one dust speck. One thing is certain though: going from nothing to a million dust specks is exactly as bad as going from nothing to one dust speck plus going from one dust speck to two dust specks etc.
If going from nothing to one dust speck isn't a millionth as bad as nothing to a million dust specks, it has to be made up somewhere else, like going from 999,999 to a million dust...
Ask this to yourself to make the question easier. What would you prefer, getting 3^^^3 dust specks in your eye or being hit with a spiked whip for 50 years.
You must live long enough to feel the 3^^^3 specks in your eye, and each one lasts a fraction of a second. You can feel nothing else but that speck in your eye.
So, it boils down to this question. Would you rather be whipped for 50 years or get specks in your eye for over a googleplex of years.
If I could possible put a marker of the utility of bad that a speck of dust in the eye is, and compare that to...
In the real world the possiblity of torture obviously hurts more people than just the person being tortured. By theorizing about the utility of torture you are actually subjecting possibly billions of people to periodic bouts of fear and pain.
Forgive me if this has been covered before. The internet here is flaking out and it makes it hard to search for answers.
What is the correct answer to the following scenario: Is it preferable to have one person be tortured if it gives 3^^^3 people a miniscule amount of pleasure?
The source of this question was me pondering the claim, "Pain is temporary; a good story lasts forever."
So, I'm very late into this game, and not through all the sequences (where the answer might already be given), but still, I am very interested in your positions (probably nobody answers, but who knows):
I'd gladly get a speck of dust in my eye as many times as I can, and I'm sure those 3^^^3 people would join me, to keep one guy from being tortured for 50 years.
"Would you prefer that one person be horribly tortured for fifty years without hope or rest, or that 3^^^3 people get dust specks in their eyes?
I think the answer is obvious. How about you?"
Yes, Eliezer, the answer is obvious. The answer is that this is a false dilemma, and that I should go searching for the third alternative, with neither 3^^^3 dust specks nor 50 years of torture. These are not optimal alternatives.
Construct a thought experiment in which every single one of those 3^^^3 is asked whether he would accept a dust speck in the eye to save someone from being tortured, take the answers as a vote. If the majority would deem it personally acceptable, then acceptable it is.
Interesting question. I think a similar real-world situation is when people cut in line.
Suppose there is a line of 100 people, and the line is moving at a rate of 1 person per minute.
Is it ok for a new person to cut to the front of the line, because it only costs each person 1 extra minute, or should the new person stand at the back of the line and endure a full 100 minute wait?
Of course, not everyone in line endures the same wait duration; a person near the front will have a significantly shorter wait than a person near the back. To address that issue o...
I would prefer the dust motes, and strongly. Pain trumps inconvenience.
And yet...we accept automobiles, which kill tens of thousands of people per year, to avoid inconvenience. (That is, automobiles in the hands of regular people, not just trained professionals like ambulance drivers.) But it's hard to calculate the benefits of having a vehicle.
Reducing the national speed limit to 30mph would probably save thousands of lives. I would find it unconscionable to keep the speed limit high if everyone were immortal. At present, such a measure would trade lives for parts of lives, and it's a matter of math to say which is better...though we could easily rearrange our lives to obviate most travel.
Idea 1: dust specks, because on a linear scale (which seems to be always assumed in discussions of utility here) I think 50 years of torture is more than 3^^^3 times worse than a dust speck in one's eye.
Idea 2: dust specks, because most people arbitrarily place bad things into incomparable categories. The death of your loved one is deemed to be infinitely worse than being stuck in an airport for an hour. It is incomparable; any amount of 1 hour waits are less bad than a single loved one dying.
I think it might be interesting to reflect on the possibility that among the 3^^^3 dust speck victims there might be a smaller-but-still-vast number of people being subjected to varying lengths of "constantly-having-dust-thrown-in-their-eyes torture". Throwing one more dust speck at each of them is, up to permuting the victims, like giving a smaller-but-still-vast number of people 50 years of dust speck torture instead of leaving them alone.
(Don't know if anyone else has already made this point - I haven't read all the comments.)
These ethical questions become relevant if we're implementing a Friendly AI, and they are only of academic interest if I interpret them literally as a question about me.
If it's a question about me, I'd probably go with the dust specs. A small fraction of those people will have time to get to me, and of those, none of those people are likely to bother me if it's just a dust speck. If I were to advocate the torture, the victim or someone who knows him might find me and try to get revenge. I just gave you a data point about the psychology of one unmodifie...
I wonder if some people's aversion to "just answering the question" as Eliezer notes in the comments many times has to do with the perceived cost of signalling agreement with the premises.
It's straightforward to me that answering should take the question at face value; it's a thought experiment, you're not being asked to commit to a course of action. And going by the question as asked the answer for any utilitarian is "torture", since even a very small increment of suffering multiplied by a large enough number of people (or an infinite...
I choose the specks. My utility function u(what happens to person 1, what happens to person 2, ..., what happens to person N) doesn't equal f_1(what happens to person 1) + f_2(what happens to person 2) + ... + f_N(what happens to person N) for any choice of f_1, ..., f_N, not even allowing them to be different; in particular, u(each of n people gets one speck in their eye) doesn't approaches a finite limit as n approaches infinity, and this limit is less negative than u(one person gets tortured than 50 years)
I spent quite a while thinking about this one, and here is my "answer".
My first line of questioning is "can we just multiply and compare the sufferings ?" Well, no. Our utility functions are complicated. We don't even fully know them. We don't exactly know what are terminal values, and what are intermediate values in them. But it's not just "maximize total happiness" (with suffering being negative happiness). My utility function also values things like fairness (it may be because I'm a primate, but still, I value it). The &quo...
Let me attempt to shut up and multiply.
Let's make the assumption that a single second of torture is equivalent to 1 billion dust specks to the eye. Since that many dust specks is enough to sandblast your eye, it seems reasonable approximation.
This means that 50 years of this torture is equivalent to giving 1 single person (50 365.25 24 60 60 * 1,000,000,000) dust specks to the eye.
According to Google's calculator,
(50 365.25 24 60 60 1,000,000,000)/(3^39) = 0.389354356 (50 365.25 24 60 60 1,000,000,000)/(3^38) = 1.16806307
Ergo, If someone co...
I tentatively like to measure human experience with logarithms and exponentials. Our hearing is logarithmic, loudness wise, hence the unit dB. Human experiences are rarely linear, thus is it is almost never true that f(x*a) = f(x)*a.
In the above hypothetical, we can imagine the dust specks and the torture. If we propose that NO dust speck ever does anything other than cause mild annoyance, never one enters the eye of a driver who blinks at an inopportune time and crashes; then I would propose we can say: awfulness(pain) = k^pain.
A dust speck causes approxi...
My utility function has non-zero terms for preferences of other people. If I asked each one of the 3^^^3 people whether they would prefer a dust speck if it would save someone a horrible fifty-year torture, they (my simulation of them) would say YES in 20*3^^^3-feet letters.
The mathematical object to use for the moral calculations needs not be homologous to real numbers.
My way of seeing it is that the speck of dust barely noticeable will be strictly smaller than torture no matter how many instances of speck of dust happen. That's just how my 'moral numbers' operate. The speck of dust equals A>0, the torture equals B>0, the A*N<B holds for any finite N . . I forbid infinities (the number of distinct beings is finite).
If you think that's necessarily irrational you have a lot of mathematics to learn. You can start with...
Choosing TORTURE is making a decision to condemn someone to fifty years of torture, while knowing that 3^^^3 people would not want you to do so, would beg you not to, would react with horror and revulsion if/when they knew you did it. And you must do it for the sake of some global principle or something. I'd say it puts one at least into Well-intentioned Extremist / KnightTemplar category, if not outright villain.
If an AI had made a choice like that, against known wishes of practically everyone, I'd say it was rather unfriendly.
ADDED: Detailed
People who choose torture, if the question was instead framed as the following would you still choose torture?
"Assuming you know your lifespan will be at least 3^^^3 days, would you choose to experience 50 years worth of torture, inflicted a day at a time at intervals spread evenly across your life span starting tomorrow, or one dust speck a day for the next 3^^^3 days of your life?"
Common sense tells me the torture is worse. Common sense is what tells me the earth is flat. Mathematics tells me the dust specks scenario is worse. I trust mathematics and will damn one person to torture.
This "moral dilemma" only has force if you accept strict Bentham-style utilitarianism, which treats all benefits and harms as vectors on a one-dimensional line, and cares about nothing except the net total of benefits and harms. That was the state of the art of moral philosophy in the year 1800, but it's 2012 now.
There are published moral philosophies which handle the speck/torture scenario without undue problems. For example if you accepted Rawls-style, risk-averse choice from a position where you are unaware whether you will be one of the speck...
The dusk speck is a slight irritation. Hearing about somone being tortured is a bigger irritation. Also, pain depends on greatly on concentration. Something that hurts "twice as much" is actually much worse: lets say it is a hundred times worse. Offcourse this levels off (it is a curve) at some point, but in this case that is not problem as we can say that the torture is very close to the physical max and the speck's are very close to the physical minimum pain. The difference between the Speck and the torture is immense. Differense in time...
At first, I picked the dust specks as being the preferable answer, and it seemed obvious. What eventually turned me around was when I considered the opposite situation -- with GOOD things happening, rather than BAD things. Would I prefer that one person experience 50 years of the most happiness realistic in today's world, or that 3^^^3 people experience the least good, good thing?
I was very surprised to find that a supporter of the Complexity of Value hypothesis and the author who warns against simple utility functions advocates torture using simple pseudo-scientific utility calculus.
My utility function has constraints that prevent me from doing awful things to people, unless it would prevent equally awful things done to other people. That this is a widely shared moral intuition is demonstrated by the reaction in the comments section. Since you recognize the complexity of human value, my widely-shared preferences are presumably v...
No. One of those actions, or something different, happens if I take no action. Assuming that neither the one person nor the 3^^^3 people have consented to allow me to harm them, I must choose the course of action by which I harm nobody, and the abstract force harms people.
If you instead offer me the choice where I prevent the harm (and that the 3^^^3+1 people all consent to allow me to do so), then I choose to prevent the torture.
My maximal expected utility is one in which there is a universe in which I have taken zero additional actions without the conse...
How bad is the torture option?
Let's say a human brain can have ten thoughts per second; or the rate of human awareness is ten perceptions per second. Fifty years of torture means just over one and a half billion tortured thoughts, or perceptions of torture.
Let's say a human brain can distinguish twenty logarithmic degrees of discomfort, with the lowest being "no discomfort at all", the second-lowest being a dust speck, and the highest being torture. In other words, a single moment of torture is 2^19 = 524288 times worse than a dust speck; and a dust speck is the smallest discomfort possible. Let's call a unit of discomfort a "dol" (from Latin dolor).
In other words, the torture option means 1.5 billion moments × 2^19 dols; whereas the dust-specks option means 3^^^3 moments × 1 dol.
The assumptions going into this argument are the speed of human thought or perception, and the scale of human discomfort or pain. These are not accurately known today, but there must exist finite limits — humans do not think or perceive infinitely fast; and the worst unpleasantness we can experience is not infinitely bad. I have assumed a log scale for discomfort because we use log scal...
I think I have to go with the dust specks. Tomorrow, all 3^^^3 of those people will have forgotten entirely about the speck of dust. It is an event nearly indistinguishable from thermal noise. People, all of them everywhere, get dust specks in their eyes just going about their daily lives with no ill effect.
The torture actually hurts someone. And in a way that's rather non-recoverable. Recoverability plays a large part in my moral calculations.
But there's a limit to how many times I can make that trade. 3^^^3 people is a LOT of people, and it doesn't take ...
If you ask me the slightly different question, where I choose between 50 years of torture applied to one man, or between 3^^^3 specks of dust falling one each into 3^^^3 people's eyes and also all humanity being destroyed, I will give a different answer. In particular, I will abstain, because my moral calculation would then favor the torture over the destruction of the human race, but I have a built-in failure mode where I refuse to torture someone even if I somehow think it is the right thing to do.
But that is not the question I was asked. We could also have the man tortured for fifty years and then the human race gets wiped out BECAUSE the pan-galactic cataclysm favors civilizations who don't make the choice to torture people rather than face trivial inconveniences.
Consider this alternate proposal:
Hello Sir and/or Madam:
I am trying to collect 3^^^3 signatures in order to prevent a man from being tortured for 50 years. Would you be willing to accept a single speck of dust into your eye towards this goal? Perhaps more? You may sign as many times as you are comfortable with. I eagerly await your response.
Sincerely,
rkyeun
PS: Do you know any masochists who might enjoy 50 years of torture?
BCC: 3^^^3-1 other people.
If asked independently whether or not I would take an eyeball speck in the eye to spare a stranger 50 years of torture, i would say "sure". I suspect most people would if asked independently. It should make no difference to each of those 3^^^3 dust speck victims that there are another (3^^^3)-1 people that would also take the dust speck if asked.
It seems then that there are thresholds in human value. Human value might be better modeled by sureals than reals. In such a system we could represent the utility of 50 years of torture as -Ω and represe...
The other day, I got some dirt in my eye, and I thought "That selfish bastard, wouldn't go and get tortured and now we all have to put up with this s#@$".
I don't see that it's necessary -- or possible, for that matter -- for me to assign dust specks and torture to a single, continuous utility function. On a scale of disutility that includes such events as "being horribly tortured," the disutility of a momentary irritation such as a dust speck in the eye has a value of precisely zero -- not 0.000...0001, but just plain 0, and of course, 0 x 3^^^3 = 0.
Furthermore, I think the "minor irritations" scale on which dust specks fall might increase linearly with the time of exposure, and would c...
Incidentally, I think that if you pick "dust specks," you're asserting that you would walk away from Omelas; if you pick torture, you're asserting that you wouldn't.
Bravo, Eliezer. Anyone who says the answer to this is obvious is either WAY smarter than I am, or isn't thinking through the implications.
Suppose we want to define Utility as a function of pain/discomfort on the continuum of [dust speck, torture] and including the number of people afflicted. We can choose whatever desiderata we want (e.g. positive real valued, monotonic, commutative under addition).
But what if we choose as one desideratum, "There is no number n large enough such that Utility(n dust specks) > Utility(50 yrs torture)." What doe...
To me, this experiment shows that absolute utilitarianism does not make a good society. Conversely, a decision between, say, person A getting $100 and person B getting $1 or both of them getting $2 shows absolute egalitarianism isn't satisfactory either (assuming simple transfers are banned). Perhaps the inevitable realization...is that some balance between them is possible, such as the weighted sum (sum indicating utilitarianism) with more weight applied to those who have less (this indicating egalitarianism) can provide such a balance?
I would suggest the answer is fairly obviously that one person be horribly tortured for 50 years, on the grounds that the idea "there exists 3^^^3 people" is incomprehensible cosmic horror even before you add in the mote of dust.
I have been reading less wrong about 6 month but this is my first post. I'm not an expert but an interested amateur. I read this post about 3 weeks ago and thought it was a joke. After working through replies and following links, I get it is a serious question with serious consequences in the world today. I don’t think my comments duplicate others already in the thread so here goes…
Let’s call this a “one blink” discomfort (it comes and goes in a blink) and let’s say that on average each person gets one every 10 minutes during their waking hours. In re...
I definitely think it is obvious what Eliezer is going for: 3^^^3 people getting dusk specks in their eyes being the favorable outcome. I understand his reasonig, but I'm not sure I agree with the simple Benthamite way of calculating utility. Popular among modern philosophers is preference utilitarianism, where the preferences of the people involved are what constitute utility. Now consider that each of those 3^^^3 people has a preference that people not be tortured. Assuming that the negative utility each individual computes for someone being tortured is ...
This question reminds me of the dilemma posed to medical students. It went something like this;
if the opportunity presented itself to secretly, with no chance of being caught, 'accidentally' kill a healthy patient who is seen as wasting their life (smoking, drinking, not exercising, lack of goals etc) in order to harvest his/her organs in order to save 5 other patients should you go ahead with it?
From a utilitarian perspective, it makes perfect sense to commit the murder. The person who introduced me to the dilemma also presented the rationale for saying '...
I used to think that the dust specks was the obvious answer. Then I realized that I was adding follow-on utility to torture (inability to do much else due to the pain) but not the dust specks (car crashes etc due to the distraction). It was also about then that I changed from two-boxing to one-boxing, and started thinking that wireheading wasn't so bad after all. Are opinions to these three usually correlated like this?
I would suggest that torture has greater and greater disutility the larger the size of the society. So given a specific society of a specific size, the dust specks can never add up to more suffering than the torture; the greater the number of dust specs possible, the greater the disutility of the torture, and the torture will always add up to worse.
If you're comparing societies of different size, it may be that the society with the dust specks has as much disutility as the society with the torture, but this is no longer a choice between dust specks and to...
There are many ways of approaching this question, and one that I think is valuable and which I can't find any mention of on this page of comments is the desirist approach.
Desirism is an ethical theory also sometimes called desire utilitarianism. The desirist approach has many details for which you can Google, but in general it is a form of consequentialism in which the relevant consequences are desire-satisfaction and desire-thwarting.
Fifty years of torture satisfies none and thwarts virtually all desires, especially the most intense desires, for fifty ...
Forgive me for posting on such an old topic, but I've spent the better part of the last few days thinking about this and had to get my thoughts together somewhere. But after some consideration, I must say that I side with the "speckers" as it were.
Let us do away with "specks of dust" and "torture" notions in an attempt to avoid arguing the relative value one might place on either event (i.e. - "rounding to 0/infinity"), and instead focus on the real issue. Replace torture with "event A" as the single most h...
3^^^3 people? ...
I can see what point you were trying to make....I think.
But I happen to have a significant distrust of classic utilitarianism: if you sum up the happiness of a society with a finite chance of lasting forever, and subtract the sum of all the pain...you get infinity-infinity, which is conditionally convergent. the simplest patch is to insert a very, very, VERY tiny factor, reducing the weight of future societal happiness in your computation... Any attempt to translate to so many people ...places my intuition in charge of setting the summa...
It seems to me that preference utilitarianism neatly reconciles the general intuitive view against torture with a mathematical utilitarian position. If a proportion p of those 3^^^3 people have a moral compunction against people being tortured, and the remainder are indifferent to torture but have a very slight preference against dust specks, then as long as p is not very small, the overall preference would be for dust specks (and if p was very small, then the moral intuitions of humanity in general have completely changed and we shouldn't be in a position to make any decisions anyway). Is there something I'm missing?
"...Some may think these trifling matters not worth minding or relating; but when they consider that tho' dust blown into the eyes of a single person, or into a single shop on a windy day, is but of small importance, yet the great number of the instances in a populous city, and its frequent repetitions give it weight and consequence, perhaps they will not censure very severely those who bestow some attention to affairs of this seemingly low nature. Human felicity is produc'd not so much by great pieces of good fortune that seldom happen, as by little advantages that occur every day."
--Benjamin Franklin
Would you prefer that one person be horribly tortured for fifty years without hope or rest, or that 3^^^3 people get dust specks in their eyes?
I would prefer that 3^^^3 people get dust specs in their eyes, because that means that we either figured out how to escape the death of our universe, or expand past our observable universe. [/cheating]
I have mixed feelings on this question. On the one hand, I agree that scope insensitivity should be avoided, and utility should count linearly over organisms. But at the same time, I'm not really sure the dust specks are even ... bad. If I could press a button to eliminate dust specks from the world, then (ignoring instrumental considerations, which would obviously dominate) I'm not sure whether I would bother.
Maybe I'm not imagining the dust specks as being painful, whereas Eliezer had in mind more of a splinter that is slightly painful. Or we can imagine...
Would it change anything if the subjets where extremely cute puppies?
Would it change anything if the subjects were extremely cute puppies with eyes so wide and innocent that even the hardest lumberjack would soon?
If the dust specks could cause deaths I would refuse to chose either. If I somehow still did, I would pick the dusts anyhow because I know that I myself would rather die by accident caused of a dust particle than be tortured for even ten years.
"The Lord Pilot shouted, fist held high and triumphant: "To live, and occasionally be unhappy!"" (three worlds collide) dust specks are just dust specks - in a way its helpful to sometimes have these things.
But the thing changes if you don't distribute the dust specks 1 per person but 10 per second per person?
I think I've seen some other comments bring it up, but I'll say it again. I think people who go for the torture are working off a model of linear discomfort addition, in which case the badness of the torture would have to be as bad as 3^^^3 dust particles in the eye to justify taking the dust. However, I'd argue that it's not linear. Two specs of dust is worse than twice as bad as one spec. 3^^^3 people getting specs in their eyes is unimaginably less bad than one person getting 3^^^3 specs (a ridiculous image considering that's throwing universes into a dude's eye). So the spec very well may be less than 1/(3^^^3) as bad as torture.
Even so, I doubt it. So purely utilitarian probably does suggest torture the one guy.
I think the problem here is the way the utility function is chosen. Utilitarianism is essentially a formalization of reward signals in our heads. It is a heuristic way of quantifying what we expect a healthy human (one that can raise up and survive in a typical human environment and has an accurate model of reality) to want. All of this only converges roughly to a common utility because we have evolved to have the same needs which are necessarily pro-life and pro-social (since otherwise our species wouldn't be present today).
Utilitarianism crudely abstract...
I think the reason people are hesitant to choose the dust speck option is that they view the number 3^^^3 as being insurmountable. It's a combo chain that unleashes a seemingly infinite amount of points in the "Bad events I have personally caused" category on their scoreboard. And I get that. If the torture option is a thousand bad points, and the dust speck is 1/1000th of a point for each person, than the math clearly states that torture is the better option.
But the thing is that you unleash that combo chain every day.
Everytime you burn a piece...
To me it is immediately obvious that torture is preferable. Judging my the comments I'm in the minority.
I think the best counterargument I came against this line of reasoning turns around the fact that there might not be 3^^^3 moral beings in mindspace
There might not be 3^^^3 moral beings in mindspace, and instantiating someone more than once might not create additional value. So there's probably something here to consider. I still would choose torture with my current model of the world, but I'm still confused about that point.
Fun Fact: the vast majority of those 3^^^3 people would have to be duplicates of each other because that many unique people could not possibly exist.
The answer is obvious once you do the math.
I think most people read the statement above like it reads either torture one person alot or torture alot of people very little. That is not what it says at all, because 3^^^3 or 3^7625597484987 is more like the idea of infinity than the idea of alot.
If you were divide up those 3^^^3 dust particles and send them through the eyes of anything with eyes since the dawn of time, it would be no minor irritant. You wouldn't be just blinding everything ever. Nor is it just like sandblasting...
I've taught my philosophy students that "obvious" is a red flag in rational discourse.
It often functions as "I am not giving a logical or empirical argument here, and am trying to convince you that none is needed" (Really, why?) and "If you disagree with me, you should maybe be concerned about being stupid or ignorant for not seeing something obvious; a disagreement with my unfounded claim needs careful reasoning and arguments on your part, it may be better to be quiet, lest you are laughed at." It so often functions as a trick to get people to overlook an...
I drop the number into a numbers-to-words converter and get "seven trillion six hundred twenty-five billion five hundred ninety-seven million four hundred eighty-four thousand nine hundred eighty-seven". (I don't do it by hand, because a script that somebody tested is likely to make fewer errors than me). Google says there are roughly 7 billion people on earth at the moment. Does that mean that each person gets roughly 1089 dust specks, or that everyone who's born gets one dust speck until the 7 trillion and change speck quota has been met? I ask because i...
I'm fairly certain Elizer ended with "the choice is obvious" to spark discussion, and not because it's actually obvious, but let me go ahead and justify that - this is not an obvious choice, even though there is a clear, correct answer (torture).
There are a few very natural intuitions that we have to analyze and dispel in order to get off the dust specks.
If that's the case, 3^^^3*0 = 0, and the torture is worse. The issue with this is two fold.
First, why does it have to be torture on the...
Gee, tough choice. We either spread the suffering around so that it’s not too intense for anyone or we scapegoat a single unlucky person into oblivion. I think you’d have to be a psychopath to torture someone just because “numbers go BRRR.”
The answer is obvious, and it is SPECKS.
I would not pay one cent to stop 3^^^3 individuals from getting it into their eyes.
Both answers assume this is a all-else-equal question. That is, we're comparing two kinds of pain against one another. (If we're trying to figure out what the consequences would be if the experiment happened in real life - for instance, how many will get a dust speck in their eye when driving a car - the answer is obviously different.)
I'm not sure what my ultimate reason is for picking SPECKS. I don't believe there are any ethical theo...
Strongly disagree.
Utilitarianism did not fall from a well of truth, nor was it derived from perfect rationality.
It is an attempt by humans, fallible humans, to clarify and spell out pre-existing, grounding ethical belief, and then turn this clarification into very simple arithmetic. All this arithmetic rests on the attempt to codify the actual ethics, and then see whether we got them right. Ideally, we would end up in a scenario that reproduces our ethical intuitions, but more precisely and quickly, where you look at the result and go “yes, tha...
Well, 3^^^3 dust specks in people's eyes imply that order of magnitude of people existing, which is an... interesting world, and sounds good news on its own. While 3^^^3 dust specks in the same person's eyes imply that they and the whole Earth get obliterated by a relativistic sand jet that promptly collapses into a black hole, so yeah.
But way-too-literal interpretations aside, I would say this argument is why I don't think total sum utilitarianism is any good. I'd rather pick a version like "imagine you're born as a random sentient in this universe, would...
First, I wanted to suggest a revision to 3^^^3 people getting duct in their eyes: Everyone alive today and everyone who is ever born henceforth, as well as all their pets, will get the speck. That just makes it easier to conceive.
In any case, I would choose the speck simply on behalf of the rando who would otherwise get torture. I'd want to let everyone know that I had to choose and so we all get a remarkably minor annoyance in order to avoid "one person" (assuming no one can know who it will be) getting tortured. This would happen only if there were a strong motivation to stop. The best option is not presented: collect more information.
This thought experiment is unrealistic, there is no and will never be a population of 3^^^3 homogenous agents to consider. In common realistic variants the considerations end up dominated by considerations like "who is being tortured?" and "what do the dust specks interfere with?".
"What's the worst that can happen?" goes the optimistic saying. It's probably a bad question to ask anyone with a creative imagination. Let's consider the problem on an individual level: it's not really the worst that can happen, but would nonetheless be fairly bad, if you were horribly tortured for a number of years. This is one of the worse things that can realistically happen to one person in today's world.
What's the least bad, bad thing that can happen? Well, suppose a dust speck floated into your eye and irritated it just a little, for a fraction of a second, barely enough to make you notice before you blink and wipe away the dust speck.
For our next ingredient, we need a large number. Let's use 3^^^3, written in Knuth's up-arrow notation:
3^^^3 is an exponential tower of 3s which is 7,625,597,484,987 layers tall. You start with 1; raise 3 to the power of 1 to get 3; raise 3 to the power of 3 to get 27; raise 3 to the power of 27 to get 7625597484987; raise 3 to the power of 7625597484987 to get a number much larger than the number of atoms in the universe, but which could still be written down in base 10, on 100 square kilometers of paper; then raise 3 to that power; and continue until you've exponentiated 7625597484987 times. That's 3^^^3. It's the smallest simple inconceivably huge number I know.
Now here's the moral dilemma. If neither event is going to happen to you personally, but you still had to choose one or the other:
Would you prefer that one person be horribly tortured for fifty years without hope or rest, or that 3^^^3 people get dust specks in their eyes?
I think the answer is obvious. How about you?