For those not familiar with the topic, Torture vs. Dustspecks asks the question: "Would you prefer that one person be horribly tortured for fifty years without hope or rest, or that 3^^^3 people get dust specks in their eyes?"
Most of the discussion that I have noted on the topic takes one of two assumptions in deriving their answer to that question: I think of one as the 'linear additive' answer, which says that torture is the proper choice for the utilitarian consequentialist, because a single person can only suffer so much over a fifty year window, as compared to the incomprehensible number of individuals who suffer only minutely; the other I think of as the 'logarithmically additive' answer, which inverts the answer on the grounds that forms of suffering are not equal, and cannot be added as simple 'units'.
What I have never yet seen is something akin to the notion expressed in Ursula K LeGuin's The Ones Who Walk Away From Omelas.If you haven't read it, I won't spoil it for you.
I believe that any metric of consequence which takes into account only suffering when making the choice of "torture" vs. "dust specks" misses the point. There are consequences to such a choice that extend beyond the suffering inflicted; moral responsibility, standards of behavior that either choice makes acceptable, and so on. Any solution to the question which ignores these elements in making its decision might be useful in revealing one's views about the nature of cumulative suffering, but beyond that are of no value in making practical decisions -- they cannot be, as 'consequence' extends beyond the mere instantiation of a given choice -- the exact pain inflicted by either scenario -- into the kind of society that such a choice would result in.
While I myself tend towards the 'logarithmic' than the 'linear' additive view of suffering, even if I stipulate the linear additive view, I still cannot agree with the conclusion of torture over the dust speck, for the same reason why I do not condone torture even in the "ticking time bomb" scenario: I cannot accept the culture/society that would permit such a torture to exist. To arbitrarily select out one individual for maximal suffering in order to spare others a negligible amount would require a legal or moral framework that accepted such choices, and this violates the principle of individual self-determination -- a principle I have seen Less Wrong's community spend a great deal of time trying to consider how to incorporate into Friendliness solutions for AGI. We as a society already implement something similar to this, economically: we accept taxing everyone, even according to a graduated scheme. What we do not accept is enslaving 20% of the population to provide for the needs of the State.
If there is a flaw in my reasoning here, please enlighten me.
I don't understand what you mean by "practically infinitesimal". Are you saying the negative utility incurred by a dust speck is zero? Also, what do you mean by "nearly... infinite"? Either a quantity is infinite or finite.
You've completely lost me. If X is the negative utility of N dust specks, and Y the negative utility of fifty years of torture, then the first sentence implies that X > Y. Then the second sentence defines a second kind of negative utility, Z, due to other consequences. It goes on to imply that X + Z < Y. All quantities involved are positive (i.e., the units involved are antiutilons), so there's a contradiction somewhere, unless I've misread something.
Nearly zero. That's part of the hypothesis: that it be the smallest possible unit of suffering. If the logarithmic scale of quantification for forms of suffering holds true, then forms of suffering at the maximal end of the scale would be practically infinite comparably.
Correct, but a number that approaches infinity is not itself necessarily infinite; merely very large. 3^^^3 for example.
... (read more)