solipsist comments on What are your contrarian views? - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (806)
Define?
Meaning, in this scenario, I prefer 3^^^3 specks to 50 years of torture for one person.
I think that my objection is that the analysis sneaks in an ontological assumption: sensory experiences are comparable across a huge range. I'm not very sure that's true.
What does it mean for something to be incomparable? You can't just not decide.
Sensory experiences that reliably change utility functions are hard to reason about.
I'm not sure what you mean. Are you saying that since torture will destroy someone's mind, it's vastly worse than a dust speck, and exactly how much worse is nigh impossible to tell?
It can't be that hard to tell. Maybe you're not sure whether or not it's in the range of ten thousand dust specks to a quintillion dust specks, but it seems absurd to be so confused about it that you don't even know if it's worse than 3^^^3 dust specks.
What's your reasoning? I expect serious attempts at an answer to have to cope with questions such as —
Even if questions such as these can't be given precise answers, it should be possible to give some sort of bounds for them, and it's possible that those bounds are narrow enough to make the answer obvious.
You want a scientific scale for measuring pain? Take your pick.
Not only is there no universally standardized measure of pain, the reason why I'm pro-specks is that I don't believe that pain distributed over separate brains is summable. It does not scale.
Elsewhere EY argued that a billion skydivers, each increasing the atmosphere's temperature by a thousandth of a degree, would individually not care about the effect, but collectively kill us all. The reason why the analogy doesn't apply is that all the skydivers are in the same atmosphere, whereas the specks are not hurting the same consciousness. Unless the pain is communicable (via hive mind or what have you), it will still be roundable to zero. You could have as many specks as you like, each of them causing the mildest itch in one eye, and it would still not surpass the negative utility from torturing one person.
Edited to add: I still don't have a clear idea of how infinite specks would change the comparison, but infinites don't tend to occur in real life.
If the problem was solvable that easily, it wouldn't be a problem.
Just slightly change the definition of "speck" (or reinterpret the intent of the original definition): let a speck be "an amount of pain just slightly above the threshold where the pain no longer rounds down to zero for an individual". Now would you prefer specks for a huge number of people to torture for one person?
I'm already taking "speck" to have that meaning. Even raising the threshold (say, 3^^^3 people stubbing their toe against the sidewalk with no further consequences), my preference stands.
If you're already taking "speck" to have that meaning, then your statement "Unless the pain is communicable (via hive mind or what have you), it will still be roundable to zero." would no longer be true.
Granted. Let's take an example of pain that would be decidedly not roundable to zero. Say, 3^^^3 paper cuts, with no further consequences. Still preferable to torture.
Presumably, you still think that large amounts of pain can be added up.
In that case, that must have a threshold too; something that causes a certain amount of pain cannot be added up, while something that causes a very very slightly greater amount of pain can add up. That implies that you would prefer 3^^^3 people having pain at level 1 to one person having pain of level 1.00001, as long as 1 is not over the threshold for adding up but 1.00001 is. Are you willing to accept that conclusion?
(Incidentally, for a real world version, replace "torture" with "driving somewhere and accidentally running someone over with your car" and "specks" with "3^^^3 incidences of not being able to do something because you refuse to drive". Do you still prefer specks to torture?)
As I stated before, doctors can't agree on how to quantify pain, and I'm not going to attempt it either. This does not prevent us from comparing lesser and bigger pains, but there are no discrete "pain units" any more than there are utilons.
I would choose the certain risk of one traffic victim over 3^^^3 people unable to commute. But this example has a lot more ramifications than 3^^^3 specks. The lack of further consequences (and of aggregation capability) is what makes the specks preferable despite their magnitude. A more accurate comparison would be choosing between one traffic victim and 3^^^3 drivers annoyed by a paint scratch.
(What I'm about to say is I think the same as Jiro has been saying, but I have the impression that you aren't quite responding to what I think Jiro has been saying. So either you're misunderstanding Jiro, in which case another version of the argument might help, or I'm misunderstanding Jiro, in which case I'd be interested in your response to my comments as well as his/hers :-).)
It seems to me pretty obvious that one can construct a scale that goes something like this:
with, say, at most a million steps on the scale from the stubbed toe to 50 years' torture, and with the property that any reasonable person would prefer N people suffering problem n+1 to (let's say) (1000N)^2 people suffering problem n. So, e.g. if I have to choose between a million people getting 13 months' torture and a million million million people getting 12 months' torture, I pick the former.
(Why not just say "would prefer 1 person suffering problem n+1 to 1000000 people suffering problem n"? Because you might take the view that large aggregates of people matter sublinearly, so that 10^12 stubbed toes aren't as much worse than 10^6 stubbed toes as 10^6 stubbed toes are than 1. The particular choice of scaling in the previous paragraph is rather arbitrary.)
If so, then we can construct a chain: 1 person getting 50 years' torture is less bad than 10^6 people getting 49 years, which is less bad than 10^18 people getting 48 years, which is less bad than [... a million steps here ...] which is less bad than [some gigantic number] getting stubbed toes. That final gigantic number is a lot less than 3^^^3; if you replace (1000N)^2 with some faster-growing function of N then it might get bigger, but in any case it's finite.
If you want to maintain that TORTURE is worse than SPECKS in view of this sort of argument, I think you need to do one of the following:
Incidentally, for my part I am uncertain about TORTURE versus SPECKS on two grounds. (1) I do think it's possible that for really gigantic numbers of people badness stops depending on numbers, or starts depending only really really really weakly on numbers, so weakly that you need a lot more arrows to make a number large enough to compensate -- precisely on the grounds that when the exact same life is duplicated many times its (dis)value might be a slowly growing function of the number of duplicates. (2) The question falls far outside the range of questions on which my moral intuitions are (so to speak) trained. I've never seriously encountered any case like it (with the outlandishly large numbers that are required to make it work), nor have any of my ancestors whose reproductive success indirectly shaped my brain. And, while indeed it would be nice to have a consistent and complete system of ethics that gives a definite answer in every case and never contradicts itself, in practice I bet I don't. And cases like this I think it's reasonable to mistrust both whatever answers emerge directly from my intuitions (SPECKS is better!) and the answers I get from far-out-of-context extrapolations of other intuitions (TORTURE is better!).
[EDITED immediately after posting, to fix a formatting screwup.]
(Small nitpicking: The pain from "a multiply-fractured leg" may bother you longer than "an hour of expertly applied torture", but the general idea behind the scale is clear.)
In this case I'd choose as you do, just as in Jiro's example:
The problem with these scenarios, however, is that they introduce a new factor: they're comparing magnitudes of pain that are too close to each other. This not only applies to the amount of pain, but also to the amount of people:
I'd rather be tortured for 12 than 13 months if those were my only options, but after having had both experiences I would barely be able to tell the difference. If you want to pose this problem to someone with enough presence of mind to tell the difference, you're no longer torturing humans.
(If psychological damage is cumulative, one month may or may not make the difference between PTSD and total lunacy. Of course, if at the end of the 12 months I'm informed that I still have one more month to go, then I will definitely care about the difference. But let's assume a normal, continuous torture scenario, where I wouldn't be able to keep track of time.)
This is why,
runs into a Sorites problem that is more complex than EY's blunt solution of nipping it at the bud.
In another thread (can't locate it now), someone argued that moral considerations about the use of handguns were transparently applicable to the moral debate on nuclear weapons, and I didn't know how to present the (to me) super-obvious case that nuclear weapons are on another moral plane entirely.
You could say my objection to your 50 Shades of Pain has to do with continuity and with the meaningfulness of a scale over very large numbers. Such a quantitative scale would necessarily include several qualitative transitions, and the absurd results of ignoring them are what happens when you try to translate a subjective, essentially incommunicable experience into a neat progression of numbers.
(You could remove that obstacle by asking self-aware robots to solve this thought experiment, and they would be able to give you a precise answer about which pain is numerically worse, but in that case the debate wouldn't be relevant to us anymore.)
The underlying assumptions behind this entire thought experiment are a moral theory that leads to not being able to choose between 2 persons being tortured for 25 years and 1 person being tortured for 50 years, which is regrettable, and a decision theory that leads to scenarios where small questions can quickly escalate to blackmailing and torture, which is appalling.