dlthomas comments on [SEQ RERUN] Torture vs. Dust Specks - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (83)
If the dust speck has a slight tendency to be bad, 3^^^3 wins.
If it does not have a slight tendency to be bad, it is not "the least bad bad thing that can happen to someone" - pick something worse for the thought experiment.
Only if you agree to follow EY in consolidating many different utilities in every possible case into one all-encompassing number, something I am yet to be convinced of, but that is beside the point, I suppose.
Sure, if you pick something with a guaranteed negative utility and you think that there should be one number to bind them all, I grant your point.
However, this is not how the problem appears to me. A single speck in the eye has such an insignificant utility, there is no way to estimate its effects without knowing a lot more about the problem.
Basically, I am uncomfortable with the following somewhat implicit assumptions, all of which are required to pick torture over nuisance:
A breakdown in any of these assumptions would mean needless torture of a human being, and I do not have enough confidence in EY's theoretical work to stake my decision on it.
If you have a preference for some outcomes versus other outcomes, you are effectively assigning a single number to those outcomes. The method of combining these is certainly a viable topic for dispute - I raised that point myself quite recently.
It was quite explicitly made a part of the original formulation of the problem.
Considering the assumptions you are unwilling to make:
As I've been saying, there quite clearly seem to be things that fall in the realm of "I am confident this is typically a bad thing" and "it runs counter to my intuition that I would prefer torture to this, regardless of how many people it applied to".
I addressed this at the top of this post.
I think it's clear that there must be some means of combining individual preferences into moral judgments, if there is a morality at all. I am not certain that it can be done with the utility numbers alone. I am reasonably certain that it is monotonic - I cannot conceive of a situation where we would prefer some people to be less happy just for the sake of them being less happy. What is needed here is more than just monotonicity, however - it is necessary that it be divergent with fixed utility across infinite people. I raise this point here, and at this point think this is the closest to a reasonable attack on Eliezer's argument.
On balance, I think Eliezer is likely to be correct; I do not have sufficient worry that I would stake some percent of 3^^^3 utilons on the contrary and would presently pick torture if I was truly confronted with this situation and didn't have more time to discuss, debate, and analyze. Given that there is insufficient stuff in the universe to make 3^^^3 dust specks, much less the eyes for them to fly into, I am supremely confident that I won't be confronted with this choice any time soon.