dlthomas comments on [SEQ RERUN] Torture vs. Dust Specks - Less Wrong

4 Post author: MinibearRex 11 October 2011 03:58AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (83)

You are viewing a single comment's thread. Show more comments above.

Comment author: dlthomas 11 October 2011 10:26:38PM 4 points [-]

If the dust speck has a slight tendency to be bad, 3^^^3 wins.

If it does not have a slight tendency to be bad, it is not "the least bad bad thing that can happen to someone" - pick something worse for the thought experiment.

Comment author: shminux 11 October 2011 11:31:02PM *  0 points [-]

If the dust speck has a slight tendency to be bad, 3^^^3 wins.

Only if you agree to follow EY in consolidating many different utilities in every possible case into one all-encompassing number, something I am yet to be convinced of, but that is beside the point, I suppose.

If it does not have a slight tendency to be bad, it is not "the least bad bad thing that can happen to someone" - pick something worse for the thought experiment.

Sure, if you pick something with a guaranteed negative utility and you think that there should be one number to bind them all, I grant your point.

However, this is not how the problem appears to me. A single speck in the eye has such an insignificant utility, there is no way to estimate its effects without knowing a lot more about the problem.

Basically, I am uncomfortable with the following somewhat implicit assumptions, all of which are required to pick torture over nuisance:

  • a tiny utility can be reasonably well estimated, even up to a sign
  • zillions of those utilities can be combined into one single number using a monotonic function
  • these utilities do not interact in any way that would make their combination change sign
  • the resulting number is invariably useful for decision making

A breakdown in any of these assumptions would mean needless torture of a human being, and I do not have enough confidence in EY's theoretical work to stake my decision on it.

Comment author: dlthomas 11 October 2011 11:57:58PM *  1 point [-]

Only if you agree to follow EY in consolidating many different utilities in every possible case into one all-encompassing number, something I am yet to be convinced of, but that is beside the point, I suppose.

If you have a preference for some outcomes versus other outcomes, you are effectively assigning a single number to those outcomes. The method of combining these is certainly a viable topic for dispute - I raised that point myself quite recently.

Sure, if you pick something with a guaranteed negative utility and you think that there should be one number to bind them all, I grant your point.

However, this is not how the problem appears to me. A single speck in the eye has such an insignificant utility, there is no way to estimate its effects without knowing a lot more about the problem.

It was quite explicitly made a part of the original formulation of the problem.

Considering the assumptions you are unwilling to make:

  • tiny utility can be reasonably well estimated, even up to a sign

As I've been saying, there quite clearly seem to be things that fall in the realm of "I am confident this is typically a bad thing" and "it runs counter to my intuition that I would prefer torture to this, regardless of how many people it applied to".

  • the resulting number is invariably useful for decision making

I addressed this at the top of this post.

  • zillions of those utilities can be combined into one single number using a monotonic function
  • these utilities do not interact in any way that would make their combination change sign

I think it's clear that there must be some means of combining individual preferences into moral judgments, if there is a morality at all. I am not certain that it can be done with the utility numbers alone. I am reasonably certain that it is monotonic - I cannot conceive of a situation where we would prefer some people to be less happy just for the sake of them being less happy. What is needed here is more than just monotonicity, however - it is necessary that it be divergent with fixed utility across infinite people. I raise this point here, and at this point think this is the closest to a reasonable attack on Eliezer's argument.

On balance, I think Eliezer is likely to be correct; I do not have sufficient worry that I would stake some percent of 3^^^3 utilons on the contrary and would presently pick torture if I was truly confronted with this situation and didn't have more time to discuss, debate, and analyze. Given that there is insufficient stuff in the universe to make 3^^^3 dust specks, much less the eyes for them to fly into, I am supremely confident that I won't be confronted with this choice any time soon.