MarkusRamikin comments on A (small) critique of total utilitarianism - Less Wrong

36 Post author: Stuart_Armstrong 26 June 2012 12:36PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (237)

You are viewing a single comment's thread. Show more comments above.

Comment author: MarkusRamikin 27 June 2012 08:56:59AM *  2 points [-]

There's no fundamental reason why value should be linear in number of dust specks

Yeah, that has always been my main problem with that scenario.

There are different ways to sum multiple sources of something. Consider linear vs paralel electrical circuits; the total output depends greatly on how you count the individual voltage sources (or resistors or whatever).

When it comes to suffering, well suffering only exists in consciousness, and each point of consciousness - each mind involved - experiences their own dust speck individually. There is no conscious mind in that scenario who is directly experiencing the totality of the dust specks and suffers accordingly. It is in no way obvious to me that the "right" way to consider the totality of that suffering is to just add it up. Perhaps it is. But unless I missed something, no one arguing for torture so far has actually shown it (as opposed to just assuming it).

Suppose we make this about (what starts as) a single person. Suppose that you, yourself, are going to be copied into all that humongous number of copies. And you are given a choice: before that happens, you will be tortured for 50 years. Or you will be unconscious for 50 years, but after copying each of your copies will get a dust speck in the eye. Either way you get copied, that's not part of the choice. After that, whatever your choice, you will be able to continue with your lives.

In that case, I don't care about doing the "right" math that will make people call me rational, I care about being the agent who is happily NOT writhing in pain with 50 years more of it ahead of him.

EDIT: come to think of it, assume the copying template is taken from you before the 50 years start, so we don't have to consider memories and lasting psychological effects of torture. My answer remains the same, even if in future I won't remember the torture, I don't want to go through it.

Comment author: TheOtherDave 27 June 2012 01:48:51PM 0 points [-]

As far as I know, TvDS doesn't assume that value is linear in dust specks. As you say, there are different ways to sum multiple sources of something. In particular, there are many ways to sum the experiences of multiple individuals.

For example, the whole problem evaporates if I decide that people's suffering only matters to the extent that I personally know those people. In fact, much less ridiculous problems also evaporate... e.g., in that case I also prefer that thousands of people suffer so that I and my friends can live lives of ease, as long as the suffering hordes are sufficiently far away.

It is not obvious to me that I prefer that second way of thinking, though.

Comment author: David_Gerard 27 June 2012 03:27:26PM 2 points [-]

e.g., in that case I also prefer that thousands of people suffer so that I and my friends can live lives of ease, as long as the suffering hordes are sufficiently far away.

It is arguable (in terms of revealed preferences) that first-worlders typically do prefer that. This requires a slightly non-normative meaning of "prefer", but a very useful one.

Comment author: TheOtherDave 27 June 2012 03:34:42PM *  2 points [-]

Oh, absolutely. I chose the example with that in mind.

I merely assert that "but that leads to thousands of people suffering!" is not a ridiculous moral problem for people (like me) who reveal such preferences to consider, and it's not obvious that a model that causes the problem to evaporate is one that I endorse.

Comment author: private_messaging 27 June 2012 03:47:07PM *  0 points [-]

Well, it sure uses linear intuition. 3^^^3 is bigger than number of distinct states, its far past point where you are only increasing exactly-duplicated dust speck experience, so you could reasonably expect it to flat out.

One can go perverse and proclaims that one treats duplicates the same, but then if there's a button which you press to replace everyone's mind with mind of happiest person, you should press it.

I think the stupidity of utilitarianism is the belief that the morality is about the state, rather than about dynamic process and state transition. Simulation of pinprick slowed down 1000000 times is not ultra long torture. The 'murder' is a form of irreversible state transition. The morality as it exist is about state transitions not about states.

Comment author: Mark_Lu 27 June 2012 04:29:11PM -1 points [-]

I think the stupidity of utilitarianism is the belief that the morality is about the state, rather than about dynamic process and state transition.

"State" doesn't have to mean "frozen state" or something similar, it could mean "state of the world/universe". E.g. "a state of the universe" in which many people are being tortured includes the torture process in it's description. I think this is how it's normally used.

Comment author: private_messaging 27 June 2012 04:38:20PM *  -1 points [-]

Well, if you are to coherently take it that the transitions have value, rather than states, then you arrive at morality that regulates the transitions that the agent should try to make happen, ending up with morality that is more about means than about ends.

I think it's simply that the pain feels like a state rather than dynamic process, and so utilitarianism treats it as state, while doing something feels like a dynamic process, so utilitarianism doesn't treat it as state and is only concerned with difference in utilities.

Comment author: TheOtherDave 27 June 2012 04:05:21PM 0 points [-]

It isn't clear to me what the phrase "exactly-duplicated" is doing there. Is there a reason to believe that each individual dust-speck-in-eye event is exactly like every other? And if so, what difference does that make? (Relatedly, is there a reason to believe that each individual moment of torture is different from all the others? If it turns out that it's not, does that imply something relevant?)

In any case, I certainly agree that one could reasonably expect the negvalue of suffering to flatten out no matter how much of it there is. It seems unlikely to me that fifty years of torture is anywhere near the asymptote of that curve, though... for example, I would rather be tortured for fifty years than be tortured for seventy years.

But even if it somehow is at the asymptotic limit, we could recast the problem with ten years of torture instead, or five years, or five months, or some other value that is no longer at that limit, and the same questions would arise.

So, no, I don't think the TvDS problem depends on intuitions about the linear-additive nature of suffering. (Indeed, the more i think about it the less convinced i am that I have such intuitions, as opposed to approaches-a-limit intuitions. This is perhaps because thinking about it has changed my intuitions.)