NancyLebovitz comments on The "Intuitions" Behind "Utilitarianism" - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (193)
I assert the use of 3^^^3 in a moral argument is to _avoid_ the effort of multiplying.
Yes, that's what I said. If the quantities were close enough to have to multiply, the case would be open for debate even to utilitarians.
Demonstration: what is 3^^^3 times 6?
3^^^3, or as close as makes no difference.
What is 3^^^3 times a trillion to the trillionth power?
3^^^3, or as close as makes no difference.
...that's kinda the point.
So it seems you have two intuitions. One is that you like certain kinds of "feel good" feedback that aren't necessarily mathematically proportional to the quantifiable consequences. Another is that you like mathematical proportionality.
Er, no. One intuition is that I like to save lives - in fact, as many lives as possible, as reflected by my always preferring a larger number of lives saved to a smaller number. The other "intuition" is actually a complex compound of intuitions, that is, a rational verbal judgment, which enables me to appreciate that any non-aggregative decision-making will fail to lead to the consequence of saving as many lives as possible given bounded resources to save them.
I'm feeling a bit of despair here... it seems that no matter how I explain that this is how you have to plan if you want the plans to work, people just hear, "You like neat mathematical symmetries." Optimal plans are neat because optimality is governed by laws and the laws are math - it has nothing to do with liking neatness.
50 years of being tortured is not (50 years * 365 days * 24 hours * 3600 seconds)-times worse than 1-second of torture. It is much (non-linearly) worse than that.
Utilitarianism does not assume that multiple experiences to the same person aggregate linearly.
Yes, I agree that it is non-linearly worse.
It is not infinitely worse. Just non-linearly worse.
The non-linearity factor is nowhere within a trillion to the trillionth power galaxies as large as 3^^^3.
If it were, no human being would ever think about anything except preventing torture or goals of similar importance. You would never take a single moment to think about putting an extra pinch of salt in your soup, if you felt a utility gradient that large. For that matter, your brain would have to be larger than the observable universe to feel a gradient that large.
I do not think people understand the largeness of the Large Number here.