dhasenan comments on Average utilitarianism must be correct? - Less Wrong

2 Post author: PhilGoetz 06 April 2009 05:10PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (159)

You are viewing a single comment's thread. Show more comments above.

Comment author: ArisKatsaris 16 February 2014 03:03:55AM 1 point [-]

You are averaging based on the population at the start of the experiment. In essence, you are counting dead people in your average, like Eliezer's offhanded comment implied he would

I consider every moment of living experience as of equal weight. You may call that "counting dead people" if you want, but that's only because when considering the entire timeline I consider every living moment -- given a single timeline, there's no living people vs dead people, there's just people living in different times. If you calculate the global population it doesn't matter what country you live in -- if you calculate the utility of a fixed timeline, it doesn't matter what time you live in.

But the main thing I'm not sure you get is that I believe preferences are valid also when concerning the future, not just when concerning the present.

If 2014 Carl wants the state of the world to be X in 2024, that's still a preference to be counted, even if Carl ends up dead in the meantime. That Carl severely does NOT want to be dead in 2024, means that there's a heavy disutility penalty for the 2014 function of his utility if he ends up nonetheless dead in 2024.

Of course, if we counted their preferences, that would be a conservatizing force that we could never get rid of

If e.g. someone wants to be buried at sea because he loves the sea, I consider it good that we bury them at sea.
But if someone wants to be buried at sea only because he believes such a ritual is necessary for his soul to be resurrected by God Poseidon, his preference is dependent on false beliefs -- it doesn't represent true terminal values; and that's the ones I'm concerned about.

If conservatism is e.g. motivated by either wrong epistemic beliefs, or by fear, rather than true different terminal values, it should likewise not modify our actions, if we're acting from an epistemically superior position (we know what they didn't).

Comment author: [deleted] 16 February 2014 03:32:30AM *  -1 points [-]

I think you're arguing against my argument against a position you don't hold, but which I called by a term that sounds to you like your position.

Assuming you have a function that yields the utility that one person has at one particular second, what do you want to optimize for?

And maybe I should wait until I'm less than 102 degrees Fahrenheit to continue this discussion.