I disagree, primarily on the grounds of when you take the measure of utility.
Do you disagree with just my real-world application, or also with my coinflip example?
It is not at all clear to me that short term effects like those you describe end up with long term average effects that can be calculated, or would be of the desired sign.
Let's say you have two choices; one is "+500 utilons and then other stuff"; the other is "-500 utilons and then other stuff", where you don't know anything about the nature of "other stuff". Why can you not cancel out the unknowns? Your best information about both unknowns is identical, is it not?
How does giving a random person chocolate for no reason affect them over the course of their whole life?
On average better than torturing them would. Do you disagree?
Do you disagree with just my real-world application, or also with my coinflip example?
Both.
Let's say you have two choices; one is "+500 utilons and then other stuff"; the other is "-500 utilons and then other stuff", where you don't know anything about the nature of "other stuff". Why can you not cancel out the unknowns? Your best information about both unknowns is identical, is it not?
Too clean - money is not utilons. I think I can see part of the problem. The standard definition of utility seems to contain the time el...
I've been doing thought experiments involving a utilitometer: a device capable of measuring the utility of the universe, including sums-over-time and counterfactuals (what-if extrapolations), for any given utility function, even generic statements such as, "what I value." Things this model ignores: nonutilitarianism, complexity, contradictions, unknowability of true utility functions, inability to simulate and measure counterfactual universes, etc.
Unfortunately, I believe I've run into a pathological mindset from thinking about this utilitometer. Given the abilities of the device, you'd want to input your utility function and then take a sum-over-time from the beginning to the end of the universe and start checking counterfactuals ("I buy a new car", "I donate all my money to nonprofits", "I move to California", etc) to see if the total goes up or down.
It seems quite obvious that the sum at the end of the universe is the measure that makes the most sense, and I can't see any reason for taking a measure at the end of an action as is done in all typical discussions of utility. Here's an example: "The expected utility from moving to California is negative due to the high cost of living and the fact that I would not have a job." But a sum over all time might show that it was positive utility because I meet someone, or do something, or learn something that improves the rest of my life, and without the utilitometer, I would have missed all of those add-on effects. The device allows me to fill in all of the unknown details and unintended consequences.
Where this thinking becomes a problem is when I realize I have no such device, but desperately want one, so I can incorporate the unknown and the unintended, and know what path I should be taking to maximize my life, rather than having the short, narrow view of the future I do now. In essence, it places higher utility on 'being good at calculating expected utility' than almost any other actions I could take. If I could just build a true utilitometer that measures everything, then the expected utility would be enormous! ("push button to improve universe"). And even incremental steps along the way could have amazing payoffs.
Given that a utilitometer as described is impossible, thinking about it has still altered my values to place steps toward creating it above other, seemingly more realistic options (buying a new car, moving to California, etc). I previously asked the question, "How much time and effort should we put into improving our models and predictions, given we will have to model and predict the answer to this question?" and acknowledged it was circular and unanswerable. The pathology comes from entering the circle and starting a feedback loop; anything less than perfect prediction means wasting the entire future.