VAuroch comments on Rationality Quotes September 2014 - Less Wrong

8 Post author: jaime2000 03 September 2014 09:36PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (379)

You are viewing a single comment's thread. Show more comments above.

Comment author: VAuroch 05 September 2014 02:37:19AM *  2 points [-]

Yes, I am, by definition, because the util rewards, being in utilons, must factor in everything I care about, including the potential regret.

Unless your bets don't cash out as

Bet 1: If the coins lands heads you will receive 9 utils, and if it lands tails you will receive 11 utils

and

Bet 2: If the coins lands heads you will receive -90 utils, and if it lands tails you will receive 110 utils.

If it means something else, then the precise wording could make the decision different.

Comment author: Lumifer 05 September 2014 03:16:49AM 1 point [-]

util rewards, being in utilons, must factor in everything I care about, including the potential regret.

It's not quite the potential regret that is the issue, it is the degree of uncertainty, aka risk.

Do you happen to have any links to a coherent theory of utilons?

Comment author: VAuroch 05 September 2014 07:37:02AM 4 points [-]

I'm pretty strongly cribbing off the end of So8res's MMEU rejection. Part of what I got from that chunk is that precisely quantifying utilons may be noncomputable, and even if not is currently intractable, but that doesn't matter. We know that we almost certainly will not and possibly cannot actually be offered a precise bet in utilons, but in principle that doesn't change the appropriate response, if we were to be offered one.

So there is definitely higher potential for regret with the second bet, since losing a bunch when I could otherwise have gained a bunch, and that would reduce my utility for that case, but for the statement 'you will receive -90 utilons' to be true, it would have to include the consideration of my regret. So I should not add additional compensation for the regret; it's factored into the problem statement.

Which boils down to me being unintuitively indifferent, with even the slight uncomfortable feeling of being indifferent when intuition says I shouldn't be factored into the calculations.

Comment author: Lumifer 05 September 2014 02:55:01PM 1 point [-]

We know that we almost certainly will not and possibly cannot actually be offered a precise bet in utilons

That makes it somewhat of a angels-on-the-head-of-a-pin issue, doesn't it?

I am not convinced that utilons automagically include everything -- it seems to me they wouldn't be consistent between different bets in that case (and, of course, each person has his own personal utilons which are not directly comparable to anyone else's).

Comment author: VAuroch 05 September 2014 07:55:20PM 4 points [-]

If utilons don't automagically include everything, I don't think they're a useful concept. The concept of a quantified reward which includes everything is useful because it removes room for debate; a quantified reward that included mostly everything doesn't have that property, and doesn't seem any more useful than denominating things in $.

That makes it somewhat of a angels-on-the-head-of-a-pin issue, doesn't it?

Maybe, but the point is to remove object-level concerns about the precise degree of merits of the rewards and put it in a situation where you are arguing purely about the abstract issue. It is a convenient way to say 'All things being equal, and ignoring all outside factors', encapsulated as a fictional substance.

Comment author: Lumifer 05 September 2014 08:18:58PM 1 point [-]

If utilons don't automagically include everything, I don't think they're a useful concept.

Utilons are the output of the utility function. Will you, then, say that a utility function which doesn't include everything is not a useful concept?

And I'm still uncertain about the properties of utilons. What operations are defined for them? Comparison, probably, but what about addition? multiplicaton by a probability? Under which transformations they are invariant?

It all feels very hand-wavy.

a situation where you are arguing purely about the abstract issue

Which, of course, often has the advantage of clarity and the disadvantage of irrelevance...

Comment author: nshepperd 08 September 2014 02:05:31AM *  2 points [-]

And I'm still uncertain about the properties of utilons. What operations are defined for them? Comparison, probably, but what about addition? multiplicaton by a probability? Under which transformations they are invariant?

The same properties as of utility functions, I would assume. Which is to say, you can compare them, and take a weighted average over any probability measure, and also take a positive global affine transformation (ax+b where a>0). Generally speaking, any operation that's covariant under a positive affine transformation should be permitted.

Comment author: VAuroch 05 September 2014 08:55:48PM *  2 points [-]

Will you, then, say that a utility function which doesn't include everything is not a useful concept?

Yes, I think I agree. However, this is another implausible counterfactual, because the utility function is, as a concept, defined to include everything; it is the function that takes world-states and determines how much you value that world. And yes, it's very hand-wavy, because understanding what any individual human values is not meanginfully simpler than understanding human values overall, which is one of the Big Hard Problems. When we understand the latter, the former can become less hand-wavy.

It's no more abstract than is Bayes' Theorem; both are in principle easy to use and incredibly useful, and in practice require implausibly thorough information about the world, or else heavy approximation.

The utility function is generally considered to map to the real numbers, so utilons are real-valued and all appropriate transformations and operations are defined on them.

Comment author: Lumifer 08 September 2014 01:39:18AM 1 point [-]

the utility function is, as a concept, defined to include everything; it is the function that takes world-states and determines how much you value that world.

Some utility functions value world-states. But it's also quite common to call a "utility function" something that shows/tells/calculates how much you value something specific.

The utility function is generally considered to map to the real numbers

I am not sure of that. Utility functions often map to ranks, for example.

Comment author: VAuroch 09 September 2014 07:00:23AM 0 points [-]

But it's also quite common to call a "utility function" something that shows/tells/calculates how much you value something specific.

I'm not familiar with that usage, Could you point me to a case in which the term was used, that way? Naively, if I saw that phrasing I would most likely consider it akin to a mathematical "abuse of notation", where it actually referred to "the utility of the world in which <X> exists over the otherwise-identical world in which <X> did not exist", but where the subtleties are not relevant to the example at hand and are taken as understood.

I am not sure of that. Utility functions often map to ranks, for example.

Could you provide an example of this also? In the cases where someone specifies the output of a utility function, I've always seen it be real or rational numbers. (Intuitively worldstates should be finite, like the universe, and therefore map to the rationals rather than reals, but this isn't important.)

Comment author: Lumifer 09 September 2014 06:45:16PM 0 points [-]

Could you point me to a case in which the term was used, that way?

Um, Wikipedia?