orthonormal comments on Winning the Unwinnable - Less Wrong

4 Post author: JRMayne 21 January 2010 03:01AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (51)

You are viewing a single comment's thread. Show more comments above.

Comment author: MrHen 21 January 2010 05:03:29AM 3 points [-]

If the expected value for buying all of the tickets is positive, wouldn't the expected value of any particular ticket be positive? Does the math require you to buy all of the tickets?

A small example:

5 numbers that each cost $1 with payouts of $4 for 1st pick and $2 for 2nd pick. Any ticket has a 1/5 chance of paying $4, a 1/5 chance of paying $2, and a 3/5 chance of paying $0.

.2 * $4 + .2 * $2 + .6 * 0 = $1.2

Buying all of the tickets will give you $6 for spending $5, which is a profit of $1.2 per dollar invested. So... what am I missing? It seems like if it was good for you to spend $41 million it was good for you to spend $1. Is it a matter of risk management or something like that? This isn't really my area of expertise.

Comment author: orthonormal 21 January 2010 05:33:00AM *  5 points [-]

As expected value ≠ expected utility, it's not the case that you should always buy a ticket if expected value is positive. It's a standard result that people actually treat the utility of wealth roughly logarithmically: i.e. that it's better to have a net worth of $1,000,000,000 than $100,000,000, but not that much better compared to how much better $100,000,000 is than $1000 net worth.

To simplify the lottery situation in the case of extreme probabilities and payouts, say that Omega offers a lottery only to you (no worries about split jackpots), in which there are exactly 1,000,000 tickets, each costing $1, and among them there is one winning ticket that pays out $2,000,000.

Now if you can scrounge up a million dollars to buy every ticket, you make a tidy $1 million profit (less interest from your backers) with zero risk, so the expected utility is very positive for this strategy.

If, however, you can only get $100,000 together, you shouldn't buy any tickets (unless you're a millionaire to start), since the utility to you of a 90% chance of losing $100,000 (and having a pretty crappy life being so far in debt) outweighs the utility of a 10% chance of winning $2 million (and a nice standard of living).

Comment author: Douglas_Knight 21 January 2010 05:58:20AM 6 points [-]

It's a standard result that people actually treat the utility of wealth roughly logarithmically

or is it just a standard assumption? I've never heard anything more precise than declining marginal utility.

Comment author: Nick_Tarleton 21 January 2010 04:49:17PM *  4 points [-]

Pretty sure it's the standard result that people don't consistently assign utilities to levels of wealth.

Comment author: bill 21 January 2010 02:25:34PM 6 points [-]

Logarithmic u-functions have an uncomfortable requirement that you must be indifferent to your current wealth and a 50-50 shot at doubling or halving it (e.g. doubling or halving every paycheck/payment you get for the rest of your life). Most people I know don't like that deal.

Comment author: magfrump 21 January 2010 04:45:47PM 0 points [-]

I'm confused about what is uncomfortable about this, or what function of wealth you would measure utility by.

Naively it seems that logarithmic functions would be more risk averse than nth root functions which I have seen Robin Hanson use. How would a u-function be more sensitive to current wealth?

Comment author: Technologos 21 January 2010 09:01:25PM *  1 point [-]

I think the uncomfortable part is that bill's (and my) experience suggests that people are even more risk-averse than logarithmic functions would indicate.

I'd suggest that any consistent function (prospect theory notwithstanding) for human utility functions is somewhere between log(x) and log(log(x))... If I were given the option of a 50-50 chance of squaring my wealth and taking the square root, I would opt for the gamble.

Comment author: Blueberry 21 January 2010 04:58:52PM -1 points [-]

That's only a requirement for risk-neutral people. Most people you know are not risk-neutral.

Comment author: Technologos 21 January 2010 08:25:38PM 2 points [-]

Logarithmic utility functions are already risk-averse by virtue of their concavity. The expected value of a 50% chance of doubling or halving is a 25% gain.

Comment deleted 21 January 2010 08:36:37PM [-]
Comment author: Cyan 21 January 2010 08:39:40PM 2 points [-]

I would say that such a person doesn't have preferences representable by a utility function.

Comment author: thomblake 21 January 2010 08:42:54PM *  1 point [-]

That's just plain false. Risk-aversion is a valid preference, and can be included as a term in a utility function (at slight risk of circularity, but that's not really a problem).

ETA: well, the stated units were utils, so risk-aversion should be included, so I think you're correct.

Comment deleted 21 January 2010 08:43:36PM [-]
Comment author: Cyan 21 January 2010 09:11:02PM 0 points [-]

I don't think opportunities to make choices are usually considered to be in the domain of a utility function. (If I'm wrong, educate me. I'd appreciate it.)

Comment author: thomblake 21 January 2010 08:47:47PM 1 point [-]

Nitpick: you put the values in utiles, which should include risk-aversion. If you put the values in dollars or something, I would agree.

Comment author: orthonormal 21 January 2010 06:34:57AM 0 points [-]

Hmm, good question. Quick Google search doesn't turn up anything...

Comment author: MrHen 21 January 2010 05:55:45AM 0 points [-]

Got it. This totally answered my question.