I use the phrase 'clever argument' deliberately: I have reached a conclusion that contradicts the usual wisdom around here, and want to check that I didn't make an elementary mistake somewhere.

Consider a lottery ticket that costs $100 for a one-in-ten-thousand chance of winning a million dollars, expected value, $100. I can take this deal or leave it, and of course a realistic ticket actually costs 100+epsilon where epsilon covers the arranger's profit, which is a bad deal.

But now consider this deal in terms of time. Suppose I've got a well-paid job in which it takes me an hour to earn that $100. Suppose further that I work 40 hours a week, 50 weeks a year, and that my living expenses are a modest $40k a year, making my yearly savings $160k. Then, with 4% interest on my $160k yearly, it would take me about 5.5 years to accumulate that million dollars, or 11000 hours. Also note that with these assumptions, once I have my million I don't need to work any more.

It seems to me that, given the assumptions above, I could view the lottery deal as paying one hour of my life for a one-in-ten-thousand chance to win 11000 hours, expected value, 1.1 hours. (Note that leisure hours when young are probably worth more, since you'll be in better health to enjoy it; but this is not necessary to the argument.)

Of course it is possible to adjust the numbers. For example, I could scrimp and save during my working years, and make my living expenses only 20k; in that case it would take me less than 5 years to accumulate the million, and the ticket goes back to being a bad deal. Alternatively, if I spend more than 40k a year, it takes longer to accumulate the million; in this case my standard of living drops when I retire to live off my 4% interest, but the lottery ticket becomes increasingly attractive in terms of hours of life.

I think, and I could be mistaken, that the reason this works is that the rate at which I'm indifferent between money and time changes with my stock of money. Since I work for 8 hours a day at $100 an hour, we can reasonably conclude that I'm *roughly* indifferent between an hour and $100 at my current wealth. But I'm obviously not indifferent to the point that I'd work 24 hours a day for $2400, nor 0 hours a day for $0. Further, once I have accumulated my million dollars (or more generally, enough money to live off the interest), my indifference level becomes much higher - you'd have to offer me way more money per hour to get me to work. Notice that in this case I'm postulating a very sharp dropoff, in that I'm happy to work for $100 an hour until the moment my savings account hits seven digits, and then I am no longer willing to work at all; it seems possible that the argument no longer works if you allow a more gradual change in indifference, but on the other hand "save to X dollars and then retire" doesn't seem like a psychologically unrealistic plan either.

Am I making any obvious mistakes? Of course it may well be the case that the actual lottery tickets for sale in the real world do not match the wages-and-savings situations of real people in such a way that they have positive expected value; that's an empirical question. But it does seem in-principle possible for an epsilon chance at one-over-epsilon dollars paid out right away to be of positive expected value after converting to expected hours of life, even though it's neutral in expected dollars. Am I mistaken?

 

Edit: Wei Dai found the problem. Briefly, the 100 dollars added to my savings would cut more than 1.1 hours off the time I had to work at the end of the 5.5 years. 

New to LessWrong?

New Comment
35 comments, sorted by Click to highlight new comments since: Today at 10:17 PM

Not sure if this is the crux of the issue, but if the lottery fails to pay off, you lose the interest on the $100 in addition to the $100 itself, and in order to get to $1 million, you'd have to make up for it by working more than 1.1 hour at the end of 5.5 years. Is there a way to change the numbers to make your argument work out even after taking this into account?

That seems like a possible candidate for the problem: Perhaps my round figure 5.5 years was a bit too round. Let us do the calculations. For simplicity I'm going to assume that interest works like this: On December 31st of each year, I put my earnings for the year into a savings account, which then grows by a factor 1.04. This is not realistic, of course, but it does seem that the answer ought not to depend on details of the model of interest. So, consider how much I have to work in the case where I buy the ticket and don't win. At the end of the first year, I put $159900 in my account, which grows to $166296 by the magic of 4% interest. Then it grows thus:

End of 2nd year: 339347.84 End of 3rd year: 519321.75 End of 4th year: 706494.62 End of 5th year: 901154.41

and I need to work another 988.46 hours to reach the million. Now consider the case where I didn't buy the ticket. Then I have 901276.07 at the end of the fifth year, and must work another 987.24 hours. The difference is 1.22, which as you suggested is larger than the 1.1.

A cross-check: If I buy the ticket, then there's a 99.99% chance that I have to work those (5 years + 988.46 hours), which is 10987.36 expected hours. If I don't, there's a 100% chance that I have to work (5 years + 987.24) hours, or 10987.24 expected hours. So the ticket is costing me 0.12 hours, the same as we found above.

I repeated the calculation on the assumption that my expenses are $35k and $45k a year, and both are even worse for the lottery-buyer, which surprises me a bit. I don't see an obvious way to change the numbers, then; although I could still fall back on the argument that leisure hours at age 30 are worth more than at age 35. It presumably does not take a whole lot of weighting to overcome the 0.12-hour disadvantage of the lottery. But my initial numbers were clearly flawed due to not calculating exactly enough what was needed at the end of the period. Thanks for clearing up my confusion.

The general principle is that utility doesn't equal money, and you are interested in expected utility, not expected money. If, for example, you need $50000 for an operation that would save your life, then certainty of $40000 is much worse than 1% chance at $50000, which is only $500 in expected money, but more than certain $40000's worth in expected utility. Insurance is a trade where both sides gain positive expected utility, even though the (expected) balance of money must necessarily be zero.

Am I making any obvious mistakes?

Well, you've used a value for the likelihood of winning the lottery that's dramatically higher than it is in real life, and it's a lot harder to make the numbers come out in favor of buying the lottery tickets when you keep the winnings on the same order and decrease the chances of winning by more than two orders of magnitude.

The cost of a lottery ticket is a lot more than the jackpot times the likelihood of winning.

[-][anonymous]12y40

OK, so let's look at actual numbers.

I earn around £28,000 per year. Currently, I can't save any money - in fact I have to find extra sources of income to meet expenses for me and my wife.

Accounting for holiday time, and so on, I earn £15 an hour, give or take, so a £1 lottery ticket is worth four minutes of my work time.

Currently, I do not have any savings, or any prospect of saving any money. Thus, I expect to work for at least the next thirty-five years, or 64,400 hours.

If I won the average £2,053,984 prize for the UK National Lottery jackpot, I would not only be able to retire, thus saving those 64,400 hours, if I stuck that in a bank at 5% interest I would get £102699.20 a year - nearly four times my current salary. So I could actually be saving 239,631 'hours' over that 35-year period.

The odds of winning the national lottery jackpot are 1 in 13,983,816, which would mean my expected winning (only counting the jackpot, not the lower prizes) would be 0.017 hours, or 1.02 minutes.

That suggests that for me, at a relatively high wage, the lottery is still irrational to play. However, it also suggests that for anyone whose earnings are below £3.82 per hour (for example someone working a minimum wage job but also doing unpaid overtime or an intern) it could be rational to play the lottery.

If I won the average £2,053,984 prize for the UK National Lottery jackpot, I would not only be able to retire, thus saving those 64,400 hours, if I stuck that in a bank at 5% interest I would get £102699.20 a year - nearly four times my current salary. So I could actually be saving 239,631 'hours' over that 35-year period.

I'm not sure how it works in the UK, but in the US, the advertised jackpot figures are lies - they actually give you an annuity which pays out a total of that amount over its lifetime, and then tax the payouts.

[-][anonymous]12y50

It's different in the UK - it's paid out as a tax-free lump sum.

That suggests that for me, at a relatively high wage, the lottery is still irrational to play. However, it also suggests that for anyone whose earnings are below £3.82 per hour (for example someone working a minimum wage job but also doing unpaid overtime or an intern) it could be rational to play the lottery.

British pounds are currently 1.6USD:1GBP conversion. I'm somewhat disturbed by the notion that $24/hr is described as a "relatively high" wage, as that works out to a roughly $50k/year salary. Wikipedia indicates you're pretty accurate, however.

That's... enlightening, actually. Also, very fascinating.

[-][anonymous]12y00

The difference is that in the UK we don't have to pay health insurance and so on, and most people don't have to travel significant distances by US standards, so the cost of living is significantly lower (food and rent costs are higher, but the travel and health insurance savings more than make that up). It'd still be solidly average household earnings for the US, anyway.

(The reason I can't save anything despite being on a high wage is that I have debts that need paying off, and have to support my wife who's too unwell to work. Most people on my income outside London would be living very comfortably indeed).

The reason I can't save anything despite being on a high wage is that I have debts that need paying off,

Here's the thing: I have a below average income, and I earn more -- currencies converted -- than you do. To your equivalent 24USD/hr I earn (sole earner of my household) 27USD/hr (I also do not pay out-of-pocket for health insurance, and my cost-of-mileage is far less than yours [gasoline/petrol taxes are fractional here compared to yours]). Now, I also live in a part of the US where the average 1400 Sq.ft home costs ~$50,000USD. And I'm in a metropolitan area at that.

So I'm experiencing a touch of culture shock here.

[-][anonymous]12y30

You don't have a below average income then - according to http://en.wikipedia.org/wiki/Personal_income_in_the_United_States , "The overall median personal income for all individuals over the age of 18 was $24,062 ($32,140 for those age 25 or above) in the year 2005. The overall median income for all 155 million persons over the age of 15 who worked with earnings in 2005 was $28,567" - I can't find a mean, or data more recent than that, but it can't have changed that much. Unless you're only working 26-hour weeks, you're earning more than average for the US - significantly so.

In fact, according to that table, a $50,000 salary, which you were thinking of as low, would be in the top 25% of income for the US...

And yes, your cost-of-mileage would be lower, assuming I was driving. But in the UK we have a far more efficient, and cheaper, public transport system, and most people live significantly closer to our places of work than in the US. I work from home quite a bit of the time, but in a week where I have to travel in to work every day, my total transport costs for the week would be £8.50 (five return train journeys at £1.70 each). People in metropolitan areas, especially, are as likely to cycle or walk to work as to drive, if not more so.

How did you decide to use individual income, rather than household? You both seem to support other people.

[-][anonymous]12y10

Because it was in the context of me saying that £28,000 p.a. is a good wage in my original comment. £28,000 is a pretty good income for an individual, but not a wonderful one for two people, and that was the comment Logos01 originally picked up on.

So I'm experiencing a touch of culture shock here.

And consider that you're still talking about the AngloAmerican world. If British salaries shocked you like that, do I want to know what you think about Greek ones? When I first started working (just after getting my master's degree and completing my standard year of military conscription) my yearly salary was about 14.000 euro, after taxes.

In Slovakia a high school teacher receives 600 € / month, before taxes. To get above 1200 € / month, one has to be a manager, financial specialist, IT specialist, etc. On the other hand, we get 25 paid vacation days.

Sure, we need to be careful not to compare apples with oranges. For example, do people receive free healthcare for their taxes? This may be difficult to evaluate, for example in Slovakia there is a nominally free health care, but sometimes one has to pay some additional costs, or pay for higher quality or sooner treatment, and it can get pretty expensive. Also the tax system is complicated (income tax, property tax, value added tax, special taxes, mandatory insurances, tax for having an employee, media tax, etc.) -- I suppose this is intentional to prevent average people from finding out how much exactly do they pay.

(Data source, unfortunately in Slovak only.)

Let's make sure we're comparing apples to apples here.

Mandatory paid vacation days, by country

US -- None.
UK -- 20 (not counting public holidays, again for apples-apples purposes)
Greece -- 20 and up, depending on seniority

The average US worker works a lot more hours than the average UK or Greek worker does.

The hours worked by a US worker (+- 1800 a year) is lower than the hours worked by Greeks (about 2100 a year), and higher than those of the UK (approximately 1650 hours a year). Only the Koreans work more hours a year than Greeks, so the amount of hours put in is not very relevant to the yearly salary.

source: OECD statistics

The cost of a lottery ticket is a lot more than the jackpot times the likelihood of winning.

In lotteries with rollover jackpots, lottery tickets can become positive expected value... as long as you don't take into account the possibility of having to split the jackpot with another winner.

But that possibility is present regardless of whether the jackpot accumulates or not, or am I missing something?

Certainly lotteries with an oversized jackpot will sell more tickets, thus increasing the risk, but it doesn't seem obvious to me that the player pool will increase fast enough to keep the EV down.

Well I suppose that it depends on the rules of that specific lottery, but I suspect that most U.S. state lotteries with large jackpots do split the total prize between all winning tickets. It's just that the odds of having to split your prize are a lot harder to know than the odds of winning the jackpot at all.

I picked numbers that are easy to work with to illustrate a theoretical point. Whether any actual lottery can do the same thing for the particular situation of any real person is a separate issue. Again, the part of the lottery ticket's price that covers overhead and profit is only relevant in the context of real lotteries. The question I'm asking is whether it is possible for a bad or neutral deal in dollars to become a good deal in hours of life.

I believe you left out the opportunity cost of spending the $100 on a ticket instead of letting it accrue 4% interest. That is, you compared $100 in today's dollars to $100 in 2017 dollars, but it should've been $124 in 2017 dollars.

I am reasonably convinced that I did not do so; when I said "roughly 5.5 years to accumulate a million", I was taking compound interest into account.

Let's try a different angle:

Then, with 4% interest on my $160k yearly, it would take me about 5.5 years to accumulate that million dollars, or 11000 hours.

So over 5.5 years, you theoretically earned $1,220,000. A million in savings plus $40k living expenses for 5.5 years. Effective hourly wage is $110.90.

At an effective hourly wage of $110 your expected lottery ticket return is 1.0 hours, not 1.1

Ah, but now you are neglecting the additional value of having the million dollars now, instead of 5.5 years from now. At 4% interest this neatly cancels out the additional $220k.

I don't believe so, but maybe someone smarter than me can explain this. The magic 4% of a million = 40k value indeed should factor in, but it shouldn't dominate the expected value to the degree that you're making it.

I don't think this works because you're submitting to the same probabilistic calculation but muddying the waters, so somewhere you should be missing something which pushes expected value in line and makes it a bad decision. There's also the issue that no lottery has proper payoffs where expected value is par.

A better argument is to note that we lose money with little utility gained all the time because we don't intuit our accounts very well unless we're close to the edge of them by some factor. In that case, buying lottery tickets is a good bet up to edge * factor because they have a higher expected value than simply losing the money altogether.

somewhere you should be missing something

Fine, but where and what? I've already gone through the math a couple of times looking for errors; the purpose of posting here was to get fresh eyes on the problem.

I see that Wei_Dai has come up with the answer. I was unwilling to dedicate the time to it, as your example was needlessly overgrown. I should have simply said nothing save that. My apologies.

The problem is that you are betting away the money needed for your living expenses, as well as that which you would put towards savings.

If you go the saving route, then 12 minutes of each hour goes in living expenses, 48 minutes to your savings, which after 5.5 years will total 10000 hours. If you bet the hour in a lottery and lose then you don't have the 12 minutes required for living expenses.

Something else to consider is where the money you are spending on the lottery is going. Where I live that means state governments and specifically the school system. I am required to give those organizations money in a variety of ways but the lottery is voluntary. The state schools treat children horribly and the state itself does a lot of other horrible things.

I am not voluntarily giving money to those organizations.

One could also argue that the state and its schools would be able to do a much better job if they were better funded, arriving at the opposite conclusion.

The problems that I see in the state and its schools are not problems that can be solved with more money.

It seems certain that both types of problems (problems of under-funding, and "other") make a significant contribution to the failings of nation-states and schools.

Any problems caused by insufficient funding pale in comparison to the problem of how we treat the students. It would be a longish post for me to describe the problems I have with the public schools in the US, never mind the state itself, but if you are interested I will post it.