So the jackpot in the Ohio lottery is around 25 million, and the chance of winning it is one in roughly 14 million, with tickets at 1 dollar a piece. It appears to me that roughly a quarter million tickets are sold each drawing; so, supposing you win, the probability of someone else also winning is 1 - (1 - 1/14e6)^{250000}=2%, which does not significantly reduce the expectation value of a ticket. So, unless I'm making a silly mistake somewhere, buying lottery tickets has positive expected value. (I find this counterintuitive; where are all the economists who should be picking up this free money? But I digress.)
I pointed this out to my wife, and said that it might be worth putting a dollar into it; and she very cogently asked, "Then why not make it 100 dollars?" Why not, indeed! Is there any sensible way of deciding how much to put into an option that has a positive expected value, but very low chance of payoff?
The Kelly Criterion describes how to invest if you have utility that goes up logarithmically with the amount of dollars you make. It's not a foolproof decision theory.
I was under the impression that for infinite repeated play, no matter what your actual utility function is (as long as it is increasing and the total number of dollars is bounded), it turns out that the optimal single-turn strategy "looks like" betting with a logarithmic utility function --- hence the Kelly Criterion.
I don't know much about this though so could be mistaken.