I think you're making the wrong comparisons. If you buy $1 worth, you get p(win) U(jackpot) + (1-p(win)) U(-$1), which is more-or-less p(win)U(jackpot)+U(-$1); this is a good idea if p(win) U(jackpot) > -U(-$1). But under usual assumptions -U(-$2)>-2U(-$1). This adds up to normality; you shouldn't actually spend all your money. :)
Of course you are right, silly mistake.
(Not really important nitpick:) The dollar is spent once the ticket is bought and doesn't return even if you win, so you shoudn't have there (1-p(win)) * U(-$1), but just U(-$1).
So the jackpot in the Ohio lottery is around 25 million, and the chance of winning it is one in roughly 14 million, with tickets at 1 dollar a piece. It appears to me that roughly a quarter million tickets are sold each drawing; so, supposing you win, the probability of someone else also winning is 1 - (1 - 1/14e6)^{250000}=2%, which does not significantly reduce the expectation value of a ticket. So, unless I'm making a silly mistake somewhere, buying lottery tickets has positive expected value. (I find this counterintuitive; where are all the economists who should be picking up this free money? But I digress.)
I pointed this out to my wife, and said that it might be worth putting a dollar into it; and she very cogently asked, "Then why not make it 100 dollars?" Why not, indeed! Is there any sensible way of deciding how much to put into an option that has a positive expected value, but very low chance of payoff?