Actually, I'm using it to refer to something which has high expected utility, low probability of success, and a third criterion: you are uncertain about what the probability really is. A sweepstakes with 100 tickets has a 1% chance of winning. A sweepstakes which has 2 tickets but where you think there's a 98% chance that the person running the sweepstakes is a fraudster also has a 1% chance of winning, but that seems fundamentally different from the first case.
you are uncertain about what the probability really is
I think this is a misunderstanding of the idea of probability. The real world is either one way or another, either we will actually win the sweepstakes or we won't. Probability comes into the picture in our heads, telling us how likely we think a certain outcome is, and how much we weight it when making decisions. As such, I don't think it makes sense to talk about having uncertainty about what a probability really is, except for the case of a lack of introspection.
Also, going back to Robby's post:
...
I'm currently unconvinced either way on this matter. However, enough arguments have been raised that I think this is worth the time of every reader to think a good deal about.
http://nothingismere.com/2014/11/12/inhuman-altruism-inferential-gap-or-motivational-gap/