I'll first explain how I see expected outcome, because I'm not sure my definition is the same as the widely accepted definition.
If I have 50% chance to win 10$, I take it as there are two alternative universes, the only difference being that in one of them, I win 10$ and in the other one, I win nothing. Then I treat the 50% chance as 100% chance to be in both of them, divided by two. If winning 10$ means I'll save myself from 1 hour of work, when divided by two it would be 30 minutes of work. In virtually all cases, when it's about winning small sums of money, you can simply multiply the percentage by the money (in this case, we'll get 5$). Exceptions would be the cases analogous to the one where I'm dying of an illness, I can't afford treatment, but I have all the money I need except for the last 10$ and there isn't any other way to obtain them. So if there's 30% chance to save 10 people's lives, that's the same as saving 3 lives.
If you have no idea what you're talking about, then at least you can see a proof of my problem: I find it hard to explain this idea to people, and impossible for some.
I'm not even sure if the idea is correct. I once posted it on a math forum, asking for evidence, but I didn't find any. So, can someone confirm whether is true, also giving any evidence?
And my main question is, how can I explain this in a way that people can understand it as easily as possible.
(it is possible that it's not clear what I meant - I'll check this thread later for that, and if it turns out to be the case, I'll edit it and add more examples and try to clarify and simplify)
To be risk neutral on the money themselves you need to be extremely rich or betting on very tiny amounts (and at very least you need to have $500 000 to bet).
If you are risk neutral on logarithm of amount of money (which is more realistic), the pay off is a>0 and you start off with n>0 in assets, then your expected pay-off after betting x is 0.5 ln(n-x)+0.5 ln(a+n-x) , the change of utility is 0.5 ln(n-x)+0.5 ln(1000000+n-x) - log(n) and there's zero change in utility when it equals zero when 0.5 log(n-x)+0.5 log(a+n-x) = log(n) , which has solution x=0.5 a + n - 0.5 sqrt(a^2 + 4 * n^2) .
Plugging in 1 million in payoff and $100 000 in assets, you should bet up to about 90 000 $ , which is somewhat surprising. Usually, the utility function is not logarithmic though, as if you were to lose a lot of money you would have to deal with all the logistics of moving to cheaper apartment or the like, so people would be willing to bet less. Actually, (rational) people just compare two imagined outcomes directly to each other rather than convert each to a real number first, this is better when you only partially imagine an outcome, so that you can do both equally partially. Estimating 2 utilities and then comparing runs into problems when estimations are necessarily approximate.
edit: reddit is designed for people who use italics more often than algebra.