I think the argument is wrong. Proof by counterexample: say you have $10 and you value your life at $5. Note that according to the terms of the question, the value of the money is lost as well as the value of the life if you get shot. Then:
Case 1: expected value of playing the game with 4 bullets = -4/6($10+$5) = -$10, 3 bullets: -3/6($10+$5) = -$7.5, delta = $2.5
Case 2: ... 2 bullets: -2/6*($10+$5) = -$5, 0 bullets = $0, delta = $5
So you should pay up to $2.5 to take the option in Case 1, but up to $5 in case 2.
I'm pretty sure this generalises, and only in some cases is the expected amount the same.
Case 1: expected value of playing the game with 4 bullets = -4/6($10+$5) = -$10, 3 bullets: -3/6($10+$5) = -$7.5, delta = $2.5
[...]
So you should pay up to $2.5 to take the option in Case 1
This does not follow. Rather, you should be willing to pay an amount $X such that -3/6($10+$5) - 3/6 $X > -$10. That means that you are willing to pay as much as $5 in Case 1.
Imagine you're playing Russian roulette. Case 1: a six-shooter contains four bullets, and you're asked how much you'll pay to remove one of them. Case 2: a six-shooter contains two bullets, and you're asked how much you'll pay to remove both of them. Steven Landsburg describes an argument by Richard Zeckhauser and Richard Jeffrey saying you should pay the same amount in both cases, provided that you don't have heirs and all your remaining money magically disappears when you die. What do you think?