I’ve recently heard a number of people arguing for “fanaticism“ when it comes to longtermism. Basically, if a cause area has even a minuscule probability of positively affecting the long-term future of humanity (and thus influencing an effectively unbounded number of lives), we should fund/support that cause even at the expense of near-term projects with high probability of success. If this is so, I have trouble seeing why Pascal’s Wager (or the even less probable Pascal’s Mugging) shouldn’t hold. I know most people (even religious people) don’t believe Pascal’s argument is valid, but most arguments against it I’ve read would seem to also exclude low-probability longtermist causes from being valid. What am I missing here?
First and most important thing that I want to say here is that fanaticism is sufficient for longtermism, but not necessary. The ">10^36 future lives" thing means that longtermism would be worth pursuing even on fanatically low probabilities - but in fact, the state of things seems much better than that! X-risk is badly neglected, so it seems like a longtermist career should be expected to do much better than reducing X-risk by 10^-30% or whatever the break-even point is.
Second thing is that Pascal's Wager in particular kind of shoots itself in the foot by going infinite rather than merely very large. Since the expected value of any infinite reward is infinity regardless of the probability it's multiplied, there's an informal cancellation argument that basically says "for any heaven I'm promised for doing X and hell for doing Y, there's some other possible deity that offers heaven for Y and hell for X".
Third and final thing - I haven't actually seen this anywhere else, but here's my current solution for Pascal's Mugging. Any EV agent is going to have a probability distribution over how much money the mugger really has to offer. If I'm willing to consider (p>0) any sum, then my probability distribution has to drop off for higher and higher values, so the total integrates to 1. As long as this distribution drops off faster than 1/x as the offer increases, then arbitrarily large offers are overwhelmed by vast implausibility and their EV becomes arbitrarily small.
Agree that there is no such guarantee. Minor nitpick that the distribution in question is in my mind, not out there in the world - if the world really did have a distribution of muggers' cash that was slower than 1/x, the universe would be comprised almost entirely of muggers' wallets (in expectation).
But even without any guarantee about my mental probability distribution, I think my argument does establish that not every possible EV agent is susceptible to Pascal's Mugging. That suggests that in the search for a formalism of ideal decison-making algorithm, formulations of EV that meet this check are still on the table.