I’ve recently heard a number of people arguing for “fanaticism“ when it comes to longtermism. Basically, if a cause area has even a minuscule probability of positively affecting the long-term future of humanity (and thus influencing an effectively unbounded number of lives), we should fund/support that cause even at the expense of near-term projects with high probability of success. If this is so, I have trouble seeing why Pascal’s Wager (or the even less probable Pascal’s Mugging) shouldn’t hold. I know most people (even religious people) don’t believe Pascal’s argument is valid, but most arguments against it I’ve read would seem to also exclude low-probability longtermist causes from being valid. What am I missing here?
Some of them are invalid for exactly the same reason.
One of the problems with Pascal's Wager (and Pascal's Mugging) is that actually, you don't even know the sign of the result of your actions.
Sure, you've got a book that says something about the disposition of an afterlife, but there are many books written and unwritten, and you don't know which is correct. Doing that which gets you a good afterlife in one gets you eternal torture in another. Likewise the mugger says that there will be invisible multitudes tortured iff you don't give them money, but really how trustworthy is such a person even if they do have such power? Isn't it just as likely that they'll use success on this occasion as evidence that they should just keep trying the same thing until they do end up torturing all those people anyway? Maybe they can use all this money (or acquiescence in other ways) to build their power to torture even more people in the future.
Likewise efforts made on behalf of uncountable future multitudes might have the opposite effect from the intended one. The future is hard to predict, and except for some really obvious and immediate issues, you should expect that the correlation between the intended and actual outcome is near zero.
If your proposed actions have some easily predictable negative outcomes that are not outweighed by known positive outcomes with similar or better credence, then you almost certainly shouldn't do them.