? multiply what by that zero? There's so many things you might mean by that, and if even one of them made any sense to me I'd just assume that was it, but as it stands I have no idea. Not a very helpful comment.
Well, suppose you're doing an expected utility calculation, and the utility of outcome one is U1, the utility of outcome 2 is U2, and so on.
Then your expected utility looks like (some stuff)*U1 + (some other stuff)*U2, and so on. The stuff in parentheses is usually the probability of outcome N occurring, but some systems might include a correction based on collective decision-making or something, and that's fine.
Now suppose that U1=0. Then your expected utility looks like (some stuff)*0 + (some other stuff)*U2, and so on. Which is equal to (that other stuff)*U2, etc, because you just multiplied the first term by 0. So the zero is in there. You've just multiplied by it.
A technical report of the Future of Humanity Institute (authored by me), on why anthropic probability isn't enough to reach decisions in anthropic situations. You also have to choose your decision theory, and take into account your altruism towards your copies. And these components can co-vary while leaving your ultimate decision the same - typically, EDT agents using SSA will reach the same decisions as CDT agents using SIA, and altruistic causal agents may decide the same way as selfish evidential agents.
Anthropics: why probability isn't enough
This paper argues that the current treatment of anthropic and self-locating problems over-emphasises the importance of anthropic probabilities, and ignores other relevant and important factors, such as whether the various copies of the agents in question consider that they are acting in a linked fashion and whether they are mutually altruistic towards each other. These issues, generally irrelevant for non-anthropic problems, come to the forefront in anthropic situations and are at least as important as the anthropic probabilities: indeed they can erase the difference between different theories of anthropic probability, or increase their divergence. These help to reinterpret the decisions, rather than probabilities, as the fundamental objects of interest in anthropic problems.