This all seems like exploiting ambiguity about what your conditional probabilities are conditional on.
Conditional on "you will be around a supercritical ball of enriched uranium and alive to talk about it," things get weird, because that's such a low-probability event to begin with. I suspect I'd still favor theories that involve some kind of unknown/unspecified physical intervention, rather than "the neutrons all happened to miss," but we should notice that we're conditioning on a very low probability event and things wi
IMO, the anthropic principle boils down to "notice when you are trying to compute a probability conditional on your own existence, and act accordingly."
A really simple example, where the mistake is obvious(ly wrong), is "isn't it amazing that we live on a planet that's just the right distance from its star (etc.) to support life?" No, this can't be amazing. The question presupposes a "we" who live on some planet, so we're looking for something like P(we live on a habitable planet | we are inhabiting a pla