Sorry if this seems incomplete - thought I'd fire this off as a discussion post now and hope to return to it with a more well-rounded post later.
Less Wrongers are used to thinking of uncertainty as best represented as a probability - or perhaps as a log odds ratio, stretching from minus infinity to infinity. But when I argue with people about for example cryonics, it appears most people consider that some possibilities simply don't appear on this scale at all: that we should not sign up for cryonics because no belief about its chances of working can be justified. Rejecting this category seems to me one of the key foundational ideas of this community, but as far as I know the only article specifically discussing it is "I don't know", which doesn't make a devastatingly strong case. What other writing discusses this idea?
I think there are two key arguments against this. First, you have to make a decision anyway, and the "no belief" uncertainty doesn't help with that. Second, "no belief" is treated as disconnected from the probability line; so at some point evidence causes a discontinuous jump from "no belief" to some level of confidence. This discontinuity seems very unnatural. How can evidence add up to a discontinuous jump - what happened to all the evidence before the jump?
In discussions with a friend, who expressed great discomfort in talking about cryonics, I finally extracted the confession that he had no emotional or social basis for considering cryonics. None of his friends or family had done it, it was not part of any of the accepted rituals that he had grown up with -- there was an emotional void around it that placed it outside of the range of options that he was able to think about. It was "other", alien, of such a nature that merely rational evaluation could not be applied.
He's in his 70's, so this issue is more than just academic. He understands that by rejecting cryonics he is embracing his own death. He does not believe in an afterlife. He becomes emotionally perturbed when I discuss cryonics precisely because I am persuasive about its technical feasibility.
Perhaps this observation isn't germane to the present thread, as this seems an emotional response rather than a response driven by "no belief." But perhaps "no belief" has an emotional component, as in "I don't want to have a belief. If I had a belief, then I'd have to take an unpleasant action."