This is categorically invalid. Humans are not bayesian belief-networks.
You have a point in saying that "12.2485%" is an unlikely number to give to your degree of belief in something, although you could create a scenario in which it is reasonable (e.g. you put 122485 red balls in a bag...). And it's also fair to say that casually giving a number to your degree of belief is often unwise when that number is plucked from thin air - if you are just using "90%" to mean "strong belief" for example. The point about belief not being binary stands in any case.
We're not discussing "what should you believe" -- we are discussing "what should you hold to be true."
Those are one and the same! If that's the real source of your disagreement with everyone here, it's a doozy.
And that, sir, categorically IS binary. A thing is either true or not true. If you affirm it to be true you are assigning it a fixed binary state.
Perhaps this will help make the point clear. In fact I'm sure it will - it deals with this exact confusion. Please, if you don't read any of these other links, look at that one!
LessWrongers as a group are often accused of talking about rationality without putting it into practice (for an elaborated discussion of this see Self-Improvement or Shiny Distraction: Why Less Wrong is anti-Instrumental Rationality). This behavior is particularly insidious because it is self-reinforcing: it will attract more armchair rationalists to LessWrong who will in turn reinforce the trend in an affective death spiral until LessWrong is a community of utilitarian apologists akin to the internet communities of anorexics who congratulate each other on their weight loss. It will be a community where instead of discussing practical ways to "overcome bias" (the original intent of the sequences) we discuss arcane decision theories, who gets to be in our CEV, and the most rational birthday presents (sound familiar?).
A recent attempt to counter this trend or at least make us feel better about it was a series of discussions on "leveling up": accomplishing a set of practical well-defined goals to increment your rationalist "level". It's hard to see how these goals fit into a long-term plan to achieve anything besides self-improvement for its own sake. Indeed, the article begins by priming us with a renaissance-man inspired quote and stands in stark contrast to articles emphasizing practical altruism such as "efficient charity"
So what's the solution? I don't know. However I can tell you a few things about the solution, whatever it may be:
Whatever you may decide to do, be sure it follows these principles. If none of your plans align with these guidelines then construct a new one, on the spot, immediately. Just do something: every moment you sit hundreds of thousands are dying and billions are suffering. Under your judgement your plan can self-modify in the future to overcome its flaws. Become an optimization process; shut up and calculate.
I declare Crocker's rules on the writing style of this post.