One workaround would be to assign high confidence only to beliefs for which you have read n academic papers on the subject. For example, only assign 90% confidence if you've read ten academic papers.
Easy hack:
I don't think many people do this as such, but there are less self-aware versions of the same procedure that do happen in practice. For example, if you hang out on any reasonably intellectual partisan blog, links to related papers will probably come your way pretty often. If you read them as they arrive and update as suggested, in fairly short order you'll have read enough to assign high confidence to your preexisting opinions -- yet those opinions will never be seriously challenged, because all the information involved has been implicitly screened for compatibility before it gets anywhere near your head.
Your second criterion helps but I don't think it's sufficient; it's very easy to convince yourself that you understand the strongest opposing arguments as long as you've been exposed to simplified or popularized versions of them, which to a first approximation is true for everyone with opinions on controversial issues.
So you're playing the credence game, and you’re getting a pretty good sense of which level of confidence to assign to your beliefs. Later, when you’re discussing politics, you wonder how you can calibrate your political beliefs as well (beliefs of the form "policy X will result in outcome Y"). Here there's no easy way to assess whether a belief is true or false, in contrast to the trivia questions in the credence game. Moreover, it’s very easy to become mindkilled by politics. What do you do?
In the credence game, you get direct feedback that allows you to learn about your internal proxies for credence, i.e., emotional and heuristic cues about how much to trust yourself. With political beliefs, however, there is no such feedback. One workaround would be to assign high confidence only to beliefs for which you have read n academic papers on the subject. For example, only assign 90% confidence if you've read ten academic papers.
To account for mindkilling, use a second criterion: assign high confidence only to beliefs for which you are ideologically Turing-capable (i.e., able to pass an ideological Turing test). As a proxy for an actual ideological Turing test, you should be able to accurately restate your opponent’s position, or be able to state the strongest counterargument to your position.
In sum, to calibrate your political beliefs, only assign high confidence to beliefs which satisfy extremely demanding epistemic standards.