how important the problem is relative to other problems, what ethical theory to use when deciding whether a policy is good or bad
Apart from those two issues, the other points you bring up are the domain of experts. Unless we are experts ourselves, or have strong relevant information about the biases of experts, the rational thing to do is to defer to expert beliefs. We can widen the uncertainty somewhat (we can confidently expect overconfidence :-), maybe add a very small systematic bias in one direction (to reflect possible social or political biases - the correction has to be very small as our ability to reliably estimate these factors is very poor).
I might still complain about it falling afoul of anti-politics norms, but at least it would help create the impression that the debate was about ideas rather than tribes.
Excessive anti-politics norms are a problem here - because the issue has become tribalised, we're no longer willing to defend the rational position, or we caveat it far too much.
Presumably most of those whose opinions fall outside of whatever the acceptable range is have those opinions either because they believe they have some relevant piece of expertise, or because they believe they have some relevant information about the biases of specific experts, or because they don't believe that their ability to estimate systematic bias is in fact "very poor", or even because they disagree with you about what the experts think. This seems like the sort of information people might falsely convince themselves that they have, but at...
Theism is often a default test of irrationality on Less Wrong, but I propose that global warming denial would make a much better candidate.
Theism is a symptom of excess compartmentalisation, of not realising that absence of evidence is evidence of absence, of belief in belief, of privileging the hypothesis, and similar failings. But these are not intrinsically huge problems. Indeed, someone with a mild case of theism can have the same anticipations as someone without, and update their evidence in the same way. If they have moved their belief beyond refutation, in theory it thus fails to constrain their anticipations at all; and often this is the case in practice.
Contrast that with someone who denies the existence of anthropogenic global warming (AGW). This has all the signs of hypothesis privileging, but also reeks of fake justification, motivated skepticism, massive overconfidence (if they are truly ignorant of the facts of the debate), and simply the raising of politics above rationality. If I knew someone was a global warming skeptic, then I would expect them to be wrong in their beliefs and their anticipations, and to refuse to update when evidence worked against them. I would expect their judgement to be much more impaired than a theist's.
Of course, reverse stupidity isn't intelligence: simply because one accepts AGW, doesn't make one more rational. I work in England, in a university environment, so my acceptance of AGW is the default position and not a sign of rationality. But if someone is in a milieu that discouraged belief in AGW (one stereotype being heavily Republican areas of the US) and has risen above this, then kudos to them: their acceptance of AGW is indeed a sign of rationality.