Cross-posted from the EA forum. I asked for questions for this test here on LW about a year ago. Thanks to those who contributed.
Rationally, your political values shouldn't affect your factual beliefs. Nevertheless, that often happens. Many factual issues are politically controversial - typically because the true answer makes a certain political course of action more plausible - and on those issues, many partisans tend to disregard politically uncomfortable evidence.
This sort of political bias has been demonstrated in a large number of psychological studies. For instance, Yale professor Dan Kahan and his collaborators showed in a fascinating experiment that on politically controversial questions, people are quite likely to commit mathematical mistakes that help them retain their beliefs, but much less likely to commit mistakes that would force them to give up those belies. Examples like this abound in the literature.
Political bias is likely to be a major cause of misguided policies in democracies (even the main one according to economist Bryan Caplan). If they don’t have any special reason not to, people without special knowledge defer to the scientific consensus on technical issues. Thus, they do not interfere with the experts, who normally get things right. On politically controversial issues, however, they often let their political bias win over science and evidence, which means they’ll end up with false beliefs. And, in a democracy voters having systematically false beliefs obviously more often than not translates into misguided policy.
Can we reduce this kind of political bias? I’m fairly hopeful. One reason for optimism is that debiasing generally seems to be possible to at least some extent. This optimism of mine was strengthened by participating in a CFAR workshop last year. Political bias seems not to be fundamentally different from other kinds of biases and should thus be reducible too. But obviously one could argue against this view of mine. I’m happy to discuss this issue further.
Another reason for optimism is that it seems that the level of political bias is actually lower today than it was historically. People are better at judging politically controversial issues in a detached, scientific way today than they were in, say, the 14th century. This shows that progress is possible. There seems to be no reason to believe it couldn’t continue.
A third reason for optimism is that there seems to be a strong norm against political bias. Few people are consciously and intentionally politically biased. Instead most people seem to believe themselves to be politically rational, and hold that as a very important value (or so I believe). They fail to see their own biases due to the bias blind spot (which disables us from seeing our own biases).
Thus if you could somehow make it salient to people that they are biased, they would actually want to change. And if others saw how biased they are, the incentives to debias would be even stronger.
There are many ways in which you could make political bias salient. For instance, you could meticulously go through political debaters’ arguments and point out fallacies, like I have done on my blog. I will post more about that later. Here I want to focus on another method, however, namely a political bias test which I have constructed with ClearerThinking, run by EA-member Spencer Greenberg. Since learning how the test works might make you answer a bit differently, I will not explain how the test works here, but instead refer either to the explanatory sections of the test, or to Jess Whittlestone’s (also an EA member) Vox.com-article.
Our hope is of course that people taking the test might start thinking more both about their own biases, and about the problem of political bias in general. We want this important topic to be discussed more. Our test is produced for the American market, but hopefully, it could work as a generic template for bias tests in other countries (akin to the Political Compass or Voting Advice Applications).
Here is a guide for making new bias tests (where the main criticisms of our test are also discussed). Also, we hope that the test could inspire academic psychologists and political scientists to construct full-blown scientific political bias tests.
This does not mean, however, that we think that such bias tests in themselves will get rid of the problem of political bias. We need to attack the problem of political bias from many other angles as well.
I just took the test. Looking at the official answers, I have to say that the test probably says more about the test makers' bias then my own. For example, a number of the answers cite "scientific consensus", which is a rather dubious concept philosophically especially in areas like the global warming and GMOs where there is reasonable suspicion that the "consensus" is politically manufactured, or even worse "economic consensus", a.k.a., the economists we cherry-picked all agree.
It doesn't help that some of the economics questions are ambiguous: "Did the Obama administration’s 2009 stimulus package reduce or increase unemployment?" are we including the effects on the economy of borrowing the money to pay for the stimulus?
Another example is the World Giving Index. While the answer, that the US gives more then European states is probably true, the fact that the index has the US tied with Myanmar is extremely strong evidence that the index is BS.
Whether or not the index produces that effect seems to be a fairly objective question. If conversatives get this right but biased liberals get it wrong, this indeed shows bias.