How many times have you heard a claim from a somewhat reputable source like "only 28 percent of Americans are able to name one of the constitutional freedoms, yet 52 percent are able to name at least two Simpsons family members"?
Mark Liberman over at Language Log wrote up a post showing how even when such claims are based on actual studies, the methodology is biased to exaggerate ignorance:
The way it works is that the survey designers craft a question like the following (asked at a time when William Rehnquist was the Chief Justice of the United States):
"Now we have a set of questions concerning various public figures. We want to see how much information about them gets out to the public from television, newspapers and the like….
What about William Rehnquist – What job or political office does he NOW hold?"The answers to such open-ended questions are recorded — as audio recordings and/or as notes taken by the interviewer — and these records are coded, later on, by hired coders.
The survey designers give these coders very specific instructions about what counts as right and wrong in the answers. In the case of the question about William Rehnquist, the criteria for an answer to be judged correct were mentions of both "chief justice" and "Supreme Court". These terms had to be mentioned explicitly, so all of the following (actual answers) were counted as wrong:
Supreme Court justice. The main one.
He’s the senior judge on the Supreme Court.
He is the Supreme Court justice in charge.
He’s the head of the Supreme Court.
He’s top man in the Supreme Court.
Supreme Court justice, head.
Supreme Court justice. The head guy.
Head of Supreme Court.
Supreme Court justice head honcho.Similarly, the technically correct answer ("Chief Justice of the United States") would also have been scored as wrong (I'm not certain whether it actually occurred or not in the survey responses).
If, every time you heard a claim of the form "Only X% of Americans know Y" you thought "there's something strange about that", then you get 1 rationality point. If you thought "I don't believe that", then you get 2 rationality points.
Neat! I'll put less confidence in such surveys now. HOWEVER! Many of the questions in such surveys are plain-ol' 50/50, and I have no idea how they could be very biased.
As an example, here is a scan from Carpini and Keeter's What Americans Know About Politics and Why It Matters. You'll notice that, in table 2.7, only 42% of Americans knew that Soviets suffered more deaths than Americans during World War 2. Seems like a coin flip to me, unless they asked, "Who had the most deaths during World War 2?" and ignored all answers besides US and USSR. I still think Americans are pretty durn ignorant of most political and historical matters. (Myself included, for many of the questions. I have no idea who my state's congressmen are (and I don't really care.))
But then, I've never been one to compare this to modern cultural knowledge. I see that as irrelevant. Asking about fresh memory vs. deep memory doesn't tell you about political knowledge per se. Responses should be compared against questions of similar difficulty.