I'm not sure where you're going with this
It was a trick question (sorry!) - all media are primary sources. From the wikipedia article:
"Primary" and "secondary" are relative terms, with sources judged primary or secondary according to specific historical contexts and what is being studied.
...
For example, encyclopedias are generally considered tertiary sources, but Pliny's Naturalis Historia, originally published in the 1st century, is a primary source for information about the Roman era.
Not all of the books I mentioned are primary sources about WWII - the ones you mentioned are primary sources in other subjects. For example, WWII for Dummies is a primary source for a study of the For Dummies series, and Rise and Fall of the Third Reich is a primary source for how history was generally written in the second half of the twentieth century (e.g., from the perspective of nations more so than of a random person).
So where I'm going with this is to say that reversed stupidity isn't intelligence, but information about stupidity is information about a topic, just as legitimate information about a topic is legitimate information about a topic, even if others are stupid about that topic. Knowledge of stupidity is a type of information no different than any other, this is how it indirectly affects knowledge about which the stupid are stupid about.
In his recent CATO article, Reversed Stupidity Is Not Intelligence, Eliezer writes:
I think the statement makes a correct point - don't dismiss an idea just because a few proponents are stupid - but is too strong as written. In some cases, we can derive information about the truth of a proposition by psychoanalyzing reasons for believing it.
There are certain propositions that people are likely to assert regardless of whether or not they are true. Maybe they're useful for status disputes, or part of a community membership test, or just synchronize well with particula human biases. "X proves the existence of God" commonly gets asserted whether or not X actually proves the existence of God. Anything that supports one race, gender, political party, or ideology over another is also suspect. Let's call these sorts of propositions "popular claims". Some true propositions might be popular claims, but popular claims are popular whether or not they are true.
Some popular claims are surprising. Without knowing anything about modern society, one might not predict that diluting chemicals thousands of times to cure diseases, or claiming the government is hiding alien bodies, would be common failure modes. You don't know these are popular claims until you hear them.
If a very large group of people make a certain assertion, and you always find it to be false, you now have very good evidence that it's a popular claim, a proposition that people will very often assert even if it's false.
Normally, when someone asserts a proposition, you assume they have good evidence for it - in Bayesian terms, the probability that they would assert it is higher if there is evidence than if there is not evidence. But people assert popular claims very often even when there is no evidence for them, so someone asserting a popular claim provides no (or little) evidence for it, leaving you back at whatever its prior is.
Time for an example: suppose two respected archaeologists (who happen to be Mormon) publish two papers on the same day. The first archaeologist claims to have found evidence that Native Americans are descended from ancient Israelites. The second archaeologist claims to have found evidence that Zulus are descended from Australian aborigines.
On the face of it, these two claims are about equally crazy-sounding. But I would be much more likely to pay attention to the claim that the Zulus are descended from aborigines. I know that the Mormons have a bias in favor of believing Indians are descended from Israelites, and probably whatever new evidence the archaeologist thinks she's found was just motivated by this same bias. But no one believes Zulus are descended from Australians. If someone claims they are, she must have some new and interesting reason to think so.
(to put it another way, we expect a Mormon to privilege the hypothesis of Israelite descent; her religion has already picked it out of hypothesis-space. We don't expect a Mormon to privilege the hypothesis of Australian descent, so it's more likely that she came to it honestly).
If then I were to learn that there was a large community of Mormons who interpreted their scripture to say that Zulus were descended from Australians, I would consider it much more likely that the second archaeologist was also just parroting a religious bias, and I would no longer be quite as interested in reading her paper.
In this case, reversed stupidity is intelligence - learning that many people believed in an Australian-Zulu connection for religious reasons decreases my probability that the posited Australian-Zulu connection is real. I can never go lower than my whatever my prior for an Australian - Zulu connection would be, but I can discount a lot of the evidence that might otherwise take me above my prior.
So in summary, a proposition asserted for stupid reasons can raise your probability that it is the sort of proposition that people assert for stupid reasons, which in turn can lower your probability that the next person to assert it will have smart reasons for doing so. Reversed stupidity can never bring the probability of an idea lower than its prior, but it can help you discount evidence that would otherwise bring it higher.