When some folks think "Science", they don't think "formalized careful thinking/writing things down/reproducibility," they think "competitor priesthood."
"Anti-science" seems like sloganeering.
I've previously remarked that
...One of the biggest ways by which scientists and science-minded people fail to communicate with laypeople, I think, is with the phrase “there’s no scientific evidence for X”.
Suppose that a layperson comes to a scientist and explains excitedly that he just heard of a new way to [raise children/lose weight/cure illnesses] that works great, and everyone should adopt that method at once!
The scientist replies that she has heard of that method too, but so far there’s no scientific evidence for it.
By that, the scientist means something roughly like “Well, there might be something to it, but the successes that you mention could very well just be coincidence, and it’s really really hard to figure out whether that thing actually works, even if we do lots of careful experiments. So although the thing that you mention could be worth looking into, we really don’t know whether it works yet, and most things like that actually turn out not to work when you do the experiments, so for now we should assume that it won’t work.”
What the layperson actually hears is “I’m going to be all stuck up and holier-than-thou and insist that we do this time-consuming and expensive exa
Not really surprising. If you tell most people that those who think as they do are ignorant savages, the expected response is not "Oh, we need to change our ways". The expected response is "Fuck you".
Calling anti-vaccination people "anti-science" is a transparently bad persuasion tactic. Leave a social line of retreat.
Also, it probably isn't even true that they're anti-science. It's more likely their stances on science are inconsistent, trusting it to varying degrees in different situations depending on the political and social implications of declaring belief.
In a geology course I took in undergrad, I was under the impression that successfully locating fossil fuels that can be used for production purposes requires understanding the mechanics of how fossils form and how long the organic material has to be fossilized, which explicitly requires deep time. Young Earth Creationism cannot be used as a model for providing the world's oil, and our professor made sure that we understood those implications.
Here's a relevant study, Negative persuasion via personal insult:
an individual directly insulted by a communicator attempting to persuade him will show a “boomerang effect” by increasing the extremity of his initial attitude position
Years and years ago, when I was trolling creationist discussion boards (not for lulz, but to deconvert), I tried to make mild claims that were nonetheless adequate to send the more extreme elements into frothy rages.
Those were more effective at deconversion than the actual valid arguments that provoked them.
On the risk of seeming to ride a hobby horse (which I don't) I post this:
Is there any risk that we (as a society) may lose science (or rather scientific literacy) in the medium run to religous or other anti-science factions?
This can actually happen and did happen more than once during human history. As a data point take this:
Frederick Starr: Lost Enlightenment
A very interesting account of the rise and fall of the arab enlightenment in central asia.
First chapter here: http://press.princeton.edu/chapters/s10064.pdf
From that chapter:
...There is no more vexing
Rationality could be defeated by one powerful enemy (e.g. religion), but also by a concentrated attack of many diverse enemies. These days I would be more worried by the latter.
Rationality is a common enemy of many beliefs. If you refuse one specific truth, you must also refuse other specific truths this one is connected with, then you must refuse the general rules of reasoning, and the whole meta-level. And this is where it becomes dangerous: people with different false beliefs can be opponents at the object level, but allies at the meta-level; they may all agree that all this talking about "evidence" is bullshit... some of them because it offends their religious beliefs; others because it leads to politically unacceptable conclusions; yet others because it can be used to support sexism or racism; etc. Each of them wants to remove some specific conclusion; all of them want to stop the same algorithms for reasoning.
I don't think it's so likely that the science in the west could be destroyed these days by religion alone. But it could be destroyed by systematic attacks from all sides: religious people who hate hearing about evolution, paranoid people who hate hearing about ...
In a sense, most certainly yes! In the middle ages, each fiefdom was a small city-state, controlling in its own right not all that much territory. There certainly wasn't the concept of nationalism as we know it today. And even if some duke was technically subservient to a king, that king wasn't issuing laws that directly impacted the duke's land on a day to day basis.
This is unlike what we have today: We have countries that span vast areas of land, with all authority reporting back to a central government. Think of how large the US is, and think of the fact that the government in Washington DC has power over it all. That is a centralized government.
It is true that there are state governments, but they are weak. Too weak, in fact. In the US today, the federal government is the final source of authority. The president of the US has far more power over what happens in a given state than a king in the middle ages had over what happened in any feudal dukedom.
As a corollary, how many people trust science more because they hear some misrepresentation that quantum physics says X or Y?
This reminds me. Effective communication and persuasion skills seem like they should be core concepts of rationality. Does CFAR teach effective persuasion in their workshops? I was reading this interview of an FBI hostage negotiator and the techniques that he talks about in hostage negotiation seem like they would be pretty good skills to have, and I wouldn't want to join the FBI just to get a chance to have a structured environment where I can learn that stuff more effectively.
Well seeing as irrelevant factors seem to have such a big influence on people's perceptions, it seems to me that the thing to do would be to talk about those factors in relation to anti-vaccers. Less 'they're wilfully ignorant of the research' and more 'they willfully kick their wives and beat their dogs'.
Paper by the Cultural Cognition Project: The culturally polarizing effect of the "anti-science trope" on vaccine risk perceptions
This is a great paper (indeed, I think many at LW would find the whole site enjoyable). I'll try to summarize it here.
Background: The pro/anti vaccine debate has been hot recently. Many pro-vaccine people often say, "The science is strong, the benefits are obvious, the risks are negligible; if you're anti-vaccine then you're anti-science".
Methods: They showed experimental subjects an article basically saying the above.
Results: When reading such an article, a large number of people did not trust vaccines more, but rather, trusted the American Academy of Pediatrics less.
My thoughts: I will strive to avoid labeling anybody as being "anti-science" or "simply or willfully ignorant of current research", etc., even when speaking of hypothetical 3rd parties on my facebook wall. This holds for evolution, global warming, vaccines, etc.
///
Also included in the article: references to other research that shows that evolution and global warming debates have already polarized people into distrusting scientists, and evidence that people are not yet polarized over the vaccine issue.
If you intend to read the article yourself: I found it difficult to understand how the authors divided participants into the 4 quadrants (α, ß, etc.) I will quote my friend, who explained it for me:
I was helped by following the link to where they first introduce that model.
The people in the top left (α) worry about risks to public safety, such as global warming. The people in the bottom right (δ) worry about socially deviant behaviors, such as could be caused by the legalization of marijuana.
People in the top right (β) worry about both public safety risks and deviant behaviors, and people in the bottom left (γ) don't really worry about either.