There's a lot of background mess in our mental pictures of the world. We try and be accurate on important issues, but a whole lot of the less important stuff we pick up from the media, the movies, and random impressions. And once these impressions are in our mental pictures, they just don't go away - until we find a fact that causes us to say "huh", and reassess.
Here are three facts that have caused that "huh" in me, recently, and completely rearranged minor parts of my mental map. I'm sharing them here, because that experience is a valuable one.
- Think terrorist attack on Israel - did the phrase "suicide bombing" spring to mind? If so, you're so out of fashion: the last suicide bombing in Israel was in 2008 - a year where dedicated suicide bombers managed the feat of killing a grand total of 1 victim. Suicide bombings haven't happened in Israel for over half a decade.
- Large scale plane crashes seem to happen all the time, all over the world. They must happen at least a few times a year, in every major country, right? Well, if I'm reading this page right, the last time there was an airline crash in the USA that killed more that 50 people was... in 2001 (2 months after 9/11). Nothing on that scale since then. And though there has been crashes on route to/from Spain and France since then, it seems that major air crashes in western countries is something that essentially never happens.
- The major cost of a rocket isn't the fuel, as I'd always thought. It seems that the Falcon 9 rocket costs $54 million per launch, of which fuel is only $0.2 million (or, as I prefer to think of it - I could sell my house to get enough fuel to fly to space). In the difference between those two prices, lies the potential for private spaceflight to low-Earth orbit.
The problem is that the choice to eat differently itself is potentially a confounding factor (people who pick particular diets may not be like people who do not do so in very important ways), and any time you have to deal with, say, 10 factors, and try to smooth them out, you have to question whether any signal you find is even meaningful at all, especially when it is relatively small.
The study in particular notes:
[quote]Men and women in the top categories of red or processed meat intake in general consumed fewer fruits and vegetables than those with low intake. They were more likely to be current smokers and less likely to have a university degree [/quote]
At this point, you have to ask yourself whether you can even do any sort of reasonable meta analysis on the population. You're seeing clear differences between the populations and you can't just "compensate for them". If you take a sub-population which has numerous factors which increase their risk of some disease, and then "compensate" for those factors and still see an elevated level of the disease, it isn't actually suggestive of anything at all, because you have no way of knowing whether your "compensation" actually compensated for it or not. Statistics is not magic; it cannot magically remove bias from data.
This is the problem with virtually all analysis like this, and is why you should never, ever believe studies like this. Worse still, there's a good chance you're looking at the blue M&M problem - if you do enough meta analysis of a large population you will find significant trends which are not really there, and different studies (noted in the paper) indicate different results - that study showed no increase in mortality and morbidity from red meat consumption, an American study showed an increase, and several vegetarian studies showed no difference at all. Because of publication bias (positive results are more likely to be reported than negative results), potential researcher bias (belief that a vegetarian diet is good for you is likelier than normal in a population studying diet, because vegetarians are more interested in diets than the population as a whole), and the fact that we're looking at conflicting results from studies, I'd say that that is pretty good evidence that there is no real effect and it is all nonsense. If I see five studies on diet, and three of them say one thing and two say another, I'm going to stick with the null hypothesis because it is far more likely that the three studies that say it does something are the result of publication bias of positive results.