FiftyTwo comments on Minor, perspective changing facts - Less Wrong

38 Post author: Stuart_Armstrong 22 April 2013 07:01PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (157)

You are viewing a single comment's thread.

Comment author: FiftyTwo 22 April 2013 11:16:03PM *  15 points [-]

Looking at the relative calorie values of different foods radically changed my eating habits. Especially realising how high calorie 'boring' things like rice and bread were, vs. 'fun' things like bacon. Good example chart here.

Comment author: Xachariah 24 April 2013 02:58:23AM 4 points [-]

Woah. Instead of potato chips, I could be eating an equal amount of bacon.

I'm never going to eat another potato chip again.

Comment author: Desrtopa 24 April 2013 03:17:20AM 2 points [-]

Well, the fact that they have about the same calories per gram doesn't mean that they're equally healthy. The fat in bacon is almost all saturated.

Bacon is probably more filling per calorie though, so you'd be less likely to gain weight snacking on bacon than potato chips.

Comment author: RomeoStevens 24 April 2013 08:35:26AM 3 points [-]

The evidence that saturated fat is bad for you is dubious. However there is good evidence that processed meats are bad for you, even though every test of possible causal pathways has failed so far.

Comment author: RichardKennaway 24 April 2013 12:40:43PM *  1 point [-]

However there is good evidence that processed meats are bad for you

Really? I heard something on the radio a few days ago about a study to that effect, and then I came across a blog posting by someone apparently reputable finding little substance in the original paper, so what am I to make of that?

For that matter, what is unprocessed meat? Raw?

ETA: This is the study (open access), and "processed" means "having had its shelf life extended". From a brief glance at the paper, I don't think they did any sort of causal analysis beyond controlling for possible confounders such as the tendency of high consumers of red meat to smoke more. I don't care enough about this to study it in any more detail.

Comment author: RomeoStevens 24 April 2013 07:22:53PM 1 point [-]

Unprocessed means untreated with preservatives. Smoked, salted, dried, potassium benzoate, etc. The evidence I'm referencing is a meta-review of epidemiological studies. The lack of a causal pathway refers to the failure to find anything when doing intervention studies on particular substances. So it could very well be that the epidemiological studies are all failing to properly control for confounding factors. Nutritional self reporting is notoriously terrible. Epidemiological studies often rely on spaced surveys, sometimes asking questions about food habits over an entire year. That people are unable to provide accurate info is unsurprising. Still, it is not zero evidence.

My own hypothesis is that the animal's diet has a lot more to do with the potential harm to you than currently realized. Animals with crappy diets are sickly. We likely have a natural aversion to eating sickly animals for a reason.

Comment author: TitaniumDragon 25 April 2013 09:54:43PM 3 points [-]

Uh, yeah. The reason for that is that sickly animals carry parasites. It is logical that we wouldn't want to eat parasite-ridden or diseased animals, because then WE get the parasites. If the animal is not parasite-ridden, there's no good reason to believe it would be unhealthy to eat.

My personal suspicion for the cause is underlying SES factors (wealthy people tend to eat better, fresher food than the poor) as well as the simple issue of dietary selection - people who watch what they eat are also more likely to exercise and generally have healthier habits than those who are willing to eat anything.

Comment author: Desrtopa 25 April 2013 11:47:44PM *  1 point [-]

There might be some factors which the study is failing to control for, but from the link in the grandparent

Included in the analysis were 448,568 men and women without prevalent cancer, stroke, or myocardial infarction, and with complete information on diet, smoking, physical activity and body mass index

The study seems to control for the more obvious associated factors.

Also, the full text states that the consumption of red meat is associated with an increase in mortality when controlling for the confounders assessed in their study, with processed meat being associated with a greater increase, but poultry not being associated with an increase in mortality.

Comment author: TitaniumDragon 26 April 2013 07:16:12AM 2 points [-]

The problem is that the choice to eat differently itself is potentially a confounding factor (people who pick particular diets may not be like people who do not do so in very important ways), and any time you have to deal with, say, 10 factors, and try to smooth them out, you have to question whether any signal you find is even meaningful at all, especially when it is relatively small.

The study in particular notes:

[quote]Men and women in the top categories of red or processed meat intake in general consumed fewer fruits and vegetables than those with low intake. They were more likely to be current smokers and less likely to have a university degree [/quote]

At this point, you have to ask yourself whether you can even do any sort of reasonable meta analysis on the population. You're seeing clear differences between the populations and you can't just "compensate for them". If you take a sub-population which has numerous factors which increase their risk of some disease, and then "compensate" for those factors and still see an elevated level of the disease, it isn't actually suggestive of anything at all, because you have no way of knowing whether your "compensation" actually compensated for it or not. Statistics is not magic; it cannot magically remove bias from data.

This is the problem with virtually all analysis like this, and is why you should never, ever believe studies like this. Worse still, there's a good chance you're looking at the blue M&M problem - if you do enough meta analysis of a large population you will find significant trends which are not really there, and different studies (noted in the paper) indicate different results - that study showed no increase in mortality and morbidity from red meat consumption, an American study showed an increase, and several vegetarian studies showed no difference at all. Because of publication bias (positive results are more likely to be reported than negative results), potential researcher bias (belief that a vegetarian diet is good for you is likelier than normal in a population studying diet, because vegetarians are more interested in diets than the population as a whole), and the fact that we're looking at conflicting results from studies, I'd say that that is pretty good evidence that there is no real effect and it is all nonsense. If I see five studies on diet, and three of them say one thing and two say another, I'm going to stick with the null hypothesis because it is far more likely that the three studies that say it does something are the result of publication bias of positive results.

Comment author: Desrtopa 26 April 2013 01:26:40PM 1 point [-]

At this point, you have to ask yourself whether you can even do any sort of reasonable meta analysis on the population. You're seeing clear differences between the populations and you can't just "compensate for them". If you take a sub-population which has numerous factors which increase their risk of some disease, and then "compensate" for those factors and still see an elevated level of the disease, it isn't actually suggestive of anything at all, because you have no way of knowing whether your "compensation" actually compensated for it or not. Statistics is not magic; it cannot magically remove bias from data.

Well, if you already know how much each of the associated factors contributes alone via other tests where you were able to isolate those variables, you can make an educated guess that their combined effect is no greater than the sum of their individual effects.

The presence of other studies that didn't show the same significant results weighs against it, but on the other hand such cases are certainly not unheard of with respect to associations that turn out to be real. The Cochrane Collaboration's logo comes from a forest plot of results for whether an injection of corticosteroids reduce the chance of early death in premature birth. Five out of seven studies failed to achieve statistical significance, but when their evidence was taken together, it achieved very high signficance, and further research since suggests a reduction of mortality rate between 30-50%.

While a study of the sort linked above certainly doesn't establish the truth of its findings with the confidence of its statistical significance, "never believe studies like this" doesn't leave you safe from a treatment-of-evidence standpoint, because even in the case of a real association, the data are frequently going to be messy enough that you'd be hard pressed to locate it statistically. You don't want to set your bar for evidence so high that, in the event that the association were real, you couldn't be persuaded to believe in it.