Critiquing Gary Taubes, Part 1: Mainstream Nutrition Science on Obesity

13 ChrisHallquist 25 December 2013 06:27PM

Related: Trusting Expert Consensus

Lately, I've been thinking a lot about whether we can find any clear exceptions to the general "trust the experts (when they agree)" heuristic. One example that keeps coming up—at least on LessWrong and related blogs—is Gary Taubes' claims about mainstream nutrition experts allegedly getting obesity horribly wrong.

Taubes is probably best-known for his book Good Calories, Bad Calories. I'd previously had a mildly negative impression of him from discussion of him on Yvain's old blog, particularly some of other posts Yvain and other people linked from there, such as this discussion of Taubes' "carbohydrate hypothesis" and especially this discussion of Taubes' attempt to refute the standard calories-in/calories-out model of weight.

But I figured maybe the criticism of Taubes I'd read hadn't been fair to him, so I decided to read him for myself... and holy crap, Taubes turned out to be far worse than I expected. I decided to write a post explaining why, and then realized that, even if I were somewhat selective about the issues I focused on, I had enough material for a whole series of posts, which I'll be posting over the course of the next week.

The problem with Taubes is not that everything he says is wrong. Much of it is ludicrously wrong, but that's only one half of the problem. The other half is that he says a fair number of things mainstream nutrition science would agree with, but then hides this fact, and instead pretends those things are a refutation of mainstream nutrition science. So it's worth starting with a brief in-a-nutshell version of what mainstream nutrition science actually says about obesity.

(The following summary is drawn from a number of sources, including this, this, and this. Everything I'm about to say will be discussed in much greater detail in subsequent posts.)

Here it goes: people gain weight when they consume more calories than they burn. But both calorie intake and calorie expenditure are regulated by complicated mechanisms we don't fully understand yet. This means the causes of overweight and obesity* are also complicated and not fully understood. It is, however, worth watching out for foods with lots of added fat and sugar, if only because they're an easy way to consume way too many calories.

We currently don't have any great solutions to the problem of overweight and obesity. If you consume fewer calories than you burn, you will lose weight, but sticking to a diet is hard. It's relatively easy to lose weight in the short run, and it's possible to do so on a wide variety of diets, but only a small percentage of people keep the weight off over the long run.

As for low-carb diets, people do lose weight on them, but they do so because low-carb diets generally lead people to restrict their calorie intake even when they aren't actively counting calories. For one thing, it's hard to consume as many calories when you drastically restrict the range of foods you can eat. There's also some evidence that low-carb diets may have some advantages. in terms of, say, warding off hunger, but the evidence is mixed. There's certainly no basis for claiming low-carb diets as a magic bullet for the problems of overweight and obesity.

The above points are not the only issues at stake in Taubes' writings on nutrition. Admittedly, he covers a huge amount of ground, from the relationship between sugar and diabetes to the relationship between fat intake and heart disease to the alleged dangers of extremely-low carbohydrate diets. However, I'll be focusing on his claims about the causes of and solutions to the problems of overweight and obesity, because that seems to be the main thing people talk about when they talk about Taubes supposedly showing how wrong mainstream experts can be.

I'll also focus heavily on how Taubes misrepresents the views of mainstream experts on obesity. In the next post, though, I'll be temporarily setting that issue aside in order to look at what Taubes is proposing as an alternative. This will involve examining some claims made by Dr. Robert Atkins, whose ideas' Taubes champions.

*Note: if the use of "overweight" as a noun sounds weird to you, it does to me too, but I discovered as I researched this article that it's standard usage in the literature on the subject. I came to realize there's a good reason for this usage: it's inaccurate to talk about the problem solely in terms of "obesity," but constantly saying "the problem of people being overweight and obese" gets really wordy.

Next: Atkins Redux

Alien parasite technical guy

61 PhilGoetz 27 July 2010 04:51PM

Custers & Aarts have a paper in the July 2 Science called "The Unconscious Will: How the pursuit of goals operates outside of conscious awareness".  It reviews work indicating that people's brains make decisions and set goals without the brains' "owners" ever being consciously aware of them.

A famous early study is Libet et al. 1983, which claimed to find signals being sent to the fingers before people were aware of deciding to move them.  This is a dubious study; it assumes that our perception of time is accurate, whereas in fact our brains shuffle our percept timeline around in our heads before presenting it to us, in order to provide us with a sequence of events that is useful to us (see Dennett's Consciousness Explained).  Also, Trevina & Miller repeated the test, and also looked at cases where people did not move their fingers; and found that the signal measured by Libet et al. could not predict whether the fingers would move.

Fortunately, the flaws of Libet et al. were not discovered before it spawned many studies showing that unconscious priming of concepts related to goals causes people to spend more effort pursuing those goals; and those are what Custers & Aarts review.  In brief:  If you expose someone, even using subliminal messages, to pictures, words, etc., closely-connected to some goals and not to others, people will work harder towards those goals without being aware of it.

continue reading »

Why safety is not safe

48 rwallace 14 June 2009 05:20AM

June 14, 3009

Twilight still hung in the sky, yet the Pole Star was visible above the trees, for it was a perfect cloudless evening.

"We can stop here for a few minutes," remarked the librarian as he fumbled to light the lamp. "There's a stream just ahead."

The driver grunted assent as he pulled the cart to a halt and unhitched the thirsty horse to drink its fill.

It was said that in the Age of Legends, there had been horseless carriages that drank the black blood of the earth, long since drained dry. But then, it was said that in the Age of Legends, men had flown to the moon on a pillar of fire. Who took such stories seriously?

The librarian did. In his visit to the University archive, he had studied the crumbling pages of a rare book in Old English, itself a copy a mere few centuries old, of a text from the Age of Legends itself; a book that laid out a generation's hopes and dreams, of building cities in the sky, of setting sail for the very stars. Something had gone wrong - but what? That civilization's capabilities had been so far beyond those of his own people. Its destruction should have taken a global apocalypse of the kind that would leave unmistakable record both historical and archaeological, and yet there was no trace. Nobody had anything better than mutually contradictory guesses as to what had happened. The librarian intended to discover the truth.

Forty years later he died in bed, his question still unanswered.

The earth continued to circle its parent star, whose increasing energy output could no longer be compensated by falling atmospheric carbon dioxide concentration. Glaciers advanced, then retreated for the last time; as life struggled to adapt to changing conditions, the ecosystems of yesteryear were replaced by others new and strange - and impoverished. All the while, the environment drifted further from that which had given rise to Homo sapiens, and in due course one more species joined the billions-long roll of the dead. For what was by some standards a little while, eyes still looked up at the lifeless stars, but there were no more minds to wonder what might have been.

continue reading »

Macroeconomics, The Lucas Critique, Microfoundations, and Modeling in General

0 Matt_Simpson 06 June 2009 04:35AM

I posted this comment in reply to a post by David Henderson over at econlog, but first some context.

Mathew Yglesias writes:

...From an outside perspective, what seems to be going on is that economists have unearthed an extremely fruitful paradigm for investigation of micro issues. This has been good for them, and enhanced the prestige of the discipline. No such fruitful paradigm has actually emerged for investigation of macro issues. So the decision has been made to somewhat arbitrarily impose the view that macro models must be grounded in micro foundations. Thus, the productive progressive research program of microeconomics can “infect” the more troubled field of macro with its prestige...

...But as a methodological matter, it seems deeply unsound. As a general principle for investigating the world, we normally deem it desirable, but not at all necessary, that researchers exploring a particular field of inquiry find ways to “reduce” what they’re doing to a lower level....

continue reading »

Honesty: Beyond Internal Truth

40 Eliezer_Yudkowsky 06 June 2009 02:59AM

When I expect to meet new people who have no idea who I am, I often wear a button on my shirt that says:

SPEAK THE TRUTH,
EVEN IF YOUR VOICE TREMBLES

Honesty toward others, it seems to me, obviously bears some relation to rationality.  In practice, the people I know who seem to make unusual efforts at rationality, are unusually honest, or, failing that, at least have unusually bad social skills.

And yet it must be admitted and fully acknowledged, that such morals are encoded nowhere in probability theory.  There is no theorem which proves a rationalist must be honest - must speak aloud their probability estimates.  I have said little of honesty myself, these past two years; the art which I've presented has been more along the lines of:

SPEAK THE TRUTH INTERNALLY,
EVEN IF YOUR BRAIN TREMBLES

I do think I've conducted my life in such fashion, that I can wear the original button without shame.  But I do not always say aloud all my thoughts.  And in fact there are times when my tongue emits a lie.  What I write is true to the best of my knowledge, because I can look it over and check before publishing.  What I say aloud sometimes comes out false because my tongue moves faster than my deliberative intelligence can look it over and spot the distortion.  Oh, we're not talking about grotesque major falsehoods - but the first words off my tongue sometimes shade reality, twist events just a little toward the way they should have happened...

From the inside, it feels a lot like the experience of un-consciously-chosen, perceptual-speed, internal rationalization.  I would even say that so far as I can tell, it's the same brain hardware running in both cases - that it's just a circuit for lying in general, both for lying to others and lying to ourselves, activated whenever reality begins to feel inconvenient.

continue reading »

Mate selection for the men here

13 rhollerith 03 June 2009 11:05PM

The following started as a reply to a request for relationship advice (http://lesswrong.com/lw/zj/open_thread_june_2009/rxy) but is expected to be of enough general interest to justify a top-level post.  Sometimes it is beneficial to have older men in the conversation, and this might be one of those times.  (I am in my late 40s.)

I am pretty sure that most straight men strong in rationality are better off learning how the typical woman thinks than holding out for a long-term relationship with a women as strong in rationality as he is. If you hold out for a strong female rationalist, you drastically shrink the pool of women you have to choose from -- and people with a lot of experience with dating and relationships tend to consider that a bad move.  A useful data point here is the fact (http://lesswrong.com/lw/fk/survey_results/cee) that 95%-97% of Less Wrongers are male.  If on the other hand, women currently (*currently* -- not in some extrapolated future after you've sold your company and bought a big house in Woodside) find you extremely attractive or extremely desirable long-term-relationship material, well, then maybe you should hold out for a strong female rationalist if you are a strong male rationalist.

Here is some personal experience in support of the advice above to help you decide whether to follow the advice above.

My information is incomplete because I have never been in a long-term relationship with a really strong rationalist -- or even a scientist, programmer or engineer -- but I have been with a woman who has years of formal education in science (majored in anthropology, later took chem and bio for a nursing credential) and her knowledge of science did not contribute to the relationship in any way that I could tell.  Moreover, that relationship was not any better than the one I am in now, with a woman with no college-level science classes at all.

The woman I have been with for the last 5 years is not particularly knowledgeable about science and is not particularly skilled in the art of rationality.  Although she is curious about most areas of science, she tends to give up and to stop paying attention if a scientific explanation fails to satisfy her curiosity within 2 or 3 minutes.  If there is a strong emotion driving her inquiry, though, she will focus longer.  E.g., she sat still for at least 15 or 20 minutes on the evolutionary biology of zoonoses during the height of the public concern over swine flu about a month ago -- and was glad she did.  (I know she was glad she did because she thanked me for the explanation, and it is not like her to make an insincere expression of gratitude out of, e.g., politeness.)  (The strong emotion driving her inquiry was her fear of swine flu combined with her suspicion that perhaps the authorities were minimizing the severity of the situation to avoid panicking the public.)

Despite her having so much less knowledge of science and the art of rationality than I have, I consider my current relationship a resounding success: it is no exaggeration to say that I am more likely than not vastly better off than I would have been if I had chosen 5 years ago not to pursue this woman to hold out for someone more rational.  She is rational enough to take care of herself and to be the most caring and the most helpful girlfriend I have ever had.  (Moreover, nothing in my ordinary conversations and interactions with her draw my attention to her relative lack of scientific knowledge or her relative lack of advanced rationalist skills in a way that evokes any regret or sadness in me.  Of course, if I had experienced a long-term relationship with a very strong female rationalist in the past, maybe I *would* experience episodes of regret or sadness towards the woman I am with now.)

Here are two more tips on mate selection for the straight men around here.

I have found that it is a very good sign if the woman either (1) assigns high social status to scientific ability or scientific achievement or finds scientific ability appealing in a man or (2) sees science as a positive force in the world.  The woman I am with now clearly and decisively meets criterion (1) but does not meet criterion (2).  Moreover, one of my most successful relationships was with a woman who finds science fiction very inspiring.  (I do not BTW.)  The salient thing about that was that she never revealed it to me, nor the fact that she definitely sees science as a positive force in the world.  (I pieced those two facts together after we broke up.)  The probable reason she never revealed them to me is that she thought they would clue me in to the fact that she found scientific ability appealing in a man, which in turn would have increased the probability that I would try to snow her by pretending to be better at science or more interested in science than I really was.  (She'd probably been snowed that way by a man before she met me: male snowing of prospective female sexual partners is common.)

By posting on a topic of such direct consequence to normal straight adult male self-esteem, I am making myself more vulnerable than I would be if I were posting on, e.g., regulatory policy.  Awareness of my vulnerability might cause someone to refrain from publicly contradicting what I just wrote.  Do not refrain from publicly contradicting what I just wrote!  The successful application of rationality and scientific knowledge to this domain has high expected global utility, and after considering the emotional and reputational risks to myself of having posted on this topic, I have concluded that I do not require any special consideration over and above what I would get if I had posted on regulatory policy.

And of course if you have advice to give about mate selection for the straight men around here, here is your chance.

(EDITED to avoid implying that all men are heterosexual.)

Probability distributions and writing style

2 dclayh 04 June 2009 06:17AM

In his recent post, rhollerith wrote,

I am more likely than not vastly better off than I would have been if <I had made decision X>

This reminded me of the slogan for the water-filtration system my workplaces uses,

We're 100% sure it's 99.9% pure!

because both sentences make a claim and give an associated probability for it. Now in this second example, the actual version is better than the expectation-value-preserving "We're 99.9% sure it's 100% pure", because the actual version implies a lower variance in outcomes (and expectation values being equal, a lower variance is nearly always better).  But this leads to the question of why rhollerith didn't write something like "I am almost certainly at least somewhat better off than I would have been...". 

So I ask: when writing nontechnically, do you prefer to give a modest conclusion with high confidence, or a strong conclusion with moderate confidence?  And does this vary with whether you're trying to persuade or merely describe?

(Also feel free to post other examples of this sort of statement from LW or elsewhere; I'd search for them myself if I had any good ideas on how to do so.)

My concerns about the term 'rationalist'

10 JamesCole 04 June 2009 03:31PM

 

I've noticed that here on Less Wrong people often identify themselves as rationalists (and this community as a rationalist one -- searching for 'rationalist' on the site returns exactly 1000 hits).  I'm a bit concerned that this label may work against our favour.

Paul Graham recently wrote a nice essay Keep Your Identity Small in which he argued that identifying yourself with a label tends to work against reasonable -- rational, you might say -- disscusions about topics that are related to it.  The essay is quite short and if you haven't read it I highly reccommend doing so.

If his argument is correct, then identifying with a label like Rationalist may impede your ability to be rational.

My thinking is that once you identify yourself as an X, you have a tendancy to evaluate ideas and courses of action in terms of how similar or different they appear to your prototypical notion of that label - as a shortcut for genuinely thinking about them and instead of evaluating them on their own merits. 

Aside from the effect such a label may have on our own thinking, the term 'rationalist' may be bad PR.  In the wider world 'rational' tends to be a bit of a dirty word.  It has a lot of negative connotations.   

Outside communities like this one, presenting yourself a rationalist is likely to get other people off on the wrong foot.  In many people's minds, it'd strike you out before you'd even said anything.  It's a great way for them to pigeonhole you.  

And we should be interested in embracing the wider world and communicating our views to others.

If I was to describe what we're about, I'd probably say something like that we're interested in knowing the truth, and want to avoid deluding ourselves about anything, as much as either of these things are possible.  So we're studying how to be less wrong.  I'm not sure I'd use any particular label in my description.

Interestingly, those goals I described us in terms of -- wanting truth, wanting to avoid deluding ourselves -- are not really what separates "us" from "them".  I think the actual difference is that we are simply more aware of the fact that there are many ways our thinking can be wrong and lead us astray.  

Many people really are -- or at least start out -- interested in the truth, but get led astray by flawed thinking because they're not aware that it is flawed.  Because flawed thinking begets flawed beliefs, the process can lead people onto systematic paths away from truth seeking.  But I don't think even those people set out in the first place to get away from the truth.

The knowledge our community has, of ways that thinking can lead us astray, is an important thing we have to offer, and something that we should try to communicate to others.  And I actually think a lot of people would be receptive to it, presented in the right way. 

 

With whom shall I diavlog?

9 Eliezer_Yudkowsky 03 June 2009 03:20AM

Bloggingheads.tv can't exactly call up, say, the President of France and get him to do a diavlog, but they have some street cred with mid-rank celebrities and academics.  With that in mind, how would you fill in this blank?

"I would really love to see a diavlog between Yudkowsky and ____________."

A social norm against unjustified opinions?

11 Kaj_Sotala 29 May 2009 11:25AM

A currently existing social norm basically says that everyone has the right to an opinion on anything, no matter how little they happen to know about the subject.

But what if we had a social norm saying that by default, people do not have the right to an opinion on anything? To earn such a right, they ought to have familiarized themselves on the topic. The familiarization wouldn't necessarily have to be anything very deep, but on the topic of e.g. controversial political issues, they'd have to have read at least a few books' worth of material discussing the question (preferrably material from both sides of the political fence). In scientific questions where one needed more advanced knowledge, you ought to at least have studied the field somewhat. Extensive personal experience on a subject would also be a way to become qualified, even if you hadn't studied the issue academically.

The purpose of this would be to enforce epistemic hygiene. Conversations on things such as public policy are frequently overwhelmed by loud declarations of opinion from people who, quite honestly, don't know anything on the subject they have a strong opinion on. If we had in place a social norm demanding an adequate amount of background knowledge on the topic before anyone voiced an opinion they expected to be taken seriously, the signal/noise ratio might be somewhat improved. This kind of a social norm does seem to already be somewhat in place in many scientific communities, but it'd do good to spread it to the general public.

At the same time, there are several caveats. As I am myself a strong advocate on freedom of speech, I find it important to note that this must remain a *social* norm, not a government-advocated one or anything that is in any way codified into law. Also, the standards must not be set *too* high - even amateurs should be able to engage in the conversation, provided that they know at least the basics. Likewise, one must be careful that the principle isn't abused, with "you don't have a right to have an opinion on this" being a generic argument used to dismiss any opposing claims.

View more: Next