This is the Dark Side root link. In my opinion it's a useful chunked concept, though maybe people should be hyperlinking here when they use the term, to be more accessible to people who haven't read every post. At the very least, the FAQ builders should add this, if it's not there already.
I'm certainly not against using chunked concepts on here per se. But I think associating this community too closely with sci-fi/fantasy tropes could have deleterious consequences in the long run, as far as attracting diverse viewpoints and selling the ideas to people who aren't already pre-disposed to buying them. If Eliezer really wanted to proselytize by poeticizing, he should turn LW into the most hyper-rational, successful PUA community on the Internet, rather than the Star Wars-esque roleplaying game it seems to want to become.
I don't think it is enough to split the "dark arts" along the true/false axis. We also need to split along internal/external.
Consider the case that we have tried very hard to avoid being taken in by false arguments and have at long last reached a true and useful conclusion. Now what? We still have to remember our conclusion and refresh it so that it doesn't slowly fade from view. Harder still, if we want our life to change, we need to find emotional equivalents for our intellectual understanding.
So I think that there are good internal uses for the "dark arts". Once you have made your rational decision find a slogan that sticks in the memory and find motivational techniques with personal resonnance, even if they are not entirely honest. Of course, if one has made a mistake with ones initial assessment one is now digging oneself a very deep hole, they aren't called dark arts for nothing.
What the hell are the "dark arts"? Could we quit playing super-secret dress-up society around here for one day and just speak in plain English, using terms with known meanings?
People are irrational largely because they're stupid. I have yet to be convinced that "rationality" is something entirely distinct from intelligence itself, such that you can appeal to someone to become significantly more "rational" without simultaneously effecting the seemingly tougher feat of boosting IQ a standard deviation or so.
If Eliezer eats fewer calories than he expends, he's not going to die of hunger.
But he may spend large amounts of time in a state where physiological and psychological responses are screaming "eat more food!". This state is not conducive to a happy, productive life.
I won't dispute this. For some people, a calculated decision to remain overweight in today's world in order to focus on other things may be the best course of action.
Alternatively, if losing weight is that important to you, you can alter your environment so "today's world" doesn't make it so tempting to eat crappy foods. Your body can be screaming out "eat more food!" all it wants, but if you're living in a cabin in some remote corner of Alaska, there's only so much damage that can do.
Of course not, but you've contrived an odd corner-case that, in fact, doesn't exist in reality. I'm not sure what that goes to show.
Except that my counterfactual organism seems to more strongly resemble Eliezer Yudkowsky than does whatever model you're working from.
Oh come on. If Eliezer eats fewer calories than he expends, he's not going to die of hunger. I fully buy that will-power is a legitimate issue, but bringing up extreme cases like this to make your point doesn't enhance the conversation.
I can starve or think, not both at the same time.
I'm sure you've seen the psych research suggesting people have a finite amount of "willpower" they can exercise at a given time. It probably does make sense for some people to worry about hard-thinking (or other endeavors) than staying in top shape.
To expand on this:
Imagine a counterfactual organism that always preferentially stores X number of calories per day as fat, where X is equivalent to the calorie expenditure of running at top speed for over 24 hours, and does not increase muscle mass.
If the organism eats more than X calories, it gains weight. If it eats less than X calories, it will experience crippling lethargy and eventually die.
Obviously no such organism would be produced by natural selection, but assume the Least Convenient Possible World. Would advising such an organism "eat less, exercise more" enable it to lose weight?
Of course not, but you've contrived an odd corner-case that, in fact, doesn't exist in reality. I'm not sure what that goes to show.
No... it... DOESN'T. I tried that. I ate a simple Paleo diet which consists of nothing except healthy foods; my staples were home-cooked turkey and bananas. I did it for months. I lost not a single pound.
You CANNOT BEGIN TO IMAGINE how much stuff that really truly seems like it ought to work simply DOES NOT WORK when you are metabolically disprivileged.
Are you saying it didn't work because it didn't curb your hunger or your desire for other, less healthy foods? Or it didn't work because you stuck to the diet of healthy foods and gained weight nonetheless? The latter seems hard to believe, though I suppose it's technically possible to accumulate an excess of calories via turkey and bananas...
I am questioning the value of diet and exercise. Thermodynamics is technically true but useless, barring the application of physical constraint or inhuman willpower to artificially produce famine conditions and keep them in place permanently. You, clearly, are one of the metabolically privileged, so let me assure you that I could try exactly the same things you do to control your weight and fail. My fat cells would keep the energy that yours release; a skipped meal you wouldn't notice would have me dizzy when I stand up; exercise that grows your muscle mass would do nothing for mine.
So, maybe staying thin requires Herculean effort for some. Why turn your back on that particular challenge? Elsewhere you seem to take a lot of pride in your determination to "save the world," which seems like no small feat. Don't try to lose weight -- lose weight!
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
Be careful about how you define those terms, as they may be idiosyncratic. "Rationalism" and "Empiricism" have long philosophical histories, and are typically seen as parallel, not-quite-rival schools of thought, with the rationalists striving to root all knowledge in a priori rational inquiry (Descartes' Meditations is the paradigm example). I'm not sure it's wise to flip that on its head by redefining such a common, well-denoted term.