What science gets wrong, more science sets right. (What religion gets wrong, by way of contrast, more religion rarely sets right.)
-- Dan Savage, American Savage, p. 152
The long term discussed in that article is multiple generations, and there's still evidence there that wealth does transfer to children and further (e.g. the Swedish doctors). It has little to say about the relative efficacy of social programs vs. direct cash grants in alleviating poverty today.
It is comfortable for richer people to think they are richer because of the moral failings of the poor. And that justifies a paternalistic approach to poverty relief using vouchers and in-kind support. But the big reason poor people are poor is because they don’t have enough money, and it shouldn’t come as a huge surprise that giving them money is a great way to reduce that problem—considerably more cost-effectively than paternalism.
-- Charles Kenney, "For Fighting Poverty, Cash Is Surprisingly Effective", Bloomberg News, June 3, 2013
If there’s a single lesson that life teaches us, it’s that wishing doesn’t make it so. Words and thoughts don’t change anything. Language and reality are kept strictly apart—reality is tough, unyielding stuff, and it doesn’t care what you think or feel or say about it. Or it shouldn’t. You deal with it, and you get on with your life.
Little children don’t know that. Magical thinking: that’s what Freud called it. Once we learn otherwise we cease to be children. The separation of word and thing are the essential facts on which our adult lives are founded.
--Professor Fogg in The Magicians by Lev Grossman, p. 248
I suspect the answer is that grading at U.S. colleges just isn't that important.
I've experienced this as well, in different contexts. It's depressing to watch birders and even more commonly bird photographers trample on protected habitat just to get a better look at a bird. That being said, there's perhaps a fallacy here. It is absolutely true that some people value their personal comfort and wealth over broader values like environmental protection or the general health of the population, at least some of the time. It is also true that some people pick broader values like environmental protection or the general health of the population, even at some cost to their personal comfort and specific wants, at least some of the time.
Neither statement is true of all people, all of the time. The real questions we should ask are:
1) How many people, how much of the time? 2) Which people? And why? 3) What can we do to require less specific sacrifice in favor of the general good?
Both of these questions are better asked of very specific cases. For instance, you'll get different answers if you talk about, for example, reducing marine speed limits in Florida to protect manatees or installing smokestack scrubbers on coal-fired power plants.
Talking in generalities often avoids the hard work of quantification on real world problems in favor of ideologically motivated displays of tribal allegiance.
I've learned useful things from the sequences and CFAR training, but it's almost all instrumental, not epistemic. I suppose I am somewhat more likely to ask for an example when I don't understand what someone is telling me, and the answers have occasionally taught me things I didn't know; but that feels more like an instrumental technique than an epistemic one.
Basically, because it seems to me that if people had really huge amounts of epistemic rationality + competence + caring, they would already be impacting these problems. Their huge amounts of epistemic rationality and competence would allow them to find a path to high impact; and their caring would compel them to do it.
I agree with this, but I strongly disagree that epistemic rationality is the limiting factor in this equation. Looking at the world, I see massive lack of caring. I see innumerable people who care only about their own group, or their own interests, to the exclusion of others.
For example, many people give to ineffective local charities instead of more effective charities that invest their money in the developing world because they care more about the park down the street than they do about differently colored refugees in the developing world. People care more about other people who are closer to them and more like them than they do about different people further away. Change that, and epistemic rationality will take care of itself.
Solutions for the problems that exist in the world today are not limited by competence or epistemic rationality. (Climate change denial is a really good example: it's pretty obvious that denial is politically and personally motivated and that the deniers are performing motivated reasoning, not seriously misinformed. Better epistemic rationality will not change their actions because they are acting rationally in their own self-interests. They're simply willing to damage future generations and poorer people to protect their interests over those of people they don't care about.)
Anna's argument here is a classic example of the fallacy of assuming your opponents are stupid or misinformed, that they simply need to be properly educated and everyone will agree. This is rarely true. People disagree and cause the problems that exist in the world today because they have different values, not because they see the world incorrectly.
To the extent that people do see the world incorrectly, it is because epistemic rationality interferes with their values and goals, not because poor epistemic rationality causes them to have the wrong values and goals. That is, a lack of caring leads to poor epistemic rationality, not the other way around.
This is why I find CFAR to be a very low-effectiveness charity. It is attacking the wrong problem.
Sometimes a writer has no choice but to hedge a statement. Better still, the writer can qualify the statement—that is, spell out the circumstances in which it does not hold rather than leaving himself an escape hatch or being coy as to whether he really means it. If there is a reasonable chance that readers will misinterpret a statistical tendency as an absolute law, a responsible writer will anticipate the oversight and qualify the generalization accordingly. Pronouncements like “Democracies don’t fight wars,” “Men are better than women at geometry problems,” and “Eating broccoli prevents cancer” do not do justice to the reality that those phenomena consist at most of small differences in the means of two overlapping bell curves. Since there are serious consequences to misinterpreting those statements as absolute laws, a responsible writer should insert a qualifier like on average or all things being equal, together with slightly or somewhat. Best of all is to convey the magnitude of the effect and the degree of certainty explicitly, in unhedged statements such as “During the 20th century, democracies were half as likely to go to war with one another as autocracies were.” It’s not that good writers never hedge their claims. It’s that their hedging is a choice, not a tic.
-- Steven Pinker, Why Academics Stink at Writing (Behind Paywall)
Sorry, but it is. Simple test: open a page and view source. Do you see HTML or do you see a big chunk of obfuscated JavaScript?
Browsers today are wicked fast at rendering HTML. They are ungodly slow on anything that replaces HTML with JavaScript. A text-heavy site such as LessWrong is very well served by pure HTML with a small scattering of JavaScript here and there. LessWrong 1.0 isn't perfect markup (too many divs and spans, too little semantic markup) but it is much better designed for speed than 2.0.