DSimon comments on How about testing our ideas? - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (113)
A young and learning member calling reading papers "fun" without a second thought is already impressive progress when compared to the epistemic attitude of most people around us, I'd say.
LW posters have noticed many times that the most instrumentally rational people, hailed for making the world better or at any rate leaving a mark on it (Page & Brin, Warren Buffett, Linus Torvalds, maybe Thiel; among politicians either Gandhi, Churchill or Lee Kuan Yew - they wouldn't have got along! - and maybe some older ones like Alexander II of Russia or the people behind the Meiji Restoration...), rarely behave like Eliezer or Traditional Rationality would want them to. They exploited some peculiar factors, innate or unintentionally acquired advantages (genes, lucky upbringing, broad life experience) that LW attempts to emulate through some written advice and group meetings. Most haven't even heard of Bayes or can't name a couple of fallacies! :)
At this stage, if an LW user actually uses the letter and spirit of LW materials to gain rent in some complicated, important area (like education, career, interpersonal relations, "Luminosity", fighting akrasia) - well, that's a pleasant surprise but an improbable prior. And some might not even pretend to heed the advice. E.g. my choice of education and career (Social Sciences) directly contradicts the common LW wisdom, that much of it is pure woo and will be made irrelevant in the transhuman world anyway. I can't even formulate a "rationalist" argument against that wisdom, besides some vague guesses that principles of social organization and grand-scale value conflict like farmers vs. foragers - what LW likes to dismiss as "politics" - might stay important after we handle FAI, death or scarcity. For all the LW consensus knows, I might be insane for choosing to blow the next few years on empty talk instead of going the 80000 Hours way, or raising x-risk awareness by writing fiction, or something else "rationalist".
Even our smallest real gains (openness, changing one's mind, "luminosity", intellectual rigor) are impressive, given just how ineffective or double-edged most deliberate attempts at instrumental rationality are. New Atheism, "pragmatic" politics (along the lines of moldbuggery), "PUA", theology-based intellectual traditions like the Jewish ones - all claim to make you wiser, more truth-oriented, with better heuristics... yet all can have specific, awful, all-too-commonly seen negative effects on their real audiences.
Ack, noticed some tribe blindness in myself here. Out of the examples you list in your last paragraph:
I can immediately think of negative effects each of these ideas have on their audiences, except for the first one, New Atheism. Of course (remarkable coincidence!) that happens to be the one that I have personal association with. Can you elaborate on the negative effects you were thinking of when you mentioned New Atheism?
The bad of New Atheism: Children playing with memetic weapons, with the safety off.
Lack of patience, overconfidence, more about signalling intelligence than about persuading religious people, lack of empathy. Those are the problems that came immediately to mind when I thought about it. That's not to criticize all of New Atheism, though. I think I like the basic idea of it.
Yes, those all make sense, thank you.
And I am also a fan of at least a good subset of each of the other three examples. It's just good, as you say, to remember how fraught with nasty side effects this whole self-improvement thing can often be.