patrissimo comments on Self-Improvement or Shiny Distraction: Why Less Wrong is anti-Instrumental Rationality - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (251)
Why I read less wrong:
Item D is the most important to me, but LessWrong has not been very successful at it. EY rarely gives the posts that I think are important along those lines the coveted green button, nor does the LW readership vote them up highly.
I think that the most important purpose LW could serve would be to critically analyze the ideas EY has put forth, and discuss different possible paths to a better future. But, AFAIK, EY has not given the green button to any posts that look at his ideas critically. Most readers never see posts that don't get the green button. So LW doesn't serve that purpose well.
Self-improvement for me from LW does not usually come from the akrasia stuff. pjeby's website is more interesting for that, at least what I've looked at so far. (I read "Everything I Needed To Know About Life, I Learned From Supervillains" yesterday, and recommend it.) It comes more in finding specific errors in my reasoning or holes in my understanding, and calibrating.
EY's sequences and early posts are very different from the usual self-improvement stuff. I think people would benefit more from reading the sequences than from staying current on all the new posts (yet I do the latter instead of the former). I know people aren't reading them, because he has some good posts (old ones, backdated to before LW existed; maybe they were imported from OB) with only a couple of upvotes.
Well, here we come to the gap between "the stated intention of Less Wrong" and "what people actually use it for". This is surely a big part of the resolution of the gap I pointed out. If people are not using LW to increase their own rationality, then it should be clearer about that. Perhaps it's my misreading of "refining the art of human rationality" - I assumed that the goal was "by making humans more rational", but if the goal is just to sit around and have a delightful intellectual wankfest about the deep nature of rationality in isolation from people's execution of their brains and use of their abilities in real life, then the site is being consistent :).
But this doesn't seem consistent with Eliezer's claim "Rationalists win". I've seen enough of life to know that winners spend time building their many different kinds of muscles, not chatting on web forums.