orthonormal comments on Humans are not automatically strategic - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (266)
I'm disappointed at how few of these comments, particularly the highly-voted ones, are about proposed solutions, or at least proposed areas for research. My general concern about the LW community is that it seems much more interested in the fun of debating and analyzing biases, rather than the boring repetitive trial-and-error of correcting them.
Anna's post lays out a particular piece of poor performance which is of core strategic value to pretty much everyone - how to identify and achieve your goals - and which, according to me and many people and authors, can be greatly improved through study and practice. So I'm very frustrated by all the comments about the fact that we're just barely intelligent and debates about the intelligence of the general person. It's like if Eliezer posted about the potential for AI to kill us all and people debated how they would choose to kill us instead of how to stop it from happening.
Sorry, folks, but compared to the self-help/self-development community, Less Wrong is currently UTTERLY LOSING at self-improvement and life optimization. Go spend an hour reading Merlin Mann's site and you'll learn way more instrumental rationality than you do here. Or take a GTD class, or read a top-rated time-management book on Amazon.
Talking about biases is fun, working on them is hard. Do Less Wrongers want to have fun, or become super-powerful and take over (or at least save) the world? So far, as far as I can tell, LW is much worse than the Quantified Self & time/attention-management communities (Merlin Mann, Zen Habits, GTD) at practical self-improvement. Which is why I don't read it very often. When it becomes a rationality dojo instead of a club for people who like to geek out about biases, I'm in.
Interestingly, the people who seem most interested in the topic of instrumental rationality never seem to write a lot of posts here, compared to the people interested in epistemic rationality. Maybe that's because you're too busy "doing" to teach (or to ask good open questions), but I'm confident that's not true of all the I-Rationality crowd.
Of course, as an academic, I'm perfectly happy staying on the E-Rationality side.
Instrumental rationality is one of my primary interests here, but I don't post much -- the standard here is too high. All I have to offer is personal anecdotal evidence about various self-help / anti-akrasia techniques I tried on myself, and I always feel a bit guilty when posting them because unsubstantiated other-optimizing is officially frowned upon here. Attempting to extract any deep wisdom from these anecdotes would be generalizing from one example.
An acceptable way to post self-help on LW would be in the form of properly designed, properly conducted long-term studies of self-help techniques. However, designing and conducting such studies is a full-time job which ideally requires a degree in experimental psychology.
If that's true, we absolutely need to lower the bar for such posts. Three good sorts of posts that are not terribly difficult are: (1) a review of a good self-help book and what you personally took from it; (2) a few-sentence summary of an academic study on an income-boosting technique, a method for improving your driving safety, or other useful content, with a link to the same; or (3) a description of self-intervention you tried and tracked impacts from, quantified self style.
I have been thinking that LW really needs categorization system for top level post, this would create a way to post on 'lighter' topics without feeling like you're not matching people's expectations.
Tags
Tags do not affect how the site is read by most people, some predefined categories can be used to drive navigation.
I've had this very failure to communicate with Tom McCabe (so the evidence is mounting that the problem is with me, rather than all of you) - [edit]Tags[/edit] are categories, only with more awesome and fewer constraints. If "predefined categories can be used to drive navigation", then surely [edit]Tags[/edit] can be used to drive navigation, without having to be predefined.
Is the problem just that the commonly used [edit]Tags[/edit] need to be positioned differently in the site layout?
Tags are categories.
I think xamdam meant that there should be a category of "lighter" posts that people could opt out of (ie, not see in their feed of new posts) so that they wouldn't have the right to complain that they didn't live up to their expectations. Promotion means that there are two tiers, but I'm not sure whether people read the front page or the new posts.
Incidentally, I think people are using the tags too much for subject matter and not enough for indicating this kind of weight or type of post. For example, I don't see a tag for self-experimentation. If the tags were visible in the article editing mode, that would encourage people to reuse the same tags, which is important for making them function (thought maybe retagging is the only way to go). If predefined tags were visible in the article editing mode, that would encourage posts on those topics; in particular, it could be used to indicate that some things are acceptable, as in Anna's list above.
yes
Excellent (it was me).
Ideas in commets below:
When someone says they have anecdotes but want data, I hear an opportunity for crowdsourcing.
Perhaps a community blog is the wrong tool for this? What if we had a tool that supported tracking rationalist intervention efficacy? People could post specific interventions and others could report their personal results. Then the tool would allow for sorting interventions by reported aggregate efficacy. Maybe even just a simple voting system?
That seems like it could be a killer app for lowering the bar toward encouraging newcomers and data-poor interventions from getting posted and evaluated.
I think there is definitely some of that, and I've heard that from other LW "fringers" like myself - people who love the concept of rationality and support the philosophy of LW, but have no time to write posts because their lives are full to the brim with awesome projects.
One problem, i think, is that teaching and writing things up well/usefully is work. I spend time reading and writing blogs, and I do that in my "fun time" because it is fun. Careful writing about practical rationality would be work and come out of my work time, and my work time is very very full. Which suggests that to advance, we need people whose job it is to do this work. Which is part of what we see in the self-improvement world - people get paid to write books and run workshops, and while there is lots of crap out there generally the result is higher quality and more useful material.