I'd prefer more posts that aim to teach something the author knows a lot about, as opposed to an insight somebody just thought of. Even something less immediately related to rationality -- I'd love, say, posts on science, or how-to posts, at the epistemic standard of LessWrong. Also I'd prefer more "Show LessWrong" project-based posts.
Note: The following depicts my personal perception and feelings.
What bothers me is that Less Wrong isn't trying to reach the level of Timothy Gowers' Polymath Project but at the same time acts like being on that level by showing no incentive to welcome lesser rationalists or more uneducated people who want to learn the basics.
One of the few people here who sometimes tries to actually tackle hard problems appears to be cousin_it. I haven't been able to follow much of his posts but all of them have been very exciting and actually introduced me to novel ideas and concepts.
Currently, most of Less Wrong is just boring. Many of the recent posts are superb, clearly written and show that the author put a lot of work into them. Such posts are important and necessary. But I wouldn't call them exciting or novel.
I understand that Less Wrong does not want to intimidate most of its possible audience by getting too technical. But why not combine both worlds by creating accompanying non-technical articles that explain the issue in question and at the same time teach people the maths?
I know that some people here are working on decision theoretic problems and other technical issues related to rationality. Why don't you talk about it here on Less Wrong? You could introduce each article with a non-technical description or write an accompanying article that teaches the basics that are necessary to understand what you are trying to solve.
Minor quibble: I think terms "rational" and "irrational" (and "rationalist", "rationality", etc.) tend to be overused, sometimes as vague "good/bad" qualifiers (I've been guilty of that). As a rule of thumb, I'd recommend against using those terms unless
You're using them in a narrow technical meaning, i.e. a rational utility-maximizing agent in economics, or
You're discussing "non-traditional mental skills" like changing one's mind, compensating for cognitive biases or dissolving confusion (i.e. not just being smart, open-minded and an atheist), or
You have above 10000 karma.
There's too much talk of Bayesianism in fuzzy conspiracy terms and not enough "here's the maths. learn.".
1. Too much emphasis on "altruism" and treatment of "altruists" as a special class. (As opposed to the rest of us who "merely" enjoy doing cool things like theoretical research and art, but also need the world to keep existing for that to continue happening.) No one should have to feel bad about continuing to live in the world while they marginally help to save it.
2. Not enough high-status people, especially scientists and philosophers. Do Richard Dawkins and Daniel Dennett know about LW? If not, why not? Why aren't they here? What can we do about it? Why aren't a serious-looking design and the logo of an Oxford institute enough to gain credibility? (Exception that proves the rule: Scott Aaronson has LW on his blogroll, but he was reading OB before he was high-status, and so far as I am aware, hasn't ever commented on LW as opposed to OB.)
3. Too much downvoting for disagreement, or for making non-blatant errors.
4. It's not that there are too many meetup posts, it's that there are too few content posts by comparison.
5. I sometimes feel that LW is not quite nice enough (see point 3.). Visiting other internet forums quickly snaps me out of this and p...
Not enough high-status people, especially scientists and philosophers. Do Richard Dawkins and Daniel Dennett know about LW? If not, why not?
Well, to be blunt, arguing on public internet forums is not an effective way to accomplish anything much in practice. The only people who do it are those for whom the opportunity cost in time is low (and are thus necessarily underachievers) and those who find it enjoying enough to be worth the cost (but this is clearly negatively correlated with achievement and high status).
Also, arguing on the internet under one's real identity is a bad idea for anyone who isn't in one of these four categories: (1) those who already have absolute financial security and don't care what others will think of them, (2) those who instinctively converge towards respectable high-status opinions on all subjects, (3) those who can reliably exercise constant caution and iron self-discipline and censor themselves before writing anything unseemly, and (4) those who absolutely lack interest in any controversial topics whatsoever.
Not enough high-status people, especially scientists and philosophers.
High status people tend to be those whose actions are optimized to maximize status. Participating on Internet forums is not an optimal way to gain status in general. (Of course it can be a good way to gain status within particular forums, but by high-status people you clearly meant more widely-recognized status.)
(I disagree with Vladimir_M that "arguing on public internet forums is not an effective way to accomplish anything much in practice". In my experience it is a good way to get people interested in your ideas, further develop them and/or check them for correctness.)
Do Richard Dawkins and Daniel Dennett know about LW? If not, why not? Why aren't they here? What can we do about it? Why aren't a serious-looking design and the logo of an Oxford institute enough to gain credibility?
Probably not much we can do unless LW somehow gains widespread recognition among the public (but then we probably won't care so much about "not enough high status people"). I note that even the philosophers at FHI rarely participate here.
There are way to many amazing posts with very little karma and mediocre posts with large amounts of karma.
Not enough productive projects related to the site, like site improvements and art. The few that do show up get to little attention and karma.
To much discussion about things like meetups and growing the community and converting people. Those things are important but they dosn't belong on LW and should probably have their own site.
There is a category of religiously inspired posts that creep me out and set of cult alarms. It contains that post about staring from Scientology and that Transcendental meditation stuff that while I found it interesting and perhaps useful doesn't seem to belong on LW and now recently these Mormon posts abut growing organizations. shudder
There is a category of religiously inspired posts that creep me out and set of cult alarms. It contains that post about staring from Scientology and that Transcendental meditation stuff that while I found it interesting and perhaps useful doesn't seem to belong on LW and now recently these Mormon posts abut growing organizations. shudder
What I'm about to say has been said before, but it bears repeating. What exactly about all this stuff is setting off cult alarms for you? I had a similar problem with those posts as well, until I actually went and questioned the cult alarm in my head (which was a gut reaction) and realized that it might not be a rational reaction. Just because some scary group does something does not make it a bad thing, even if they're the only people that do it -- reversed stupidity is not intelligence. And a number of those things suggested sound like good, self-improvement suggestions, which are free of religious baggage.
In general, when you're creeped out by something, you should try to figure out why you're being creeped out instead of merely accepting what the feeling suggests. Otherwise you could end up doing something bad that you wouldn't have done if you'd thought it through. Which is of course the basic purpose of the teachings on this site.
It is often said that one of our core values is being able to change your mind and admit when you are wrong. That process involves questioning. Am I wrong about X? As a community, we should not punish questioning, and yet my own experience suggests we do.
I find that Less Wrong is a conflation of about six topics:
These don't all seem to fit together entirely comfortably. Ideally, I'd split these into three more-coherent sections (singularitarianism and AI, philosophy and epistemic rationality, and applied rationality and community), each of which I think could probably be more effective as their own space.
If you try to do moral philosophy, you inevitably end up thinking a lot about people getting run over by trolleys and such. Also if you want to design good chairs, you need to understand people's butts really well. Though of course you're allowed to say it's a creepy job but still enjoy the results of that job :-)
I've heard that before, and I grant that there's some validity to it, but that's not all that's going on here. 90% of the time, torture isn't even relevant to the question the what-if is designed to answer.
The use of torture in these hypotheticals generally seems to have less to do with ANALYZING cognitive algorithms, and more to do with "getting tough" on cognitive algorithms. Grinding an axe or just wallowing in self-destructive paranoia.
If the point you're making really only applies to torture, fine. But otherwise, it tends to read like "Maybe people will understand my point better if I CRANK MY RHETORIC UP TO 11 AND UNCOIL THE FIREHOSE AND HALHLTRRLGEBFBLE"
There's a number of things that make me not want to self-identify as a lesswrong user, and not bring up lesswrong with people who might otherwise be interested in it, and this is one of the big ones.
As a relative newcomer I've found it quite hard to get a sense of the internal structure of less wrong and the ideas presented. Once you've looked at the 'top' posts and the obvious bits of sequences a lot of it is quite unstructured. Little thing like references to the 'Bayesian conspiracy' or the paperclip AI turn up frequently without explanation, and are difficult to follow up.
In roughly decreasing order of annoyance:
A varying degree of belief in utilitarianism (ranging from a confused arithmetic altruism to hardcore Benthamism) seems to be often taken for granted, and rarely challenged. The feeling I get when reading posts and comments that assume the above is very similar to what an atheist feels when frequenting a community of religious people. The fix is obvious, though: I should take the time to write a coherent, organised post outlining my issues with that.
A little Singularitarianism, specifically the assumption that self-improving AI = InstantGod®, and that donating to SIAI is the best possible EV for your cash. This isn't a big deal because they tend to be confined to their own threads. (Also, in the thankfully rare instance that someone brings up the Friendly AI Rapture even when it brings nothing to the conversation, I get to have fun righteously snarking at them, and usually get cheap karma too, perhaps from the other non-Singularitarians like me.) But it does make me feel less attached and sympathetic to other LessWrongers.
Of late, there's a lot of concern about what content should be on this site and about how to promote the site and its men...
I don't like meetup posts getting in the way of actually interesting content.
I don't like the heavily blog inspired structure - I want something more like a book of core ideas, and perhaps a separate forum for discussing them and extending the core. At the moment it's very hard to "work your way in".
It would be nice to know more about other users rather than just their karma.
Content seems quite light and of low value at the moment. I may well be contributing to this.
I don't like the overlap between SIAI and LW. I'd like a clearer distinction between the two projects even if the people are the same.
I miss MoR and wish EY would finish it.
I like being notified of valuable new content via email, it makes me sad LW doesn't offer this.
I am obsessed with group epistemology just enough to suggest the probably-bad idea that a much-better-written version of this post should maybe be posted to the main LW section, and maybe even once a month. This allows people who don't constantly check LW discussion to get in on the fun, and if we want to avoid evaporative cooling those are just the kind of people whose critiques we most want. Not only is this perhaps a good idea for group epistemology but it is also a good signal to the wider aspiring-to-sanity community.
We'll see what the response to this post is, and plan from there, or not.
(ETA: Perhaps a general (both positive and negative) feedback post with relatively lax comment quality expectations would be better; as User:atucker points out in another comment on this post, there is utility to be had in positive feedback.)
Before the discussion section was implemented I envisioned more of a "episodes from rationality in everyday life" rather than the Top Posts Junior it has partly become. I think there are a large fraction of LWers who are interested in discussing the low hanging fruit of everyday life but have been discouraged by the response to those types of posts.
I dislike the discussion/main section divide. I have to check two places for recent comments/posts. Plus, every time I want to make a post I have to decide which section to post it in, and that seems to be a not insignificant mental cost. Actually I can't really tell which posts belong where, so I've ended up posting all of them in discussion "to be safe".
My chief complaint is that almost none of the other articles here are as engaging, compelling, or fun as Eliezer's sequences. Which I have finished reading. :(
I think that it would be helpful to make a "What do you like about Less Wrong" post. Mostly because phrasing things negatively frames them negatively, and knowing what people like is also helpful in making the site better.
I'd like to see more people questioning orthodox assumptions, and generally more radical arguments, yet without compromising LW standards of rigor. I feel like people are too afraid to stick their necks out and seriously argue something that goes against the majority/high-status views.
Ideally I'd like to see a version of the site where the upvotes and downvotes of people I tend to agree with affect what is displayed to me, rather than the upvotes and downvotes of everybody. I'd be happy if I saw votes made by 90% or maybe even 99% of the users, but there's a small minority of users who I'd rather not see. These are people who use downvoting politically against specific users, instead of using them as statements about specific articles.
Implementing this would require some math and perhaps significant CPU, so it may not be worthwhile.
One thing that bothers me about LW is that comments and discussion posts had their karma divided by ten (rather than some more moderate number). Surely that has to have taken away too much of the incentive to post good comments, as well as made karma less informative as the measure of general quality of thought that some are taking it for.
What bothers me is that the real agenda of the LessWrong/Singularity Institute folks is being obscured by all these abstract philosophical discussions. I know that Peter Thiel and other billionaires are not funding these groups for academic reasons -- this is ultimately a quest for power.
I've been told by Michael Anissimov personally that they are working on real, practical AI designs behind the scenes, but how often is this discussed here? Am I supposed to feel secure knowing that these groups are seeking the One Ring of Power, but it's OK because they'...
While I'm griping:
I have always been puzzled and somewhat disappointed by the reception of this post. Almost all the comments seemed to fall within the following two categories: either they totally didn't understand the post at all, or thought its main point was so utterly obvious that they had trouble understanding why I had bothered to write it.
There seemed to be very few people in the targeted intermediate group, where I myself would have been a year before: those for whom the main idea was a comprehensible yet slightly novel insight.
On the subject of powerful self-improving AI, there does not seem to be enough discussion of real-world limitations or chances for manual override on 1. the AI integrating computational power and more importantly 2. the AI manipulating the outside world with limited info and no dedicated or trustworthy manipulators, or manipulators weaker than Nanotech God. I no longer believe that 1 is a major (or trustable!) limit on FOOM since an AI may be run in rented supercomputers, eat the Internet, etc but 2 seems not to be considered very much. I've seen some clai...
Or, what do you want to see more or less of from Less Wrong?
I'm thinking about community norms, content and topics discussed, karma voting patterns, et cetera. There are already posts and comment sections filled with long lists of proposed technical software changes/additions, let's not make this post another one.
My impression is that people sometimes make discussion posts about things that bother them, and sometimes a bunch of people will agree and sometimes a bunch of people will disagree, but most people don't care that much (or they have a life or something) and thus don't want to dedicate a post just to complaining. This post is meant to make it socially and cognitively easy to offer critique.
I humbly request that you list downsides of existing policies even when you think the upsides outweigh them, for all the obvious reasons. I also humbly request that you list a critique/gripe even if you don't want to bother explaining why you have that critique/gripe, and even in cases where you think your gripe is, ahem, "irrational". In general, I think it'd be really cool if we erred on the side of listing things which might be problems even if there's no obvious solution or no real cause for complaint except for personal distaste for the color green (for example).
I arrogantly request that we try to avoid impulsive downvoting and non-niceness for the duration of this post (and others like it). If someone wants to complain that Less Wrong is a little cultish without explaining why then downvoting them to oblivion, while admittedly kind of funny, is probably a bad idea. :)