Documents like these seem among the most important ones to get right.
If this is among the first essays a new user is going to see, then remember that they might have little buy-in to the site's philosophy, and furthermore don't know any of the jargon. Furthermore, not all users will be native English speakers.
So my recommendations and feedback come from this perspective.
Regarding the writing:
Regarding the content:
Missing stuff:
Regarding the writing
Agreed for the most part. However, all of the things you mention are difficult to get right. It would take a good deal of the team's time to improve the writing quality I presume. If so, the question becomes one of priorities. Is it worth spending that time or using that time on something else? My impression is that it's probably worth spending a week or so on it and then iterating periodically for a few months afterwards in response to feedback.
Be more concise
I think you can actually be both concise and lengthy. Have a "here's the quick version" section and then a "if you want more detail, here's the details" part that follows. Or maybe break it into two separate posts.
XKCD made the Simple Writer at one time
Thanks for pointing me to this. I never saw it before and think it's so cool!
Here's how you can audition for a spot in our prestigious club
I don't get that impression.
Re: the section "How to get started": There must be some way for new users to actively participate that does not require hours or days of prep work.
I don't agree with that. Large requirements will definitely filter more people out, but it's not clear that that's a bad thing. Personally my sense is that it's a good thing on balance.
Explain the karma system
This doesn't seem important enough to spend time on in this post. It seems more appropriate to have those questions addressed in the FAQs and perhaps have the post mention the FAQs as something to refer to.
I'd make it clear that while most people aspire to be rational here, we fail more often than not. The discourse level is generally significantly better than the average on reddit, twitter or discord, but certainly falls short of the ideals described in this draft.
I would also give a few examples of good/acceptable/bad posts and comments. Maybe link to some threads where people do it right, and where they do it wrong. I realize this is a lot of work, though.
Yeah, I wouldn't call my writing "rational", but it seems like I got rid of some bad habits that are frequent on other parts of internet, and it is annoying when a new user unknowingly brings them here. I wish I could pinpoint them; that would probably be a useful list of do's and dont's.
One such example is exaggeration. In many debates, exaggeration is a way to get attention. People are screaming at each other; you need to scream louder than average in order to be noticed. Here, hyperbole will more likely make you seem stupid. We want a calibrated presentation of your case instead. If it is not the most important thing in the world, that is perfectly okay... unless you pretend that it is.
Similarly, humility. If you are not sure about something, that is okay as you admit it. The problem is when you write as if you are 100% sure of something, but make obvious mistakes.
Do not use CAPS LOCK, do not make clickbait titles... okay, this is probably obvious. It just seems to me that what the annoying stuff has in common is fighting for attention (by sacrificing to Moloch). The proper way to get attention is to write good content. -- Perhaps we should remind the new users that there are mechanisms that reward this. In short term, karma. In long term, selecting the best articles of the year.
(Generally, trying to seem cool can backfire? Or maybe it's just because I am mostly noticing the unsuccessful attempts to seem cool?)
If you make a mistake, do not double down. Your article getting -10 karma should not motivate you to write three more articles on the same topic. You are not going to win that way. More likely, you will get banned.
We are not one convincing article away from joining your religion or your political cause.
One of the things I like least in comments is imputing (bad) motives: "Clearly you wrote this to X" or "Your purpose is to Y" etc.
Another thing I don't like is confidently paraphrasing what someone else said, in a way that's inevitably a misunderstanding or strawman. "You're clearly endorsing <politically disfavored concept, e.g. eugenics>. How dare you!". Trying to paraphrase others is good, if it's done in a spirit of curiosity: "Correct me if I'm wrong, but I understood you to say X.", or "As I understand it, you imply Y."
I like the idea of linking to concrete examples. If we go far enough into the archives, we presumably also aren't really making anyone particularly defensive by spotlighting some 10-year old bad comment of theirs (and we should probably just quote it without linking to it).
The community believes there are ways of thinking, that if you figure them out and adopt them, you can become a person who systematically arrives at true beliefs and good decisions more of the time than someone who didn't adopt those ways of thinking.
What does it matter what the community believes? This phrasing is a bit self-defeating, deferring to community is not a way of thinking that helps with arriving at true beliefs and good decisions.
Also, I think references to what distinguishes rationality from truth and other good things are useful in that section (these posts are not even in the original sequences).
If you are joining a community and want to be accepted and welcomed, it matters what they believe, value, and are aiming to do. For that matter, knowing this might determine whether or not you want to be involved.
Or in other words, that line means to say "hey, this is what we're about"
I do like those posts quite a fair bit. Will add.
it matters what they believe
The phrasing is ambiguous between descriptive of this fact and prescriptive for it, especially for new people joining the community, which is the connotation I'm objecting to. It's bad as an argument or way of thinking in connection with that sentence, the implication of its relevance in that particular sentence is incorrect. It's not bad to know that it's true, and it's not bad that it's true.
Notes:
To me it seems like a good idea to call out that we believe in a bunch of things that most people think are wacky. Intelligence explosion, cryonics, transhumanism, polyamory, circling. Better to filter out people who react strongly against those sorts of things from the get-go if you ask me.
I think it's good to point out that the LW audience is far more contrarian than the median, and that arguments from conformity or authority or public relations or the absurdity heuristic aren't received well. That said, I would not want to imply that there's a belief litmus test, and also expect that a significant fraction of LW members don't agree with / endorse / believe in at least one of these examples.
Agreed. However, I think you can sort of have your cake and eat it too here. I think you can:
I think 4 is a really good point though and it didn't occur to me when I wrote my initial comment, so thanks for pointing that out. At the same time, I do still endorse the "filter out people who react strongly against it" part. If 1, 2, 3 and 4 are all made clear and someone still, seeing that there's a lot of belief in wacky ideas is turned off, I expect that they wouldn't have been a good fit for the community anyway and so it's better to "fail fast".
I think it hits a lot of good notes, but I'm not sure if it's all of them we'd need, and at the same time, I'm worried it may be too long to hit a new user with all at once. I'm not sure what I'd cut. What would go in a TL;DR?
I maintain that the 12 Virtues of Rationality is a good summary but a poor introduction. They seemed pretty useless to me until after I had read a lot of the Sequences. Not beginner material.
Inferential distances and "scout mindset" might be worth mentioning.
I think Raising the Sanity Waterline (if you follow its links) is a great mini-Sequence on fundamentals. I'm not sure how much that overlaps with the Highlights, but it's probably shorter.
Hi, I’m a new user who stumbled across this so I figured it would be worth commenting. I came here via effective altruism and have now read a decent chunk of the Sequences so LW is not totally new to me as of reading this but still.
I definitely wish this introduction had been here when I first decided to take a look at LessWrong - it was a little confusing to figure out what the community was even supposed to be. The introductory paragraph is excellent for communicating what the core is that the community is built around, and the following sections seem super efficient at getting us up to speed on what LW is.
I find the How To Get Started section very confusing. It seems at first like a list of things you need to do before participating on the forum, but I guess it’s supposed to be rough progression of things you can do to become more of a LessWronger considering it has attend a meet-up on there? The paragraph afterwards also doesn’t make any sense to me - it says there’s not a tonne but you’ll probably be missing something on your first day… but it seems to me like it IS a tonne (the Sequences alone are really long!) and on your first day you won’t have done ANY of them (except general reading). Maybe you meant to say that it’s a list of possible things to get yourself clued up, but you don’t need to do a ton of it?
Finally, I already commented with no idea it would be moderated so heavily, so including that info is definitely helpful - plus the information about standards of content is just generally super useful to know from the start anyway.
Overall this seems really good and gets the important questions answered quickly. Honestly there’s not anything I wish was there that isn’t, or anything that is there that seems unnecessary. Great work 👍
The LessWrong team is currently thinking a lot about what happens with new users: both the bar of their contributions being accepted, how we deliver feedback and restriction of not good contributions, but also most importantly, how we get them onboarded onto the site
This is a draft of a document we'd present to new users to help them understand what LessWrong is about. I'm interested in early community feedback about whether I'm hitting the right notes here before investing a lot more in it.
This document also references another post that's something of more of a list of norms, akin to Basics of Rationalist Discourse, though (1) I haven't written that yet, (2) I'm much less certain about the shape or nature of it. I'll share a post or draft about that too soon.
This document is aimed at new users but may also be a useful reference for established users. It elaborates on the about page.
The Core of LessWrong: Rationality
LessWrong is an online forum and community that was built around the goal improving human reasoning and decision-making. The community believes there are ways of thinking, that if you figure them out and adopt them, you can become a person who systematically[1] arrives at true beliefs and good decisions more of the time than someone who didn't adopt those ways of thinking. Around here, the short word for "systematically arriving at truth, etc." is rationality, and that's at the core of this site.
More than that, LessWrong community shares a culture that encodes a bunch of built up beliefs, opinions, concepts, and values about how to reason better. These give LessWrong a pretty distinctive style from the rest of Internet.
Some of the features that set LessWrong apart:
Philosophical Heritage: The Sequences
Between 2006 and 2009, Eliezer Yudkowsky spent two years writing a sequence of blog posts that shared his philosophy/beliefs/models about rationality (collectively those blog posts are called The Sequences). In 2009, Eliezer founded LessWrong as a community forum for the people who were attracted to his ideas and worldview.
While not everyone on the site agrees with everything Eliezer says, The Sequences (also known as Rationality: AI to Zombies) is the foundational cultural/values document of LessWrong. To understand LessWrong and participate well (and also for your own reasoning ability), we strongly encourage you to read the Sequences.
Topics other than Rationality
We are interested in rationality not for the sake of rationality alone, but because we care about lots of other things too. LessWrong has rationality as a central focus, but site members are interested in discussing an extremely wide range of topics, albeit using our rationality toolbox/worldview.
Artificial Intelligence
If you found your way to LessWrong recently, it might be because of your interest in AI. For several reasons, the LessWrong community has strong interest in AI and specifically causing powerful AI systems to be safe and beneficial.
Even if you found your way to LessWrong because of your interest in AI, it's important for you to be aware of the site's focus on rationality, as this shapes expectations we have of all users in their posting, commenting, etc.
How to get started
<TO-DO>
not necessarily a tonne of this, but if it's your first day on LessWrong, you'll be missing <something>
</TO-DO>
How to ensure your first post or comment is approved
This is a hard section to write. The new users who need to read it least are more likely to spend time worrying the below, and those who need it most are likely to ignore it. Don't stress too hard. If you submit it and we don't like it, we'll give you some feedback.
A lot of the below is written for the people who aren't putting in much effort at all, so we can at least say "hey, we did give you a heads up in multiple places".
There are a number of dimensions upon which content submissions may be strong or weak. Strength in one place can compensate for weakness in another, but overall the moderators assess each first post/comment from new users for the following. If the first submission is lacking it, might be rejected and you'll get feedback on why.
Your first post or comment is more likely to approved by moderators (and upvoted by general site users) if:
You demonstrate understanding of LessWrong rationality fundamentals. These are the kinds of things covered in The Sequences such as probabilistic reasoning, proper use of beliefs, being curious about where you might be wrong, avoiding arguing over definitions, etc.
You write a clear introduction. If your first submission is lengthy, i.e. a long post, it's more likely to get quickly approved if the site moderators can quickly understand what you're trying to say rather than having to delve deep into your post to figure it out. Once you're established on the site and people know that you have good things to say, you can pull off having a "literary" opening that doesn't start with the main point.
Address existing arguments on the topic (if applicable). Many topics have been discussed at length already on LessWrong, or have an answer strongly implied by core content on the site, e.g. from the Sequences (which has rather large relevance to AI questions). Your submission is more likely to be accepted if it's clear you're aware of prior relevant discussion and are building upon on it. It's not a big deal if you weren't aware, there's just a chance the moderator team will reject your submission and point you to relevant material.
This doesn't mean that you can't question positions commonly held on LessWrong, just that it's a lot more productive for everyone involved if you're able to respond to or build upon the existing arguments, e.g. showing why you think they're wrong.
Address the LessWrong audience
A recent trend is more and more people crossposting from their personal blogs, e.g. their Substack or Medium, to LessWrong. There's nothing inherently wrong with that (we welcome good content!) but many of these posts neither strike us as particular interesting or insightful, nor demonstrate an interest in LessWrong's culture/norms or audience (as revealed by a very different style and not really responding to anyone on site).
It's good (though not absolutely necessary) when a post is written for the LessWrong audience and shows that by referencing other discussions on LessWrong (links to other posts is good).
Aim for a high standard if you're contributing on the topic AI
As AI becomes higher and higher profile in the world, many more people are flowing to LessWrong because we have discussion of it. In order to not lose what makes our site uniquely capable of making good intellectual progress, we have particularly high standards for new users showing up to talk about AI. If we don't think your AI-related contribution is particularly valuable and it's not clear you've tried to understand the site's culture or values, then it's possible we'll reject it.
A longer list of guidelines on LessWrong can be found here [Link]
Don't worry about it too hard.
It's ok if we don't like your first submission, we can just give you feedback. In many ways, the bar isn't that high. As I wrote above, this document is so not being approved on your first submission doesn't come as a surprise. If you're writing a comment and not a 5,000 word post, don't stress about it.
If you do want to write something longer, there is a much lower bar for open threads, e.g. the general one [link] or AI one [link]. That's a good place to say "I have an idea about X, does LessWrong have anything on that already?"
Helpful Tips <to-do>
FAQ
Intercom
OpenThreads
LessWrong moderator's tool kit.
This means you won't necessarily do better on every occasion, but that on average you will.
As opposed to beliefs being for signaling group affiliation and having pleasant feelings.