Less Wrong is based on reddit code, which means we can create subreddits with relative ease.
Right now we have two subreddits, Main and Discussion. These are distinguished not by subject matter, but by whether a post is the type of thing that might be promoted to the front page or not (e.g. a meetup announcement, or a particularly well-composed and useful post).
As a result, almost everything is published to Discussion, and thus it is difficult for busy people to follow only the subjects they care about. More people will be able to engage if we split things into topic-specific subreddits, and make it easy to follow only what they care about.
To make it easier for people to follow only what they care about, we're building the code for a Dashboard thingie.
But we also need to figure out which subreddits to create, and we'd like community feedback about that.
We'll probably start small, with just 1-5 new subreddits.
Below are some initial ideas, to get the conversation started.
Idea 1
- Main: still the place for things that might be promoted.
- Applied Rationality: for articles about what Jonathan Baron would call descriptive and prescriptive rationality, for both epistemic and instrumental rationality (stuff about biases, self-improvement stuff, etc.).
- Normative Rationality: for articles about what Baron would call normative rationality, for both epistemic and instrumental rationality (examining the foundations of probability theory, decision theory, anthropics, and lots of stuff that is called "philosophy").
- The Future: for articles about forecasting, x-risk, and future technologies.
- Misc: Discussion, renamed, for everything that doesn't belong in the other subreddits.
Idea 2
- Main
- Epistemic Rationality: for articles about how to figure out the world, spanning the descriptive, prescriptive, and normative.
- Instrumental Rationality: for articles about how to take action to achieve your goals, spanning the descriptive, prescriptive, and normative. (One difficulty with the epistemic/instrumental split is that many (most?) applied rationality techniques seem to be relevant to both epistemic and instrumental rationality.)
- The Future
- Misc.
Thank you for avoiding inferential silence!
I apologize for any confusion: My comment is rather directed at the ones running LessWrong, because I am all too aware that people in charge are more or less entirely incapable of legitimately taking criticism from a crowd. That is; I intended to stand out in a way that can't be ignored by the mindset of a manager who happens to value rationality. (Also I was in an extremely jaded mood when I wrote that. <.<)
It's not that you should want such a thing, it's rather that, if MIRI understood online community evolution, they would want to encourage the existence of such things. I wouldn't expect to get my own special "subreddit" (not the same as a website) for only me and people that understand me well/agree with me, and I'd probably refuse such an offer if it was given. If MIRI never existed, there could be no such, "MIRI simply isn't good enough," subsection. MIRI is useful as an organization because CSER wasn't good enough. In the same way, I don't see MIRI as nearly good enough to accomplish its goals of ensuring a positive future, so I see fit to evolve past it and create a better organization. Whether anyone can provide such a thing or not, would you not be interested in seeing something more advanced/effective/useful than MIRI?