billswift comments on Elitism isn't necessary for refining rationality. - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (91)
I don't think people who feel comfortable posting average youtube comments are going to be welcome or useful at LessWrong, I don't think this is a problem, and there are a lot of people like that.
Raising the sanity waterline on a grand scale should affect the comments on youtube, but we're a long way from that.
This being said, I'd like to see more rationality materials for people of average intelligence, but that's another long term possibility. Not does there not seem to be huge interest in the project, figuring out simple explanations for new ideas is work, and it seems to be be a relatively rare talent.
I only recently ran into a good simple explanation for Bayes-- that the more detailed a prediction becomes, the less likely it is to be true. And I got it from a woman who doesn't post on LW because she thinks the barriers to entry are too high. (It's possible that this explanation was on LW, and I didn't see it or it didn't register--- has anyone seen it here?)
There's some degree of natural sorting on LW-- I'm not the only person who doesn't read the more mathematical or technical material here, and I'm not commenting on that material, either.
I don't think having separate ranked areas is going to solve the problem of people living down to expectations.
That looks like a good way of explaining the conjunction and narrative fallacies, too. They could easily be looked at as adding details to a simpler argument. I wonder what other fallacies could be "generalized" similarly?
One thing I think we should be working on is a way of organizing the mass of fallacies and heuristics. There are too many to keep straight without some sort of organizing principles.