Academics write textbooks, popular books, and articles that are intended for a lay audience.
Nevertheless, I think it's great if LW users want to compile & present facts that are well understood. I just don't think we have a strong comparative advantage.
LW already has a reputation for exploring non-mainstream ideas. That attracts some and repels others. If we tried to sanitize ourselves, we probably would not get back the people who have been repulsed, and we might lose the interest of some of the people we've attracted.
A bit about our last few months:
We care a lot about AI Safety efforts in particular, and about otherwise increasing the odds that humanity reaches the stars.
Also, we[1] believe such efforts are bottlenecked more by our collective epistemology, than by the number of people who verbally endorse or act on "AI Safety", or any other "spreadable viewpoint" disconnected from its derivation.
Our aim is therefore to find ways of improving both individual thinking skill, and the modes of thinking and social fabric that allow people to think together. And to do this among the relatively small sets of people tackling existential risk.
Existential wins and AI safety
Who we’re focusing on, why
Brier-boosting, not Signal-boosting