Many people are likely stumble across the Wikipedia entry for topics of interest relevant to those of us who frequent LessWrong: rationality, artificial intelligence, existential risks, decision theory, etc. These pages often shape one’s initial impressions of how interesting, important, or even credible a given topic is, and may have the potential to direct people towards productive resources (reading material, organizations like CFAR, notable figures such as Eliezer, etc.). As a result, ensuring that the Wikipedia entries on these topics are of better quality than some of them presently are presents an opportunity for investing relatively little effort in an activity with potentially substantial payoffs relative to the cost of time and effort put in.
I have already decided to improve some of the pages, beginning with the rather sloppy page that’s currently serving as the entry for existential risks, though of course others are welcome to contribute and may be more suited to the task than I am:
https://en.wikipedia.org/wiki/Risks_to_civilization,_humans,_and_planet_Earth
If you look at the section on risks posed by AI, for instance, it's notably inadequate, while the page includes a bizarre section referencing Mayan doomsday forecasts and Newton's predictions about the end of the world, neither of which seem adequately distinguished from rigorous attempts to actually assess legitimate existential risks.
I’m also constructing a list of other pages that are or are potentially in need of updating it and organizing it by my rough estimates of their relative importance (which I’m happy to share, modify, or discuss).
Turning this into a collaborative effort would be far more effective than doing it myself. If you think this is a worthwhile project and want to get involved I’d definitely like to hear from you and figure out a way to best coordinate our efforts.
I was honestly surprised to learn that there's no overarching Altruism or Charity or Humanitarianism WikiProject. Seems like the most obvious void LWers can fill, the best place to put resources into developing GAs and FAs, a useful way to draw some Wikipedians (who already tend to be unorthodox altruists) to Effective Altruism, and a good framing mechanism for drumming up interest and goodwill in general.
If enough LWers want to get involved with Glacian's Wikipedia PR project, and enough agree that a WikiProject:Altruism is a useful way of organizing our collaboration, I'd propose that we have several editorial Task Forces for topics that aren't currently the focus of any group:
We could also regularly team up with existing WikiProjects that already cover areas that would have been natural Task Forces for WP:Altruism, building ties to those Wikipedians and infusing their projects with more energy, direction, and humanpower:
What are GAs and FAs?