Many people are likely stumble across the Wikipedia entry for topics of interest relevant to those of us who frequent LessWrong: rationality, artificial intelligence, existential risks, decision theory, etc. These pages often shape one’s initial impressions of how interesting, important, or even credible a given topic is, and may have the potential to direct people towards productive resources (reading material, organizations like CFAR, notable figures such as Eliezer, etc.). As a result, ensuring that the Wikipedia entries on these topics are of better quality than some of them presently are presents an opportunity for investing relatively little effort in an activity with potentially substantial payoffs relative to the cost of time and effort put in.

I have already decided to improve some of the pages, beginning with the rather sloppy page that’s currently serving as the entry for existential risks, though of course others are welcome to contribute and may be more suited to the task than I am:

https://en.wikipedia.org/wiki/Risks_to_civilization,_humans,_and_planet_Earth

If you look at the section on risks posed by AI, for instance, it's notably inadequate, while the page includes a bizarre section referencing Mayan doomsday forecasts and Newton's predictions about the end of the world, neither of which seem adequately distinguished from rigorous attempts to actually assess legitimate existential risks.

I’m also constructing a list of other pages that are or are potentially in need of updating it and organizing it by my rough estimates of their relative importance (which I’m happy to share, modify, or discuss).

Turning this into a collaborative effort would be far more effective than doing it myself. If you think this is a worthwhile project and want to get involved I’d definitely like to hear from you and figure out a way to best coordinate our efforts.

New Comment
19 comments, sorted by Click to highlight new comments since: Today at 9:02 AM

I have already decided to improve some of the pages, beginning with the rather sloppy page that’s currently serving as the entry for existential risks

Ten bonus points for Doing The Work. You have already avoided the most common way these projects fail.

If you are going to do this, please keep in mind Wikipedia's most relevant policies and guidelines in this context: The conflict of interest guideline, the Neutral point of view policy, and the prohibition on original research.

I will certainly keep those considerations in mind. I don't personally intend any substantial revisions without them being reviewed. I have no explicit interest in promoting any particular agenda, but I am of course aware that my interests and affiliations could lead me to have particular biases or to over- or underemphasize certain issues; that's another reason why having multiple eyes look at these pages makes more sense than going at it alone. I don't want these pages to reflect my take on an issue. I just want the pages to be better.

I suggest making a list of LW- or EA-relevant articles on a Talk page, so everyone can quickly add to it. Sort them in three categories:

  • high priority: stuff pretty much everyone agrees needs urgent work, based on quality and importance
  • mid priority: stuff that should be gotten to eventually, or anything whose priority a number of LWers disagree about
  • low priority: important but high-quality articles LWers should keep tabs on in case they degrade, and otherwise unimportant articles LWers might be especially knowledgeable about

Then start working through the high priority list systematically, focusing effort on one article at a time. The existential risks article you mention seems like a fine place to start. If there's a ton of interest, bring an article all the way up to Featured Article status and try to get it shown off on Wikipedia's main page, as a way of spreading important ideas. Otherwise, just try to make them better resources. Perhaps coordinate with (and/or take over) WikiProjects focused on: Mathematics, Statistics, Cognitive Science, Psychology, Computing, Technology, Transhumanism, Futures studies.

I was honestly surprised to learn that there's no overarching Altruism or Charity or Humanitarianism WikiProject. Seems like the most obvious void LWers can fill, the best place to put resources into developing GAs and FAs, a useful way to draw some Wikipedians (who already tend to be unorthodox altruists) to Effective Altruism, and a good framing mechanism for drumming up interest and goodwill in general.

If enough LWers want to get involved with Glacian's Wikipedia PR project, and enough agree that a WikiProject:Altruism is a useful way of organizing our collaboration, I'd propose that we have several editorial Task Forces for topics that aren't currently the focus of any group:

  1. Charities TF (with WP Organizations)
  2. Existential Risk TF (with WP Disaster Management, WP Futures Studies and WP Extinction)
  3. Life Extension TF (with WP Transhumanism, WP Medicine and WP Death)
  4. Mathematics of Altruism TF (with WP Mathematics and WP Economics)
  5. Psychology of Altruism TF (with WP Psychology)

We could also regularly team up with existing WikiProjects that already cover areas that would have been natural Task Forces for WP:Altruism, building ties to those Wikipedians and infusing their projects with more energy, direction, and humanpower:

  1. WP Animal Rights
  2. WP Education (Rationality Activism)
  3. WP Environment (Environmentalism)
  4. WP Feminism
  5. WP Human Rights
  6. WP International Development
  7. WP Philosophy: Ethical Theory TF
  8. WP Transhumanism

developing GAs and FAs

What are GAs and FAs?

FAs = Featured Articles, the most heavily vetted, consistently high-quality, comprehensive articles on Wikipedia. An FA is a candidate for appearing, once only and for a single day, on the main page of Wikipedia. Which is an excellent way to advertise a topic you like. You also literally get a gold star to commemorate the success.

GA = Good Article. A bronze medal for articles that are pretty good, but not up to FA snuff. Kind of a silly honor, but a good benchmark to shoot for if you just want to polish things up without doing lots of new research and repeated rounds of vetting. Or if you want a waystation before approaching FA. WikiProjects also keep tabs on the general quality of articles in their scope at lower levels.

Ah, thanks!

Before going for an FA, keep in mind that FA is very rarely granted, the process is absurdly nitpicking, and it's mostly given to people who've already earned FAs or have participated in the process for a long time. I tried going for an FA for one or two of my best-researched articles, and the process was so frustrating that I never tried again. And this was back in 2008 or so, when the process was more reasonable.

(GA isn't too great any more compared to its original form, but it's a lot more doable.)

I'm reading through the names of the FA's on Wikipedia.

It looks as though the FA's are heavily represented by highly regional events like hurricanes, highly local historical places, and very specific things. I'd like to hazard a guess that at least half of all FA's are written by people with a close personal connection or hobby with the subject.

The mathematics section is looking pretty empty, as is the computing section. Maybe they'd like another article for those?

Note that those are also highly delineated and uncontroversial topics, which means that they can pass FAs easily. FA isn't so much about 'is this a great article?' but 'can we find any excuse to not make this an FA?'; hence, crabbed uninteresting topics. Hurricanes aren't very controversial.

So articles on those subjects aren't especially good?

They're good in a very stereotypical, narrow, uncontroversial, lowest-common-denominator sort of way.

Good articles and featured articles, respectively. Part of the Wikipedia system for grading article quality.

That sounds good. I'll do that. I may take slightly longer than someone experienced with Wikipedia (I need to familiarize myself a bit with the specifics, but it looks pretty darn easy, so that shouldn't be an issue) but unless someone more capable than me wants to run with this, I'll retain my commitment to doing it myself.

There is a number of useful ideas on this forum worth a wikipedia article. However, I am not sure how acceptable it is to reference primarily LW (or another single source) in a wikipedia entry.

On the other hand, some concepts have spread beyond the site. For example, the term "steelmanning" as a step up from the principle of charity would be a good candidate for an article, as it has a number of references outside this community.

Wikipedia's notability standard are much higher for blogs than for academic sources. (Mainstream journalism falls somewhere in between the two.) A concept might be cited on fifteen thousand blogs and still not be counted as 'noteworthy' to the same degree as a concept that's occurred in fifteen substantial academic texts. Citing published books, MIRI research papers, and use of concepts in mainstream journalism + academia is your best bet.

Also, err on the side of consolidating information into a single article (e.g., 'steel man' as a section of the 'straw man' article rather than as a stand-alone) and building a Redirect to the main page, rather than constructing separate articles for every little neologism.

I am not sure how acceptable it is to reference primarily LW (or another single source) in a wikipedia entry.

It would count as a blog and would be unlikely to pass muster unless the author was quite famous.

[-][anonymous]10y-40

Taxonomic characterization

Taxonomic characterization allowed Kuhn to postulate his no-overlap principle, since, if the taxonomic categories are divisions in a logical sense then this implies that the relations established between these concepts and the rest are necessarily hierarchical. It is for exactly this type of relationship that the changes in categories are holistic, as the modification of a category necessarily implies the modification of the surrounding categories, which explains why once the change takes place the taxonomies can not be comparable - they are isomorphic.

This characterization was already present in Kuhn’s writing along with remnants of semantic characterization, which he developed in full towards the end of the 1980s in his taxonomic characterization. An advantage of this characterization is the belief that the criteria that allow the identification of a concept with its references are many and varied, so that a coincidence of criteria is not necessary for successful communication except for those categories that are implicated. Kuhn saw the relations between concepts as existing in a multidimensional space, the categories consist of partitions in this space and they must coincide between the communicators, although this is not the case for the criteria that establish a connection between this space and the associated reference.

*

Lots of work to be done on hyperlinking and re-categorising lesswrong relevant articles