History of Less Wrong

Less Wrong is a community resource devoted to refining the art of human rationality, sometimes known as rationalism. It has undergone which was founded in 2009. Site activity reached a risepeak in 2011-13 and arguably a falltrough in recent years.2016-17. This page mainly describes the history through 2016.

TodayThe view from 2016

As of 2016 the community is far less active than it once was. The forum'sforum stands, but submissions are down. The wiki has low traction and it is potentially in need to streamlining around remaining activity rather than its former glories.

The diaspora section says in part: "The wiki has low traction and it is potentially in need to streamlining around remaining activity rather than its former glories."

What does low traction mean? --Alti (talk) 21:23, 19 March 2017 (AEDT)

Popularity, usage, edit rates, visits, authoritativeness, respect, importance, influence, etc. --Gwern (talk) 02:11, 22 March 2017 (AEDT)

Created by Deku-shrub at

Around 2001 Yudkowsky had created theSL4 mailing list and IRC channel, and on them Yudkowsky frequently expressed annoyance, frustration, and disappointment in his interlocutors' inability to think in ways he considered obviously rational. After failed attempts at teaching people to use Bayes' Theorem, he went largely quiet from SL4 to work on AI safety research directly. After discovering he was not able to make as much progress as he wanted to, he changed tacts to focus on teaching the rationality skills necessary to do AI safety research until such time as there was a sustainable culture that would allow him to focus on AI safety research while also continuing to find and train new AI safety researchers.

LessWrong developed from Overcoming Bias, an earlier group blog focused on human rationality, which began in November 2006, with Eliezer Yudkowsky and Robin Hanson as the principal contributors.

Prior to thatAround 2001 Yudkowsky had created the SL4 mailing list and IRC channel, and on them Yudkowsky frequently expressed annoyance, frustration, and disappointment in his interlocutors' inability to think in ways he considered obviously rational. After failed attempts at teaching people to use Bayes' Theorem, he went largely quiet from SL4 to work on AI safety research directly. After discovering he was not able to make as much progress as he wanted to, he changed tacts to focus on teaching the rationality skills necessary to do AI safety research until such time as there was a sustainable culture that would allow him to focus on AI safety research while also continuing to find and train new AI safety researchers.

LessWrong material was ultimately developed from Overcoming Bias, an earlier group blog focused on human rationality, which began in November 2006, with Eliezer Yudkowsky and Robin Hanson as the principal contributors.

Prior to that Yudkowsky had created the SL4 mailing list and IRC channel, and on them Yudkowsky frequently expressed annoyance, frustration, and disappointment in his interlocutors' inability to think in ways he considered obviously rational. After failed attempts at teaching people to use Bayes' Theorem, he went largely quiet from SL4 to work on AI safety research directly. After discovering he was not able to make as much progress as he wanted to, he changed tacts to focus on teaching the rationality skills necessary to do AI safety research until such time as there was a sustainable culture that would allow him to focus on AI safety research while also continuing to find and train new AI safety researchers.

In February 2009, Yudkowsky's posts were used as the seed material to create the community blog LessWrong, and Overcoming Bias became Hanson's personal blog. Some users were recruited via Eliezer's transhumanist https://hpluspedia.org/wiki/SL4 SL4 mailing list.

In February 2009, Yudkowsky's posts were used as the seed material to create the community blog LessWrong, and Overcoming Bias became Hanson's personal blog. Some users were recruited via Eliezer's transhumanisthttps://hpluspedia.org/wiki/SL4 SL4 mailing list.

Around 2013, many core members of the community stopped posting on Less Wrong, because of both increased growth of the Bay Area physical community and increased demands and opportunities from other projects. MIRI's support base grew to the point where Eliezer could focus on AI research instead of community-building, CFARCenter for Applied Rationality worked on development of new rationality techniques and rationality education mostly offline, and prominent writers left to their own blogs where they could develop their own voice without asking if it was within the bounds of Less Wrong. Collectively some of this diaspora forms the 'rationalist movement'.

Around 2013, many core members of the community stopped posting on Less Wrong, because of both increased growth of the Bay Area physical community and increased demands and opportunities from other projects. MIRI's support base grew to the point where Eliezer could focus on AI research instead of community-building, CFAR worked on development of new rationality techniques and rationality education mostly offline, and prominent writers left to their own blogs where they could develop their own voice without asking if it was within the bounds of Less Wrong. Collectively some of this diaspora forms the 'rationalist movement'.

LessWrong developed from Overcoming Bias, an earlier group blog focused on human rationality, which began in November 2006, with artificial intelligence theorist Eliezer Yudkowsky and economist Robin Hanson as the principal contributors.

The site's use ofuses Reddit-style voting andinfrastructure. The ongoing community blog materials lead to significant growth and interest over the following years. At its peak it had over 15,000 pageviews a day.

Around 2013, many core members of the community stopped posting on Less Wrong, because of both increased growth of the Bay Area physical community and increased demands and opportunities from other projects. MIRI's support base grew to the point where Eliezer could focus on mathAI research instead of community-building, CFAR worked on development of new rationality techniques and rationality education mostly offline, and prominent writers left to their own blogs where they could develop their own voice without asking if it was within the bounds of Less Wrong.

Some other prominent ideas to grow out of the lesswrong community (by members of the community's actions) include:

Lesswrong is still active and activity can also be found in the diaspora communities:

As of 2016 the community is far less vibrantactive than it once was. The forum's karmic barrier to entry stands, but submissions are down. The wiki has low traction and it is potentially in need to streamlining around remaining activity rather than its former glories.

Eliezer'Diaspora

Around 2013, many core members of the community stopped posting on Less Wrong, because of both increased growth of the Bay Area physical community and increased demands and opportunities from other projects. MIRI's withdrawal

At some time around June throughsupport base grew to October 2013the point where Eliezer stopped participating directly in the less wrong community, as did a numbercould focus on math research instead of other established contributors.

This triggered many contributors removing themselvescommunity-building, CFAR worked on development of new rationality techniques and rationality education mostly offline, and prominent writers left to their own blogs and communities, triggering what's commonly known aswhere they could develop their own voice without asking if it was within the less wrong diaspora.bounds of Less Wrong.

Diaspora