A sequence is a series of multiple posts on Less Wrong on the same topic, to coherently and fully explore a particular thesis. See the Library page for a list of LessWrong sequences in their modern form.
From the old discussion page:
The Sequences page is probably the most important page on the Wiki. As such, speed of user experience is more important than a vague urge for abstract symmetry or consistency. Where the original sequence guides are blog posts, we want users - especially new users - to visit those original sequence guides immediately. We don't want to send them to a Wiki page that forwards to the sequence guide and force them to click twice or read the same content over again. User interface studies show that requiring one more click results in a significant drop-off in participation, and this is very debilitating when it comes to the Sequences page. We want the user to click and see something interesting and attractive as fast as possible.
Hence I'm rolling back various edits that increase the number of user clicks. I designed the page the way it is for a reason. --Eliezer Yudkowsky 18:43, 10 December 2009 (UTC)
Eliezer, that's causing problems. See the issue I've just moved from #215:
I've noticed that not all sequences have a page on the wiki. E.g. http://wiki.lesswrong.com/wiki/Sequences#The_Quantum_Physics_Sequence links to http://lesswrong.com/lw/r5/the_quantum_physics_sequence/ instead of a wiki page. I assume this is because the LW post already lists the articles in the sequence. This is a problem though as the article navigation code assumes there is a wiki page for each sequence and uses that to build the list of articles in the sequence. The result is articles in sequences without a wiki page don't have the sequence listed in the article navigation links. E.g. http://lesswrong.com/lw/r9/quantum_mechanics_and_personal_identity/
--Matt 04:04, 13 August 2010 (UTC)
As I've said on the tracker, it's not really so:
There is a wiki page for quantum physics sequence, it's just not linked from the main sequences page, because the post that is linked has abstracts, while the wiki page doesn't (and probably shouldn't, since it'll be just a copy-paste of what's already available). Here it is: http://wiki.lesswrong.com/wiki/The_Quantum_Physics_Sequence
The same goes for fun theory sequence: http://wiki.lesswrong.com/wiki/The_Fun_Theory_Sequence
--Vladimir Nesov 08:26, 13 August 2010 (UTC)
"Map and Territory contains some of the most important introductory posts and essays.
"If you don't read the sequences on Mysterious Answers to Mysterious Questions and Reductionism, nothing else on Less Wrong will make much sense."
This has bugged me somewhat since I first came here. "Map and Territory is 'introductory', suggesting that I'm best off reading it first, but at the same time it won't make much sense without MAMQ, but I should read it first for some reason anyway?". --Document 04:22, 6 January 2011 (UTC)
It wasn't til I'd already added links under each section heading that I noticed the "Alternative Formats" section at the bottom and discovered OneWhoFrogs's apparently more complete collection; sorry about that. I'd try to clean things up, but I'm up late and I'm hoping it's still closer to ideal accessibility than before than before. (Edit: another thing that should be done is making links to summary posts more prominent, and possibly linking this.) --Document 08:18, 6 January 2011 (UTC)
I'm reverting all your edits, as they seem to be redundant (correct me if there's some material not included in the last section, I haven't actually checked; in that case, it should be added to the last section), and distort structure of the page (by interrupting the text that describes the sequences). --Vladimir Nesov 13:58, 16 January 2011 (UTC)
http://yudkowsky.net/rational/overcoming-bias notes that the posts may make more sense in chronological format, and for me, it seems like http://www.cs.auckland.ac.nz/~andwhay/postlist.html offers a better starting point than the "map and territory" sequence (ie, just a bit more exciting to start reading). So, I suggest that this be mentioned as an alternative in the wiki page, possibly in the introduction after the bit about "the most systematic way..."
Work in progress: User:Chriswaterguy #A communication sequence.
I don't know whether it should eventually be linked here, but it's possibly of interest to people besides myself. --Chriswaterguy (talk) 11:44, 8 December 2014 (AEDT)
__TOC__
Rationality: From AI to Zombies is an ebook collecting six books worth of essays on the science and philosophy of human rationality. It's one of the best places to start for people who want to better understand topics that crop up on Less Wrong, such as cognitive bias, the map-territory distinction, meta-ethics, and existential risk.
Castify makes certain content of Less Wrong available as a podcast for a small fee (they're recorded by a professional voice actor). Currently they offer:
Promoted Posts:
Major Sequences:
Minor Sequences:
Essay:
The ebook can be downloaded on a "pay-what-you-want" basis from intelligence.org. There is also an audiobook version of the book available from Castify. Its six books in turn break down into twenty-six sections:
Rationality: From AI to Zombies is an ebook collecting six books worth of essays on the science and philosophy of human rationality. It's one of the best places to start for people who want to better understand topics that crop up on Less Wrong, such as cognitive bias, the map-territory distinction, meta-ethics, and existential risk. The six books are:
The ebook can be downloaded on a "pay-what-you-want" basis from intelligence.org. There is also an audiobook version of the book available from Castify. Its six books in turn break down into twenty-six sections:
Reading throughThe original sequences were written by Eliezer Yudkowsky with the goal of creating a book on rationality. MIRI has since collated and edited the sequences into Rationality: From AI to Zombies. If you are new to Less Wrong, this book is the most systematic waybest place to approachstart.
__TOC__
Rationality: From AI to Zombies is an ebook collecting six books worth of essays on the science and philosophy of human rationality. It's one of the best places to start for people who want to better understand topics that crop up on Less Wrong archives., such as cognitive bias, the map-territory distinction, meta-ethics, and existential risk. The six books are:
The ebook can be downloaded on a "pay-what-you-want" basis from intelligence.org. Its six books in turn break down into twenty-six sections:
__________________________________________________________________
If you don't read the sequences on rational belief.
__________________________________________________________________
The most important method that Less Wrong can offer you is
Long sequences that have been completed A guide to noticing motivated reasoning and organized into a guide.overcoming confirmation bias.
__________________________________________________________________
How to see throughThe Machine in the many disguisesGhost. Essays on the general topic of answers or beliefs or statements, that don't answer or say or mean anything.minds, goals, and concepts.
__________________________________________________________________
A series. Essays on the use and abuse of words; why you often can't define a word any way you like; how human brains seem to process definitions. First introduces the Mind Projection Fallacyscience and the conceptphysical world.
__________________________________________________________________
__________________________________________________________________
__________________________________________________________________
The following collections of essays come from the original sequences, an algorithm feels from inside, which makes it a basic intro to key elementsearlier version of much of the LW zeitgeist.material from Rationality: From AI to Zombies:
: A mega-sequence scattered...
The originalReading through sequences were written by Eliezer Yudkowsky with the goal of creating a book on rationality. MIRI has since collated and edited the sequences into Rationality: From AI to Zombies. If you are new to Less Wrong, this book is the best placemost systematic way to start.
__TOC__
Rationality: From AI to Zombies is an ebook collecting six books worth of essays onapproach the science and philosophy of human rationality. It's one of the best places to start for people who want to better understand topics that crop up on Less Wrong, such as cognitive bias, the map-territory distinction, meta-ethics, and existential risk. The six books are: archives.
The ebook can be downloaded on a "pay-what-you-want" basis from intelligence.org. Its six books in turn break down into twenty-six sections:
__________________________________________________________________
If you don't read the sequences on Mysterious Answers to the Bayesian concept of rational belief.
__________________________________________________________________
Long sequences that have been completed and organized into a guide.
__________________________________________________________________
__________________________________________________________________
A mega-sequence scattered over almost all of Less Wrong on the ultra-high-level penultimate technique of rationality: actually updating on the evidence.
Organized into eight subsequences.
The second core sequence of Less Wrong. How to take reality apart into pieces... and live in that universe, where we have always lived, without feeling disappointed about the Merely Real
A non-mysterious introduction to quantum mechanics, intended to be accessible to anyone who can grok algebra and
Rationality: From AI to Zombies is an ebook collecting six books worth of essays on the science and philosophy of human rationality. It's one of the best places to start for people who want to better understand topics that crop up on Less Wrong, such as cognitive bias, the map-territory distinction, meta-ethics, and existential risk. The six books are:
The original sequences were written by Eliezer Yudkowsky with the goal of creating a book on rationality. MIRI has since collated and edited the sequences into Rationality: From AI to Zombies. If you are new to Less Wrong, this book is the best place to starstart.
The ebook can be downloaded on a "pay-what-you-want" basis from intelligence.org. Its six books in turn break down into twenty-six sections: __________________________________________________________________
__________________________________________________________________
__________________________________________________________________
__________________________________________________________________
__________________________________________________________________
__________________________________________________________________
__________________________________________________________________
---
---
---
---
Later sequences that were written people other than Eliezer Yudkowsky.Sequences of essays by Scott Alexander include:
: Priming may be described as the capability of any random stimulus to commandeer your thinking and judgement for the next several minutes. Scared? Don't be. There exist ways to defend yourself against these kinds of intrusions, and there are even methods to harness them into useful testing mechanisms.
Sequences by YvainLuke Muehlhauser
By Anna Salamon:
. Decisions need to be modeled with some structure in order to be scrutinized and systematically improved; simply "intuiting" the answers to decision problems by ad-hoc methods is not conducive to thorough analysis. For this, we formulate decision theories. This sequence, themed with an analysis of Newcomb's problem, is a consolidated summary and context for the many decision theory discussions found on LessWrong at the time of writing.
By Alicorn:
. Luminosity, as used here, is self-awareness. A luminous mental state is one that you have and know that you have. It could be an emotion, a belief or alief, a disposition, a quale, a memory - anything that might happen or be stored in your brain. What's going on in your head?
And by lukeprogKaj Sotala
This sequence summarizes scientifically-backed advice for "winning" at everyday life: in one's productivity, in one's relationships, in one's emotions, etc. Each post concludes with footnotes and a long list of references from the academic literature.:
This sequence explains how intuitions are used in mainstream philosophy and what the science of intuitions suggests about how intuitions should be used in philosophy.
This sequence explains and defends a naturalistic approach to metaethics.
. A sequence summarizing the content of Keith Stanovich's book What Intelligence Tests Miss.
A. An unfinished sequence summarizing the content of Robert Kurzban's book Why Everyone (Else) is a Hypocrite: Evolution and the Modular Mind (this sequence hasn't been finished).
Two abridged indexes of Eliezer's sequences are XiXiDu's guide, or Academian's guide targeted at people who already have a science background.
The Sequences have been converted to eReader compatible formats by several projects.
The original sequences were written by Eliezer Yudkowsky with the goal of creating a book on rationality. MIRI has since collated and edited the sequences into Rationality: From AI to Zombies. If you are new to Less Wrong, this book is the best place to start.star
__TOC__
Rationality: From AI to Zombies is an ebook collecting six books worth of essays on the science and philosophy of human rationality. It's one of the best places to start for people who want to better understand topics that crop up on Less Wrong, such as cognitive bias, the map-territory distinction, meta-ethics, and existential risk. The six books are:
The ebook can be downloaded on a "pay-what-you-want" basis from intelligence.org. Its six books in turn break down into twenty-six sections:
Read rational belief.
The most important technique that Less Wrong can offer you is
Long sequences that have been completed A guide to noticing motivated reasoning and organized into a guide.
How to see throughThe Machine in the many disguisesGhost. Essays on the general topic of answers or beliefs or statements that remove curiosity without alleviating confusion.
Sequence guide: 37 Ways That Words Can Be Wrong
List of posts: A Human's Guide to Words
A series. Essays on the use and abuse of words; why you can't define a word any way you like; how human brains seem to process definitions. First introduces the Mind projection fallacyscience and the conceptphysical world.
The following collections of essays come from the original sequences, an algorithm feels from inside, which makes it a basic intro to key elementsearlier version of much of the LW zeitgeist.material from Rationality: From AI to Zombies:
: A mega-sequence scattered over almost alldiscussion of Less Wrong
I think this page would be more useful if it linked to the individual sequences it lists.
As far as I've seen, there is no page that links to all sequences in order, which would be useful for working through them systematically.