If this has been discussed before, then I ask for patience, and a point in the right direction.

I have been a lurker on Lesswrong for a while, and have mostly just been reading things, and only commenting occasionally. It wasn't long before I realised that the sequences played a very important role for understanding lots of what goes on here.

I have been trying to read them, but I've been getting very frustrated. Apart from being insanely long, they are not very easy to understand.

Take the first one I came to "The Simple Truth".

It is a very long story, and it is never really explained what the point is. Is it that truth is whatever helps you to survive? If it is, that seems obviously false.

It also took me quite a while to realise that all these posts are written by one person, that struck me as a bit odd for a "community" blog. So couldn't there be some work to improve the sequences, while also making it more of a community effort?

Maybe:

* Some people could rewrite the key ones, and others could vote on them, or suggest changes

* There could be summary posts alongside the sequences listing the key claims

 

Any other suggestions?

New to LessWrong?

New Comment
32 comments, sorted by Click to highlight new comments since: Today at 4:53 PM

I agree that some combination of rewriting and rearranging is called for. I think that most people here haven't thought of editing the sequences as allowed, because they're Eliezer's articles, promoted to high status. In actual fact, that page started as just a categorized list; using it as a suggested reading order came later, and it was never optimized for that purpose. The very notion of reading articles in category-sorted order is pretty stupid; it would be better to give new readers a taste of each of the topics Less Wrong has to offer, to maximize the chance that one of them will pull them in, and then go in depth about particular topics.

There should be particular emphasis on the first posts in a depth-first traversal, since those are what people will start with when told to read the sequences. The first article people will read, following this pattern, is The Simple Truth. And as a newcomer's introduction to Less Wrong, The Simple Truth is terrible. I mean, it's a good article, but it's much too long, indirect and sparse, and it's aimed at dispelling specific confused notions which most readers won't even have.

So let's fix it. Our goal is to choose the first few articles people read, in order to maximize the chance that they get hooked and keep reading. We can either pick high-quality historical posts, by any author, or write new articles specifically for this purpose. The very first article should get special attention, and be chosen carefully. After that, there should be one article about each major topic, before circling around to go into depth about any one topic. Many of the best articles for this purpose will come from the sequence, but there are also a lot of high-quality posts that aren't part of any sequence that should be considered. It's also probably a good idea to include a variety of authors, to make it clear that this is a community blog and not just Eliezer.

So please post (1) the one article that you think newcomers should read, to maximize the chance that they read more; and (2) articles you think should be in the first ten articles that a newcomer reads.

So please post (1) the one article that you think newcomers should read, to maximize the chance that they read more; and (2) articles you think should be in the first ten articles that a newcomer reads.

The order in which I would have liked to be introduced to Less Wrong (introduction, wisdom, insight, education, excitement, novelty, fun):

Introduction

Wisdom & Insight

Education

Excitement, Novelty, Fun

More

I'm not sure Twelve Virtues of Rationality is the best place to start. To be honest, I was a bit confused reading it the first time, and it only made sense to me after I had spent some time on lesswrong getting used to Eliezer's writing-style.

For myself (as I know it was for many others), I got here via Harry Potter and the Methods of Rationality. I'd say it's a great place to start many people off, but perhaps not the majority. Along with that, what got me convinced to start reading lesswrong was my interest in biases and importantly being convinced that I, myself, am biased.

Thus I would propose one starts off with a single post about some bias, especially one that convinces the reader that this is not an abstract experiment involving random test-subjects. I think that Hindsight Devalues Science works excellently for this purpose, although it's obviously not written as an introductory essay.

Follow this up with some posts from Map And Territory, namely: What Do We Mean by Rationality, What is Evidence, and The Lens that Sees its Flaws, in that order, to give a basic introduction to what rationality actually is. This could be followed by one or two more posts from Mysterious Answers to Mysterious Questions, so why not start with the first two: Making Beliefs Pay Rent in Anticipated Experiences and Belief in Belief.

Now, you could finally digress to Twelve Virtues Of Rationality and then maybe try your hand at the whole Map and Territory Sequence (skipping over those posts you've already seen), alternatively you could finish reading the Mysterious Answers to Mysterious Questions sequence first.

After this, I no longer provide any advice as to reading order. You could choose to follow the order provided by XiXiDu above. I provide the following as one order which would at least do better than picking articles at random:

Finish reading Mysterious Answers to Mysterious Questions if you haven't already done so.

The whole mega-sequence of How to Actually Change Your Mind contains a lot of pretty important stuff, but will take a while to read.

The rest of Lesswrong. ;)


Summary:

And then follow with either of:

Path a

Path b


Which concludes my recommendation.

For anyone interested, I've made an ebook variants for myself (epub, mobi, PDF, odt). It is far from awesome, but at least readable on e-book reader. https://www.dropbox.com/sh/6agp4otiukejb0g/AACO-5V1J8i0USBWUFL9nw74a

For the record, it's not too far down my to-do list to write a few long posts that summarize certain sections of the LW canon but also say, "If you want the full argument and the full impact, read posts X, Y, and Z by Eliezer." The purpose of these posts would be to lower the entry bar for newcomers. Eliezer's books will do this, too, but lots of people do not read books even if they will read a book's worth of content in blog posts.

I suspect you'll see the first of my sequence-summarizing posts within a month.

Our goal is to choose the first few articles people read, in order to maximize the chance that they get hooked and keep reading.

I am doing a reordering of posts in the sequences with a different goal and according to a different method. I started yesterday.

I undertook to rewrite a book from the Harvard Negotiation Project as hyperlinks to LW articles with a few words in between, sort of in this style. It is not optimized to be an organized introduction to the material, but it does trace a book that was so designed. I hope to learn about the relationship between LW thought and HNP thought, and expect to do so from sections in which HNP has content LW other does not, or where LW would insert content. This is also a test of both thought systems.

So this would be for someone committed to reading more than one blog post, as the first will not be designed to stir curiosity for the second.

Thanks, this is a great suggestion, I think this would be more helpful.

The book Eliezer has been working on is the shorter, better version of the sequences.

I thought it was going to be longer and better, at least before he decided to split it into two books? Either way, looking forward to it.

No, shorter, the sequences as they are would be 2000 pages long in printed form.

My suggestion: lesswrong should have a prominently placed random article button.

My background: It would be hard for me to guess what fraction of material I have read. I read some stuff, enthusiastically, when this community was at Overcoming Bias. Since then I have had one foot in this community. However, when I get enthusiastic, I find myself spending Far Too Long attempting to read the sequences; I decide to put them aside again and do more productive things.

As a result, I am somewhat conflicted. I think the lesswrong philosophy is excellent. I link friends to lesswrong articles often, hoping that they will end up reading more than just the one article. Yet, for myself, I feel I "get the idea" and that reading old articles more systematically, while fun, is too hard and too time-consuming.

I get the urge to go back and read stuff every so often. Re-starting from the beginning of a sequence or the chronological list of articles is demoralizing, though; I'm just looking for something fun to read, not to start a project. Therefore, my proposal: a Random Article button.

My guess is that this would also be useful to other people not sure where to start.

(I understand that this would require a semi-significant amount of programming.)

Oh, also:

More sorting options on the main blog and on pages for individual contributors would be nice. If I could see all posts (promoted, or of an individual) in chronological order (starting with the first, not the most recent), that would be nice.

[-][anonymous]13y40

There's a bit of background information that might answer some of your questions.

The reason that the sequences written by Eliezer Yudkowsky don't feel like the product of a community blog is that they were originally written for Overcoming Bias which was an ordinary blog with a limited set of contributors and only later imported to LessWrong (which was created when those sequences were pretty much finished).

As far as I know, the story behind the sequences is that Eliezer Yudkowsky wanted to write a book about rationality. However, he had some trouble with it so he decided to first write out all the material in the form of blog posts of lower quality than a finished book would have. He is now working on that book and the last public statement about it that I'm aware of is somewhere in this interview.

Also, LessWrong has a wiki that's supposed to contain succint summaries of ideas presented in the articles but I don't know how complete it is.

That helps explain a bit more why they are the way they are. But it suggests to me that they shouldn't play such a prominent role on the site, because they haven't been designed for the purpose they are now being used for.

Take the first one I came to "The Simple Truth".

I agree that "The Simple Truth" is long and meandering. It would definitely benefit from being supplemented with a summary containing just the core claims and arguments, with all of the characters and satire stripped away.

But that article is not part of the sequences. It was written before Eliezer started blogging at Overcoming Bias (the blog from which LW branched off)*. It certainly isn't typical of the sequence posts. Almost none of them (if any) are anywhere close to that long.


*At least, I believe that this is so. Eliezer already mentions it in his third OB post.

I have noticed that my lack of math skills put me at a great disadvantage. Telling people to go to khanacademy and do all the exams is a bit extreme, but maybe someone would be able to write a convincing post about how truly understanding math would help further understanding in general and lesswrong specifically.

Other than that i fully agree that we need a rewrite and re-ordering of the content to be more beginner friendly. I am willing to spend time helping out here but i do not know what to do.

Thanks for the reminder. I plan to re-write Truly Part of You in a way that requires fewer pre-requisites, since I think it's such a useful, universal heuristic that nearly obviates a lot of the other material.

Take the first one I came to "The Simple Truth".

What make you think a single version will work for all audiences?

For example, the one you picked ("The Simple Truth") has been re-written for people immersed in Semiotics:

"Pragmatism and Umwelt-theory", by Alexei Sharov

and the result is clearer for them, but probably less clear for most people.

By the way,

I still haven't heard an explanation of what "The Simple Truth" is about. Maybe that requires a whole separate post.

Here Elizier said:

"The Simple Truth" was generated by an exercise of this discipline to describe "truth" on a lower level of organization, without invoking terms like "accurate", "correct", "represent", "reflect", "semantic", "believe", "knowledge", "map", or "real".

or "correspond?"

[-][anonymous]13y00

You may need to read and understand something like the Truth entry at the Stanford Encyclopedia of Philosophy to grok the context and meaning of The Simple Truth.

So, is he defending one of these positions, or arguing against them all. Or saying the whole debate is pointless?

From what I read he seems to be suggesting that truth is independent of what we believe, but I'm not sure what else he is saying, or what his argument is.

Here are the main points I understood:

The only way you can be sure your mental map accurately represents reality is by allowing a reality-controlled process to draw your mental map.

A sheep-activated pebble-tosser is a reality-controlled process that makes accurate bucket numbers.

The human eye is a reality-controlled process that makes accurate visual cortex images.

Natural human patterns of thought like essentialism and magical thinking are NOT reality-controlled processes and they don't draw accurate mental maps.

Each part of your mental map is called a "belief". The parts of your mental map that portray reality accurately are called "true beliefs".

Q: How do you know there is such a thing as "reality", and your mental map isn't all there is? A: Because sometimes your mental map leads you to make confident predictions, and they still get violated, and the prediction-violating thingy deserves its own name: reality.

Thanks, that helps a lot.

The only way you can be sure your mental map accurately represents reality is by allowing a reality-controlled process to draw your mental map.

Everything is reality, so that is a distinction that doesn't make a difference. All illusions and errors are produced by real processes. (Or is "reality" being used to mean "external reality").

The human eye is a reality-controlled process that makes accurate visual cortex images.

Sometimes. But being reality controlled isn't a good criterion for when, since it is never false.

Natural human patterns of thought like essentialism and magical thinking are NOT reality-controlled processes and they don't draw accurate mental maps.

They are perfomed by real brains. if "reality controlled" just means producing the right results, the whole argument is circular.

Each part of your mental map is called a "belief". The parts of your mental map that portray reality accurately are called "true beliefs".

Why is it important to taboot the words "accurate", "correct", "represent", "reflect", "semantic", "believe", "knowledge", "map", or "real"., but not the word "correspond"?

By "reality-controlled", I don't just mean "external reality", I mean the part of external reality that your belief claims to be about.

Understanding truth in terms of "correspondance" brings me noticeably closer to coding up an intelligent reasoner from scratch than those other words.

The simple truth is that brains are like maps, and true-ness of beliefs about reality is analogous to accuracy of maps about territory. This sounds super obvious, which is why Eliezer called it "The Simple Truth". But it runs counter to a lot of bad philosophical thinking, which is why Eliezer bothered writing it.

Understanding truth in terms of "correspondance" brings me noticeably closer to coding up an intelligent reasoner from scratch than those other words.

If the correspendence theory cannot handle maths or morals, you will end up with a reasoner that cannot handle maths or morals.

The simple truth is that brains are like maps, and true-ness of beliefs about reality is analogous to accuracy of maps about territory.

You need to show that that simple theory also deals with the hard cases....because EY didn't.

But it runs counter to a lot of bad philosophical thinking, which is why Eliezer bothered writing it.

It's a piece of bad thinking that runs counter to philosophy. You don't show that something works in all cases by pointing out, however loudly or exasperatedly, that it works in the easy cases ,where it is already well known to work.

Seems like first you objected that TST's lesson is meaningless, and now you're objecting that it's meaningful but limited and wrong. Worth noting that this isn't a back and forth argument about the same objection.

The rest of LW's epistemology sequences and meta-morality sequences explain why the foundations in TST also help understand math and morals.

I've read them, and, no, not really.

I think you can somewhat rescue the correspondence theory for math by combining claims like "this math is true" with claims like "this part of reality is well modeled by this math" to create factual claims. That approach should be enough for decision making. And you can mostly rescue the correspondence theory for morals by translating claims like "X is better than Y" into factual claims like "my algorithm prefers X to Y", since we have some idea of how algorithms might have preferences (to the extent they approximate vNM or something). I agree that both areas have unsolved mysteries, though.

I think you can somewhat rescue the correspondence theory for math by combining claims like "this math is true" with claims like "this part of reality is well modeled by this math" to create factual claims

Then you would have some other form of truth in place before you started considering what true maths corresponds to. Which may be what you mean by somewhat rescue...embed correspondence as a one component of a complex theory. But no one is really saying that correspodence is 100% wrong, the debate is more about whether a simple theory covers all cases..

And you can mostly rescue the correspondence theory for morals by translating claims like "X is better than Y" into factual claims like "my algorithm prefers X to Y", s

Why should I go to jail for going against your preferences... why not the other way round? Getting some sort of naturalised "should" or "ought" out of preferences is the easy bit. What you need, to solve morality, to handle specifically moral oughts, is a way to resolve conflicts between the preferences of individuals.

[-][anonymous]7y00

I don't see what your central point is. Is it "The lessons that the author is attempting to teach in the simple truth are not positive contributions to add to one's philosophy?"

Without having your own central point, it's easy to just argue against each of my statements individually because they all have caveats.

[This comment is no longer endorsed by its author]Reply