You may need to read and understand something like the Truth entry at the Stanford Encyclopedia of Philosophy to grok the context and meaning of The Simple Truth.
So, is he defending one of these positions, or arguing against them all. Or saying the whole debate is pointless?
From what I read he seems to be suggesting that truth is independent of what we believe, but I'm not sure what else he is saying, or what his argument is.
By the way,
I still haven't heard an explanation of what "The Simple Truth" is about. Maybe that requires a whole separate post.
There's a bit of background information that might answer some of your questions.
The reason that the sequences written by Eliezer Yudkowsky don't feel like the product of a community blog is that they were originally written for Overcoming Bias which was an ordinary blog with a limited set of contributors and only later imported to LessWrong (which was created when those sequences were pretty much finished).
As far as I know, the story behind the sequences is that Eliezer Yudkowsky wanted to write a book about rationality. However, he had some trouble with it so he decided to first write out all the material in the form of blog posts of lower quality than a finished book would have. He is now working on that book and the last public statement about it that I'm aware of is somewhere in this interview.
Also, LessWrong has a wiki that's supposed to contain succint summaries of ideas presented in the articles but I don't know how complete it is.
That helps explain a bit more why they are the way they are. But it suggests to me that they shouldn't play such a prominent role on the site, because they haven't been designed for the purpose they are now being used for.
I agree that some combination of rewriting and rearranging is called for. I think that most people here haven't thought of editing the sequences as allowed, because they're Eliezer's articles, promoted to high status. In actual fact, that page started as just a categorized list; using it as a suggested reading order came later, and it was never optimized for that purpose. The very notion of reading articles in category-sorted order is pretty stupid; it would be better to give new readers a taste of each of the topics Less Wrong has to offer, to maximize the chance that one of them will pull them in, and then go in depth about particular topics.
There should be particular emphasis on the first posts in a depth-first traversal, since those are what people will start with when told to read the sequences. The first article people will read, following this pattern, is The Simple Truth. And as a newcomer's introduction to Less Wrong, The Simple Truth is terrible. I mean, it's a good article, but it's much too long, indirect and sparse, and it's aimed at dispelling specific confused notions which most readers won't even have.
So let's fix it. Our goal is to choose the first few articles people read, in order to maximize the chance that they get hooked and keep reading. We can either pick high-quality historical posts, by any author, or write new articles specifically for this purpose. The very first article should get special attention, and be chosen carefully. After that, there should be one article about each major topic, before circling around to go into depth about any one topic. Many of the best articles for this purpose will come from the sequence, but there are also a lot of high-quality posts that aren't part of any sequence that should be considered. It's also probably a good idea to include a variety of authors, to make it clear that this is a community blog and not just Eliezer.
So please post (1) the one article that you think newcomers should read, to maximize the chance that they read more; and (2) articles you think should be in the first ten articles that a newcomer reads.
Thanks, this is a great suggestion, I think this would be more helpful.
"The normativity of logic is: “If you want to be speaking the same language as everyone else, don’t say things like ‘The ball is all green and all blue at the same time in the same way.’”"
You surely don't mean this: everyone one else is logical, why not me?
For a start, is everyone else logical? And even if they are, is that the best justification we have for logic?
"But philosophers still argue about ... theism ... as if these weren’t settled questions."
If this is really what you think, then why do you continue with your blog?
The science analogy is based on assumption that an unbiased scientist is unable to choose what to do. How is this assumption justified? In situations where the analogy applies, to realise that you have no time to think and must pick one random way to win the prize is a pretty standard rational decision, no bias involved.
You only have no time to think if your main priority is winning the prize. If you are interested in holding true beliefs then you can take longer. However, our current system tends to reward those that get there first, not those who maximize their chances of being correct.
The former president of South Africa denied that HIV caused AIDS. Biases matter.
Ok. Clearly you only read the title, and not my actual post. I didn't say no biases matter, just that they might not always be a bad thing.
Having lots of people making leaps in different directions might also make science progress faster overall.
Yes, but some of this might be in the wrong direction. We have plenty of examples where scientists have gone with incorrect theories...
Of course. Most of it will be in the wrong direction, that's the point. It might not be best for you, but maybe it will be the best thing for the group.
View more: Next
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
Here are the main points I understood:
The only way you can be sure your mental map accurately represents reality is by allowing a reality-controlled process to draw your mental map.
A sheep-activated pebble-tosser is a reality-controlled process that makes accurate bucket numbers.
The human eye is a reality-controlled process that makes accurate visual cortex images.
Natural human patterns of thought like essentialism and magical thinking are NOT reality-controlled processes and they don't draw accurate mental maps.
Each part of your mental map is called a "belief". The parts of your mental map that portray reality accurately are called "true beliefs".
Q: How do you know there is such a thing as "reality", and your mental map isn't all there is? A: Because sometimes your mental map leads you to make confident predictions, and they still get violated, and the prediction-violating thingy deserves its own name: reality.
Thanks, that helps a lot.