Less Wrong is a large community of very smart people with a wide spectrum of expertise, and I think relatively little of that value has been tapped.
Like my post The Best Textbooks on Every Subject, this is meant to be a community-driven post. The first goal is to identify topics the Less Wrong community would like to read more about. The second goal is to encourage Less Wrongers to write on those topics. (Respecting, of course, the implicit and fuzzy guidelines for what should be posted to Less Wrong.)
One problem is that those with expertise on a subject don't necessarily feel competent to write a front-page post on it. If that's the case, please comment here explaining that you might be able to write one of the requested posts, but you'd like a writing collaborator. We'll try to find you one.
Rules
You may either:
- Post the title of the post you want someone to write for Less Wrong. If the title itself isn't enough to specify the content, include a few sentences of explanation. "How to Learn a Language Quickly" probably needs no elaboration, but "Normative Theory and Coherent Extrapolated Volition" certainly does. Do not post two proposed post titles in the same comment, because that will confuse voting. Please put the title in bold.
or... - Vote for a post title that has already been suggested, indicating that you would like to read that post, too. Vote with karma ('Vote Up' or 'Vote Down' on the comment that contains the proposed post title).
I will regularly update the list of suggested Less Wrong posts, ranking them in descending order of votes (like this).
The List So Far (updated 02/11/11)
- (35) Conversation Strategies for Spreading Rationality Without Annoying People
- (32) Smart Drugs: Which Ones to Use for What, and Why
- (30) A Survey of Upgrade Paths for the Human Brain
- (29) Trusting Your Doctor: When and how to be skeptical about medical advice and medical consensus
- (25) Rational Homeschool Education
- (25) Field Manual: What to Do If You're Stranded in a Level 1 (Base Human Equivalent) Brain in a pre-Singularity Civilization
- (20) Entrepreneurship
- (20) Detecting And Bridging Inferential Distance For Teachers
- (19) Detecting And Bridging Inferential Distance For Learners
- (18) Teaching Utilizable Rationality Skills by Exemplifying the Application of Rationality
- (13) Open Thread: Offers of Help, Requests for Help
- (13) Open Thread: Math
- (12) How to Learn a Language Quickly
- (12) True Answers for Every Philosophical Question
- (10) The "Reductionism" Sequence in One Lesson
- (10) The "Map and Territory" Sequence in One Lesson
- (10) The "Mysterious Answers to Mysterious Questions" Sequence in One Lesson
- (10) Lecture Notes on Personal Rationality
- (10) The "Joy in the Merely Real" Sequence in One Lesson
I thought I just explained it in the same paragraph and in the parenthetical. Did you read those? If so, which claim do you find implausible or irrelevant to the issue?
The purpose of my remarks following the part you quoted was to clarify what I meant, so I'm not sure what to do when you cut that explanation off and plead incomprehension.
I'll say it one more time in a different way: You make certain assumptions, both in the background, and in your language, when you claim that "100 angels can dance on the head of a pin". As those assumptions turn out false, they lose importance, and you are forced to ask a different question with different assumptions, until you're no longer answering anything like e.g. "Do humans have free will?" or about angels -- both your terms, and your criteria for deciding when you have an acceptable answer, have changed so as to render the original question irrelevant and meaningless.
(Edit: So once you've learned enough, you no longer care if "Do humans have free will?" is "true", or even what such a thing means. You know why you asked about the phenomenon you had in mind with the question, thus "unasking" the question.)
I looked at the list of theories of truth you linked, and they don't seem to address (or be robust against) the kind of situation we're talking about here, in which the very assumptions behind claims are undergoing rapid change, and necessitate changes to the language in which you express claims. The pragmatic (#2) sounds closest to what I'm judging answers to philosophical questions by, though.
Thanks, that's actually much clearer to me.
But can't that knowledge be expressed as a truth in some language, even if not the one that I used when I first asked the question? To put it another way, if I'm to be given confusion extinguishing answers, I still want them to be true answers, because surely there are false answers that will also extinguish my confusion (since I'm human and flawed).
I'm worried about prematurely identifying th... (read more)