Is there a LessWrong Index?
Pardon me, please, if this is not the way to go about asking such questions (it's all I know). Is this more for LessWrong itself, or for LessWrong Discussion?
Is there some kind of comprehensive organization by subject of LessWrong posts?
I know there are the sequences, but also a lot of other useful posts.
If I want to learn about learning, about lifespan extension, about charity work, about happiness, etc., is there a place I can go to view all relevant posts in each respective area?
Thanks much
Index of Yvain's (Excellent) Articles
Yvain is one of Less Wrong's best and most prolific writers. I suspect many Less Wrongers haven't read his posts. Here's an index of Yvain's articles (not including meta posts, ranked by upvotes (like on my post index):
- Generalizing from One Example (176)
- Diseased Thinking: Dissolving Questions about Disease (154)
- Eight Short Studies on Excuses (137)
- Confidence Levels Inside and Outside an Argument (97)
- The Least Convenient Possible World (96)
- Intellectual Hipsters and Metacontrarianism (94)
- The Apologist and the Revolutionary (92)
- Are Wireheads Happy? (89)
- Doing Your Good Deed for the Day (85)
- Efficient Charity: Do Unto Others... (78)
- Defense Against the Dark Arts: Case Study #1 (72)
- How to Not Lose an Argument (71)
- A Parable of Obsolete Ideologies (71)
- Extreme Rationality: It's Not That Great (70)
- Shut Up and Guess (64)
- Simultaneously Right and Wrong (62)
- Beware Trivial Inconveniences (61)
- The Trouble with 'Good' (60)
- That Other Kind of Status (57)
- Never Leave Your Room (54)
- Guilt: Another Gift Nobody Wants (52)
- Missing the Trees for the Forest (50)
- Techniques for Probability Estimates (50)
- The Mystery of the Haunted Rationalist (50)
- Offense Versus Harm Minimization (49)
- The Power of Positivist Thinking (46)
- When Truth Isn't Enough (44)
- Book Review: The Root of Thought (41)
- Conflicts Between Mental Subagents: Expanding Wei Dai's Master-Slave Model (40)
- Diplomacy as a Game Theory Laboratory (39)
- Typical Mind and Politics (39)
- The Skeptic's Trilemma (37)
- The 'Spot the Fakes' Test (36)
- Blue- and Yellow-Tinted Choices (36)
- Applied Picoeconomics (36)
- Antagonizing Opioid Receptors for (Prevention of) Fun and Profit (34)
- What I Tell You Three Times is True (33)
- Where's Your Sense of Mystery? (30)
- Would Your Real Preferences Please Stand Up? (30)
- Crowley on Religious Experience (29)
- Solutions to Political Problems as Counterfactuals (28)
- Why Real Men Wear Pink (28)
- Rationalist Poetry Fans, Unite! (27)
- Why Support the Underdog? (27)
- The Zombie Preacher of Somerset (24)
- Are You a Solar Deity? (24)
- The Implicit Association Test (23)
- More Thought on Assertions (22)
- Fight Biases, or Route Around Them? (22)
- Bogus Pipeline, Bona Fide Pipeline (19)
- Help, Help, I'm Being Oppressed! (19)
- It's Not Like Anything to Be a Bat (12)
- Framing Effects in Anthropology (1)
Is Atheism a failure to distinguish Near and Far?
The terms Near and Far are to be taken in the context of Robin Hanson's Near/Far articles.
I was reading a fairly convincing article linked from a comment here about how theistic beliefs are so scantly supported, when not outright contradictory, that it's a doubtful whether anyone truly holds them at all. Of course there is a whole battery of explanations around the self-deception, signalling and belief-in-belief cluster, but the question that got in my head was about the kinds of people that can or cannot profess to hold these beliefs.
A common thread in many a 'deconversion' story is that some inconsistency in a person's worldview comes to their attention, and they can't let go until they have undone the whole fabric of their belief system. But given that most people are happy living productive lives while simultaneously nominally carrying around massively conflicted worldviews, what is it that makes certain individuals not capable of this fairly common human feat?
So the hypothesis that I'm considering is that the people who came to atheism this way, are those who demand detailed consistency of their Far ideals. Alternatively, they could be those for who what is normally considered Far is actually Near, in other words those with an unusually high Buxton Index. Combining the two, perhaps for people with a high Buxton Index, Far simply evaporates, as it comes under the scope of things that are relevant to a person's planning. (Edsger W. Djikstra, when introducing the Buxton Index, says that "true christians" have a Buxton Index of infinity. I think that couldn't be more wrong. Perhaps it is the case for singularitarians though.)
The obvious reason to be suspicious of this idea is that it's very flattering for those that fall in this category, which includes myself. Rather than dithering about it, I'd rather expose it to the community and see if it seems to have legs in the eyes of others.
= 783df68a0f980790206b9ea87794c5b6)
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)