Comment author: Yvain 23 March 2009 04:13:02PM *  7 points [-]

From a poster's perspective: it is very hard to tell which ideas your audience considers beginner-level and which they consider advanced-level. Especially when the audience is as diverse and self-selected as at LW. I've posted a few times asking "Hey, does everyone here already know X or not?" and I've rarely gotten the answer I expected.

Responses to my post last night ranged from "this is obvious" to "this is wrong" to "this acronym could be useful" to "this was one of my favorite posts yet". I don't quite know what to do with that. Right now I am erring on the side of caution; I'd rather write something obvious to everyone than skip an inferential distance somewhere.

Upvoting ought to be the main feedback mechanism here, but right now I worry that a well-written true (but obvious) article will get voted up just because it's well-written and true, and everyone figures it will probably help someone else. Maybe make a rule that you should not upvote a post unless it teaches you something? Or maybe end a post whose difficulty level you're not sure of with "Please rate this as too obvious, okay, or too hard"?

EDIT: It's also hard to remember if something has already been covered on Overcoming Bias (see: source confusion). There's not any nice list of Robin or the other writers' posts like there is of Eliezer's, is there?

Comment author: NQbass7 23 March 2009 08:04:48PM 5 points [-]

Right now I am erring on the side of caution; I'd rather write something obvious to everyone than skip an inferential distance somewhere.

That seems like the best policy to me, especially for a site like LW. Perhaps on OB that could be a concern, but here where it's so easy to avoid the posts you don't want to read or which aren't upvoted much, having redundant information doesn't seem like it would be too much of a problem.

Comment author: roland 21 March 2009 10:19:59PM 9 points [-]

My two bad decisions regarding motor vehicles, for example, could not have easily been outsourced to a group rationality mechanism[3].

Cars kill a LOT of people every month. One rational thing to do would be to simply restrict their use as much as possible and instead implement an efficient mass transit system(busses, trains, etc...). You seem to advocate the other route of making drivers more rational but I think this approach is inherently flawed and limited. Consider probability: one million car drivers on the streets are going to have much more accidents than correspondingly fewer busses and trains.

Comment author: NQbass7 23 March 2009 05:57:41PM 2 points [-]

There are other concerns as well, such as individual freedom. If you randomly chose half the population and stuck them in padded rooms, you'd also reduce the number of car accidents. There's value in allowing people to make stupid decisions. What the OP is advocating is how to prevent yourself from making stupid decisions in situations where you're allowed to.

Then again, maybe that's what this debate is about... whether we should help people individually be rational, or give incentives at a group level for being rational. But it seems to me that restricting the use of cars doesn't make people rational, it just takes away the freedom to make stupid choices.

Comment author: Cameron_Taylor 23 March 2009 12:31:17PM 7 points [-]

One of the things that I found most useful when I came across OvercomingBias was the frequent linking though most of the posts. In fact, in the beginning I spent a couple of hours on each new post, clicking through the links and absorbing the required background knowledge.

Especially now that posting has become somewhat more eclectic, I can see the value of providing a reference to some of the key topics that we assume for so much of our discussions.

In the mean time, some may find this list of Eleizer's OB posts useful. The dependency graphs there are handy too.

Comment author: NQbass7 23 March 2009 01:45:25PM 6 points [-]

I would agree that heavy linking to background material is extremely helpful, but perhaps it would also be good to have a "Welcome to Rationality" page with a basic primer not just on what the site is for, but where you should start post-wise. Directing to the Most Frequently Useful Things and the Most Important Thing would be a good start, I would think.

Comment author: Daniel_Burfoot 12 March 2009 09:19:45AM *  8 points [-]

There are a couple of large gorillas in this room.

First, the examples of great scientists who were also religious shows that you don't have to be an atheist to make great discoveries. I think the example of Isaac Newton is especially instructive: not only did Newton's faith not interfere with his ability to understand reality, it also constituted the core of his motivation to do so (he believed that by understanding Nature he would come to a greater understanding of God). Faraday's example is also significant: his faith motivated him to refuse to work on chemical weapons for the British government.

Second, evidence shows that religious people are happier. Now, this happiness research is of course murky, and we should hesitate to make any grand conclusions on the basis of it. But if it is true, it is deeply problematic for the kind of rationality you are advocating. If rationalists should "just win", and we equate winning with happiness, and the faithful are happier than atheists, then we should all stop reading this blog and start going to church on Sundays.

There are subtleties here that await discovery. Note for example Taleb's hypothesis that the ancients specifically promoted religion as a way of preventing people from going to doctors, who killed more people than they saved until the 19th century. Robin made a similar point about the cost effectiveness of faith healing.

Comment author: NQbass7 12 March 2009 01:34:38PM *  1 point [-]

... shows that you don't have to be an atheist to make great discoveries.

I don't think anyone is making the argument that you do. Plenty of people get through life without basic rationality, and some even do interesting and amazing things. That's not an argument for being religious though - at best, it shows that in certain cases religion doesn't completely cripple your rationality. It's still a risk, however.

As for religious people being happier than atheists, ... In my experience, the average atheist is not at the basic level talked about in this post. Slightly more rational than the average theist I've met (and I tend to more time around the smarter, more rationalizing type of theist), but still not even close to this.

Obviously that's just anecdotal, so I wouldn't bet much on it at all, but it's enough for me to question the validity of applying murky evidence about happiness and religion to a discussion about teaching basic rationality. If anything, I would say that evidence seems to more strongly indicate that we should teach basic rationality, because leaving religion without it might make it more difficult to be happy.

Comment author: MBlume 28 February 2009 09:00:19PM 7 points [-]

Eliezer, I suspect you might find the answers to these questions less useful than you expect. The most useful things we've learned from you are probably going to be those things that we've already forgotten you wrote, because they've become a part of us -- because they've become background in how we live, how we think, and thus are completely invisible to us at any given time.

Comment author: NQbass7 01 March 2009 09:57:14PM 2 points [-]

Having particular names which may not be in common usage makes it easier for me to identify the things that I've picked up from OB that are now a part of me. Cached Thoughts, Inferential Distance, Mind-Projection Fallacy - those are all terms I use now when referring to things that are a part of me, but not many other people use those terms often. It makes it somewhat easier to identify those things.

Comment author: Ziphead 28 February 2009 09:13:09PM 15 points [-]

Expecting Short Inferential Distances

One of many posts that gave me a distinct concept for something I previously had been only vaguely aware of, and this one kept coming back to me all the time. By now, I don’t think it’s an extreme exaggeration to say that I make use of this insight every time I communicate with someone, and of all the insights I picked up from OB, this might be the one I most frequently try to explain to others. It doesn’t seem like the most important thing, but for some reason, it immediately struck me as the most frequently useful one.

Comment author: NQbass7 01 March 2009 09:53:22PM 2 points [-]

For me it's between inferential distance and cached thoughts, at least for ones I explain to other people. For ones I use myself, Line of Retreat is probably the one I actively pursue most frequently.

Though I end up using Absence of Evidence pretty often as well.

Comment author: NQbass7 27 February 2009 01:37:56PM 6 points [-]

I don't remember a time when I wasn't in some sense interested in rationality in some sense... but I can remember one time being at a bookstore and seeing Bertrand Russell's "Why I Am Not a Christian" (this being back when I was one) and thinking "Maybe I should read that and see what the other side says." I came home with it and my mom saw it and asked why I would want to read that when it might make me doubt. I clearly remember thinking about it and responding with something along the lines of "If you don't know both sides, how could you possibly know which one is right? Wouldn't you rather be right than keep the same wrong beliefs?" I don't know that it was a turning point for me, but it was the first time I had really had that thought out loud (and it was probably the start of my deconversion and subsequent start down the road of rationality).

View more: Prev