Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Comment author: WalterL 01 December 2014 08:30:37PM 42 points [-]

The race is not always to the swift, nor the battle to the strong, but that's the way to bet.

-Damon Runyon

Comment author: hhadzimu 03 December 2014 11:13:25PM 16 points [-]

Damon Runyon clearly has not considered point spreads.

Comment author: Jack 06 June 2009 12:02:10AM 1 point [-]

So it might be the case that bayesian updating has some quirky memetic mutation that could lead it to destroy itself if it stopped working. Maybe so-called 'rationalism' is especially bad at absorbing internal contradictions. But this would be a feature of they belief itself-- not a feature of it being a belief about procedure. Many beliefs about procedure are exactly the opposite-- take believing that truth can be taken from the Bible. That procedure is self-justifying and there is no way to dispute it from within the assumptions of the procedure.

Mostly, I just don't think the distinction you are trying to make between "procedural" and "substantive" beliefs holds water. Beliefs about political theory and economics, for example, are almost all procedural beliefs (i.e. the right procedure for making a law or stimulating the economy). What about them would make them immune to labeling problems?

Comment author: hhadzimu 06 June 2009 12:21:18AM 0 points [-]

"Many beliefs about procedure are exactly the opposite-- take believing that truth can be taken from the Bible. That procedure is self-justifying and there is no way to dispute it from within the assumptions of the procedure."

That's my point about rationality - the way I think about it, it would catch its own contradictions. In essence, a rationalist would recognize that rationalists don't "win." So as a result, committing yourself to rationality doesn't actually commit you to an outcome, as perhaps following a scripture would.

The bigger problem, I believe, is that most professed commitment to a procedure is superficial, and that instead most people simply bend the procedure to a preferred outcome. "The Devil may cite scripture for his purpose." The key, of course, is following the procedure accurately, and this is the community that'll keep you in line if you try to bend procedure to your preferred conclusion.

Comment author: Jack 05 June 2009 11:20:54PM 3 points [-]

Why should we think of beliefs about proper procedure as less prone to reifying identity formation than beliefs about things other than procedures? How are beliefs about the best procedure for reasoning or predicting not beliefs about the state of the world? Specifically, are such beliefs not beliefs about the human brain and how it functions? Aren't we all pretty committed to the view that updating priors is a better way of getting things right than praying for the answer? I don't see why beliefs about procedure aren't just as liable to be let by as unchallenged assumption as are beliefs about political systems.

Besides, we'd be kidding ourselves if we said that the less wrong community has no shared beliefs other than about procedure. Yeah, a rationalist doesn't have to be an atheist... but there aren't a lot of outspoken evangelicals around these parts. It remains very possible that some or most of us could come to associate other beliefs with the rationalist label, even if the label doesn't explicitly include them right now.

There are lots of reasons to call ourselves rationalists- but lets try not to dupe ourselves into thinking we're so special none of the problems with labeling will apply to us.

Comment author: hhadzimu 05 June 2009 11:45:47PM 0 points [-]

I'm inclined to agree on your latter point: looking at the results of the survey, it seems like it would be easy to go from 'rationalist' as a procedural label to 'rationalist' as shorthand for 'atheist male computer programmer using bayesian rules.' Of course, that's a common bias, and I think this community is as ready as any to fight it.

As for the former, I tried to address that by pointing out that rationalism means that we've already decided that updating priors is more effective than prayer. That said, I have a perhaps idealistic view of rationality, in that I think it's flexible enough to destroy itself, if necessary. I'd like to think that if we learned that our way of reasoning is inferior, we'd readily abandon it. A little too idealistic, perhaps.

That said, I will say that I find purely procedural labels less dangerous than substantive ones. You've alluded to the danger of conflating it with substantive labels like atheism, but that's a separate danger worth looking out for.

Comment author: hhadzimu 05 June 2009 09:04:08PM 2 points [-]

I think the danger here is far smaller than people are making it out to be. There is a major difference between the label "rationalist" and most other identities as Paul Graham refers to them. The difference is that "rationalist" is a procedural label; most identities are at least partially substantive, using procedural/substantive in the sense that the legal system does.

"Rationalist," which I agree is an inevitable shorthand that emerges when the topic of overcoming bias is discussed frequently, is exclusively a procedural label: such a person is expected to make decisions and seek truth using a certain process. This process includes Bayesian updating of priors based on evidence, etc. However, such a label doesn't commit the rationalist to any particular conclusion ex ante: the rationalist doesn't have to be atheist or theist, or accept any other fact as true and virtually unassailable. He's merely committed to the process of arriving at conclusions.

Other identities are largely substantive. They commit the bearer to certain conclusions about the state of the world. A Christian believes in a god with certain attributes and a certain history of the world. A Communist believes that a certain government system is better than all others. These identities are dangerous: once they commit you to a conclusion, you're unlikely to challenge it with evidence to ensure it is in fact the best one. That's the kind of identity Paul Graham is warning against.

Of course, these labels have procedural components: a Christian would solve a moral dilemma using the Bible; a Communist would solve an economic problem using communist theory. Similarly, rationalism substantively means you've arrived at the conclusion than you're biased and you can't trust your gut or your brain like most people do, but that's the extent of your substantive assumptions.

Since rationalism is a procedural identity rather than a substantive one, I see few of the dangers of using the term "rationalist" freely here.

Comment author: steven0461 16 April 2009 08:06:44PM 4 points [-]

Karma-based explanations don't explain why we saw the same gender imbalance on OB.

Comment author: hhadzimu 16 April 2009 08:39:51PM 3 points [-]

Exposing yourself to any judgments, period, is risky. The OB crowd is perhaps the best-commenting community I've come across: they read previous comments and engage the arguments made there. How many other bloggers are like Robin Hanson and consistently read and reply to comments? Anyway, as a result, any comment is bound to be read and often responded to by others. There may not have been a point value attached, but judgments were made.

Comment author: Vladimir_Nesov 10 April 2009 09:16:39PM 5 points [-]

What you describe is not a factual mistake, nor does it strike me as a moral error. It is merely an aesthetic judgment. Though it might be a mistake if you are wasting too much attention and thus deprive yourself of superior experience.

Comment author: hhadzimu 10 April 2009 10:04:11PM -1 points [-]

Agreed. I have trouble accepting this as a true irrationality. It strikes me as merely a preference. You lose time you could be listening to song A because of your desire to have the same play count for song B, but this is because you prefer the world where playcounts are equal to the world where they are unequal but you hear a specific song more. Is that really an irrational preference?

I also agree with VN's disclaimer: this time spent [wasted?] on equalizing playcounts could probably be used for something else. But at what point does the preference for a certain aesthetic outcome become irrational? What about someone who prefers a blue shirt to a red one? What about someone who can't enjoy a television show because the living room is messy? Someone who can't enjoy a party because there's an odd number of people attending? Someone who insists on eating the same lunch every day? Some of these are probably indicators of OCD, but it's really just an extreme point on a spectrum of aesthetic and similar preferences. At what point do preferences become irrational?

In response to Winning is Hard
Comment author: orthonormal 03 April 2009 06:45:24PM 6 points [-]

It seems that all you're saying is that we need information in order to make good decisions. I really don't think that's a controverted or remotely necessary point here.

In situations where there is not enough information to work from (even given perfect Bayesian updating), of course rationalists can make the wrong decision. But across the spectrum of similar possible decisions (where fugu is replaced with some other dish you've never tried), making rational use of what info you have should result in a positive expectation.

Comment author: hhadzimu 03 April 2009 09:15:06PM 1 point [-]

I have to echo orthonormal: information, if processed without bias [availability bias, for example], should improve our decisions, and getting information is not always easy. I don't see how this raises any questions about the rational process, or as you say, principled fashion.

"But by what principled fashion should you choose not to eat the fugu?"

This seems like a situation where the simplest expected value calculation would give you the 'right' answer. In this case, the expected value of eating oysters is 1, the expected value of eating the fugu is the expected value of eating an unknown dish, which you'd probably base on your prior experiences with unknown dishes offered for sale in restaurants of that type. [I assume you'd expect lower utility in some places than others.] In this case, that would kill you, but that is not a failure of rationality.

In a situation without the constraints of the example, research on fugu would obviously provide you with the info you need. A web-enabled phone and google would provide you with everything you need to know to make the right call.

Humans actually solve this type of problem all the time, though the scales are perhaps less. A driver on a road trip may settle for low-quality food [a fast food chain, representing the oysters] for the higher certainty of his expected value [convenience, uniform quality]. It's simply the best use of available information.

In response to comment by ciphergoth on Where are we?
Comment author: SoullessAutomaton 02 April 2009 10:51:33PM *  0 points [-]

Post in this thread if you live in the midwestern USA or nearby areas of Canada, ideally roughly within a day's drive of Chicago.

EDIT: For anyone in this area, Penguicon may be a good location for a meetup. It's a mixed sci-fi/open-source/general-geekery convention in the Detroit area, and just might possibly have at least one guest that LW readers would be interested to meet. I probably won't be there this year, though.

Comment author: hhadzimu 02 April 2009 11:37:16PM 1 point [-]

Chicago, IL.

Comment author: ciphergoth 26 March 2009 10:03:13AM *  6 points [-]

If I'm saying why I shouldn't join, either of "I haven't the time" or silence is fine. If I want to say why you shouldn't join, we should set the bar high, so that if I use joining as a cheap shot against you I look bad. "You joined a website with a stupid font" is what people fear, and so that might be what we need to act against.

Incidentally, what timezone are you in and when do you sleep? I'm always a bit surprised to get responses from you in the morning...

Comment author: hhadzimu 26 March 2009 06:44:58PM 5 points [-]

Eliezer Yudkowsky does not sleep. He waits.

Comment author: AnneC 09 March 2009 04:55:08PM 10 points [-]

Another thing to consider is the fact that being tired, or "half-asleep", or in that twilight state one might manage to maintain when getting up to use the restroom or fetch water at night, is different from being in a state of normal waking consciousness.

Even if the skeptics attempting to spend the night in the "haunted" house don't plan on actually sleeping, unless they're already night-shift workers or have otherwise pre-configured their body clocks so that they'll be awake all night, they are likely to at some point during the night start suffering impaired judgment. I suspect that the nearer a person gets physiologically to a state where their brain might initiate dreaming, the more difficult it becomes to maintain rationality.

I've noticed that when I get up at night after having been asleep, even in my own familiar apartment, I have a type and level of wariness that is not present during my normal waking conscious state. E.g., I find myself reluctant to stare into mirrors or peer behind the shower curtain, or look out the window, and I also find myself practically running back to bed after getting up because of an unnerving feeling that something might "get" me and that somehow being under the covers will make me "safe".

Having done a fair bit of brainhacking in my life, I am now at the point where when I'm in this state I can recognize it as "that nighttime thing" and not take it too seriously, but it nevertheless continues to affect my behavior a little. I am curious now about whether it might be interesting to try getting up in the middle of the night and forcing myself to do all the things that make me jumpy, and whether this might be the sort of pre-exercise that might help someone stay in a "haunted" house overnight (presuming for the moment that we are not talking about a house that harbors escaped murderers or rabid squirrels).

Comment author: hhadzimu 26 March 2009 01:32:09AM 5 points [-]

You're probably right, but this is a workaround around the question. In law school, they'd accuse you of fighting the hypothetical. You're in the least convenient possible world here: you're wide awake, 100%, for the entire relevant duration.

http://lesswrong.com/lw/2k/the_least_convenient_possible_world/

View more: Next