In reply to Eliezer's Contrarian Status Catch 22 & Sufficiently Advanced Sanity. I accuse Eliezer of encountering a piece of Advanced Wisdom.

Unreason is something that we should fight against. Witch burnings, creationism & homeopathy are all things which should rightly be defended against for society to advance. But, more subtly, I think reason is in some ways, is also a dangerous phenomena that should be guarded against. I am arguing not against the specific process of reasoning itself, it is the attitude which instinctually reaches for reason as the first tool of choice when confronting a problem. Scott Aaronson called this approach bullet swallowing when he tried to explain why he was so uncomfortable with it. Jane Galt also rails against reason when explaining why she does not support gay marriage.

The most recent financial crisis is a another example of what happens when reason is allowed to dominate. Cutting away all the foggy noise & conspiracy theories, the root cause of the financial crisis was that men who believed a little too much in reason became divergent from reality and reality quickly reminded them of that. The problem was not that the models were wrong, it was that what they were trying to accomplish was unmodelable. Nassim Nicholas-Taleb's The Black Swan explains this far better than I can.

A clear-eyed look back on history reveals many other similar events in which the helm of reason has been used to champion disastrous causes: The French Revolution, Communism, Free Love Communes, Social Darwinism & Libertarianism (I am not talking about affective death spirals here). Now, one might argue at this point that those were all specific examples of bad reasoning and that it's possible to carefully strip away the downsides of reason with diligent practice. But I don't believe it. Fundamentally, I believe that the world is unreasonable. It is, at it's core, not amenable to reason. Better technique may push the failure point back just a bit further but it will never get rid of it.

So what should replace reason? I nominate accepting the primacy of evidence over reason. Reason is still used, but only reluctantly, to form the minimum span possible between evidence & beliefs. From my CS perspective, I make the analogy between shifting from algorithm centered computation to data centric. Rather than try to create elaborate models that purport to reflect reality, strive to be as model-free as possible and shut up and gather.

If the reason based approach is a towering skyscraper, an evidence based approach would be an adobo hut. Reasoning is sexy and new but also powerful, it can do a lot of things and do them a lot better. The evidence based approach, on the other hand, does just enough to matter and very little more. The evidence based approach is not truth seeking in the way the reason based approach is. A declaration of truth, built on a large pile of premises is worse than a statement of ignorance. This, I think is what Scott Aaaronson was referring to when he said "What you've forced me to realize, Eliezer, and I thank you for this:  What I'm uncomfortable with is not the many-worlds interpretation itself, it's the air of satisfaction that often comes with it."

New Comment
42 comments, sorted by Click to highlight new comments since: Today at 11:44 AM

If reason told you to jump off a cliff, would you do it?

Yup, because there are probably some pretty damned good reasons -- no pun intended -- for jumping off the cliff, like that I have a parachute strapped to my back and an impending explosion will kill all people still on the cliff.

If there weren't good reasons that were similarly persuasive, then it wouldn't be reasonable.

Yup, because there are probably some pretty damned good reasons -- no pun intended -- for jumping off the cliff, like that I have a parachute strapped to my back and an impending explosion will kill all people still on the cliff.

That's a good example. I was going for "hanglider, chased by a guy wielding a claymore".

Would "bungee cords firmly attached to body, cutie of the interesting sex looking on anxiously" count as reason or as misguided social signaling?

You missed the rather important "cliff has substantial overhang".

Depends on the utility you get from bungee jumping and the probability of the cords breaking.

Also, on your relative social position to the cutie, your general charm and good looks, etc .. - in other words, your perceived odds of making it with her if you don't jump off the cliff.

You're employing the classical division of reason versus empiricism. While I do have criticized this forum's emphasis on reason before, your critique seems to be off the mark. This site is devoted to rationality, not pure reason as such, and rationality uses whatever mix of reason and empiricism that achieves the best results. You're making the assumption that Eliezer relies only on reason and not enough on evidence, but you (ironically given your position) aren't providing any evidence to back up the claim.

Suppose there is an optimal mix of reason and empiricism for a specific domain. If you suspect somebody's mix of the two seems to be wrong, you need to first show that the optimal mix for that domain is what you think it to be, and then that your opponent is leaning too much in one direction. You can't just say "my opponent is wrong" and then parade a long list of failures which you believe were caused by too much reason. That's a fully general counterargument.

"Reason" does not appear to be the right term here. "Cleverness" comes to mind as a better substitute, though I suspect there are better terms. The banking crisis occurred because people thought they were too clever. The various problematic causes you mention all appear to overestimate their own cleverness. It's also unclear to me what it would mean for them to rely on reason less, and how this would cause their worldview to better match reality.

I think this might be best phrased as an objection to an overreliance on clever theories and a tendency to eschew evidence in favor of cleverness. Insofar as that is your point, it is an excellent if not novel one. But the way this is phrased is a bit more antagonistic than I think is merited, and seems to attack a type of thought rather than a specific error that a type of thought is prone to.

If my semantic distinction does not make sense, let me just explain my connotations. When I hear "reason," I tend to think of it much like "rational;" one definitionally cannot make a mistake through being too rational, in that rationality is the thing that having more of it causes you not to make mistakes. "Cleverness," on the otherhand, brings the same intellectual sleight-of-hand without any connotation of accuracy. The sentence, "Bob lost all his money because he was too reasonable," does not really make sense, whereas, "Bob lost all of his money because he was too clever," does. A good example of being too clever would be the demise of Vizzini from the Princess Bride.

Yes.

The things Shalmanese is labeling "reason" and "evidence" seem to closely correspond to what have been previously been called the inside view and outside view, respectively (both of which are modes of reasoning, under the more common definition).

Yes, that was going to be my comment. The outside view also uses "reason" but with wider and shallower chains of reasoning. The inside view is more fragile, requiring more assumptions and longer chains of reasoning.

Thanks for that comment, it made something click for me.

"cleverness" comes to mind as a better substitute

Or "Hubris". In the examples, the people go wrong not because they are using reason and they should not use reason, but because they falsely imagine they are capable of using reason sufficiently to deal with the particular issue.

Voted up, because it so well voices the thoughts I had when I read the OP in my RSS reader.

I'd add that "reason" is here taken as something like "abstract deduction."

Far from reason and evidence being in conflict on occasion, the former strictly requires the latter.

Indeed, it is precisely when we start thinking of reason as disconnected from evidence that it turns into cleverness. Reason is attached to no particular ritual of cognition but rather to Winning.

Jane Galt also rails against reason when explaining why she does not support gay marriage.

Seems like a fairly blatant category error to even mention that in this context.

I don't know about that, but while "[McArdle] does not support gay marriage" is technically true, it's a bit misleading. McArdle "does not support one side or the other," but only mentioning that she doesn't support one side sort of implies that she supports the other.

Why? I think that the OP meant that people blindly support gay marriage when thinking purely in terms of abstract logic, instead of looking at the empirical world, where there are marginal cases where gay marriage may make things worse, as Jane Galt discusses.

You're confusing "reason" with inappropriate confidence in models and formalism.

The most recent financial crisis is a another example of what happens when reason is allowed to dominate. Cutting away all the foggy noise & conspiracy theories, the root cause of the financial crisis was that men who believed a little too much in reason became divergent from reality and reality quickly reminded them of that.

How can you say reason failed them, when they made so much money?

Made and continue to make. One wonders if they might not actually be rather prime examples of Winning in a practical though perhaps corrupted sense.

In reply to Eliezer's Contrarian Status Catch 22 & Sufficiently Advanced Sanity. I accuse Eliezer of encountering a piece of Advanced Wisdom.

This post is indistinguishable from 'Advanced Wisdom'. I accuse you of not really understanding that of which you speak. Even if you are, in fact, advanced and wise you will need to do some background reading here in order to understand the posts here that you may enlighten the authors.

By way of answer to your rhetorical question: Yes, I would jump of a cliff if reason told me to.

The problem was not that the models were wrong, it was that what they were trying to accomplish was unmodelable. Nassim Nicholas-Taleb's The Black Swan explains this far better than I can.

My reading of Taleb's book was the opposite: he says quite explicitly that the finance models behind the crash were wrong, being based on Gaussian distributions, which did not fit the phenomena being modelled. He contrasts the Gaussian with power-law distributions, although I admit I don't know if he has done any mathematical modelling of the finance markets using those.

I don't know what exactly he has done, modeling-wise, but he seems to have made a hell of a lot of money during the financial collapse. Whether this is long-term better modeling than the kind he criticizes or largely due to sampling coincidences, remains to be seen.

In many situations, there is a "no action" safe default action, which you can take as a matter of epistemic hygiene. Then, overconfidence is dangerous and doubt is a virtue. In other cases, this heuristic is not as commendable.

I think you really meant to argue that (outside of mathematics) each link in a chain of inference has a small chance of going wrong, so longer chains are exponentially less likely to be correct than shorter ones. This is well known around here, though hubris does take the better of us sometimes!

Read through the sequences, dammit.

Indeed. Maybe start with Rationality is Systematized Winning. Giving a few examples of people failing at rationality is not an effective criticism of rationality.

Taleb's books are interesting and he makes quite a few good points but the man has a lot of flaws. Eric Falkenstein sums up the case against him quite well.

I maintain that, of all the paradigm-altering and life-improving concepts I've gained from the sequences, the one derived from that post was the most important.

Literally nothing else we do here makes sense without it.

And I'm voting this up, for the same reasons.

I'm voting this down because you so often respond to people with something like "Read the sequences" without deigning to mention any specific problems. That's high-handed and cult-leader-like. It's also unhelpful to someone even if they take your advice and read the sequences, because they're not likely to figure out the connections between them and what they said in their post.

At the very least, recommending one specific post to read would a great improvement.

Downvoted the parent (I have no doubt someone will rectify the temporary negative total). I hate high handed behaviour and so crying wolf over cult like high handedness is an instrumental detriment to me.

It's also unhelpful to someone even if they take your advice and read the sequences, because they're not terribly likely to figure out the connections between them and what they said in their post.

I would have upvoted if the expressed reasoning was "Downvoted for not providing the necessary reference to the sequence that would make the confusion obvious. It would not be hard for you to find the link matthew found and leaving it off is presumptive and makes you look kinda like a tool."

(ETA: Good change. Vote now neutral.)

I upvoted EY's comment, but very much agree with your comment. I am still plugging through the sequences and appreciate the, "This was answered, dummy." response, but I prefer, "This was answered over here, dummy."

In terms of actually increasing rationality in other people, getting short with them is probably counter-productive. But I get the frustration of dealing with noobs like me. I prefer the harshness because the feedback is clearer but I imagine that not everyone is like me.

The problem was not that the models were wrong, it was that what they were trying to accomplish was unmodelable.

[...]

Fundamentally, I believe that the world is unreasonable. It is, at it's core, not amenable to reason. Better technique may push the failure point back just a bit further but it will never get rid of it.

Since you mentioned Scott Aaronson, you might want to have a look at this post.

Maybe someone could clear this up a bit for me, but...

I had always thought that reason was a method of formalized thinking that weighed evidence and logic in coming to the most likely or most preferable conclusion.

Whereas...

Rationality is reasoning within a formal system, and following the consequences of the beliefs of that formal system.

I realize that my attempt to define these two terms is probably flawed to a degree, but I am still new at much of the terminology.

Edit: is there some form of coding by which I might use italics, bold, underlining, quoting, links, etc.? I see that others have used these, yet the norms for these styles don't seem to work here.

Click the "help" link that appears when you compose a comment to get a style guide. It'll tell you how to do bold, italics, links, etc.

Click the "help" link that appears when you compose a comment to get a style guide. It'll tell you how to do bold, italics, links, etc.

Like this?

Thanks much. That will help a lot, as I tend to be a stickler for style (I also appreciate the heads-up on my rather embarrassing reliance upon the word selection software that caused a few gaffs earlier)

Thank you (much).

I do have to say that for one who claims to have a hard time writing, you are as prolific as one who claims to have no problem writing (now, as to what you mean by this, it may be up to a definition of goals of the writing).

The differences between Epistemic and Instrumental Rationality helped a great deal.

In my discussions with Steve Omohundro about Rationality, I get the feeling that he tends to think of rationality in the Instrumental sense. So, I have had a tendency to lean more toward this definition, even though I have often felt that I should be more inclusive of the epistemic type.

This leaves the question of defining Reason. Following what I have read, reason is just a toolbox (with many tools inside) to allow us to make rational decisions.

Is that correct? (note: I have not yet read all of the "Map and the Territory". That's a lotta words in that link, and it will take me some time to get through them all)

If reason told me to jump off a cliff would I?

Interesting question. I'm tempted to say yes. If by jumping off a cliff, I can save the lives of 20 other people, then yes. I certainly should. I'm not going to say that I would, I'm a much a victim of acrasia as anyone.

If by jumping off a cliff, I can save the lives of 20 other people, then yes. I certainly should.

Really? Well, so long as you don't try to tell me I should jump off a cliff to save 20 other people then by all means go ahead. The moment you tried to extend that norm beyond yourself I would reject your attempt.

It's quite hard to evaluate this post with any precision, but I find it reasonable to assign 23.432774% probability that it is a genuine piece of Advanced Wisdom. Accordingly I choose +1, with an estimated expected benefit of 235.74637 utilons plus 54.747746 warm fuzzies.

[-][anonymous]14y-10

This problem-solving by brute-force data acquisition might be the new direction / perspective science is waiting to be ready for. From what I've witnessed here on Less Wrong (this, for example), despite intelligent discussion; reasoning on complex issues rarely gets one closer to correct; either one knew and knows or they didn't and don't -- presumably this knowledge is based on having or not having a set of experiences and the right data (evidence).

On the other hand ... I'm still an advocate of reasoning to, just as you said, bridge that last gap between what we know and what we need to know. Regarding the necessary use of reasoning, I recall this post about using reasoning when necessary, to some extent minimally and optimally.

Later edit: Here, I am using the term 'reasoning' in a narrower way as 'making up theories' as sort of the traditional/idealistic scientific approach to making hypotheses around the data, instead of just letting the data speak for itself. I inferred this use from the original post.