Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

In response to Rational Home Buying
Comment author: Shalmanese 27 August 2011 09:32:55AM 12 points [-]

Another failure of rationality is failing to understand the difference between investment goods and consumption goods. A $745,000 house may cost more to buy than a $710,000 house but you're also likely to be able to sell it for more as well. The "true" cost is not $35K, instead, it's a complex calculation of marginal mortgage payments, expected rise or fall of housing prices and cash flow considerations.

Comment author: NancyLebovitz 08 September 2010 06:36:15PM 2 points [-]

Selection pressure might be even weaker a lot of the time than a 3% fitness advantage having a 6% chance of becoming universal in the gene pool, or at least it's more complicated-- a lot of changes don't offer a stable advantage over long periods.


I think natural selection and human intelligence at this point can't really be compared for strength. Each is doing things that the other can't-- afaik, we don't know how to deliberately create organisms which can outcompete their wild conspecifics. (Or is it just that there's no reason to try and/or we have too much sense to do the experiments?)

And we certainly don't know how to deliberately design a creature which could thrive in the wild, though some animals which have been selectively bred for human purposes do well as ferals.

This point may be a nitpick since it doesn't address how far human intelligence can go.


Another example of attribution error: Why would Gimli think that Galadriel is beautiful?


Eliezer made a very interesting claim-- that current hardware is sufficient for AI. Details?

Comment author: Shalmanese 09 September 2010 04:14:02AM 1 point [-]

"Another example of attribution error: Why would Gimli think that Galadriel is beautiful?"

A waist:hip:thigh ratio between 0.6 & 0.8 & a highly symmetric fce.

Comment author: Shalmanese 02 February 2010 09:47:05AM 21 points [-]

"In the matter of reforming things, as distinct from deforming them, there is one plain and simple principle; a principle which will probably be called a paradox. There exists in such a case a certain institution or law; let us say, for the sake of simplicity, a fence or gate erected across a road. The more modern type of reformer goes gaily up to it and says, "I don't see the use of this; let us clear it away." To which the more intelligent type of reformer will do well to answer: "If you don't see the use of it, I certainly won't let you clear it away. Go away and think. Then, when you can come back and tell me that you do see the use of it, I may allow you to destroy it." GK Chesterton

In response to Two Truths and a Lie
Comment author: Shalmanese 23 December 2009 04:06:39PM 1 point [-]

OK, so my favorite man-with-a-hammer du jour is the "everyone does everything for selfish reasons" view of the world. If you give money to charity, you do it for the fuzzy feeling, not because you are altruistic.

What would you propose as the three factual claims to test this? I'm having a hard time figuring any that would be a useful discriminant.

Thinking about this a bit, it seems most useful to assert negative factual claims, ie: "X never happens".

If reason told you to jump off a cliff, would you do it?

-12 Shalmanese 21 December 2009 03:54AM

In reply to Eliezer's Contrarian Status Catch 22 & Sufficiently Advanced Sanity. I accuse Eliezer of encountering a piece of Advanced Wisdom.

Unreason is something that we should fight against. Witch burnings, creationism & homeopathy are all things which should rightly be defended against for society to advance. But, more subtly, I think reason is in some ways, is also a dangerous phenomena that should be guarded against. I am arguing not against the specific process of reasoning itself, it is the attitude which instinctually reaches for reason as the first tool of choice when confronting a problem. Scott Aaronson called this approach bullet swallowing when he tried to explain why he was so uncomfortable with it. Jane Galt also rails against reason when explaining why she does not support gay marriage.

continue reading »
Comment author: LauraABJ 20 December 2009 05:38:52PM *  2 points [-]

The advanced wisdom you describe is basically just experiential knowledge that has not been well described or quantified. It seems like bullshit because: (1) it's not described specifically enough to be applicable (\what exactly does 'be yourself' mean?), and (2) scope insensitivity of anecdotes (there may well be a vast sea of evidence for how 'being yourself' is useful, but there is certainly some evidence as to how it can be damaging, and the two are not quantified for comparison). In general we should give such experiential anecdotes some weight, especially when many people describe similar experiences, but we should not assume such reports are representative of anything more than the singular experiences of individuals.

Comment author: Shalmanese 21 December 2009 03:25:49AM 1 point [-]

No, it's not (only) experiential knowledge. It's about the basic framework through which you view the world. More experience isn't going to help if you keep on fitting it within the same, inaccurate model.

Comment author: AngryParsley 20 December 2009 11:05:21AM *  0 points [-]

I think that statement is true only for time-constrained arguments. It takes time to research and understand the prerequisites to any "advanced wisdom," so to speak. Likewise, it takes time to understand the flaws in untrue things, and to notice your own biases. Finally, even if you understand the evidence and arguments leading up to some great insight, it takes time to fully understand the ramifications of the idea. If you're time-constrained like in your example, your past self simply can't process everything fast enough and the absurdity heuristic wins.

The one positive thing this law has lead me to is a much higher tolerance for bullshit. I'm no longer so quick to dismiss ideas which, to me, seem obvious bullshit.

This is where I have to disagree with you. There are plenty of ways to quickly and accurately rule out most incorrect beliefs without accidentally ruling out correct beliefs. Many of them are mentioned on this site.

Comment author: Shalmanese 21 December 2009 03:25:01AM 2 points [-]

If you think Christians are Christians (to pick an arbitrary example) because of time constraints, then you're in for a rude shock.

Comment author: loqi 20 December 2009 08:11:33PM 4 points [-]

I would be extremely suspicious of any new "wisdom" I've acquired in the past 10 years that I found myself unable to explain to my past self in the course of six hours. Any sufficiently incommunicable wisdom is, in fact, bullshit.

Comment author: Shalmanese 20 December 2009 08:39:30PM 3 points [-]

I'd be extremely suspicious that I'd stopped maturing if myself in 10 years could get along perfectly with myself of today. Take an informal poll of the people around you, I'll bet the vast majority of them would regard their past selves as frustratingly irritating because of all the missing advanced wisdom.

Comment author: Shalmanese 20 December 2009 06:46:46PM 2 points [-]

I think the difference here is that science is still operating under the same conceptual framework as it was 100 years ago. As a result, scientists between different eras can put themselves into each others heads and come to mutual agreement.

Sufficiently advanced wisdom to me has always been a challenging of the very framing of the problem itself.

Comment author: AngryParsley 20 December 2009 11:05:21AM *  0 points [-]

I think that statement is true only for time-constrained arguments. It takes time to research and understand the prerequisites to any "advanced wisdom," so to speak. Likewise, it takes time to understand the flaws in untrue things, and to notice your own biases. Finally, even if you understand the evidence and arguments leading up to some great insight, it takes time to fully understand the ramifications of the idea. If you're time-constrained like in your example, your past self simply can't process everything fast enough and the absurdity heuristic wins.

The one positive thing this law has lead me to is a much higher tolerance for bullshit. I'm no longer so quick to dismiss ideas which, to me, seem obvious bullshit.

This is where I have to disagree with you. There are plenty of ways to quickly and accurately rule out most incorrect beliefs without accidentally ruling out correct beliefs. Many of them are mentioned on this site.

Comment author: Shalmanese 20 December 2009 02:15:15PM 1 point [-]

Note: The converse is not true. Not all bullshit looks like advanced wisdom.

View more: Next