Comment author: nazgulnarsil 20 March 2009 10:51:53AM -2 points [-]

I don't see how individualism can beat out collectivism as long as groups = more power. for individualism to work each person would have to wield equal power to any group.

Comment author: Nick_Novitski 20 March 2009 04:40:30PM 1 point [-]

One view doesn't need to "beat out" the other; for each societal state, there's a corresponding equilibrium between individualistic- and group-think (or rather, group-think for varying sizes of groups) as each person weigh the costs and benefits of adherence for them. In a world of individuals, an organized and specialized group of any size "= more power." Witness sedentary farmers displacing hunter-gatherers. On the other hand, in a world of groups, a rogue individualistic prisoner's-dilemma-defector is king. Witness sociopaths in corporate structures, or the plots of far too many Star Trek episodes.

The balance of power can shift as Individualism becomes a better choice, due to its risks lessening and rewards increasing, whether due to culture, technology, or extensive debates on websites.

Comment author: Annoyance 20 March 2009 02:23:15PM -1 points [-]

"Except that we are free to adopt any version of rationality that wins."

There's only one kind of rationality.

Comment author: Nick_Novitski 20 March 2009 04:21:38PM 4 points [-]

I agree, but that one kind is able to determine an optimal response in any universe, except one where no observable event can ever be reliably statistically linked to any other, which seems like it could be a small subset, and not one we're likely to encounter except

Certainly, there are any number of world-states or day-to-day situations where a full rigorous/sceptical/rational and therefore lengthy investigation would be a sub-optimal response. Instinct works quickly, and if it works well enough, then it's the best response. But obviously, instinct cannot self-analyze and determine whether and in what cases it works "well enough," and therefore what factors contribute to it so working, etc. etc.

Passing the problem of a gun jamming the Rationality-Function might return the response, "If the gun doesn't fire, 90% of the time, pulling the lever action will solve the problem. The other 10% of the time, the gun will blow up in your hand, leading to death. However, determining to reasonable certainty which type of problem you're experiencing, in the middle of a firefight, will lead to death 90% of the time. Therefore, train your Instinct-Function to pull the lever action 100% of the time, and rely on it rather than me when seconds count."

Does this sound like what you mean by a "beneficial irrationality"?

Also: I propose that what seems truly beneficial, seems both true and beneficial, and what seems beneficial to the highest degree, seems right. To me, these assertions appear uncontroversial, but you seem to disagree. What about them bothers you, and when will we get to see your article?

Comment author: Annoyance 20 March 2009 02:17:15PM 6 points [-]

As the old joke says: What do you mean 'we', white man?

The real reason ostensibly smart people can't seem to cooperate is that most of them have no experience with reaching actual conclusions. We train people to make whatever position they espouse look good, not to choose positions well.

Comment author: Nick_Novitski 20 March 2009 03:20:29PM 2 points [-]

What makes a position well-chosen or more likely to assit in reaching actual conclusions?

Comment author: bentarm 13 March 2009 03:48:46AM 2 points [-]

As a previous poster has said, the absurdity heuristic works very well indeed - if something seems absurd to me, I need a lot of evidence before I'll believe it. As Hume said:

"no testimony is sufficient to establish a miracle, unless the testimony be of such a kind, that its falsehood would be more miraculous than the fact which it endeavors to establish."

If someone claims that a talking snake is the reason for every bad thing that anyone has ever done, they're going to have to provide some evidence that this is the case. If they claim that people are related to monkeys (which seems inherently less absurd to me, but I'm probably biased by the fact that it's true), then they're also going to have to provide some evidence, and whatever the extraordinary claim, the evidence is going to have to be enough to shift my belief from "that's absurd" to "oh, I guess that's true then".

On the other hand, having written that I guess it's more likely that the Absurdity Heuristic is more specific, and is the tendency to stick our fingers in our ears, say "that's absurd, la la la", and refuse to listen to any evidence to the contrary. I suppose this is a heuristic that people might use, and might be useful (in ruling out hypotheses which aren't worth spending time falsifying), but as you say, does have its dangers. It's not clear how to avoid the danger of ruling out an hypothesis which is absurd but true while getting the benefits of ruling out hypotheses which are simply absurd, but the heuristic still has its uses (perhaps some threshold of absurdity, some "suspension of absurdity" for certain types of proposition?)

Comment author: Nick_Novitski 13 March 2009 06:02:38PM 2 points [-]

I think that my bias towards our being related to monkeys is due to the meanings I invest in "monkey" and "human" as not being greatly dissimilar.

On the other hand, if I had already accepted the existence and human-exclusiveness of a soul, and/or a supernatural account of the world's origin that afforded special primacy to humans as distinct from animals, then clearly I would think relations that crossed these distinct boundaries of type were too absurd to consider.

Also, another limitation on the heuristic might be, as you suggest, weighing the value of the time that it would take to investigate the proposition being examined; I'm more likely to pause and engage in a discussion of my beliefs when I'm relaxing in my leather armchair with a snifter of brandy than while I'm changing trains on the way to work.

Comment author: billswift 13 March 2009 04:37:45AM *  2 points [-]

There is literally no way anyone is going to be able to do this in the limitations of a blog - if you can get through Michael Martin's "Atheism: A Philosophical Justification" you'll know the answer. Dawkin's "The God Belief" is shorter and easier because he uses evidence to shorten the strictly logical disproofs philosophers use. An older and less complete, but more readable, philosophical disproof is George H Smith's "Atheism: The Case Against God". There are others, but of the ones I've read these two are the best and the most accessible, respectively.

Comment author: Nick_Novitski 13 March 2009 05:48:06PM 2 points [-]

"The God Belief"? Is that a freudian slip?

Comment author: [deleted] 04 March 2009 08:33:33PM 4 points [-]

To my way of thinking, "rationalist" has a certain stink to it, it has connotations of people sitting around arguing about arguing, writing pages of tedious probability math using "prior probabilities" they pulled out of their asses.

In one sense, being a rationalist just means that you try to be rational. But it seems like a stupid thing to wear on your sleeve, because everybody tries to be rational.

There's a sense in which objectivism is just the belief that reality is mind-independent. But I don't go around calling myself an objectivist either.

In response to comment by [deleted] on Tell Your Rationalist Origin Story
Comment author: Nick_Novitski 13 March 2009 05:27:38PM 2 points [-]

Not everyone tries to be rational. Some people despise rationality because of the same stink you attribute to it, or because of others. To them it might connote atheism, or linking themselves to low-status entities like "the man" or "the sheeple."

A rational person is someone who applies rationality. A rationalist is someone who advocates the application of rationality, just as a racist is someone who argues the fundamental importance of racial status and history, or a "homosexualist" is someone who (purportedly) wants to make homosexuality part of all our lives.

There's a dangerous potential to be confused between (for example) "objectivity" (the belief you mention) and "objectivism" (membership in the low-status group you mention).

Comment author: Nick_Novitski 14 February 2009 04:17:55PM 1 point [-]

Could we argue that forced "combat" with the opposite gender is good training for negotiating cooperation with hostiles towards futures higher in our preference ranking?

Err, and that such training is valuable.

View more: Prev