The dangers of decompartmentalising toxic waste have been covered here before: Phil Goetz's classic Reason as memetic immune disorder. Vladimir Nesov hypothesises that this is why humans compartmentalise.
In the skepticsphere, decompartmentalising stupidity is considered the best hypothesis to explain the Salem hypothesis: that if a creationist touts scientific expertise in a supporter, said supporter is likely an engineer. But engineers in general are notorious for this sort of thing.
I wonder whether the Salem hypothesis -- more precisely, the fact that the Salem hypothesis is interesting -- is largely a base rate fallacy. If there are a lot more engineers than, e.g., physicists (which I think there are), and if creationists will claim "scientific expertise" for anyone doing anything even vaguely sciencey, then even if there's no interaction at all between domain of expertise and susceptibility to creationism most "scientific experts" who are creationists will be engineers, just because most "scientific experts" will be engineers. (My impression is that a better version of the Salem hypothesis would say "If a creationist touts scientific expertise in a supporter, said supporter is likely an engineer, computer person, or medic" -- and now take a look at the graph at http://www.intuitor.com/physics/ScienceCareers.php .)
Is there more going on than this? Maybe. It's possible, e.g., that the cleverest scientific-ish people gravitate to fields other than those and are less likely to be creationists. Or that something about the kind of problem-solving engineers have to do fits somehow with creationism. (I've heard a similar explanation proposed for an alleged prevalence of engineers among terrorists.) Or something. But I'm not at all sure that it's not just a matter of base rates.
I'm not actually sure how well the Salem hypothesis holds; I'm wondering if it's just having no idea what "science" is. See the Creation Ministries International list of scientists alive today who accept the biblical account of creation. They've pulled in veterinarians and plastic surgeons as "scientists".
Yes, Being willing to swallow the bullet does not mean you are not, in fact, being very stupid indeed. Extrapolating beyond one's knowledge in this manner is using one's own ignorance as data. (Not that any agent can avoid judgement under uncertainty, but it's why this sort of extreme extrapolation can lead to crazy results.) Consistency is useful, but not a terminal value.
Because I can't write a comment there, I will write it here:
Comment #19 by IceBogan:
Interesting comparison. But you can be Libertarian “in the neighborhood of x_0″ without accepting all the reductio ad absurdum arguments — you can vote for a little less government, a little lower taxes, a little more personal responsibility. You can’t be a little bit Many Worlds.
You can be "a little bit Many Worlds", and actually this is probably the most popular position -- that the microscopic particles have many possible histories, with complex amplitudes that sometimes cancel each other out, but as soon as you have too many particles (such as: enough to build a cat), it's no longer true.
A "Many little Worlds" would probably be a better name. Many little Worlds are acceptable for many bullet-dodgers, assuming that they later transform (collapse) into One big World.
I think another manifestation of this phenomenon is the way geeks tend to come up with elaborate justifications for plot holes in their favorite science fiction/fantasy works, e.g., all the discussion about the science of star trek and star wars.
great catch. I wonder if teenagers sat around in ancient greece arguing about which god would win in a fight.
Within nerdspace libertarianism (deontological flavors) seems to be a pretty big offender here.
Good post; it's useful to discuss biases that people who frequent this site are especially susceptible to. This happens in US extremist religious groups too, for example see this article about the subset of people who predicted the apocalypse last year:
It’s been noted by scholars who study apocalyptic groups that believers tend to have analytical mindsets. They’re often good at math. I met several engineers, along with a mathematics major and two financial planners. These are people adept at identifying patterns in sets of data, and the methods they used to identify patterns in the Bible were frequently impressive, even brilliant. Finding unexpected connections between verses, what believers call comparing scripture with scripture, was a way to become known in the group. The essays they wrote explaining these links could be stunningly intricate.
That intricacy was part of the appeal. The arguments were so complex that they were impossible to summarize and therefore very challenging to refute. As one longtime believer, an accountant, told me: “Based on everything we know, and when you look at the timelines, you look at the evidence—these aren’t the kind of things that just happen. They correlate too strongly for it not to be important.” The puzzle was too perfect. It couldn’t be wrong.
This suggests a possibly alternative explanation, that analytical types tend/learn to enjoy systematizing, especially on topics that will be important to others. As Cosma Shalizi says,
Now, I relish the schadenfreude-laden flavors of a mega-disaster scenario as much as the next misanthropic, science-fiction-loving geek, especially when it's paired with some "The fools! Can't they follow simple math?" on the side.
Too long, lacking a decent summary upfront, misleading, if catchy, title. Presumably the point is that technical people overestimate the reliability of wetware.
These are relevant criticisms. Trying to address them I have changed the intro to this:
Related to: Reason as memetic immune disorder, Commentary on compartmentalization
On the old old gnxp site site Razib Khan wrote an interesting piece on a failure mode of nerds. This is I think something very important to keep in mind because for better or worse LessWrong is nerdspace. It deals with how the systematizing tendencies coupled with a lack of common sense can lead to troublesome failure modes and identifies some religious fundamentalism as symptomatic of such minds. At the end of both the original article as well as in the text I quote here is a quick list summary of the contents, if you aren't sure about the VOI consider reading that point by point summary first to help you judge it. The introduction provides interesting information very useful in context but isn't absolutely necessary.
Link to original article.
I have also broken it up with subtitles into three parts, making the end article summary easy to spot and added two important related links to a prominent spot.
Is this an improvement?
Is this an improvement?
Well, yeah, but I still cannot tell if my one-line summary is correct or not.
Maybe? For example, in terrorist groups, engineers are of obvious usefulness and are directly recruited, interfering with inferences about radicalism.
What we really need here is a controlled experiment :P
This explanation doesn't quite work since many engineers have become suicide bombers. If engineers were being recruited largely for their technical skill one would expect them to be the very last people used in that ashion.
. A literal reading of the Bible leads to ludicrous conclusions, but if one perceives that the game is all or nothing, then perhaps one must assert the truth value of Genesis as if it was a scientific treatise.
I think the "all or nothing" thing is a very important insight. I've often found myself profoundly disturbed by my inability to solve some bizarre moral dilemma (something really weird, like "If you had the power to perfectly control what sort of personality types people were born with, what is the optimum mixture of personalities to create?") and felt like my inability to solve these problems somehow puts all of ethics in doubt. It's like I feel that not knowing the proper way to behave in a bizarre science-fictional moral dilemma means that there is no reason to help people, save lives, and do other obviously good things. Even though I know it's irrational, it sometimes makes me physically sick. I have to keep reminding myself that my beliefs should be more robust than that, that a belief system so fragile that it can shatter with one tiny inconsistency is not one worth having.
I imagine that this is how fundamentalists must feel when they spot an inconsistency in one of their sacred texts.
It's like I feel that not knowing the proper way to behave in a bizarre science-fictional moral dilemma means that there is no reason to help people, save lives, and do other obviously good things.
An analogy would be feeling that if you can't solve the Fermat's last theorem, then there is no reason to believe that 2+2=4.
A completely reasonable answer to "if you had power to do X, what exactly would you do" is "I would start doing reseach on consequences of X, and only after having reliable results I would decide". And if the other person says "well, I want you to answer now", just say "if you want me to answer without having critical information, you are not expecting a perfect solution, right?".
If you had the power to perfectly control what sort of personality types people were born with, what is the optimum mixture of personalities to create?
I don't know the answer to this question but that's because I don't have a superhuman understanding of psychology. I don't see how it poses any moral problems.
With my very limited current layperson's knowledge, in terms of the big 5 I would probably increase Openness and Conscientiousness, leave Agreeableness and Extraversion at current rates, and decrease Neuroticism.
The question I'm trying to frame is, if you have the power to choose what preferences people would have, what would you choose? Obviously you'd increase Conscientiousness and decrease Neuroticism, because they generally determine how good you are at fulfilling your preferences, not what your preferences are. Increasing Openness would probably also be good because it would help prevent people from being jerks to those who are different, and I think that we would desire anyone we create to want to behave morally.
But what kind of preferences would you give people? Would you give them a diverse variety, or make them homogenous? Would you keep the current personality distribution the same or would you, for instance, make more nerds and less jocks? Would you pick one ideal person and make all the future generations of the world clones of them? You might say you need a variety of people for society to function, but would that mean that if we had an FAI to do all the work for us that we should make people more homogenous? If you created incredibly unambitious people who only preferred to exist and nothing else would you have created a utopia of 100% preference satisfaction?
I think I've figured out that we should create people whose preferences are at least as ambitious as a normal humans, boredom is good after all. And obviously we should create moral beings. And at the very least a good portion of the creatures we create should have fairly human-like emotions. We should definitely not create any sociopaths. I suspect that creating a wide variety of personality types is good, if only because novelty is good. But what is the proper mix of personalities? How many go-getters, how many artists, how many dreamers, how many down-to-earths, how many nerds, how many jocks, etc.
I suspect there is probably more than one right answer. And for now I'm trying to be content with that, because I'm beginning to think the anxiety it causes me might be symptomatic of some sort of serious mental health problem. I don't think most of the people on this site get heartburn from thinking about this sort of stuff.
See? Looks like I haven't been talking gibberish after all! Or, at least, someone wise shares some of my paranoid delusions. He even points to the two most infamous technocratic states specifically.
A pity that he hasn't mentioned another important thing: that being convinced of one's total freedom from dogma (and founding your philisophy on this "difference" between you and the brainwashed masses) is the most dangerous dogma of all, and nerds are very likely to be convinced of just that.
(It's easy to glimpse some scary moments of that dogma on the blog of a certain locally famous software engineer... although, as I said, he's far from the worst of it.)
the most dangerous dogma of all
Presuming it's not entirely rhetorical, that sounds more than a little overblown. I'd buy "foolish" or "dangerous", but this seems pretty ubiquitous and generally doesn't lead to more than the usual amount of disaster. In particular, I hardly think this is unique to nerds or uniquely horrible in their hands; best I can tell, pretty much everyone is under the impression that they're substantially free of ideological bias, whether they wear a blue collar or a pocket protector, and their attitude toward ideological foes is very likely to be informed by that.
With regard to the OP, I think I broadly accept the theory that technically minded folks are less inclined than average to tolerate fuzziness or internal contradiction in systems, and that this tends to attract them to totalizing systems in the absence of suitable countervailing influences: a set which, unfortunately, includes quite a lot of fundamentalist nastiness.
best I can tell, pretty much everyone is under the impression that they're substantially free of ideological bias, whether they wear a blue collar or a pocket protector
In far mode most people think in terms of good and evil first, correct and incorrect second. They might think that their enemies are evil mutants, but most sense, underneath it all, that their enemies still have their own unique truth (evil mutant truth). This leads to hatred and aggression, but it's less bad than an impersonal, clinical, mechanistic approach.
The people I'm so afraid of are the ones who look for some "objective position" first and feel simply that they're technically correct in the Engineering Challenge of Life, while others are "making mistakes". Thinking that you're fixing others' mistakes all day (like mistakenly allowing Jews to "contaminate" a nation) promotes a much more simplified picture of the world than thinking you're opposing dread and cunning evil - like Catholics do.
In far mode most people think in terms of good and evil first, correct and incorrect second. They might think that their enemies are evil mutants, but most sense that their enemies still have their own unique truth (evil mutant truth). This leads to hatred and aggression, but it's less bad than an impersonal, clinical, mechanistic approach.
I agree with the first sentence, but not with the second. Good and evil, for most people, implies correct and incorrect -- ideological enemies are both wrong and evil, and they're wrong because they're evil. Also evil because they're wrong, if you back them into a corner on that one. Christian conceptions of sin are tied pretty closely to correctness, for example -- the etymology implies "missing the mark".
I'm honestly not sure unemotional, subjectively-objective hatred exists in neurotypical folks, human psychology being what it is. I've gotten pretty angry at software bugs before.
Might be mind projection on my part, true. However, it genuinely looks to me that many people do feel like this, for example, in the trolley problem: the math might say it's more "correct" to end up with +4 saved lives, yet it's still an "evil" act to them - they'd say that a solution can be the only technically correct one and still less moral than alternatives.
A pity that he hasn't mentioned another important thing: that being convinced of one's total freedom from dogma (and founding your philisophy on this "difference" between you and the brainwashed masses) is the most dangerous dogma of all,
I doubt it. A more dangerous dogma probably involves something to do with killing.
Um, how to put it... it leads to stunning intolerance for other kinds of "dogma", including wholesome, psychologically healthy ideology or religion. Religious fanatics might hate infidels, but at least they can understand & admit vital human feelings like faith; intolerance for "blindness" or "delusion", the insistence that there's one calculable right way to run things is culturally destructive, throwing the baby out with the bathwater in literally all cases - even iif it might spare individuals, it enroaches upon the complex, often beautiful patterns of their culture.
I hope you wouldn't deny that the "rationality" of RAND, RAF Marshal Harris, Kissinger or their Soviet/Chinese counterparts - the "rationality" of Dr. Strangelove - has been like a grey, soulless plague upon civilization. They all would've said that it produced slightly less misery than the alternatives they've considered, but I maintain that the indirect damage to humanity has been off the scale, and needn't have happened if our cleverness hadn't outstripped our sanity.
Go read Orwell's or someone else's notes about how we lost a gentler, less callous way of thought in the early 20th century, one that was so entwined with Christianity as to rot away and leave a gaping hole with the advance of aggressive materialism.
Go read Orwell's or someone else's notes
No. I think you are failing to understand the difference between the meanings of the phrases "I express disapproval of" and "the most dangerous of all".
I'm telling you, IMO it's an enormous memetic threat to human civilization as a whole, and not just to the well-being of individual lives.
What sort of fanatics do you mean? Most fanatics that I'm familiar with think that the equivalent virtue in service of a different ideology is not analogous simply because it is in service of the opposing ideology.
Crusaders didn't tend to say that jihadists were like them, only Muslim. Only we who use the outside view can see the parallel.
Which notes of Orwell's are you referring to? Orwell has seen tyranny and cruelty since boarding school. I really can't see him succumbing to wistful nostalgia.
That's for a start. I already linked to that essay in the quotes thread. Also, one more in the same vein.
The young Communist who died heroically in the International Brigade was public school to the core. He had changed his allegiance but not his emotions. What does that prove? Merely the possibility of building a Socialist on the bones of a Blimp, the power of one kind of loyalty to transmute itself into another, the spiritual need for patriotism and the military virtues, for which, however little the boiled rabbits of the Left may like them, no substitute has yet been found.
There's other such bits of left-conservative, anti-pragmatist sentiment sprinkled throughout his essays. Hell, it's not a stretch to call him a National Socialist. I suggest that you take a fresh look, without the conventional view of Orwell - a petit-bourgeois view, I'd say - coloring your perception.
Also!
I really can't see him succumbing to wistful nostalgia.
Oh, but he did. Read Coming Up for Air.
Do you have any more mainstream examples than your software engineer? I really don't know what you mean by "dogma." In the 19th century the word was not used so pejoratively but lately I can't think of anyone who would describe their package of beliefs as a dogma.
Again, the RAND Corporation. There's plenty written about its mindset and practices, including in connection with the whole Vietnam deal - Agent Orange and all that. "Forced-draft urbanization", ain't that a brilliant fucking idea? Hell, thinking of that, the CIA analysts probably also qualify as slaves not only to bureaucracy, but to the Cult of Reason as well.
Samuel Huntington certainly had a bloodless way of writing. I wonder if he would have characterized himself as dogma-free.
Again, the RAND Corporation
Where else have you written about the rand corporation?
Where else have you written about the rand corporation?
Nowhere, just mentioned it in this thread twice. You can start with Soldiers of Reason by Alex Abella, though - it's really rather biased against RAND, but has plenty of info.
There's also an interesting-sounding title in the Wikipedia links, Rationalizing Capitalist Democracy: The Cold War Origins of Rational Choice Liberalism, but I haven't read that one yet. Looks like it'll be more helpful for my argument, judging by the name and the summary.
In Rationalizing Capitalist Democracy, S. M. Amadae tells the remarkable story of how rational choice theory rose from obscurity to become the intellectual bulwark of capitalist democracy. Amadae roots Rationalizing Capitalist Democracy in the turbulent post-World War II era, showing how rational choice theory grew out of the RAND Corporation's efforts to develop a "science" of military and policy decisionmaking. But while the first generation of rational choice theorists—William Riker, Kenneth Arrow, and James Buchanan—were committed to constructing a "scientific" approach to social science research, they were also deeply committed to defending American democracy from its Marxist critics. Amadae reveals not only how the ideological battles of the Cold War shaped their ideas but also how those ideas may today be undermining the very notion of individual liberty they were created to defend.
Oh, looks like its first 180 pages are on Google Books.
Ah yes, the danger of thinking you can think for yourself.
The danger is that it avoids regression to the mean. For that reason, yes it is the most dangerous dogma, but it also has a lot of potential. I'd trust someone like this more than I'd trust your average "agreeable" neurotypical who can at any moment be convinced by a charismatic enough charlattan cult leader to do just about anything if the neurotypical is down on their luck. Yes, some people like this have dangerous beliefs and a dangerous tendency to act on them but at least you can usually see them coming.
Also, what if they are free from dogma? What if they just think better than you or I? Depending on how free they are from dogma the danger may just be that they are excellent rationalisers. If someone who I think is mostly someone who thinks for themselves: they view every claim critically and insist on rederiving every conclusion before they believe it, if they tell me theythey are totally free from dogma and the masses are brainwashed idiots they're probably wrong about the "entirely". But, more or less, they are right. The only danger here is you can't talk them out of things, if they think you are one of the brainwashed masses and they might be angry about most people being brainwashed.
If they are a typically dogmatic thinker then they are really good at believing things which aren't true which presents a whole different kind of danger. Also they probably think of people who disagree with them as evil mutants and themselves as noble saints.
It's not dangerous for someone who is better at thinking undogmatically than people in general to found their philosophy on this difference, or even the overestimation of it that you propose.
Can you link the scary moment of dogma from the blog of a certain locally famous software engineer? Is it paul graham?
In a comment below you say "intolerance for "blindness" or "delusion", the insistence that there's one calculable right way to run things is culturally destructive." You sound like you are talking about something completely different. I suspect thinking they are free from dogma is simply something people who think there's one calculable right way to run things happen to tend to do and you are throwing out the baby (okay, maybe a crocodile) with the bathwater. Thinking that demonstrates blindness to the facts. Thinking that one's preferences are objective pronouncements on how the world should be in some fuzzy non value dependent way demonstrates that you mistake your feelings for facts. Believing you don't do this does intensify the danger such people pose massively but it isn't the source of the danger. And for people who don't do this, or don't do it very much, or who are just not abnormally vindictive or aggressive or callous enough to come up with a right way to run things that hurts people, or accept that their right way to run things will not be implemented are not a danger.
First, the title is quite vague. This post is very long, and it's rather difficult to see what the actual point of it is until the very end, especially because there's no "this is why you should take the time to read this" blurb up front.
It's a worthwhile point to make, and something that's worth discussing, but more people would probably discuss it if it were presented differently.
Related to: Reason as memetic immune disorder, Commentary on compartmentalization
On the old old gnxp site site Razib Khan wrote an interesting piece on a failure mode of nerds. This is I think something very important to keep in mind because for better or worse LessWrong is nerdspace. It deals with how the systematizing tendencies coupled with a lack of common sense can lead to troublesome failure modes and identifies some religious fundamentalism as symptomatic of such minds. At the end of both the original article as well as in the text I quote here is a quick list summary of the contents, if you aren't sure about the VOI consider reading that point by point summary first to help you judge it. The introduction provides interesting information very useful in context but isn't absolutely necessary.
Link to original article.
Introduction
Nerd Failure Mode
This section is the part most relevant to LessWrong:
In sum:
I bolded the note on mass literacy and participation because of the interesting historical conclusion that in the United Stated mass participation in democracy inevitably made the influence of religion on policy greater. It goes against a deep assumption shared by most educated people that "democratic elections" necessarily produce "liberal" or "secular" results. It was particularly evident among pundits and particularly easy to see as foolish with the recent upheavals in the Middle East.
This last rather minor seeming note is perhaps the most relevant part of the article for aspiring rationalist. Not only is it particularly salient for those us inclined to questioning the usefulness of the category "religion" in certain context, but because nearly all of us are not religious. Our bad axioms seem unlikely to originate directly from something like a religious texts, though obviously it is plausible many of our axioms ultimately originate from such sources.Not many of us are Communists either, but we are attracted to highly consistent ideologies. We seem likely to be particularly vulnerable to bad axioms in a way most minds aren't.
So if after some thought and examination you notice that a widely respected and universally endorsed axiom in your society has clear and hard to deny implications that are in practice ignored or even denounced by most people, you should be more willing to dump such axioms than is comfortable.