The Bayesian line is that we are too certain of our own flawed reasoning faculties, and should defer more to the average opinion. After all, if everyone has similar but noisy reasoning abilities, then lots of people coming to a conclusion is a much better indicator than a single person coming to another conclusion – and there is no reason to privilege the latter just because that brain happens to be your own.
On the margin, this is probably true - most people err on the overconfident side. But you have to adjust for the fact that on many issues people don’t actually apply their full attention and come up with independent conclusions. In some areas they do: markets encourage rationality by punishing those who over- or under-estimate the value of a product, and in many areas – “how do you make friends?” or “how do you become a great boxer?” – most people with an answer have had to do their own research, reflect on their experience, and come up with independent answers. But as Razib showed, in many issues, including the most contentious ones, each person with an opinion hasn’t actually done a lot of research and critical thinking to justify it; most people subscribe to the opinions held by people that they respect.
Doing serious, first-principles research is hard and time-consuming. Gary Taubes’ investigation of diet science required a decade of effort and a mastery of biochemistry and physiology. Yes, he was fortunate to turn up enough interesting material to make a book out of it, but for most of us it’s neither realistic nor worthwhile to take the effort required to audit each of our beliefs, especially the mostly symbolic ones we use to demonstrate tribal affililation.
Which is fine, but in such areas we need to lower the amount of importance attached to the consensus view, since it represents fewer units of independent cognition. For the rookie boxer, deferring to the consensus is probably the best choice. But for many the issues, when you read “everyone believes X” it’s reasonable to substitute “a small, highly incestuous group of interested parties and political spinners believes X, and everyone else follows along.” In such fields, the truth-value of consensus is relatively low.
But you have to adjust for the fact that on many issues people don’t actually apply their full attention and come up with independent conclusions.
It's not just that they didn't give it their full attention, but that part of their attention is focused on correlations with the opinions of others.
Also, note that "reasoning from first principles" is decidedly not equivalent to "reasoning". The blog post's casual confounding of the two just annoyed the crap out of me.
In the comment section he also made some minor and, to my eyes at least, accurate commentary on the LessWrong community.
I suppose a question to ask then is why any people are nonconformists
probably differs for a lot of people. for me it is probably a combination of lower than average social awareness and raw egotism. i don’t understand too well the need for validation from the herd, but it’s pretty common. my own experience with the less wrong crowd is very low to no social intelligence does eliminate the worse problems of group conformity. unfortunately it still doesn’t abolish the issue whereby smarter people can trick or convince less intelligent people through the fluency of their argumentation.
He seems to assume that Agreeableness is inversely correlated with social intelligence. Is that in fact the case? I certainly observe quite a few socially awkward pushovers.
unfortunately it still doesn’t abolish the issue whereby smarter people can trick or convince less intelligent people through the fluency of their argumentation.
The danger here is with people who are smart but sufficiently less intelligent than the person who makes the argument. Since if the person you want to convince is too dumb to understand some sophisticated argument then all intelligence is useless (with respect to pure argumentation) if the person is not inclined to believe you.
I mean, some superintelligent AI in a box could output sophisticated proofs of why I am supposed to let it out of the box. But since I am unable to read and understand sophisticated proofs and I am not inclined to believe such an AI, any amount of mathematical sophistication will be useless, or pretty much hit diminishing returns after the level that I (the gatekeeper) can grasp.
I came to a very similar conclusion some time ago.
Try playing a popular board game with someone of average or below average intelligence. Try proposing before the game a set of rule changes. The response will be "no lets keep the default rules to keep it fair", even if the game is broken or imbalanced and even if you can explain to them why this is so. Is this an irrational position for them to take?
If you observed a game in a parallel world where everyone is smarter by one or two standard deviations, I think their response would basically be the same. But if you just raise your and their IQ in this universe I think their response may well differ. Can anyone see why I think this is so?
I also noted that intelligent people may be more vulnerable to this because they are unused to mistrusting their own wits when evaluating the arguments of clever others. People probably have all sorts of bad associations with this kind of advice but pause to consider if you would wish your five year old to respond to reasonable arguments from random strangers wanting entry into your house or asking him to follow them. Also note that the few cases when this might be a good idea (policeman, fireman, ..) can be explicitly discuses and implemented in the little tyke's brain.
An unsupervised five-year-old will let a stranger indoors, take candy from a stranger, play with a gun he finds etc. no matter what you drill into him.
Anecdotally, there was a particular incident when I was 6 and specifically didn't open the door for a stranger because my parents told me not to. It turns out it was the postman.
But regardless, surely telling a 5 year old not to take candy from a stranger no matter the stranger's reasoning is better than not doing so.
When I was 6, my mother and I accepted a ride home from the park. I was very reluctant. I told my mother that she might not know, but I had learned in school that one should not accept rides from strangers.
Another anecdote of very young children being able to internalize and comprehend rules.
You did go along for the ride right because you trusted your mother's judgement. All else being equal I would say you where better off with your default position being not taking rides from strangers and not listening to their arguments.
Remember I was only talking about superior minds one dosen't fully trust. In that case you fully trusted your mother and it was best that you outsourced the judgement call to her.
I guess so. I only mean that I think children don't so closely resemble stupid adults. Even bad-tempered children are pretty docile.
An unsupervised five-year-old will let a stranger indoors, take candy from a stranger, play with a gun he finds etc. no matter what you drill into him.
I don't believe I would have done any of those things - and my parents would agree. Comprehension and implementation of rules was well within the realm of my capabilities and I was only rebellious against or defiant of rules I thought were stupid. Not those that could be explained ("that is dangerous and kills people") or those that I was already indifferent to (I would only have answered the door out of a sense of obligation or kindness anyway, I certainly wouldn't have wanted to.)
"No matter what you drill into him" sets a fairly high bar of stupidity.
To do that somewhat naturally, lift weights to increase testosterone. Also gets you laid. Not sure what the corresponding hack for women is.
To do that somewhat naturally, lift weights to increase testosterone.
I haven't found either my stupidity or my resistance to (desirable) persuasion to be increased by either lifting weights or alterations to my testosterone - natural or otherwise. If anything the improvement to my health (mental and physical) and general emotional/social security makes the net effect the opposite to what you describe.
Also gets you laid.
That part I noticed.
That can happen sometimes but it can also go the other way. Testosterone has its downsides. Its upsides might still outweigh them, obviously.
I was born with high prenatal testosterone, so I can attract people now even without testosterone. I generally go without testosterone, 'cuz I think it negatively affects my rationality. Hypothetically, this restricts me to cardio, and keeps me from lifting. In practice, though, it doesn't really come up.
Testosterone has its downsides.
Messes with cholestrol levels something shocking. Makes you stinky. Causes acne. Is converted to DHT leading to male pattern baldness and
You actively avoid lifting weights so as to reduce your testosterone levels? That's not something I've heard before.
I don't lift. I used to. If I were to start lifting again it'd be a relevant consideration. Most people could probably use more testosterone, but I think my optimal testosterone level is low. Years ago someone told me intellligence was correlated with high testosterone at birth but low testosterone as an adult. Makes sense to me.
This is just cracking a dark artsy joke. I still like it since reversed stupidity (or is it intelligence in this case?) truly isn't intelligence (Konkvistador's brain starts to ache).
No the better approach is to simply take into account if any important conflict of interests exist between you and a very clever party you don't fully trust when evaluating their arguments. Yes yes ad hominem I know, yet it does sounds like good tactical advice no?
Darkartsy of you.
Are you telling me that I apply this as a dark arts tactic to avoid being persuaded? That is, are you calling me stupid and arrogant? I insist that one of those does not apply!
EDIT: Oh, wait, you could be suggesting that I'm trying to portray XiXiDu as stupid and arrogant? I deny that charge. It doesn't apply in this instance and when I do say things that are insulting my track record indicates that I say them rather directly. In fact, I point out that it is XiXiDu that calls himself stupid and on more than one occasion I have flat out denied and contradicted his claim.
EDIT: Never mind. Parent changed. New reply:
This is just cracking a dark artsy joke.
Huh? No it isn't. It's an agreement with XiXiDu's point. It is an phenomenon that applies and, I suggest, one that is implemented to a certain context sensitive degree by humans.
Oh sorry I thought you where being sarcastic and that you where disagreeing with XiXiDu and perhaps even setting up a straw man. Maybe the reason I misidentified this is because of your use of "arrogant". When I think of protective stupidity I don't associate that word with it.
How do you find out whether a conflict of interests exists? That's one of the things someone who's trying to manipulate you will try to conceal, and if they're a lot smarter, they're more likely to succeed at it.
Sure. But I think at least some conflicts of interests are very hard to conceal. At the very least if someone finds this argument compelling the other party can't prompt them to denounce this check on principle.
Most strategies that could help one avoid malicious advice stemming from hard to detect conflicts of interests seem to have a (to me) unacceptably high false positive rate. Not so much in the context of a scenario where you are dealing with a a boxed AI but more say when one is interacting very intelligent people in a business envrionment or personal life. It seems to me that such strategies would carry high opportunity costs.
I mostly agree, that's all good and well... until it comes to moral choices, especially big ones. Here, even if people are very biased, don't know their own preferences, just plain don't care about others, etc... shallow conformism is still a worse option in many situations. If everyone just looked to their current group's authorities in deciding how or whether to do the right thing - and those authorities looked to the past - ... wouldn't we have, for example, 0% of Germans resisting the Holocaust instead of 2%? Wouldn't slavery be a respected institution to this day, lazily "justified" by things like genetic differences? Wouldn't, say, husbands be allowed by law and public opinion to beat, rape and essentially own their wives?
No, no, "conservative"/"traditionalist" ethics are a path to nowhere without a complex semi-conscious system, varying from individual to individual and acting on both rational and emotional levels, that would allow one to relate one's personality and preferences with their group's tradition and accumulated knowledge/heuristics, and which would be given priority during judgment-making by an appeal to a higher, ideal authority - in short, without an essentially religious worldview.[1]
Unfortunately, not everyone has it in them to be Oskar Schindler or Sophie Scholl, but many people only had to be "good Christians" when the moment of truth came - to follow the output of that deep and broad system, which had been known as "Christianity", "Western values", "common decency", but which ultimately drew upon similar sources, and had the ethical advice of centuries encapsuled within it. Alas, it was the 20th century, and things like that - old, complicated, below-the-surface systems - were just falling apart everywhere. But we shouldn't just sit back and allow our own system to follow this course.
This is why I'm against any "rational" tampering with today's mainstream Western worldview, even where I'm to the left or to the right of its political aspects. Any attack on "Liberal hypocrisy" that has indeed taken root in the last 50 years and largely replaced Christianity is short-sighted simply because this system is likely the only thing really holding our civilization together. If anything, perhaps we should move towards giving it more religious trappings - official commandments, saints, etc - without necessarily adding any supernatural element, but certainly without naively preaching that e.g. "Human Rights" don't make much sense.
Today, a thinking conservative should be focused on improving and stabilizing the prevailing liberal dogma, not trying to return to the failed Protestant/Catholic one or make a "dogma-free" system. In short, I'm for free individual search through the collected conscious and subconscious ideas of your culture - its narrative. And where you've got a narrative, you've got humans' natural ability to work with stories; abstract ideas are counter-intuitive, but picking out, combining and adapting stories is, IMO, how we can best handle social thinking.
(Sorry for such a rambling comment, I was just prompted to unload some under-construction ideas by seeing a post that's related to them. Paragraphs here can be read separately.)
[1] I'm not talking about any kind of "faith" here, a belief in the suprenatural and so forth, but about the style of thinking that organized religion or advanced ideology seems to foster in developed, all-around intelligent people - like Chesterton or Orwell. My argument is that the average human also benefits from such a system, and this would be more noticeable with better systems. (Compare the Socialism/Communism of the students and professors who were behind the dismantling of the Segregation in the U.S. - mostly good people, for all their flaws and possible delusions - with e.g. the primitive, simplified worldview of early Bolsheviks. Both are clearly religions, but one does its adherents more good than the other.)
This is why I'm against any "rational" tampering with today's mainstream Western worldview, even where I'm to the left or to the right of its political aspects. Any attack on "Liberal hypocrisy" that has indeed taken root in the last 50 years and largely replaced Christianity is short-sighted simply because this system is likely the only thing really holding our civilization together.
Would you have taken the same stance when dealing with say 18th and 19th century Anti-Christian thinkers in their own time? If not, why?
I very likely would; hell, the effects of Christianity's all-around decay in the late 19th century - which was hastened on purpose by those people - are probably among the best evidence for the case I'm making. For starters, I'd quote this at them - and Dostoevsky too, and similar stuff by other thinkers who bemoaned the course that materialist civilization was taking.
You biting this bullet in spite of it violating tribal attire has made me take your argument much more seriously.
I'll spend some time thinking about it in the next few days.
I'm taking my chances on mind-killing here, but I'm (ethnically) Jewish, and my reflexive take on Christianity is "unreliable" rather than "holds civilization together".
In the interests of avoiding the obvious... WWI was committed by Christian nations.
I believe (without having done a bunch of research) that a lot of the "tear it down" in current culture comes ultimately from seeing the hope of civilization's steady improvement was inaccurate after the governments of Europe couldn't find a way to avoid killing millions of each other's citizens.
This is why I'm against any "rational" tampering with today's mainstream Western worldview, even where I'm to the left or to the right of its political aspects.
This is an interesting sentence especially in a comment that started out discussing how bad conformism on moral issues is.
I'm basically speaking against "shallow" (bad) conformism and for "religious" (good) conformism in this comment. Only emulating the here-and-now surface patterns of your group = bad. Taking care to choose among your culture's traditions carefully, taking a sprout and nurturing it if there's no grown branch (like the more succesful attempts at democracy in Africa, which clearly did NOT come from a mere copy-paste of the Western model, but partly drew on colonial or tribal past), perhaps promoting one branch (say, American Protestant radicalism) at the expense of other (say, Southern slavery and its mode of life) but not cutting any memories and ideas off = good.
Were you aware that even the Bolsheviks in Russia were following an established tradition of "nihilism" and radical upheaval? Their fault was not steering the nation in a direction they wanted, but (nearly) pruning all the other branches of possibilities inherent in the Russian culture, from monarchism to tribal/feudal democracy. Today in the US, slavery might be gone but the positive image of the "Southern Gentlemen", with its associated aristocratic values, lives on in vestigal form (and has plenty of fans), while the memory of Russian aristocracy is sadly gone.
That's roughly the difference I'm talking about, between treating the culture as an unique living thing vs. as a generic simple machine.
(And yet still somehow I don't call myself a conservative. Don't ask.)
(like the more succesful attempts at democracy in Africa, which clearly did NOT come from a mere copy-paste of the Western model, but partly drew on colonial or tribal past)
Perhaps, but the linked articles don't go into enough detail to support this assertion.
perhaps promoting one branch (say, American Protestant radicalism) at the expense of other (say, Southern slavery and its mode of life) but not cutting any memories and ideas off = good.
I would like to point out that decentralized systems, e.g., libertarianism, are better at this then centralized systems, e.g., socialism.
Yeah, I suppose if you believe Christianity is/was the only thing holding civilization together, then adopting "Liberal hypocrisy" to fill the same role might make sense. Many people would disagree with the premise, though, by pointing to the Dark Ages and such. I don't really know what to think about this.
Religion was very important to people at least around the time of the Roman Empire's collapse (4th century onwards). Ordinary people around the Mediterranean used to argue about the nature of God and the doctrine of Trinity while standing in line at a shop!
The ecumenical councils of the day, for example, were massively important political events - the Council of Nicaea certainly held no less significance for the people under their jurisdiction than, say, the Nuremberg trials did for contemporary Europeans; it issued a controversial yet generally accepted verdict on what's OK to believe and what's vile heresy. I'm making that comparison because Christianity used to occupy, among other spaces, the niche in public consciousness that nationalism took in the 19th century.
Yeah, I suppose if you believe Christianity is/was the only thing holding civilization together, then adopting "Liberal hypocrisy" to fill the same role might make sense. Many people would disagree with the premise, though, by pointing to the Dark Ages and such. I don't really know what to think about this.
The Christian religion was adopted much earlier and by a much larger proportion of the population in the Eastern Roman Empire than in the Western Roman Empire. The former outlived the latter by over 1,000 years. Obviously, the degree of Christianization was not the only difference between Rome and Constantinople, but it is an important fact to keep in mind when reasoning about these sorts of things.
This applies to political trends—and scientific and metaphysical trends are rather mixed up with political trends—but it also applies on a higher level, and about more important things. The phenomenon of just going along with something is deep and subtle.
The guys seems to have segued from behavior with evolutionary origins to consciously justified behavior.
On the likely evolutionary origins and advantage of believing the other guy, or going along with the tribe, I agree. My own theory is that there are different motivations for the acceptance of a belief, many of which have nothing to do with it's predictive value, and many people really just aren't that motivated by predictive value. When I say "not motivated" I don't mean they have a conscious list in their head that they evaluate by. No, they have different pattern matching algorithms in their heads, one of which is "belief predicts well", and other are "the bossman says so", "my tribe says so", "mommy says so", "this is easy to understand", "this feels good to believe", etc. Different people have different weights associated with the different pattern matching algorithms, giving them different valuations of the beliefs.
An interesting blog post by Razib Khan, who many here probably know from his Gene Expression blog, the old gnxp site or perhaps from his BHTV debate with Eliezer.
I recommend following the link and reading the rest of it there, not only does interestingness continue, the comment section there is usually worth reading since he vigorously moderates it.