Try this: Choose a book that you expect to disagree with and read it from start to finish over several weeks. See what impact it has on you. I tried this and felt my beliefs changing despite none of the arguments being convincing. It seemed to peter out a few weeks after I finished the book. I hypothesize that in an extended experiment we could actually brainwash ourselves to the point of holding some radically different views.
The study described in the link only exposed the subject to a single article. The effect might be different for different amounts of exposure.
In my own experience this seems to be the case. When I briefly read politically opposing blogs I find them so obviously stupid that I'm amazed anyone could take the other side seriously, but when I spend a long while doing it I find my views moderating and sometimes even crossing over despite not being convinced by any of their actual arguments, and begin to be embarrassed by figures I normally admire even though most of what I find directed against them are mere pejoratives. Then afterward the effect wears off. I could be unusually easily-led, but I've heard of enough other similar experiences that I doubt it.
"Isn't it lucky that we happened to be born into the one true faith?"
In my case, I got a lot of my atheism from my father; I haven't had the kind of "crisis of faith" that many other aspiring rationalists said they had. (My mom is a non-practicing Catholic.) I was practically born into this culture, so I get worried about this on occasion.
It would be interesting to know how many people in the US who are raised as atheists adopt a practicing faith in their adolescence or adulthood that they maintain for at least ten years; versus people raised as practicing theists who become atheists and remain so for at least ten years.
Do you also get annoyed by people who don't believe in ghosts who criticize people who do without being aware of their own irrationality?
Surely the truth about knowledge and justification isn't correlated with which school you went to
It seems pretty likely that there is some correlation. (Suppose, without loss of generality, that some kind of epistemic externalism is true. Then some schools -- the ones where externalism is popular -- correlate with the production of true beliefs about epistemology.) The problem is just that we don't know in advance which schools actually have it right.
Perhaps what you mean to suggest is that going to school X isn't a sensitive method of belief formation. Even if what it taught was false (like school Y), you would still end up believing it (just as the students of Y do).
Then again, one could say much the same thing about being born into a religious cult (or a different historical epoch). I do consider myself lucky to have avoided such an epistemically stunted upbringing. But this kind of luck does not in any way undermine my present beliefs.
In Meta-Coherence vs. Humble Convictions, I suggest that what matters is how you assess the alternative epistemic position. If you really think it is just as well-informed and well-supported as your own, then this should undermine your pr...
Yes, the key issue is not so much whether on a first analysis you came to think those other folks are not as well informed as you, but whether you would have thought that if you had been taught by them. The issue is how to overcome the numerous easy habits of assuming that what you were taught must have been better. Once you see that on a simple first analysis you would each think the other less informed, you must realize that the problem is harder than you had realized and you need to re-evaluate your reasons for so easily thinking they are wrong and you are right. Until you can find a style of analysis that would have convinced you, had you grown up among them, to convert to this side, it is hard to believe you've overcome this bias.
This reminds me of Where Recursive Justification Hits Bottom, where Eliezer explains that an agent that uses Occamian priors and Bayes' rule, when evaluating whether this is effective, would assign an Occamian prior to the theory that it is effective and update using Bayes' rule to account for its success and failures.
If you learn and adopt a mode of epistemology at one school, that is what you will use to evaluate a competing mode advocated by another school.
Signaling is a factor, but there is another issue. That is the choice of what to work on. Academics and academic departments, no less than individuals, have to make choices of where to focus their time and resources - no one has unlimited time and attention and choices have to be made. So one person, or school, may focus on cognitive biases and skimp on memory biases, or the other way around. This choice of what is important is actually very strongly affected by signaling issues - much more so than signaling affects particular beliefs, I think.
I see some problems here but it doesn't seem quite as intractable as Alicorn suggests.
If your beliefs are highly correlated with those of your teachers then you need to immerse yourself in the best arguments of the opposing side. If you notice that you are not changing your mind very often then you have a deeper problem.
To give a few related examples. One of the things that gives me confidence in my major belief structure is that I am an Atheist Capitalist. But, as I child I was raised and immersed in Atheist Communism. I rejected the communism but not th...
I've also noticed "liberals" making more sense, but I attribute this to smart people abandoning conservative groups and jumping ship to liberal ones. This may mean that "conservative" policies are being under-argued.
Having been sincerely religious as a young adult has its drawbacks for my present self (e.g. lost time and effort), but one positive effect is that I'm not as worried about this, because I've felt what it's like to find that my entire worldview fails on its own terms. (Basically, I eventually came to realize that it was completely out of character for the deity I knew— or any deity derived from the scriptures and theologies I professed— to create a universe and humanity completely un-optimized along any recognizable moral axis. But I digress.)
I was lucky...
Can you think of another behavior pattern that is more accurate than this?
Assuming that someone isn't going to hold a belief they know to be false, they are teaching you perceived truth. Why wouldn't you adapt those beliefs? If you, the student, possessed the ability to denounce those beliefs in a manner fitting rational discussion it seems likely that the master would have been able to do so as well.
This isn't to say all masters are all right all the time, but what else do we have to go on until we go into the territory ourselves? These people went ahead ...
These have a decided enough effect that I've heard "X was a student of Y" used to mean "X holds views basically like Y's".
It still does. The question is open to what extent this is because students tend to choose advisers whose views they already agree with.
Hmm... This might actually be a major breakthrough of sorts. When I have time, I'll delve into the literature to see what folks have to say about it. On the surface, this seems related to the notion that we just need a good-enough theory to make progress towards a better theory.
It might be the case that folks in different departments are climbing the same mountain from different sides; if people in department A are advocating Big Theory p, and people in department B are advocating Big Theory q, and p and q are both approximately correct, then we shouldn't be surprised that they both have proponents.
Are you sure you have the causality of it right? I always thought of graduate schools as selectors/filters for certain kinds of intelligences and pov, rather than causators.
"But dang, that argument my teacher explained to me sure was sound-looking! I must just be lucky - those poor saps with other teachers have it wrong!"
This is actually something I've been wondering about regarding the disproportionate overlap between libertarianism and anthropogenic global warming skepticism. I'd like to think that this disproportionate overlap is because both views stem from a rational and objective assessment of the available data, but at the same time, I can't deny that anthropogenic global warming would throw a monkey wrench into libertarian philosophy if it was real, so being skeptical of it saves us from doing an awful lot of mental gymnastics...
Learning the arguments of opposing viewpoints is helpful, but if you're being truly rational, it shouldn't be necessary to remove bias.
I figure that if you know there are other groups with different beliefs, they are likely to have heard arguments you haven't. You should assume there are arguments, and use them to adjust your beliefs. If you later hear such arguments, and they make more sense then you thought, then that should make you believe them more. If they make less sense, even if they're still good arguments and would only help convince you if you h...
Thinking out loud: So if you're french,trying to figure out what true english way to say something would be, you don't want to... you want to specifically avoid the kinds of mistake french speakers (you) tend to make because they are french. What a native english speaker might call frenglish. You have access to native spanish and japanese speakers, you know they are making mistakes, and you know that some of their mistakes are made because of their native language.
One source of information is more fluent french speakers, they have overcome some obstacles ...
Widening the spread of your mentors should reduce this bias, as long as you didn't choose mentors that agree with each other. Obviously, there isn't really enough time to be taught from a wide enough sample of perspectives to properly eliminate it.
It's almost like the opposite of belief in belief - disbelief in belief.
More like belief in disbelief.
During one of my epistemology classes, my professor admitted (I can't recall the context) that his opinions on the topic would probably be different had he attended a different graduate school.
I read this as an admission that modern academic philosophy has nothing whatever to do with the search for truth and everything to do with status-seeking, signalling, and affiliation games. But at least he was being sort of honest.
Concern with status can actually foster truth-seeking, and not just interfere with it. You can be the star who discovered something new, you can expose someone else's status as undeserved, and simply maintaining your own status (in a truth-seeking community) can motivate you to take extra care with what you say and do. The social emotions are not inevitably a source of noise and error.
(And by the way, your own comment is an intemperate denunciation. I hope that a little sober reflection would lead you to conclude that maybe modern academic philosophers do have a professional interest in truth after all, and that even if they are collectively doing it badly, your particular diagnosis is factually wrong.)
This is good. It seems (to me) to mean that the LessWrong community is starting to "get the hang" of the importance of explanation...
By that I mean that a person who found themselves in the state of being "very" "intelligent" might, at the exact same time that they realized their state of intelligence had been the result of what we call "insights"--a working out of the problem on a level-independent way from the presuppositions inherent in the overwhelming bias of the problem as stated...
that that agent would also c...
Who we learn from and with can profoundly influence our beliefs. There's no obvious way to compensate. Is it time to panic?
During one of my epistemology classes, my professor admitted (I can't recall the context) that his opinions on the topic would probably be different had he attended a different graduate school.
What a peculiar thing for an epistemologist to admit!
Of course, on the one hand, he's almost certainly right. Schools have their cultures, their traditional views, their favorite literature providers, their set of available teachers. These have a decided enough effect that I've heard "X was a student of Y" used to mean "X holds views basically like Y's". And everybody knows this. And people still show a distinct trend of agreeing with their teachers' views, even the most controversial - not an unbroken trend, but still an obvious one. So it's not at all unlikely that, yes, had the professor gone to a different graduate school, he'd believe something else about his subject, and he's not making a mistake in so acknowledging...
But on the other hand... but... but...
But how can he say that, and look so undubiously at the views he picked up this way? Surely the truth about knowledge and justification isn't correlated with which school you went to - even a little bit! Surely he knows that!
And he does - and so do I, and it doesn't stop it from happening. I even identified a quale associated with the inexorable slide towards a consensus position, which made for some interesting introspection, but averted no change of mind. Because what are you supposed to do - resolutely hold to whatever intuitions you walked in with, never mind the coaxing and arguing and ever-so-reasonable persuasions of the environment in which you are steeped? That won't do, and not only because it obviates the education. The truth isn't anticorrelated with the school you go to, either!
Even if everyone collectively attempted this stubbornness only to the exact degree needed to remove the statistical connection between teachers' views and their students', it's still not truth-tracking. An analogy: suppose you give a standardized English language test, determine that Hispanics are doing disproportionately well on it, figure out that this is because many speak Romance languages and do well with Latinate words, and deflate Hispanic scores to even out the demographics of the test results. This might give you a racially balanced outcome, but on an individual level, it will unfairly hurt some monolingual Anglophone Hispanics, and help some Francophone test-takers - it will not do as much as you'd hope to improve the skill-tracking ability of the test. Similarly, flattening the impact of teaching on student views won't salvage truth-tracking of student views as though this trend never existed; it'll just yield the same high-level statistics you'd get if that bias weren't operating.
Lots of biases still live in your head doing their thing even when you know about them. This one, though, puts you in an awfully weird epistemic situation. It's almost like the opposite of belief in belief - disbelief in belief. "This is true, but my situation made me more prone than I should have been to believe it and my belief is therefore suspect. But dang, that argument my teacher explained to me sure was sound-looking! I must just be lucky - those poor saps with other teachers have it wrong! But of course I would think that..."
It is possible, to an extent, to reduce the risk here - you can surround yourself with cognitively diverse peers and teachers, even if only in unofficial capacities. But even then, who you spend the most time with, whom you get along with best, whose style of thought "clicks" most with yours, and - due to competing biases - whoever agrees with you already will have more of an effect than the others. In practice, you can't sit yourself in a controlled environment and expose yourself to pure and perfect argument and evidence (without allowing accidental leanings to creep in via the order in which you read it, either).
I'm not even sure if it's right to assign a higher confidence to beliefs that you happen to have maintained - absent special effort - in contravention of the general agreement. It seems to me that people have trains of thought that just seem more natural to them than others. (Was I the only one disconcerted by Eliezer announcing high confidence in Bayesianism in the same post as a statement that he was probably "born that way"?) This isn't even a highly reliable way for you to learn things about yourself, let alone the rest of the world: unless there's a special reason your intuitions - and not those of people who think differently - should be truth-tracking, these beliefs are likely to represent where your brain just happens to clamp down really hard on something and resist group pressure and that inexorable slide.