Again, how does the 'my choice of school' node here differ from the 'my not being born into a cult' node? The latter doesn't cause philosophical truths either.
Right, it doesn't. But they're still going to be inferentially connected (d-connected in Judea Pearl's terminology) because both a) your beliefs (if formed through a reliable process), and b) philosophical truths, will be caused by the same source.
And just a terminology issue: I was being a bit sloppy here, I admit. "X causes Y", in the sense I was using it, means "the state of X is a cause of the state of Y". So it would be technically correct but confusing to say, "Eating unhealthy foods causes long life", because it means "Whether you eat unhealthy foods is a causal factor in whether you have a long life".
(Strictly speaking nothing does: only contingent things have causes, and philosophical truths aren't contingent on how things turn out. But let's put that aside for now.)
Yes, I assumed that how philosophers define the terms, but a) I don't find such a category useful because b) of all the instances where philosophers had to revise their first-principles derivations based on subtle assumptions about how the universe works.
What [my education in philosophy] does is provide me with habits of thought that do a better job of producing true beliefs than the mental habits I would have acquired if born into a cult. But then different schools of philosophy teach different habits of thought too (that's why they reach different conclusions). The flaws in the other schools of thought are much less obvious than the flaws found in cults, but that's just a difference in degree...
I actually agree. Still, to the extent that they do converge on reliable truth finding mechanisms, they should converge on the same truth-finding mechanisms. And one's admission that one's own truth-finding mechanism is so heavily school-dependent would indeed be quite worrisome, as it indicates insufficient critical analysis of what one was taught.
Of course, merely being critical is insufficient (someone who said so in this discussion was rightfully modded down for such a simplistic solution). I would say that you additionally have to check that the things you learn are multiply and deeply connected to the rest of your model of the world, and not just some "dangling node", immune to the onslaught of evidence from other fields.
I don't find such a category [the 'non-contingent'] useful because b) of all the instances where philosophers had to revise their first-principles derivations based on subtle assumptions about how the universe works.
This sounds like a metaphysics-epistemology confusion (or 'territory-map confusion', as folks around here might call it). It's true that empirical information can cause us to revise our 'a priori' beliefs. (Most obviously, looking at reality can be a useful corrective for failures of imagination.) But it doesn't follow that the propositio...
Who we learn from and with can profoundly influence our beliefs. There's no obvious way to compensate. Is it time to panic?
During one of my epistemology classes, my professor admitted (I can't recall the context) that his opinions on the topic would probably be different had he attended a different graduate school.
What a peculiar thing for an epistemologist to admit!
Of course, on the one hand, he's almost certainly right. Schools have their cultures, their traditional views, their favorite literature providers, their set of available teachers. These have a decided enough effect that I've heard "X was a student of Y" used to mean "X holds views basically like Y's". And everybody knows this. And people still show a distinct trend of agreeing with their teachers' views, even the most controversial - not an unbroken trend, but still an obvious one. So it's not at all unlikely that, yes, had the professor gone to a different graduate school, he'd believe something else about his subject, and he's not making a mistake in so acknowledging...
But on the other hand... but... but...
But how can he say that, and look so undubiously at the views he picked up this way? Surely the truth about knowledge and justification isn't correlated with which school you went to - even a little bit! Surely he knows that!
And he does - and so do I, and it doesn't stop it from happening. I even identified a quale associated with the inexorable slide towards a consensus position, which made for some interesting introspection, but averted no change of mind. Because what are you supposed to do - resolutely hold to whatever intuitions you walked in with, never mind the coaxing and arguing and ever-so-reasonable persuasions of the environment in which you are steeped? That won't do, and not only because it obviates the education. The truth isn't anticorrelated with the school you go to, either!
Even if everyone collectively attempted this stubbornness only to the exact degree needed to remove the statistical connection between teachers' views and their students', it's still not truth-tracking. An analogy: suppose you give a standardized English language test, determine that Hispanics are doing disproportionately well on it, figure out that this is because many speak Romance languages and do well with Latinate words, and deflate Hispanic scores to even out the demographics of the test results. This might give you a racially balanced outcome, but on an individual level, it will unfairly hurt some monolingual Anglophone Hispanics, and help some Francophone test-takers - it will not do as much as you'd hope to improve the skill-tracking ability of the test. Similarly, flattening the impact of teaching on student views won't salvage truth-tracking of student views as though this trend never existed; it'll just yield the same high-level statistics you'd get if that bias weren't operating.
Lots of biases still live in your head doing their thing even when you know about them. This one, though, puts you in an awfully weird epistemic situation. It's almost like the opposite of belief in belief - disbelief in belief. "This is true, but my situation made me more prone than I should have been to believe it and my belief is therefore suspect. But dang, that argument my teacher explained to me sure was sound-looking! I must just be lucky - those poor saps with other teachers have it wrong! But of course I would think that..."
It is possible, to an extent, to reduce the risk here - you can surround yourself with cognitively diverse peers and teachers, even if only in unofficial capacities. But even then, who you spend the most time with, whom you get along with best, whose style of thought "clicks" most with yours, and - due to competing biases - whoever agrees with you already will have more of an effect than the others. In practice, you can't sit yourself in a controlled environment and expose yourself to pure and perfect argument and evidence (without allowing accidental leanings to creep in via the order in which you read it, either).
I'm not even sure if it's right to assign a higher confidence to beliefs that you happen to have maintained - absent special effort - in contravention of the general agreement. It seems to me that people have trains of thought that just seem more natural to them than others. (Was I the only one disconcerted by Eliezer announcing high confidence in Bayesianism in the same post as a statement that he was probably "born that way"?) This isn't even a highly reliable way for you to learn things about yourself, let alone the rest of the world: unless there's a special reason your intuitions - and not those of people who think differently - should be truth-tracking, these beliefs are likely to represent where your brain just happens to clamp down really hard on something and resist group pressure and that inexorable slide.