FrameBenignly comments on Open thread, Dec. 1 - Dec. 7, 2014 - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (346)
I may write a full discussion thread on this at some point, but I've been thinking a lot about undergraduate core curriculum lately. What should it include? I have no idea why history has persisted in virtually every curriculum I know of for so long. Do many college professors still believe history has transfer of learning value in terms of critical thinking skills? Why? The transfer of learning thread touches on this issue somewhat, but I feel like most people on there are overvaluing their own field hence computational science is overrepresented and social science, humanties, and business are underrepresented. Any thoughts?
The first question is what goals should undergraduate education have.
There is a wide spectrum of possible answers ranging from "make someone employable" to "create a smart, well-rounded, decent human being".
There is also the "provide four years of cruise-ship fun experience" version, too...
Check out page 40 of this survey.
In order of importance: To be able to get a better job 86% / To learn more about things that interest me 82% / To get training for a specific career 77% / To be able to make more money 73% / To gain a general education and appreciation of ideas 70% / To prepare myself for graduate or professional school 61% / To make me a more cultured person 46%
First, undergrad freshmen are probably not the right source for wisdom about what a college should be.
Second, I notice a disturbing lack of such goals as "go to awesome parties" and "get laid a lot" which, empirically speaking, are quite important to a lot of 18-year-olds.
In systems like the US, where undergraduate freshmen are basically customers paying a fee, I expect their input on what they want and expect the product they're purchasing to be like should be extremely relevant.
Yes, that is the "provide four years of cruise-ship fun experience" version mentioned. The idea that it's freshmen who are purchasing college education also needs a LOT of caveats.
Indeed, customers are usually expected to be informed about what they're buying. But in the case of education, where what the "customer" is buying is precisely knowledge, a freshman's opinion on what education should contain may be less well informed than, for example, a grad student's opinion.
Exactly which courses do you imagine do the most to help students go to the most awesome parties and get laid a lot?
Ones with very little homework and a good gender ratio.
The point is not that they need courses to help them with that. The point is that if you are accepting freshman desires as your basis for shaping college education, you need to recognize that surveys like the one you linked to present a very incomplete picture of what freshmen want.
If the desires you named are irrelevant to the discussion at hand, then can you please name the desires that you think are relevant which are not encapsulated by the survey and explain how they are relevant to what classes students are taking? Also, who is the right source of wisdom about what a college should be?
For the bit of mental doodling that this thread is, the right source is you -- your values, your preferences, your prejudices, your ideals.
Nerds tend to undervalue anything that is not math-heavy or easily quantifiable.
History illuminates the present. A lot of people care about it, a lot of feuds stem from it, and a lot of situations echo it. You can't understand the Ukrainian adventures Putin is going on without a) knowing about the collapse of the Soviet Union to understand why the Russians want it, b) knowing about the Holodomor to understand why the Ukrainians aren't such big fans of Russian domination, and arguably c) knowing about the mistakes the west made with Hitler, to get a sense of what we should do about it.
History gives you a chance to learn from mistakes without needing to make them yourself.
History is basically a collection of the coolest stories in human history. How can you not love that?
How useful is knowing about Ukraine to the average person? What percentage of History class will cover things which are relevant? Which useful mistakes to avoid does a typical History class teach you about?
1) Depends how political you are. I'm of the opinion that education should at least give people the tools to be active in democracy, even if they don't use them, so I consider at least a broad context for the big issues to be important.
2) Hard to say - I'm a history buff, so most of my knowledge is self-taught. I'd have to go back and look at notes.
3) Depends on the class. I tend to prefer the big-picture stuff, which is actually shockingly relevant to my life(not because I'm a national leader, but because I'm a strategy gamer), but there's more than enough historians who are happy to teach you about cultural dynamics and popular movements. You think popular music history might help someone who's fiddling with a bass guitar?
Given how hard it is to establish causality, history where you don't have a lot of the relevant information and there a lot of motivated reasoning going on is often a bad source for learning.
Which is better - weak evidence, or none?
An interesting question. Let me offer a different angle.
You don't have weak evidence. You have data. The difference is that "evidence" implies a particular hypothesis that the data is evidence for or against.
One problem with being in love with Bayes is that the very important step of generating hypotheses is underappreciated. Notably, if you don't have the right hypothesis in the set of hypotheses that you are considering, all the data and/or evidence in the world is not going to help you.
To give a medical example, if you are trying to figure out what causes ulcers and you are looking at whether evidence points at diet, stress, or genetic predisposition, well, you are likely to find lots of weak evidence (and people actually did). Unfortunately, ulcers turned out to be an bacterial disease and all that evidence, actually, meant nothing.
Another problem with weak evidence is that "weak" can be defined as evidence that doesn't move you away from your prior. And if you don't move away from your prior, well, nothing much changed, has it?
"Weak" means that it doesn't change your beliefs very much - if the prior probability is 50%, and the posterior probability is 51%, calling it weak evidence seems pretty natural. But it still helps improve your estimates.
Only if it's actually good evidence and you interpret it correctly. Another plausible interpretation of "weak" is "uncertain".
Consider a situation where you unknowingly decided to treat some noise as evidence. It's weak and it only changed your 50% prior to a 51% posterior, but it did not improve your estimate.
Often none.
For example, if a piece of evidence E is such that:
- I ought to, in response to it, update my confidence in some belief B by some amount A, but
- I in fact update my confidence in B by A2,
and updating by A2 gets me further from justified confidence than I started out, then to the extent that I value justified confidence in propositions I was better off without E.
Incidentally, this is also what I understood RowanE to be referring to as well.
But it's only bad because you made the mistake of updating by A2. I often notice a different problem of people to always argue A=0 and then present alternative belief C with no evidence. On some issues, we can't get a great A, but if the best evidence available points to B we should still assume it's B.
Agreed.
Agreed.
Yes, I notice that too, and I agree both that it's a problem, and that it's a different problem.
Overconfidence is a huge problem. Knowing that you don't understand how the world works is important. To the extend that people believe that they can learn significant things from history, "weak evidence" can often produce problems.
If you look at the Western Ukraine policy they didn't make a treaty to accept Russian annexion of the Krim in return for stability in the rest of Ukraine. That might have prevented the mess we have at the moment.
In general political decisions in cases like this should be made by doing scenario planning.
It on thing to say that Britian and France should have declared war on Germany earlier. It quite another thing to argue that the West should take military action against Russia.
Might have, but my money isn't on it. You think Putin cares about treaties? He's a raw-power sort of guy.
And yes, the scenarios are not identical - if nothing else, Russia has many more ICBMs than Hitler did. Still, there's ways to take action that are likely to de-escalate the situation - security guarantees, repositioning military assets, joint exercises, and other ways of drawing a clear line in the sand. We can't kick him out, but we can tell him where the limits are.
(Agreed on your broader point, though - we should ensure we don't draw too many conclusions).
Putin does care about the fact that Ukraine might join NATO or the EU free trade zone. He probably did feel threatened by what he perceived as a color revolution with a resulting pro-Western Ukrainian government.
At the end of the day Putin doesn't want the crisis to drag on indefinitely so sooner or later it's in Russia's interest to have a settlement. Russia relies on selling it's gas to Europe.
Having the Krim under embargo is quite bad for Russia. It means that it's costly to keep up the economy of the Krim in a way that it's population doesn't think the Krim decayed under Russian rule and there unrest.
On the other hand it's not quite clear the US foreign policy has a problem with dragging out the crisis. It keeps NATO together even through Europeans are annoyed of getting spied at by the US. It makes it defensibly to have foreign miltary bases inside Germany that spy on Germans.
Do you really think joint exercises contribute to deescalation?
As far as repositioning military assets goes, placing NATO assets inside Ukraine is the opposite of deescalation.
The only real way to descalate is a diplomatic solution and there probably isn't one without affirming Crimea as part of Russia.
There's a certain type of leader, over-represented among strongmen, that will push as far as they think they can and stop when they can't any more. They don't care about diplomacy or treaties, they care about what they can get away with. I think Putin is one of those - weak in most meaningful ways, but strong in will and very willing to exploit our weakness in same. The way to stop someone like that is with strength. Russia simply can't throw down, so if we tell them that they'd have to do so to get anywhere, they'd back off.
Of course, we need to be sure we don't push too far - they can still destroy the world, after all - but Putin is sane, and doesn't have any desire to do anything nearly so dramatic.
Putting gains inner politcs strength from the conflict.
That assumes that you can simply change from being weak to being strong. In poker you can do this as bluffing. In Chess you can't. You actually have to calculate your moves.
Holding joint military exercises isn't strength if you aren't willing to use the military to fight.
Bailing out European countries is expensive enough. There not really the money to additionally prop up Ukraine.
Only as long as he's winning.
NATO is, far and away, the strongest military alliance that has ever existed. They have the ability to be strong. When the missing element is willpower, "Man up, already!" is perfectly viable strategic advice.
Accept an annexation in return for promises of stability? Hmm, reminds me of something...
That's partly the point, we didn't go that route and now have the mess we have at the moment.
And what happened the last time we DID go that route?
Making decisions because on a single data point is not good policy.
Also the alternative to the Munich agreements would have been to start WWII earlier. That might have had advantages but it would still have been very messy.
Sometimes none, if the source of the evidence is biased and you're a mere human.
There are unbiased sources of evidence now?
Some sources of evidence are less biased than others. Some sources of evidence will contain biases which are more problematic than others for the problem at hand.
Of course. But Rowan seemed to be arguing a much stronger claim.
That question doesn't have anything to do with the claim that you can make someone less informed by giving them biased evidence.
Undergraduate core curriculum where, for whom, and for what purposes?
If I were designing a core curriculum off the top of my head, it might look something like this:
First year: Statistics, pure math if necessary, foundational biology, literature and history of a time and place far removed from your native culture. Classics is the traditional solution to the latter and I think it's still a pretty good one, but now that we can't assume knowledge of Greek or Latin, any other culture at a comparable remove would probably work as well. The point of this year is to lay foundations, to expose students to some things they probably haven't seen before, and to put some cognitive distance between the student and their K-12 education. Skill at reading and writing should be built through the history curriculum.
Second year: Data science, more math if necessary, evolutionary biology (perhaps with an emphasis on hominid evolution), basic philosophy (focusing on general theory rather than specific viewpoints), more literature and history. We're building on the subjects introduced in the first year, but still staying mostly theoretical.
Third year: Economics, cognitive science, philosophy (at this level, students start reading primary sources), more literature and history. At this point you'd start learning the literature and history of your native language. You're starting to specialize, and to lay the groundwork for engaging with contemporary culture on an educated level.
Fourth year: More economics, political science, recent history, cultural studies (e.g. film, contemporary literature, religion).
Fifth year: spent unemployed and depressed because of all the student debt and no marketable skills.
This is a curriculum for future philosopher-kings who never have to worry about such mundane things as money.
"Core curriculum" generally means "what you do that isn't your major". Marketable skills go there, not here; it does no one any good to produce a crop of students all of whom have taken two classes each in physics, comp sci, business, etc.
If you count the courses you suggest, there isn't much room left for a major.
I think a fruitful avenue of thought here would be to consider higher (note the word) education in its historical context. Universities are very traditional places and historically they provided the education for the elite. Until historically recently education did not involve any marketable skills at all -- its point was, as you said, "engaging with contemporary culture on an educated level".
Four to six classes a year, out of about twelve in total? That doesn't sound too bad to me. I took about that many non-major classes when I was in school, although they didn't build on each other like the curriculum I proposed.
It may amuse you to note that I was basically designing that as a modernized liberal arts curriculum, with more emphasis on stats and econ and with some stuff (languages, music) stripped out to accommodate major courses. Obviously there's some tension between the vocational and the liberal aims here, but I know enough people who e.g. got jobs at Google with philosophy degrees that I think there's enough room for some of the latter.
I studied at two state universities. At both of them, classes were measured in "credit hours" corresponding to an hour of lecture per week. A regular class was three credit hours and semester loads at both universities were capped at eighteen credits, corresponding to six regular classes per semester and twelve regular classes per year (excluding summers). Few students took this maximal load, however. The minimum semester load for full-time students was twelve credit hours and sample degree plans tended to assume semester loads of fifteen credit hours, both of which were far more typical.
Sure, but that's evidence that they are unusually smart people. That's not evidence that four years of college were useful for them.
As you probably know, there is a school of thought that treats college education as mostly signaling. Companies are willing to hire people from, say, the Ivies, because these people proved that they are sufficiently smart (by getting into an Ivy) and sufficiently conscientious (by graduating). What they learned during these four years is largely irrelevant.
Is four years of a "modernized liberal arts curriculum" the best use of four years of one's life and a couple of hundred thousand dollars?
What counts as a 'marketable skill', or even what would be the baseline assumption of skill for becoming a fully and generally competent adult in twenty-first century society, might be very different from what was considered skill and competence in society 50 years ago. Rather than merely updating a liberal education as conceived in the Post-War era, might it make sense to redesign the liberal education from scratch? Like, does a Liberal Education 2.0 make sense?
What skills or competencies aren't taught much in universities yet, but are ones everyone should learn?
Perhaps we need to re-think what jobs and employment look like in the 21st century and build from there?
That seems like a decent starting point. I don't know my U.S. history to well, as I'm a young Canadian. However, a cursory glance at the Wikipedia page for the G.I. Bill in the U.S. reveals that it, among other benefits, effectively lowered the cost not only for veterans after World War II, but also their dependents. The G.I. Bill was still used through 1973, by Vietnam War veterans, so that's millions more than I expected. As attending post-secondary school became normalized, it shifted toward the status quo for getting better jobs. In favor of equality, people of color and women also demanded equal opportunity to such education by having discriminatory acceptance policies and whatnot scrapped. This was successful to the extent that several million more Americans attended university.
So, a liberal education that was originally intended for upper(-middle) class individuals was seen as a rite of passage, for status, and then to stay competitive, for the 'average American'. This trend extrapolated until the present. It doesn't seem to me typical baccalaureate is optimized for what the economy needed for the 20th century, nor for what would maximize the chances of employment success for individuals. I don't believe this is true for some STEM degrees, of course. Nonetheless, if there are jobs for the 21st century that don't yet exist, we're not well-equipped for those either, because we're not even equipped for the education needed for the jobs of the present.
I hope the history overview wasn't redundant, but I wanted an awareness of design flaws of the current education system before thinking about a new one. Not that we're designing anything for real here, but it's interesting to spitball ideas.
If not already in high school, universities might mandate a course on coding, or at least how to navigate information and data better, the same way almost all degrees mandate a course in English or communications in the first year. It seems ludicrous this isn't already standard, and careers will involve only more understanding of computing in the future.. There needs to be a way to make the basics of information science intelligible for everyone, like literacy, and pre-calculus.
There's an unsettled debate about whether studying the humanities increases critical thinking skills or not. Maybe the debate is settled, but I can't tell the signal from the noise in that regard. To be cautious, rather than removing the humanities entirely, maybe a class can be generated that gets students thinking rhetorically and analytically with words, but is broader or more topical than the goings-on of Ancient Greece.
These are obvious and weak suggestions I've made. I don't believe I can predict the future well, because I don't know where to start researching what the careers and jobs of the 21st century will be like.
Persuasive writing and speaking. Alternatively, interesting writing and speaking.
That was basically my education (I took 5 years of Latin, 2 of ancient greek, philosophy, literature, art) and the only reason I didn't end up homeless camping out in Lumifer's yard was because I learned how to do marketing and branding. I think having practical skills is a good idea. Trade and Technical schools are a great idea.
1st year: 5 / 2nd year: 7 / 3rd year: 5 / 4th year: 4 That's over half their classes. I also counted 14 of those 21 classes are in the social sciences or humanities which seems rather strange after you denigrated the fields. Now the big question: how much weight do you put on the accuracy of this first draft?
It's pretty simple. I think the subjects are important; I'm just not too thrilled about how they're taught right now. Since there's no chance of this ever being influential in any way, I may as well go with the fields I wish I had rather than the ones I have.
As to accuracy: not much.
What do you mean with those terms?
Understanding the principle of evolution is useful but I don't see why it needs a whole semester.
Um, the reason for studying Greek and Latin is not just because they're a far-removed culture. It's also because they're the cultures which are the memetic ancestors of the memes that we consider the highest achievements of our culture, e.g., science, modern political forms.
Also this suffers from the problem of attempting to go from theoretical to practical, which is the opposite of how humans actually learn. Humans learn from examples, not from abstract theories.
Scott Alexander from Slate Star Codex has the idea that if the humanities are going to be taught as part of a core curriculum, it might be better to teach the history of them backwards.
When I was in high school, I discussed this very idea with my Philosophy teacher. She said that (at least here in Italy) curricula for humanities are still caught in the Hegelian idea that history unfolds in logical structures, so that it's easier to understand them in chronological order.
I reasoned instead that contemporary subjects are more relevant, more interesting and we have much more data about them, so they would appeal much better to first year students.
Here is Eliezer's post on the subject.
I just want to point out for the record that if we're discussing a core curriculum for undergraduate education, I figure it would be even better to get such a core curriculum into the regular, secondary schooling system that almost everyone goes through. Of course, in practice, implementing such would require an overhaul of the secondary schooling system, which seems much more difficult than changing post-secondary education. The reason for this would probably because changing the curriculum for post-secondary education, or at least one post-secondary institution, is easier, because there is less bureaucratic deadweight, a greater variety of choice, and a nimbler mechanisms in place for instigating change. So, I understand where you're coming from in your original comment above.
I think history and the softer social sciences / humanities can, if taught well, definitely improve your ability to understand and analyze present-day media and politics. This can improve your qualitative appreciation of works of art, understand journalistic works on their own terms and context instead of taking them at face value, and read and write better.
They can also provide specific cultural literacy, which is useful for your own qualitative appreciation as well as some status things.
I had a pretty shallow understanding of a lot of political ideas until I took a hybrid history/philosophy course that was really excellently taught. It allowed me to read a lot of poltical articles more deeply and understand their motivations and context and the core academic ideas they built around.
That last part, seeing theses implicitly referenced in popular works, is pretty neat.
I think this is true... but also that "taught well" is a difficult and ideologically fraught criterion. The humanities and most (but not all; linguistics, for example, is a major exception) of the social sciences are not generally taught in a value-neutral way, and subjective quality judgments often have as much to do with finding a curriculum amenable to your values as with the actual quality of the curriculum.
Unfortunately, the fields most relevant to present-day media and politics are also the most value-loaded.
Well, the impossibility of neutrality, except when giving the most mundane recitation of events, when talking about history or the humanities is a pretty vital lesson to understand. The best way to approach this is to present viewpoints then counterpoints, present a thesis then a criticism.
I have had one non-core course that was pretty much purely one perspective (left-radical tradition), but this is still a tradition opposed to and critical of even mainstream-leftist history and politics. What I mean to say is I don't think it was a great class, but I still learned plenty when I thought critically about it on my own time.
If you have a certain amount of foundation (which I got through a much more responsibly-taught class pretty much following the traditional western philosophical canon), in other words, you should still learn plenty from a curriculum that is not amenable to your values, if you put in an effort.
But I think most core history and philosophy courses at liberal arts colleges stick to a pretty mainstream view and present a decent range of criticisms, achieving the ends I talked about. If you really want far-left or right-wing or classical liberal views, there are certainly colleges built around those.
The thing that bothers me is that (at least at my university, which was to be fair a school that leaned pretty far to the left) neutrality seems to have been thrown out not only as a practical objective but also as an optimization objective. You're never going to manage to produce a perfectly unbiased narrative of events; we're not wired that way. But narratives are grounded in something; some renditions are more biased than others; and that's a fact that was not emphasized.
In a good class (though I didn't take many good classes) you'll be exposed to more than one perspective, yes. But the classes I took, even the good ones, were rather poor at grounding these views in anything outside themselves or at providing value-neutral tools for discriminating between them. (Emphasis on "value-neutral": we were certainly taught critical tools, but the ones we were taught tended to have ideology baked into them. If you asked one of my professors they'd likely tell you that this is true of all critical tools, but I don't really buy that.)
Of course bias can vary, but I think most of the professors you ask would say they are being unbiased, or they are calibrating their bias to counteract their typical student's previous educational bias. After all, you were taught history through high school, but in a state-approved curriculum taught by overworked teachers.
As far as critical tools, which ones are you thinking of? Are you thinking of traditionally-leftist tools like investigations into power relationships? What do you think of as a value-neutral critical tool?
You seem to have an idea of what differentiated the good classes from the bad. I'm not disagreeing that some classes are bad, I'm focusing on the value the good ones can give. A bad engineering class, by analogy, teaches about a subject of little practical interest AND teaches it at a slow pace. Bad classes happen across disciplines.
And I admit I am probably speaking from a lot of hindsight. I took a couple good classes in college, and since then have read a ton of academic's blogs and semi-popular articles, and it has taken a while for things to "click" and for me to be able to say I can clearly analyze/criticize an editorial about history at a direct and meta-level the way I'm saying this education helps one do.
You're right, for instance, that in college you probably won't get an aggressive defense of imperialism to contrast with its criticisms, even though that might be useful to understanding it. But that's because an overwhelming majority of academics consider it to be such a clearly wretched, even evil, that they see no value in teaching it. It's just how we rarely see a serious analysis of abolition vs. slavery, because come on right?
On slavery, academia and the mainstream are clearly in sync. On Imperialism? Maybe not as much, especially given the blurry question of "what is modern imperialism?" (is it the IMF; is it NAFTA; is it Iraq?). But many academics are striving to make their classes the antidote to a naive narrative of American history that goes: "Columbus discovered America, immigrants came and civilized the Indians, won the southwest in glorious battle against corrupt Mexico, then their nation reluctantly accepted the role of world peacekeeper ushering in our golden age, and triumphed over communism".
I mentioned critical theory elsewhere in these comments. There's also gender theory, Marxian theory, postcolonial theory... basically, if it comes out of the social sciences and has "theory" in its name, it's probably value-loaded.
These are frameworks rather than critical tools per se, but that's really what I was getting at: in the social sciences, you generally don't get the tools outside an ideological framework, and academics of a given camp generally stick to their own camp's tools and expect you to do the same in the work you submit to them. Pointing to value-neutral critical tools is harder for the same reason, but like I said earlier I think linguistics does an outstanding job with its methodology, so that could be a good place to start looking. Data science in general could be one, but in the social sciences it tends to get used in a supporting rather than a foundational role. Ditto cognitive science, except that that hardly ever gets used at all.
This in itself is a problem. If you start with a group of students that have been exposed to a biased perspective, you don't make them less biased by exposing them to a perspective that's equally biased but in another direction. We've all read the cog-sci paper measuring strength of identification through that sort of situation, but I expect this sort of thing is especially acute for your average college freshman: that's an age when distrust of authority and the fear of being bullshitted is particularly strong.
(The naive narrative wasn't taught in my high school, incidentally, but I'm Californian. I expect a Texan would say something different.)
But these frameworks/theories are pretty damn established, as far as academics are concerned. Postcolonial theory and gender theory make a hell-of-a-lot of sense. They're crowning accomplishments of their fields, or define fields. They're worth having a class about them. Most academics would also say that they consider distinctly right-wing theories intellectually weak, or simply invalid; they'd no more teach them than a bio professor would teach creationism.
If you strongly feel all of mainstream academia is biased, then pick a school known for being right-wing. Academia's culture is an issue worthy of discussion, but well outside the scope of "should history be in core curriculums".
Maybe things like game-theoretic explanations of power dynamics, or something like discussion of the sociology of in-groups and out-groups when discussing nationalism, or something similar, are neglected in these classes. If you think that, I wouldn't disagree. I guess most professors would probably say "leave the sociology to the sociologists; my class on the industrial revolution doesn't have room to teach about thermodynamics of steam engines either".
I don't know much about linguistics, except that Chomsky is a Linguist and that some people like him and some people don't.
I do know it is on the harder end of the social sciences. The softer social sciences and humanities simply won't be able to use a lot of nice, rigorous tools.
I think good teachers, even ones with a strong perspective, approach things so that the student will feel engaged in a dialogue. They will make the student feel challenged, not defensive. More of my teachers achieved this than otherwise. Bad teachers and teaching practices that fail to do this should be pushed against, but I don't think the academic frameworks are the main culprit.
Though I suspect I have a rather dimmer view of the social sciences' "crowning achievements" than you do, I'm not objecting directly to their political content there. I was mainly trying to point to their structure: each one consists of a set of critical tools and attached narrative and ideology that's relatively self-contained and internally consistent relative to those tools. Soft academia's culture, to me, seems highly concerned with crafting and extending those narratives and distinctly unconcerned with grounding or verifying them; an anthropologist friend of mine, for example, has told me outright that her field's about telling stories as opposed to doing research in the sense that I'm familiar with, STEMlord that I am. The subtext is that anything goes as long as it doesn't vindicate what you've called the naive view of culture.
That's a broader topic than "should history be in core curriculums?", but the relevance should be obvious. The precise form it takes, and the preferred models, do vary by school, but picking a right-wing school would simply replace one narrative with another. (I'd probably also like the students less.)
They don't. That doesn't mean they can't. There's plenty of rigorous analysis of issues involved in social science out there; it's just that most of it doesn't come from social scientists. Some of the best sociology I've ever seen was done by statisticians.
(Chomsky, incidentally, was a brilliant linguist -- if not always one vindicated by later research -- but he's now so well known for his [mostly unrelated] radical politics that focusing on him is likely to give the wrong impression of the field.)
I think this is a problem, BUT it wouldn't be a problem if we had more people willing to pick up the ball and take these narratives as hypotheses and test/ground them. I think there IS a broad but slow movement towards this. I think these narrative-building cultures are fantastic at generating hypotheses, and I am also sympathetic in that it is pretty hard to test many of hypotheses concretely. That said, constant criticism and analysis is a (sub-optimal) form of testing.
Historians tend to be as concrete as they can, even if it's non-quantitatively. If an art historian says one artist influenced another, they will demonstrate stylistic similarities and a possible or verified method of contact between the two artists. That's pretty concrete. It can rely on more abstract theories about what is a "stylistic similarity" though, but that's inevitable.
I also think that the broadest and best theories are the ones you see taught at an undergrad level. The problems you point out are all more pernicious at the higher levels.
Surely true. But I think (from personal discussions with academics) there is a big movement towards quantitative and empirical in social sciences (particularly political science and history), and the qualitative style is still great for hypothesis generation.
I also think our discussion is getting a bit unclear because we've lumped the humanities and social sciences together. That's literally millions of researchers using a vast array of methodologies. Some departments are incredibly focused on being quantitative, some are allergic to numbers.
If left-wing academia is low quality that in no way implies that right-wing academia is high quality. Seeing everything as left vs. right might even be part of the deeper problem plaguing the subject.
On the other hand, if (in someone's opinion) academia as a whole is of low quality on account of a leftward political bias then it seems reasonable for that person to take a look at more right-leaning academic institutions.
I would call that "damning with faint praise" :-D
It's praise sincerely intended. What strikes you as inadequate about, say, feminist theory and related ideas?
I can see your point about social sciences, but I would think this doesn't apply to most of the humanities. How is a creative writing, theatre, or communications course fraught by ideological criterion?
In a word: theory. I didn't take as many of those classes in college as I did social science, so I'm speaking with a little less authority here, but the impression I got is that the framework underpinning creative writing etc. draws heavily on critical theory, which is about as value-loaded as it gets in academia.
The implementation part, of course, isn't nearly as much so.
How do you know that you understand motivations of political articles better? Are you able to predict anything politically relevant that you couldn't have predicted beforehand?
Concretely, I can often tell if the article writer is coming from a particular school of thought or referencing a specific thesis, then interpret jargon, fill in unstated assumptions, see where they're deviating or conforming to that overarching school of thought. This directly enhances my ability to extrapolate to what other political views they might have and understand what they are attempting to write, and who their intended audience is.
As far as predicting the real world, that's tough. These frameworks of thought are in constant competition with one another. They are more about making normative judgments than predictive ones. The political theories that I believe have the most concrete usefulness are probably those that analyze world affairs in terms of neocolonialism, in part because those theories directly influence a ton of intellectuals but also in part because they provide a coherent explanation of how the US has managed its global influence in the past and (I predict) how it will do so in the future.
I can also do things like more fully analyze the factors behind US police and African-American relations, or how a film will influence a young girl.
That reminds me of the Marxist who can explain everything with the struggle of the workers against the capitalists.
The sentence looks like your study did damage. You shouldn't come out of learning about politics believing that you can fully understand the factors of anything political.
I think the difference I highlighted is an important one.
I am referring to the normative parts of frameworks. For instance feminism makes many normative statements. It is a project dedicated to changing certain policies and cultural attitudes. The eventual influence of these frameworks are based largely on their acceptance.
People make statements. Abstract intellectual labels don't. People have all sorts of personal goals. If one sees everything as the battle of certain frameworks then a lot dealing with individual people is lost.
Additionally you can also miss when new thoughts come along that don't fit into your existing scheme. A lot of people coming from the humanities for example have very little understanding of the discourse of geeks.
The political effects of getting people to meditate and be in touch with their bodies are also unknown unknowns for a lot of people trained in the standard political ways of thinking.
I don't have much to comment on this except that many academics in the humanities level charges of dehumanization and ignoring individual agency against a lot of works in economics or quantitative sociology and political science (ex. they might criticize an economics paper that attributes civil unrest to food shortages without discussing how it might originate in individual dissatisfaction with oppression and corruption). So it's ironic if I've done the same disservice to those academics.
I don't really know what you're referring to. But if you're talking LW-style memes, I think that it is generally true that futurism isn't of much interest to many in the humanities. And to a great degree it is orthogonal to what they do. A scenario like the singularity may not be, in that it's not orthogonal to anyone or anything, but I haven't had many conversations about it with those in the humanities.
What are you thinking of?
But I am sure there are academics who can readily discuss the effects of the fall of physically demanding labor, the effect of physical rigors on those in the military, or the interaction of all flavors of Buddhism with politics.
Dissatisfaction with oppression and corruption in itself doesn't have much to do with individual people being actors. Standard feminist theory suggests that social groups are oppressed.
As far as LW ideas go, prediction markets do have political implications. X-risk prevention does have political implications.
CFAR mission also mentions that they want to change how we decide which legislation to pass.
A bunch of geeks are working on getting liquid democracy to work.
Wikileaks and it's in actions do have political effects.
Sweden recently changed their Freedom of Press laws to make it clear that having a server in Sweden is not enough to profit from Swedish press protections because Julian Assanges Wikileaks tried to use Swedish press protection to threaten people who try to uncover sources of Wikileaks.
In Germany a professor of sociology recently wrote a book that argued that Quantified Self is driven by the belief that it's possible to know everything. It isn't. The kind of geeks New Atheists that want everything to be evidence-based and who believe that they can know everything generally reject QS for not doing blinded and controlled trials. He simply treated all geeks the same way and therefore missed the heart of the issue.
How much have polticial scientists wrote about Crypto Wars and in Cory Doctorow words the recent war on general computing?
Estonia had to be defended against cyber war by a lose collection where likely the stronger players weren't government associated. It's also quite likely that we live in a time where a nongovernemntal force is strong enough to start such a war.
The NSA is geeky enough that it's NSA chief Gen Keith Alexander modeled his office after the Star Treck bridge.
Jeff Bezos brought the Washington post. Pierre Omidyar who made his money with ebay sponsored First Look Media. Those are the signs that more and more political power goes to geeks.
I'm just pointing to a political idea to which you probably aren't exposed.
Military training is not supposed to build empathy but the opposite. Soldiers are trained to ignore bodily feelings.
How much of this course was history? How similar was it to other history courses you've taken? A course syllabus might be useful, but I understand if you'd prefer privacy. I could see this happening with a course on general scientific principles that used history to develop practice problems, but then what you learned would be scientific principles; not really history. Was it just the professor was better as his/her job than other history professors or is there something that's readily replicable to other history courses?
It was history of philosophy, focused on reading major works chronologically with a good dose of historical context and background for each (e.g. biblical authorship theories, prevailing attitudes that works were responding to, historical events like wars that would have influenced the authors, etc.). Work included twice-weekly journal entries on our readings, occasional quizzes, and essays tying several works together. A partial list of the curriculum, which we read in this (chronological) order, was:
The other hybrid philosophy/history course, the radical one, did have a couple excellent, very historically-oriented, readings. One was Black Jacobins about the Haitian revolution, others were about the French Revolution, the Paris Commune, and a left-radical rebellion against the Bolsheviks in the early USSR (which I have unfortunately forgotten the name of, but it does demonstrate the pluralism of pre-Bolshevik socialism).
Detailed historical explorations were the stronger part of that course, and served to show how clear investigation into the facts could dispel or nuance a charicatured view of history.
History seems to me a subject in its teachings that aims to produce critical thinking in a sense different than what LessWrong typically tries to optimize for. I figure LessWrong optimizes for the critical thinking of the individual, which benefits from an education in logic, computer science, and mathematics, among a general knowledge of the natural sciences. I'm not sure how much history would contribute to that sort of skill, but others in this thread seem skeptical of its value.
However, learning history seems like it improves how critically groups and societies can think together, across a few domains key to society. A general education in history as part of the core curriculum could be a heuristic for circumventing group irrationality, and mob rule, in a way that critical thinking skills designed for only the individual might not. Understanding the history of one's own nation in a democracy give the electorate knowledge of what's worked in the past, what's different in the nation in the present compared to the past, and the context in which policy platforms and cultural and political divides were forged. This extends to the less grand history of the geographical location in which ones resides, or was raised in, within one's own nation. An understanding of a history of other nations, and the world, gives one the context in which international relations have formed over centuries.
Here's an example of how knowledge of world history and international relations might be useful. If the executive branch of the United States federal government wants to declare war on the country, to intervene against a predator country on the behalf of one victimized, it makes sense to understand the context of that conflict. If the history of those faraway regions is known, than the electorate can check the narrative the government puts forward against what they learned in schooling. Even very recent history could be useful knowledge in this regard. If the electorate of the United States was aware of the hundreds of years of colonial or ideological conflict, and how intractably stupid the whole thing is and has been, they might have been warier of condoning invasions of Iraq, Vietnam, the former Yugoslavia, etc. Knowing the background of such regions in the future, by having better access to options in learning of these regions in undergraduate education, might make whole generations less likely to vote for parties or presidents who will sink the United States into costly and drawn-out wars that are negative-sum games for all sides.
Groupthink and other pitfalls of group psychology that aren't circumvented by merely knowing science might be debunked by everyone knowing more history. In writing this, I'm realizing that the value of history would be in having enough information as a baseline to not make mistakes of ignorance, the same way that knowing of biology or psychology might. This decreases the chances that a society at large will make mistakes, like supporting a stupid war, or rallying behind an anti-vaccination movement. However, it doesn't seem to fall into the more valuable category of subjects which (presumably) directly improve reasoning ability for individuals, such as maths, and computer science.
My above illustration is a hypothesis or thought experiment for how an education in history might be valuable for critical thinking skills. If it's mostly valuable for having a better democracy with better politics, then perhaps the question can't be divorced from what other education makes for a better democratic polity. That leads us to opening the Pandora's Box of producing better thinking on politics, which is its own behemoth of a problem.
tl;dr: having a set of courses for everyone to take is probably a bad idea. People are different and any given course is going to, at best, waste the time of some class of people.
A while ago, I decided that it would be a good thing for gender equality to have everyone take a class on bondage that consisted of opposite-gender pairs tying each other up. Done right, it would train students "it's okay for the opposite gender to have power, nothing bad will happen!" and "don't abuse the power you have over people." In my social circle, which is disproportionately interested in BDSM, this kinda makes sense. It may even help (although my experience is that by the time anyone's ready to do BDSM maturely, they've pretty much mastered not treating people poorly based on gender.) It would also be a miraculously bad idea to implement.
In general, I think it's a mistake to have a "core curriculum" for everyone. Within 5 people I know, I could go through the course catalog of, say, MIT, and find one person for whom nobody would benefit from them taking the course. (This is easier than it seems at first; me taking social science or literature courses makes nobody better off (the last social science course I took made me start questioning whether freedom of religion was a good thing. I still think it's a very good thing, but presenting me with a highly-compressed history of every inconvenience it's produced in America's history doesn't convince my system 1). Similarly, there exist a bunch of math/science courses that I would benefit greatly from taking, but would just make the social science or literature people sad. Also, I know a lot of musicians, for whom there's no benefit from academic classes; they just need to practice a lot.)
Having a typical LWer take a token literature class generally means they're going to spend ~200 hours learning stuff they'll forget exponentially. (This could be remedied by Anki, but there's a better-than-even chance the deck gets deleted the moment the final's over.) Going the other way, forcing writers to take calculus probably won't produce any tangible benefits, but it will make them pissed off and write things with science is bad plotlines. (Yes, most of us probably wish writers would get scientifically literate, but until we can figure out a way to make that happen, forcing them to take math and science courses is just going to have predictable effects on what they write and do you really think it helps to have a group of people who substantially influence culture to hate math and science?)
For the typical LWer, I'd go heavy on the math and CS with enough science (physics through psych) to counteract Dunning-Kruger, and some specialization, the idea being that math and CS are tools that let you take something you already know and find out something you didn't know for free, the sciences are there to reduce inferential gaps and eliminate illusory competence, and the specialization gets you a job. This would be very good for people-who-are-central-examples-of-LWers (although I'm sure there many are people here who this would be very bad for), but I have trouble imagining that this would work for more than a few percent of the population. In fact, for everyone going into a field that doesn't need a lot of technical knowledge, I'd look for the most efficient way to measure intelligence and conscientiousness (preferably separately), which looks very little like an undergraduate curriculum.
If a field doesn't require a lot of technical knowledge, why bother with college in the first place? I'm not so sure how useful your examples are since most creative writers and musicians will eventually fail and be forced to switch to a different career path. Even related fields like journalism or band manager require some technical skills.
Obligatory SMBC comic. :)
Signalling, AKA why my friend majoring in liberal arts at Harvard can get a high-paying job even though college has taught him almost no relevant job skills.
As a writer, I agree with you. I am horrible at math. In my life 2x3=5 most of the time. If I had to suffer and fail at Calculus when I can't multiply some days I would certainly start writing books about evil scientists abusing a village for its resources and then have the village revolt against its scientific masters with pitchforks. Throw in a great protagonist and a love interest and I have a bestseller with possible international movie rights.
I think the idea of a core curriculum that contains things such as history is awful. Diversity is pretty useful.
Business in general is useful, but little of the relevant skills are well learned via lectures. Being able to negotiate is a useful business skill.
Diversity courses strike me as an odd combination of sociology, anthropology, and history, but since you specifically criticized history courses; I'm a bit confused as to why you like diversity courses. Are culturally-focused history courses such as history of hip-hop, latin american culture, or women in American history better than standard history courses? Is there a certain category of business courses that does a better job than others? Are there any skills that can be easily taught in a lecture format? I have a friend who felt communications courses were very good at teaching negotiation strategies.
I think you have misinterpreted "Diversity is pretty useful" as "Diversity courses are pretty useful". My reading of ChristianKI's comment is that he meant "having different people take different courses is useful" and I would be rather surprised if he thought diversity courses as such were much use.
I like diversity in course offerings. That's not the same thing as liking courses that supposedly teach diversity.
I don't want a world in which every college student learns the same thing. As such I reject the idea of a core curriculum.
Probably courses that don't use textbooks but that do exercises with strong emotional engagement.
I was at personal development seminars where at the end of the day some people lie on the floor because of emotional exhaustion. I think doing a lot of deep inner work brings higher returns than learning intellectual theory.
How djd he came to that conclusion? Has the amount that the person pays for the average thing he buys gone down because he has become much better at negotiating?
I only took one class in communications so I don't understand the field too well. The class itself seemed useful, but there was no mention of negotiation strategies. It would seem more likely that better negotiation leads to more offers than that better negotiation leads to a better offer. A smart businessman is going to know how to value the deal, and it's going to be hard to significantly change his price.
What practical effect did it have that make you consider it to be useful?
If you buy a car in many cases a person with good negotating skills can achieve a better price.