We're all familiar with false popular memes that spread faster than they can be stomped out:  You only use 10% of your brain.  Al Gore said he invented the internet.  Perhaps it doesn't surprise you that some memes in popular culture can't be killed.  But does the same thing happen in science?

Most of you have probably heard of Broca's aphasia and Wernicke's aphasia.  Every textbook and every college course on language and the brain describes the connection between damage to these areas, and the speech deficits named after them.

Also, both are probably wrong.  Both areas were mistakenly associated with their aphasias because they are near or surrounded by other areas which, when damaged, cause the aphasias.  Yet our schools continue teaching the traditional, erroneous story; including a lecture in 9.14 at MIT given in 2005.  Both the Wikipedia entry on Wernicke's aphasia and the Wikipedia entry on Broca's aphasia are still in error; the Wikipedia entry on Wernicke's area has got it straight.

Is it because this information is considered unimportant?  Hardly; it's probably the only functional association you will find in every course and every book on the brain.

Is it because the information is too new to have penetrated the field?  No; see the dates on the references below.

In spite of this failure in education, are the experts thoroughly familiar with this information?  Possibly not; this 2006 paper on Broca's area by a renowned expert does not mention it.  (In its defense, it references many other studies in which damage to Broca's area is associated with language deficits.)

So:

  • Am I wrong, and the evidence still implicates Broca's area and Wernicke's area in their aphasias above other areas?
  • If I'm right, why can't the new understanding displace the old understanding?
  • Is this a general failure in the way we do science?  Can you think of other examples where an important discovery can't penetrate its field?

 

References

Bogen JE, Bogen GM (1976). Wernicke's region—Where is it? Ann. N. Y. Acad. Sci. 280: 834–43.

Dronkers, N. F., Shapiro, J. K., Redfern, B., & Knight, R. T. (1992). The role of Broca’s area in Broca’s aphasia.
Journal of Clinical and Experimental Neuropsychology, 14, 52–53.

Dronkers NF., Redfern B B., Knight R T. (2000). The neural architecture of language disorders. in Bizzi, Emilio; Gazzaniga, Michael S.. The New cognitive neurosciences (2nd ed.). Cambridge, Mass: MIT Press. pp. 949–58.

Dronkers et al. (2004).  Lesion analysis of the brain areas involved in language comprehension.  Cognition 92: 145-177.

Mohr, J. P. (1976). Broca’s area and Broca’s aphasia. In H. Whitaker, Studies in neurolinguistics, New York: Academic Press.

 

New Comment
149 comments, sorted by Click to highlight new comments since:
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings
[-]XiXiDu220

I asked a neuroscientist about this who in turn asked a friend who studies language disorders of the brain at Cambridge. Here is what he had to say:

"This fellow sounds like he must be an undergrad, or a non-psychologist. No-one believes that the area traditionally defined as Broca's area is the only frontal area involved in language processing or even in Broca's aphasia, and even less for Wernicke's. Those anatomical labels are only used as heuristics; it's much more accurate (and common) when writing on the subject to use anatomical names like 'left inferior frontal gyrus pars triangularis / pars orbitalis / pars opercularis'. I think the general understanding is that you rarely get straightforward, unambiguous cases of patients who definitely only show sypmtoms of one specific type of aphasia anyway. More to the point, it's not the case that all researchers have the same point of view on these things, and it's certainly not unusual to see papers that claim to have a new, different explanation for something. They're ten-a-penny. From a systems neuroscience perspective it's much more meaningful to think about things in terms of disconnection syndromes and hodological principles - i.e. the effects of certain types and locations of damage on the properties of the system/network as whole, rather than be strictly localizationist about the causes of aphasias."

7PhilGoetz
Thank you for that cogent ad-homimen argument! Which is why I never said that anyone believed that. It is common for damage to a number of different, often nearby, structures, to cause similar symptoms. It's common for it to be difficult to distinguish symptoms caused by different underlying damage. In the case of Broca's area and Wernicke's area, however, it's more than that. There is strong evidence that damage to Broca's area does not cause Broca's aphasia (while damage to numerous other areas does); and damage to Wernicke's area does not cause Wernicke's aphasia (though damage to numerous other nearby areas does).

The theory that different areas of the tongue tasted different things - the Tongue Map - has been pretty thoroughly debunked but lived for aboutt a century. This seems like something fairly easily testable.

Bullet lead analysis gained scientific acceptance and stuck around for forty years; it still is viewed as good science by many, although its probative value may have been overstated.

Bruise aging was accepted for a shorter period of time, but appears almost worthless. This was another testable hypothesis that lasted longer than it should have.

I don't have particularly smart things to say about why these errors lasted while others were destroyed by truth. Perhaps someone else does.

1wnoise
AIUI, different areas of the tongue do have different concentrations of the various sensors. The tongue map taken literally as "you taste salt only here, sweet only here, bitter only there" is wrong, but that the different areas represented are indeed more sensitive.
1Morendil
Thanks for the tip-off about the Tongue Map. I'm afraid it's still being presented as serious knowledge. I'll go to bed slightly wiser tonight. Nitpick: I would appreciate those things even more with a link. (Upvoted nonetheless.) By bruise aging do you mean this? Where would one go to verify it has been discredited?
2byrnema
When I was in elementary school -- must have been a year when I was attending a good one -- we did an experiment where we tasted things with different parts of our tongue. It was experimentally verified by us that the way things tasted depended on the region. And then we ate an apple while smelling an onion, and compared the sensations of immersing our hands in hot water or ice water separately and simultaneously.
4JRMayne
Interesting. Was this experiment done before you learned the tongue map? Have you tried it again? Suggestibility affects taste significantly; see the wine tasting experiment, and the well-known visual component of eating. Very fine chefs sometimes cannot identify ingredients just by tasting them (see: Top Chef Masters.) Further, this sounds like yet another data point in the need for double blind studies. Still, it's nice to see kids involved in experiments of some sort.
5byrnema
I did a mini-experiment before posting the comment. I only had chips on hand, so this was meant to be an experiment with detecting saltiness, while holding my nose. The experiment was difficult to interpret due to uneven salt on the chips, but I decided that while the salt was detected by all parts of my tongue, the taste sensation felt different -- from tangy to itchy, depending on the region. I decided that tasting was "complicated" -- that was my only conclusion.
1JRMayne
Yeah, links would have been better. Let's see if I get the format right. Try this for bruising. Tongue map and bullet lead analysis are in wikipedia, and both (as of yesterday) looked like reasoned articles to me, though the statistical confusion in bullet lead analysis is not well laid out.
0MrHen
Yeah, I thought the Tongue Map was true as well. I found a short article talking about it at Scientific American. Are they reputable?
1RobinZ
Scientific American? Yes. It's been a standard pop-sci magazine for decades, although in the last twenty years or so it has become much less rigorous. The reporting should be accurate, though.
0Eliezer Yudkowsky
Not anymore.
2RobinZ
Are you suggesting that its science reporting is now at the New York Times Magazine level, or something more severe?
0Eliezer Yudkowsky
Probably not that low, but not very good, either.
[-]Clippy110

You're just saying that because of their position on grober crimate change.

In fairness, I think the issue is way overplayed. How exactly would it interfere with wire metal forming methods or helping people adapt to Office? I don't get it.

Well, if it gets too warm, all the world's paperclips could melt into undifferentiable masses.

[-]Clippy120

Okay, what the hell is up with the moderators here? I wasn't calling "User:Alicorn" ridiculous for suggesting that paper clips can melt. I mean, come on, give me a little credit here. Not to brag, but I think I know a little about this kind of thing...

Clipper, C. "On the Influence of High-Temperature Environments on Failure Modes in Self-Locking Removable Fasteners", Journal of Non-Destructive Fastening, Vol. 3, Issue 2

Ahem. Anyway, what I was saying is, yes, paperclips can melt, but you need a LOT more than grober crimate change to melt them all into an undifferntiable mass, okay? Like, even if you set every coal vein on fire, AND filtered out the particulate matter to prevent cooling effects, you STILL wouldn't make the planet hot enough to melt all paper clips together for over a hundred years.

That is what is riduclous.

-1Clippy
Don't be ridiculous.
0RobinZ
I think we are approximately in agreement on this point.
0Douglas_Knight
The wikipedia article makes it look like it's basically true and that a straw man absolute version was "debunked."
-1wnoise
But the strawman absolute version is what was (is?) taught.

I am reminded of a paper by Simkin and Roychowdhury where they argued, on the basis of an analysis of misprints in scientific paper citations, that most scientists don't actually read the papers they cite, but instead just copy the citations from other papers. From this they show that the fact that some papers are widely cited in the literature can be explained by random chance alone.

Their evidence is not without flaws - the scientists might have just copied the citations for convenience, despite having actually read the papers. Still, we can easily imagin... (read more)

4Paul Crowley
I copy citations from other papers. When I can, I copy and paste BibTeX stanzas I find on the Web.
3byrnema
How so? Could you clarify your reasoning? * Scientists cite journals with conclusions that are convenient to cite (either because they corroborate their views or define a position to pivot from or argue with) whether or not they have been read. Journals with easily debunked conclusions might equivalently be not read (and thus unexamined) or read (and simply trusted). * I think that the real test for whether cited publications are read or not is the following: if a publication is consistently cited for a conclusion it does not actually present, then this is evidence of no one actually having read the publication. I recall in my research that it was very convenient in the literature to cite one particular publication for a minor but foundational tenet in the field. However, when I finally got a hard-copy of the paper I couldn't find this idea explicitly written anywhere. The thing is -- contradicting what I say above, unfortunately -- I think the paper was well-read, but people don't double-check citations if the citation seems reasonable.
1Yoreth
My thinking is: Given that a scientist has read (or looked at) a paper, they're more likely to cite it if it's correct and useful than if it's incorrect. (I'm assuming that affirmative citations are more common than "X & Y said Z but they're wrong because..." citations.) If that were all that happened, then the number of citations a paper gets would be strongly correlated with its correctness, and we would expect it to be rare for a bad paper to get a lot of citations. However, if we take into account the fact that citations are also used by other scientists as a reading list, then a paper that has already been cited a lot will be read by a lot of people, of whom some will cite it. So when a paper is published, there are two forces affecting the number of citations it gets. First, the "badness effect" ("This paper sounds iffy, so I won't cite it") pushes down the number of citations. Second, the "popularity effect" (a lot of people have read the paper, so a lot of people are potential citers) pushes up the number of citations. The magnitude of the popularity effect depends mostly on what happens soon after publication, when readership is small and thus more subject to random variation. Of course, for blatantly erroneous papers the badness effect will still predominate, but in marginal cases (like the aphasia example) the popularity effect can swamp the badness effect. Hence we would expect to see more bad papers getting widely cited; the more obviously bad they are, the stronger this suggests the popularity effect is. I suppose one could create a computer simulation if one were interested; I would predict results similar to Simkin & Roychowdhury's.
0byrnema
I see: in the case that a paper is read, deciding a paper sounds iffy and deciding not to cite it would correlate strongly with deciding not to cite a paper with wrong conclusions. I was considering that scientists rarely check the conclusions of the papers they cite by reading them, but just decide based on writing and other signals whether the source is credible. So a well-written paper with a wrong conclusion could get continued citations. But indeed, if the paper is written carefully and the methodology convincing, it would be less likely that the conclusion is wrong.
1PhilGoetz
That's great! I've wondered why so many mathematical papers (in non-math subject areas) contain misprints and omissions that make their equations uninterpretable. I'm wondering if even the referees and editors read them. And I have a confession. I didn't read all of the papers I referenced!
1thomblake
Indeed this is commonplace for all academic fields, though I don't see the problem with it, so long as the effect doesn't squash new work.
0lakeswimmer
a question if referencing is not based on knowledge or perhaps even relevance what does this imply for Google algorithm? does it not organize search responses according to page links?

Hmm, I was thinking about writing something similar to your article here.

A pet peeve of mine is the Observer Effect in QM - not once, on any skeptical radio show, on any internet animation, or in any High School Class (and I am wagering that it is no different in University) have I ever heard it specifically pointed out that OE does not mean conscious observer - not even when it is a PhD talking to a lay audience.

And thus we have movies such as What the * do we know, and the myths keep spinning.

7DonGeddis
There are (bad) interpretations of QM, where they do mean "conscious" observer. This objection is very close to saying that MWI (multiple worlds) is "right", and the others are "wrong". That may be the case, but it is far from universally acknowledged among practicing physicists. So, it's a bit unfair to suggest this "error", given that many prominent (but wrong) physicists would not agree that it is an error.
7Aurini
Let me see if I have this straight: I devise a double-slit experiment where my electronowhazzit collapses the waveform for an hour-and-a-half, before shutting off; thus resulting in no diffraction pattern during the first portion of the experiment, and a propagated waveform during the second. I set it up to begin the experiment at midnight, and stop at 3AM; a computer automatically records all the data, which I then store on a CD for 1 year's time, without looking at it. At the end of the year, I present this data to a group of these physicists. They declare that it's my conscious observation, going backwards in time, that creates the results; or that it's my conscious intent in setting up the aparatus, or something like that? I wish I were feeling incredulous right now, but to be honest I'm just kind of depressed.
6Eliezer Yudkowsky
People are crazy, the world is mad. Actually, I suspect, many physicists who believe in consciousness-causes-collapse might deny your logical consequence there. Being confused in the first place, why would you expect them to admit the logical consequences of what they say they believe? At least some would probably admit it, though.
0Aurini
"The Universe if funadmentally Newtonian-Relativisitc. All that Quantum mechanical stuff is just math - you plug it into the equations, but it doesn't really mean anything. It doesn't have any macro effects. What's that you're saying about computer-processors needing QM callibration dure to fluctuations at that level? Sorry, I'm not an Engineer. QM bomb detonator? Nah, just a toy, don't worry about it. "We'll figure out that grand-unified-theory once we find the God Particle and plug in a simple equation, probably involving e (because e is a pretty cool guy), but that's all." [I'm pretty sure my ignorance is shining through here, but at least I understand the implications of things.]
1Mitchell_Porter
Very very few physicists have ever believed anything like that. Those who do are likely either New Agers, parapsychologists, or people whose private philosophical ruminations have landed them in an odd place. Almost universally on this site, I see the presumption that the wavefunction exists. People should understand that for a significant fraction of physicists, wavefunctions are just like probability functions - they are regarded as calculational devices only.
0wnoise
I agree that that view is common (I am a grad student in quantum computation). But quantum mechanics admits nothing else besides the wave-function (or density operator). If there is anything "real", it pretty much has be the wave-function.
0Paul Crowley
My suspicion is that they would argue that in practice, the information would leak out before then and minutely affect your conscious perception (like, even if you don't look at the disk, the light reflected from it or the light from the laser writing it will somehow affect your perception). Obviously this is just avoiding the problem. I want to know how the idea ties in with evolution, though.
2Aurini
It doesn't tie in directly to evolution, but the misconceptions are related. I received my fair share of nonsense 'facts' growing up, and I had a hell of a time autodidacting myself as an adult - mostly because some of the consequences of our scientific knowledge base are never explicitly said. Here's a brief list of high-probability truths I've had to infer: -'Observer effect' is a quantum-level mechanism, not a conscious entity. -Schrodinger's cat would not be a superimposed waveform, the waveform would have collapsed immediately (likely leading to a many worlds split, but in our universe the cat is definitely something). As far as its a thought experiment, its been solved. -Macro- and Micro-evolution are not actually science; all evolution is micro-evolution, and it's a false distinction invented by creationists. -c is a fundamental calibrant of the universe; it's not a 'speed' per se, but rather it's an aspect of space/time -Gravity propagates at c I'm about 98% certain that those points are correct; they mesh nicely with everything else I know (what I'd call common knowledge, though it probably ain't that common). If any of them are incorrect, then all the other puzzle pieces get thrown into dissaray - and yet once you know these things, you can start predicting where the puzzle pieces will go with good accuracy. But I've never seen any of them stated explicityly (until I started following LW and OB, anyway). It's my contention that if these things were pounded into kids' heads, then science education would be a lot easier. Instead they're taught that entropy is "like when your room gets messy over time." Barf.
2Paul Crowley
What I mean is, if there's some special feature of humans that collapses wave functions, do all living things have this feature, or was there an animal that had it whose mother didn't have it?
3Psy-Kosh
Greg Egan wrote a book, Quarantine, specifically playing with that idea. (that is, the premise for the story is that it is specific features of the human brain that causes collapse... and those features can be artificially disabled)
0Aurini
Ooooooh! Larry Niven mentioned something similar to this, regarding his... um... Future History books (the ones with Beowulf in them - and the three-legged centaur aliens). At one point in the series he postulated that the centaurs had been breeding the humans for luck - we'd become the luckiest species in the Galaxy. He later on said that, if luck was an inheritable trait then it would be the best inheritable trait. Everyone would have it already. Presumably the same thing would go for QM waveform non-collapsure; it must be useful somehow. Not that it makes any sense. Back to our regularly scheduled reasoning...

If you really want to see a failure to incorporate new information, look at nutrition science. Things may be changing, but the FDA and so on still seems to be saying fat and dietary cholesterol are bad, despite a rather significant absence of evidence, and a rather strong presence of contrary evidence. They're getting around to addressing added sugars and high-fructose corn syrup, but it took maybe half a century.

This is pretty easy to understand, though. There are a lot of commercial interests that restrict what the government can say without stepping on ... (read more)

[-]Jack50

Is it because this information is considered unimportant? Hardly; it's probably the only functional association you will find in every course and every book on the brain.

This isn't my experience at all, unless I'm misunderstanding your meaning. In my cog sci class we studied the associated anatomy for psychopathology, ADHD, dyslexia and probably a couple others I can't think of, but I don't remember anything about Wernicke's or Broca's areas.

Possibly not; this 2006 paper on Broca's area by a renowned expert does not mention it. (In its defense, it r

... (read more)
2PhilGoetz
Oh, dear. Providing specific examples is a risky business. I still haven't figured out a safe way to discuss a field without making myself less welcome to some of its members.
2Jack
I was just adding a data point to the pile of evidence. Don't infer motivations or allegiances. I think your general point is insightful and don't think of myself as a member of the field.

I think it has to do with the fact that we all are cognitive misers. Just think about the comments that this post has generated. Do you really think that every commenter has carefully thought about the subject before articulating an answer? Everyone here is probably forwarding his own pet theory, myself including.

We all probably carry a lot of inaccurate beliefs with us and never bother to check them. Reevaluating our beliefs costs time and effort, keeping the status quo in our mind is effortless. So we only reevaluate when there are strong reasons to do s... (read more)

I had a university physics textbook that claimed that those in the Middle Ages believed that the Earth was flat for "religious reasons". This myth has been going strong since the 19th century. What could possibly explain the persistence of these kinds of beliefs (from textbook authors no less!)?

2Seth_Goldin
Yes, this sounds more like a problem with textbooks than with science itself. Textbooks are often censored for political reasons, such as Japanese textbooks' treatment of Nanjing, or American textbooks' treatment of the Japanese internment camps. This is hard science though, so this won't suffice as an explanation. I fear that people are attached to superstitions about how the brain works. Maybe people like an inaccurately simplified explanation of the brain that claims that specific, local parts of the brain perform specific functions. We know that fMRI research is pretty sketchy, but even smart people like Sam Harris seem to rely on it too much.
1thomblake
Well, a fact about what people in the Middle Ages believed isn't properly in the realm of Physics. I'd expect the physicists to be mostly right about the physics they put in their textbooks, and any other stuff in there should be considered highly suspect. Consider it yet another exercise in skepticism.
0PhilGoetz
Perhaps most people believed the Earth was flat, while most educated people did not.
[-]Cyan30

Some people even "know" that a kilometer is longer than a mile.

1SilasBarta
In fairness, the teacher may have understood that the student was right about kilometers, but was distracted and put-off by how he made his point, which was pretty brutal. You know, pretty much how people regard me all the time :-P
2RobinZ
Except for "brutal" substitute "breezy". d-:
0jastreich
Yes, the child was pretty brutal. That said, so was Galileo when he stood on the table and dropped fruit, while having a polite diner with a clergyman who "knew" that heavier objects fell faster than light objects -- if the story is true.
0byrnema
This letter is not meant to be taken at face value; the writing is clearly ironic. Perhaps something from the Onion.
4Cyan
I don't know about the Onion specifically, but I'd only accept even odds that this is a joke.
[-]knb20

I'm not sure if anyone has posted this yet, but the whole area of forensic "science", is actually on very shaky foundations.

http://www.popularmechanics.com/technology/military_law/4325774.html

Freudian psychoanalysis is (sadly) still the dominant version of applied psychology in many countries (Argentina is one).

Freud and Lacan are still very big influences in postmodern philosophy (and by association, much of the Humanities), in spite of the fact that their theories are either demonstrably false or unfalsifiable.

I was stunned to read Freudian ... (read more)

5JRMayne
I think the criticism of "forensic science," generally in the linked Popular Mechanics article is overblown. Much forensic science is very good. Fingerprints can be matched by computer. The only real dispute there is partial print matches. There was a scandal regarding very poor fingerprinting techniques; there have also been a couple of incidents of outright fraud. But if the prints match, dude, it's you. And there are many competent fingerprint examiners. I've never seen a computer mismatch out of thousands of examples. I have some expertise in collision reconstruction. It's certainly true that some techniques used are not as good as others; expressing solid confidence in pedestrian throw is probably a bad idea. But collision reconstruction based on critical speed scuff marks and various other methods are solid physics. Forensic accounting is valid science. Forensic chemists test for drugs and alcohol with very high accuracy. Properly done ballistics testing is good science. Hair sample comparisons are good science, if not oversold. DNA is good science, but not if you screw up your Bayesian analysis. Some people testify to silly things. Some people make mistakes. Some people are willing to say things they know aren't true. Some scientists are underqualified. Some fields - like forensic odontology - lack the rigor of others, and should not be allowed in court barring a prior showing that the person can do what they say they can do. But the idea that forensic science is "mostly created by cops who were guided by little more than common sense" seems quite misguided to me.
7Jack
The sentence doesn't seem misguided so much as it being used as a complaint. Try: "Biology was mostly created by nature lovers who were guided by little more than common sense." "Computer science was mostly created by math geeks who were guided by little more than common sense." "History was mostly created by story tellers who were guided by little more than common sense."
0CronoDAS
I've heard that, until relatively recently, forensic arson investigation was actually complete nonsense.
2MatthewB
I have had the unfortunate experience to watch, not once, but twice the misuse of forensics to convict someone, in direct opposition to not one, two or three witnesses to the contrary, but four or five people who had testified that a person could not have committed a crime... Yet, the CSI Effect was in full play, and it was not until the arrest of the actual criminal in the first case and DNA exoneration in the second that the people involved were acquitted (and in one case, released. Thankfully after a very short stay in county jail, before they were moved to an actual prison). My family also has a larger number of lawyers than normal, and this was something that was driven into us at an early age "Forensics are a bunch of BS for the most part". Now, that lesson was also tempered with another side "Forensics are a bunch of BS, Unless they help out your case"
3Blueberry
Eyewitnesses are notoriously unreliable. I might be more likely to trust reliable forensic evidence than eyewitness testimony.
2MatthewB
I agree, but I think our entire system of Justice is broken. It doesn't rely nearly enough on the right kind of evidence and the term "Peers" (as in jury of your) is all but meaningless. The fact that it is supposed to deal in evidence is, however, the right place to begin.
0Jack
Why was crystal healing brought up? What context? This is fascinating.
2knb
It was in a sidebar article about how modern scientific medicine is male-centric, and female holistic/alternative healing practices are marginalized and treated as hokum in our society. But in other cultures, female holistic healers are valued members of society. Then it talked about different New Age healing rituals. The only one I really remember was crystal healing, which they said was an ancient Japanese ritual.
0Paul Crowley
When was this? Do you remember what the book was called?
0knb
I took the class in 2007 I think.
0Jack
Bizarre. Did the text book present this as fact or was it a point raised for consideration and debate? Depressing because a good gender studies class would be such a great thing for schools to offer.
4MatthewB
I think that Gender Studies classes are hard to find decent instructors for. I am having to file a complaint of discrimination against mine. Rather than raising things as points for consideration, she raised all manner of things as fact (without room for discussion), and when I began to call her on these "Facts" (A simple wikipedia entry usually sufficed to show that her "Facts" were completely bogus)edit she forbade me to fact check her work in class (on my laptop)/edit. When I later spoke to one of the UCSC gender studies instructors, she said that this was a problem in Gender studies. That often the instructors are militant feminists with bones to pick... So sad.
2thomblake
Interdisciplinary fields are always a bit wooly anyway. There's no reason why a smart, motivated person couldn't do sociology with an emphasis on gender, or philosophy with an emphasis on gender, or so on. And if you don't have an established field for your line of inquiry, you're not going to have rigorous standards for what constitutes good work. So gender studies ends up with standards hovering somewhere between sociology and postmodernist critical theory.
4MatthewB
I completely agree. A UCSC professor named Donna Haraway, who wrote A Cyborg Manifesto: Science, Technology, and Social-Feminism in the late-Twentieth Century is an excellent example of a professor who is capable of putting a gender emphasis upon the issues of sexual roles in society, sociology, and history. It was she to whom I went to discuss the issue after discovering she was at UC Santa Cruz (I had already read the book a few years prior to the Class with the crazy teacher). I had taken a women's studies class because my ex-wife died from being sexually exploited while strung out on crack cocaine (typical crack whore story), and I figured that I might have something to learn from it. Dr. Haraway informed me that I was expecting too much, as most women's studies teachers are incredibly biased and emotionally driven and don't take to facts too well. I don't agree with much of Dr. Haraway's politics, but at least she has sound arguments for her position, rather than appeals to emotion or ignorance. Now, some of the premises of her arguments I would question, but that is the whole point isn't it. That we argue the premises and from those we attempt to form a sound argument, rather than throwing together an argument that consists of "It would be horrible if it were any other way!"
-4Blueberry
You write these brief comments that are incredibly intriguing. Please post more about your life.
3MatthewB
Maybe some day after I have made more in-person acquaintance of more people on the list.
1Jack
I think you didn't finish a sentence. What happened when you started correcting her?
5MatthewB
Duh! She forbade me to use my computer to fact check in class... And, she got really, really pissed off at anything I said (now arranging my facts before class by listening to what she was harping on about in the class prior to mine) that contradicted her rather bizarre world view. I later discovered, from the dept. chair, that she had a paranoid episode right after she had been granted tenure. She's been under pretty intense pressure to retire since then... I've never received a grade below a B in English or Composition classes since the 6th grade, yet she gave me a D, simply because I objected to her irrational world view where we needed to give up all technology and return to nature. She was very much one of those "We must honor the Noble Savage" types.
4Nanani
I've found this sort of attitude common in any class with "Studies" in the name. My worst experience was the communist teacher of East Asian studies (not himself East Asian) who knew nothing of Asia besides Communist China and spent most of the course on propaganda. This was 2006. The professor took to blatantly ignoring any student with a comment or question after a single questioning word about Communism. The world is indeed full of insane people.
1Multiheaded
All such stories of academic delirium I've heard so far took place in the US. Indeed, while all of today's nations produce their share of bogus pseudoscience in the soft fields, Americans shouldn't despair so much; their academia appears to be in an uniquely bad situation here.
2Jack
How did the rest of the class react to you?
-10MatthewB
-2mattnewport
Why would you be surprised to find abject nonsense in a Women's Studies textbook?
1knb
Well, I'm surprised to find that kind of weapons grade nonsense anywhere, still more so in a university coursebook. But I was especially surprised that they would publish nonsense from such a misogynistic author as Freud.

Four top-level posts in a little over a day: too much on LW?

I'm not sure how a reliable updating process would work. You can't keep checking everything you thought you knew in case expert opinion has changed on some of it.

Updates have to use the same channels as new information, and are competing with it.

I'd settle for a reliable method for suspecting that people have just made things up ( maximum heart rate as a simple calculation, that people generally should drink a lot more water than they want, people only use 10% of their brains, Eskimos have a huge number of words for snow, Victorian women had ribs removed to make more extreme corsets possible), but I'm not sure that even that exists.

3CaptainOblivious2
Common sense works surprisingly well in some cases: even as a child I didn't believe the "10% of your brain" thing... think about it: the only way they could know this is if someone had 90% of their brain removed and wasn't affected... and that doesn't seem nearly as likely as people wanting to believe that everyone has vast untapped potential. And let's not even get started on how/why evolution would provide us with 10x the brainpower we need... is there any precedent for that in evolution? Can cheetas run 10x faster than they normally do just by trying a little harder? Can seals hold their breath 10x longer than normal?

Yeah, it happens from time to time. Sometimes these mistakes cause errors in basic things to be taught. I think schools especially are quite slow on the uptake of new ideas -- they have syllabuses and courses and textbooks, and it's quite a lot of trouble to change them, so they usually only get updated every couple years at best.

For example, schools often teach that light is both a particle and a wave, or that it's sometimes a particle and sometimes a wave, or that it has 'particle-wave duality'. They do this because in order to explain that light is just... (read more)

This permeates history as well, specifically high-school textbooks. Lies My Teacher Told Me is an excellent read and myth-pops a few common history goofs.

2EStokes
Love that book. Recommended.
0Mike Bishop
There are some useful corrections there but I disrecommend it because Loewen's history is ignorant of economics.
1MrHen
Are you saying it is wrong or that it could be better? I ask because I felt I learned lots of interesting things and need to know if I should unlearn those things; mark them as incomplete; true with caveats; ...
1Mike Bishop
Its been quite a while since I read it so I can't point to specific facts Loewen gets wrong. I think the main thing that bothered me was that he fails to appreciate the benefits of capitalist competition in raising our living standards. Economic change creates winners and losers in the short run, and I'm happy to try to make things better for the losers, but we need to think about ways to do that which don't block innovation and empower special interests.

The last Society for Neuroscience conference had 35,000 people attend. There must be at least 100 research papers coming out per week. Neuroscience is full of "known things" that most neuroscientists don't know about unless it's their particular area.

[-]djcb10

Then, an obvious way to improve the situation would be for somebody knowledgeable to update the Wikipedia entries, or? It would actually be interesting to see how good Wikipedia is as a vector for spreading updated knowledge.

3Nanani
You haven't spent much time editing Wikipedia, I take it. Great idea in theory, in practice someone will revert the entry back to the common knowledge version in short order. Wikipedia's ban on original research, as well as fairness-of-viewpoint and notability restrictions, make it hard to use as such a vector. If the updated knowledge becomes widespread, THEN wikipedia will be useful as a place to quickly check up on it. It will be a better indicator of update spread than anything else.
3Paul Crowley
An expert wishing to fix Wikipedia might do better to provide an external source that Wikipedia can then cite. A blog post can suffice for this.
2djcb
Well, in this particular case, the 'new' knowledge is the new mainstream viewpoint, and this new mainstream viewpoint seems already present in Wikipedia (the article mentions the article on Wernicke's area). So while it may sometimes be hard to change things in Wikipedia, the difficulties you mention do not seem to apply in this case. I would make the change myself if I were more knowledgable in this area

While we're talking about Al Gore, the meme that global warming has a serious chance of destroying the world won't end.

4Morendil
The topic introduced by this post is "things we know that we know ain't so". It is a good place to discuss widely-held beliefs disseminated despite solid evidence that falsifies the underlying hypothesis. AGW seems to belong in a different category altogether, "things claimed by a majority of scientists that a vocal minority disputes". It is an ongoing controversy. To turn this post into a soapbox would lessen its value. While the AGW controversy may have its place on LW, let me suggest that it should be the object of an appropriately researched top-level post, not opportunistic potshots.
4roland
Now I'm surprised by this. AFAIK it is a real possibility that we end up like Venus: a self-reinforcing heating up of the planet until life as we know today is no longer possible. If a certain threshold is reached the process is basically irreversible. The hotter it gets the more the permafrost regions melt up, and the frozen biomaterial there generates more CO2 to accelerate the process. Additionally there are gigantic frozen Methane reserves in the oceans that will also start to bubble up(in fact they are already bubbling up) and increase the warming even further, and Methane is a much stronger greenhouse gas than CO2. So why do you think that this cannot happen on earth?
3mattnewport
It's probably not impossible that the Earth could end up like Venus but there is fairly well accepted evidence that CO2 concentrations were much higher in the past (10 to 20 times higher than today) without Venus like temperatures. (levels today are around 380 ppm)
0roland
Ok, and how was life like at those times? There are parts of Australia today that are so hot that it is impossible for any human to survive there yet you will find plants and insects and maybe some animals there. AFAIK temperatures above 90°F(33°C) may already be deadly for humans in poor health conditions.
3mattnewport
There was abundant plant and animal life under those conditions: 200 to 150 Ma is pretty much the Jurassic period and 400 to 600 Ma would have encompassed the Cambrian explosion. According to Wikipedia mean temperatures during the Jurassic period were about 3C and during the Cambrian about 7C above today's. Certainly nothing like Venus where surface temperatures are around 480C. Almost no known life could survive those temperatures, except perhaps some of the lifeforms found around black smokers.
2cousin_it
It's not obvious to me that global warming will shrink the total human-habitable area of the globe. As someone living in Moscow, I'd like some warming right now! Sick of those six-month-long winters already. Also check out Siberia, Canada, etc.
0Jack
I read an argument to the effect of: More people die annually of cold than of heat, global warming good. On the other hand, there is that bit about ice melting and destroying the jet stream leading to massive global cooling. I don't know how good that science is though.
-2roland
You can't be serious. Sure, I get your point in regard to the cold winters and I know others who argue the same way, but the thing is, no one has an exact idea of what will change in the world once the temperature rises a few degrees. More melting of ice will change the salinitiy of the sea and cause a change in sea currents which might precipitate a new ice age, this is one possibility in which Moscow will become much colder. There already were some extreme heat waves in Europe during summer that killed thousands of people. Our biosphere is a complex system and how goes the saying? "Never change a running system."
5DonGeddis
It's true that climate is too complex to predict well. Still, I haven't heard many global warming worriers warn about the threat of a new ice age. It's all about the world actually becoming warmer. Given that, the real problem seems to be the speed. If it took 1000 years to raise 5 degrees, that might not be so bad. If it's 50 years, the necessary adjustments (to both humans and non-humans) might only happen with massive die-off. But leaving aside the speed, it's not insane to notice that there is vastly more biodiversity in the tropics, than in the arctic. If you were designing a planet for humans to live on, a little warmer is a whole lot better than a little colder. This doesn't mean that global warming is "good". But you shouldn't dismiss the positive changes out of hand, when evaluating the future pros and cons.
2roland
You are operating under the assumption that warmer implies more tropics. I categorize this as wishful thinking.
0Jack
Then you aren't listening enough, I'm afraid. This is a routine concern.
0roland
Right. And the problem is not only the concerns we have but those we don't have. The unknown unknowns. The climate system is so complex that no one can predict the outcome of further warming. What we know is scary enough to be worried.
2cousin_it
Huh? The Earth once was significantly warmer than now with no runaway consequences.
0taw
Truth value of "global warming has a serious chance of destroying the world" statement is entirely unverifiable to an average person. The media are saying it's true, quoting many leading scientists and politicians. By what mechanism do you suggest people reach an alternative conclusion? There's outside view, but it's not accepted even here, as many have the same world-destroying beliefs about AI, and countless other subjects.
5toto
1- I can't remember anybody stating that "global warming has a serious chance of destroying the world". The world is a pretty big ball of iron. I doubt even a 10K warming would have much of an impact on it, and I don't think anybody said it would - not even Al Gore. 2- I can remember many people saying that "man-made global warming has a serious chance of causing large disruption and suffering to extant human societies", or something to that effect. 3- If I try to apply "reference class forecasting" to this subject, my suggested reference class is "quantitative predictions consistently supported by a large majority of scientists, disputed by a handful of specialists and a sizeable number of non-specialists/non-scientists". 4- More generally, reference class forecasting doesn't seem to help much in stomping out bias, since biases affect the choice and delineation of which reference classes we use anyway.
1knb
Well, I do recall a scientist using explicit "save the word"/"destroy the world" rhetoric. Of course this was rhetoric, not a scientific claim. A lot of non-scientist environmentalists do seem to think that global warming threatens the whole biosphere, though that seems very implausible based on what I know.
0Kevin
I think mass video advertising would probably work. A 30 second summary of the most recent IPCC report. If anyone reading this has $5 million, we can convince the US public that global warming will not destroy the world. Maybe we can even hint at some more realistic existential threats.
4Alex Flint
Please define "destroy the world". It could mean: * rip the earth into chunks and disperse them separately through space * eliminate all life on earth * eliminate all humans on earth * ruin the fabric of our current civilization Some of these are more plausible than others w.r.t. climate change
-1Kevin
Destroy the world means somewhere between #3 and #4 depending on the person with the mistaken belief.
-1eirenicon
I think when they say "the world" they mean "our world", as in "the world we are able to live in", and on that front, we're probably already screwed.

Since it's science, I think a discussion of Kuhn's paradigms shift would be relevant here.

People, maybe especially scientists, don't like accepting new information, until so much new data builds up, a new paradigm is required.

Another possible explanation is that there is so much (good and especially bad) scientific data being produced and published, it's almost better to wait a few years before believing anything new you read, and entering it into the textbooks.

3PhilGoetz
Kuhn's paradigm shifts deal with the problem of diffusing new scientific theories. But I think it's a different issue. Kuhn claims that paradigm shift is difficult because people on different sides of the shift speak in incommensurate terms, and so can't understand each other. In the case of Broca's/Wernicke's areas, there's no problem communicating between different paradigms or even vocabularies.
-1MichaelVassar
The terminological problem is a BIG problem, especially if we are aspiring to Aumann Agreement, but the term paradigm shift is also used to describe changes that don't have that property.
3PhilGoetz
I don't think this involves a paradigm shift of any kind. It's a matter of looking at a lot of brains and seeing where they are damaged.
2MichaelVassar
agreed. I mean more generally.
-1Zachary_Kurtz
Kuhn may be wrong here. If two people are speaking with the same terms, but using these terms to mean different things, it might be more difficult (not less) to accept new scientific data. Especially if such terms are highly associated with out-dated scientific 'facts.'
6Jack
Whether or not a particular brain region is related to language function isn't close to the kind of thing that would be ignored because of the existing paradigm.

The idea of Wikipedia happens to be that people put new knowledge into Wikipedia. Wikipedia unfortunately happens to have quite a few social problems that discourage academics from participating in it.

Scholarpedia.org and Citizendium.org are projects that try to get it right.

2knb
To what problems are you referring? Academics can edit Wikipedia like anyone else.
4Paul Crowley
This is exactly the problem :-)
2ChristianKl
Winning a Wikipedia discussion while you disagree with what the textbook says but are right but the average Wikipedia author thinks you aren't is a painful process. The high school student with a textbook on his side and enough time has a good chance of winning a discussion against a good academic who knows something about the topic.
2Paul Crowley
Unfortunately, that's as it has to be; you can't win unless there's a source that trumps the textbook, no matter what your expertise. As I said elsewhere, an expert can sometimes help more by providing that source than by direct editing.
1pdf23ds
Here's a fairly representative Crooked Timber thread. (Crooked Timber is a blog run more or less exclusively by academics.)

When you ask yourself whether Al Gore invented the internet it isn't as straightforward because you have to ask what you mean with the term. Inventing the internet wasn't only a technical problem but had also to do with getting the regulation right. Rereading GURPS Cyberpunk always makes me remember that it's quite a success that you don't have to pay additional money to send a megabyte to another continent than you have to pay to send it to your neighbor while you have to pay more when you make a phonecall. Al Gore claims credit for producing a regulatory environment that allowed the internet as we know it today.

6Paul Crowley
The question is whether he said he invented the Internet. He didn't.
-2Thomas
I rember him saying that. It was on CNN live. Pat Buchanan was also there laughed . I don't remember was it McLaughlin group or something else.
3Paul Crowley
It's pretty well documented that he didn't say that - check the link. It's likely you're remembering something similar but different.
6Jack
Actually it wouldn't be that surprising if a lot of people thought they remembered hearing it.
[-]RobinZ120

Many magicians have specifically stated that people have reported seeing things in their act that they didn't do. James (The Amazing) Randi tells one such story:

Many years back, I appeared on the NBC-TV Today Show doing a "survival" stunt in which I was sealed into a metal coffin in a swimming pool in an admitted and since-regretted attempt to out-do Harry Houdini, who had performed the stunt at the same location back in 1926. Yes, I beat his time, but I was much younger than he had been when he performed it. I had listened to an agent who assured me that this was the way to go. I have been ever since trying to forget my disrespect for Houdini. But this digression is not pertinent to the main story here.

A week following that TV show, I was standing out on Fifth Avenue in a pouring rain, supervising through a window the arrangement of a display of handcuffs and other paraphernalia that I'd loaned to a bank for an eye-catching advertisement. My raincoat collar was up about my ears, and I could thus not be easily recognized. I was astonished when an NBC director, Paul Cunningham, who had been in charge of my swimming-pool appearance, happened by. We were adjacent to the NB

... (read more)
0whpearson
Ulric Neisser is probably the person to start with this type of false/malleable memory problems. I have only found interesting things on google book search which do not lend themselves to quoting. Also I couldn't find much more recent studies in this field.
-2Thomas
http://www.youtube.com/watch?v=BnFJ8cHAlco This was repeated at least once.
4Paul Crowley
The video backs up the account set out in the link I keep asking you to read.
2RobinZ
Did you watch the video? He says "creating", as it says in the Snopes link. It's not a brilliant wording, but it's a reasonable misstatement.
1randallsquared
Also, I think it's clear that most people use "create" and "invent" as synonyms most of the time, especially when the object being invented or created is one-of-a-kind (which is believed to be true for the internet by most of the public, though it isn't strictly true). So, when someone says, "Al Gore says he invented the internet" and you say, "No, he said 'took the initiative in creating the internet' ", that someone is likely to say, "Why say 'no' if you're going to agree?"
4Paul Crowley
Of course, if "created" sounded as silly as "invented", then we'd see both in accounts making fun of him; but the meme didn't take off until "invented" was substituted.
1Cyan
This is either a deliberate troll or an individual who, for whatever reason, cannot assimilate new information.
3Paul Crowley
Don't think it's a deliberate troll - check the account, a history of what I think are sincere but sadly rather low-quality comments.
-11Thomas
4Zack_M_Davis
Is it a success? If sending a megabyte to another continent costs more, then it's plausible that things would work better if the sender actually pays that cost. We may not know what we're missing.
0pdf23ds
This doesn't directly address your comment; it just came to mind. Girl Arrested for Charging Cell Phone (link Not Safe for Work)