All of ChrisG's Comments + Replies

ChrisG00

By guessing 100, one is being dishonest.

No it is not, especially since cousin_it has upfrontly told us what he values. You are assuming that everyone who submits has a utility function that highly values winning this game, which, given the comments around here, seems to not be true (or is at least widely believed to not be true).

Don't confuse 'has different values than I' with 'irrational'.

ChrisG40

I have also seen instances where nearly an entire field is making some elementary error, which people outside that field can see more clearly, but which they can't communicate to people in that field because they would have to spend years learning >enough about the field to write a paper, probably with half a year's worth of experimental work, and not get rejected, even if their insight is something that could be communicated in a single sentence.

I for one would be interested in hearing these sentences, and also which fields you feel are being held back by simple errors of logic. The margins here are quite large ;).

7PhilGoetz
Some examples off the top of my head: Rodney Brooks and others published many papers in the 1980s on reactive robotics. (Yes, reactive robotics are useful for some tasks; but the claims being made around 1990 were that non-symbolic, non-representational AI was better than representational AI at just about everything and could now replace it.) Psychologists and linguists could immediately see that the reactive behavior literature was chock-full of all the same mistakes that were pointed out with behavioral psychology in the decade after 1956 (see eg. Noam Chomsky's article on Skinner's Verbal Behavior). To be fair, I'll give an example involving Chomsky on the receiving end: Chomsky prominently and repeatedly claims that children are not exposed to enough language to get enough information to learn a grammar. This claim is the basis of an entire school of linguistic thought that says there must be a universal human grammar built into the human brain at birth. It is trivial to demonstrate that it is wrong, by taking a large grammar, such as one used by any NLP program (and, yes, they can handle most of the grammar of a 6-year-old), and computing the amount of information needed to specify that grammar; and also computing the amount of information present in, say, a book. Even before you adjust your estimate of the information needed to specify a grammar by dividing by the number of adequate, nearly-equivalent grammars (which reduces the information needed by orders of magnitude), you find you only need a few books-worth of information. But linguists don't know information theory very well. Chomsky also claims that, based on the number of words children learn per day, they must be able to learn a word on a single exposure to it. This assumes that a child can work on only one word at a time, and not remember anything about any other words it hears until it learns that word. As far as I know, no linguist has yet noticed this assumption. In the field of sciencology?,