Kaj_Sotala comments on A Rational Education - Less Wrong

12 Post author: wedrifid 23 June 2010 05:48AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (149)

You are viewing a single comment's thread. Show more comments above.

Comment author: Kaj_Sotala 23 June 2010 09:05:31PM *  0 points [-]

But note that being a good researcher does not automatically translate to also being a good teacher. I'd put less emphasis on how many citations they have and more on how good they are at actually teaching.

To find out how good someone is at teaching, you can use a resource like http://www.ratemyprofessors.com/ (if you live in the right country, which I don't) or simply ask around.

Comment author: Douglas_Knight 24 June 2010 12:32:46AM 2 points [-]

This paper is widely reported as saying that student evaluations anti-correlate with performance in later classes. I haven't read the paper, but I think that might be oversimplify the claim.

You might expect this result, if popular teachers are easy and don't push the students, but that's definitely not what's happening in this military academy with a uniform curriculum. But if what's popularly perceived as the dominant force in (american) evaluations has been eliminated, it's not clear whether this tells us much (about other american schools).

Comment author: magfrump 24 June 2010 02:09:47PM *  0 points [-]

A casual glance at the abstract leads me to read the paper's conclusion more as "Teachers who have easy classes and teach to the test provide worse foundations and get better evaluations." This seems like a pretty likely hypothesis that would explain some of the correlation. Some evidence could be gathered for it from ratemyprofs.

I'll read it further when I have time to check for things like linear regression.

ETA: that study looks really good. I am curious with how the data would be effected if students consciously rated easiness separately.

Comment author: Kazuo_Thow 23 June 2010 09:54:38PM *  2 points [-]

I've gotten into the habit of pointing out, whenever other students at my university make reference to ratemyprofessors.com, that the selection bias on that site is huge. It's not uncommon to see professors with dozens of extremely positive reviews, dozens more highly negative reviews, and very few - if any - neutral reviews. Naturally, the negative reviews appear most frequently because "grr, I feel like this professor graded too harshly" provides the strongest motivation for posting a disgruntled comment.

I don't know of any other place that does this, but the University of Washington maintains a course evaluation system (with data made available to all students), to gather quarterly feedback on the performance of professors and TAs in such a way that at most ~5% students fail to fill out the questionnaires.

Comment author: magfrump 24 June 2010 02:00:31PM 1 point [-]

CSUs and UCs do this (or at least where I've been they do); while these evals might be less biased they are more than proportionately less accessible.

Also ratemyprofessors.com has different ratings for "easiness" "enthusiasm" etc., so instead of looking at "highest rated" professors looking at the actual reviews would be a bit more informative.

Comment author: Kazuo_Thow 24 June 2010 05:52:02PM 0 points [-]

while these evals might be less biased they are more than proportionately less accessible.

How so?

Comment author: magfrump 24 June 2010 09:41:00PM 0 points [-]

Compared with ratemyprofessors, which is available to everyone online, I don't think the evaluations written by students (at least in California) are publicly available at all. I could be wrong, but I don't know anyone who has ever seen one (other than the person being evaluated).