Comment author: AlexSchell 27 May 2014 04:49:42PM *  6 points [-]

SAT is very g-loaded, so it would be susceptible to the same practice effect that IQ tests in general are susceptible to. If you look at SAT/IQ tables, the 20-40 point increase Tyler cites corresponds to about 1.5-3 IQ points. This is consistent with the typical magnitude (< 5 points) of the practice effects on IQ test scores. Your "hundreds of points" are wildly inconsistent with this. The only way I could see that happen is if quite a bit of the SAT would test for skills that can be practiced but don't correlate with g. Not very likely.

The way to reconcile your experience with the evidence is to note that the score on a low-stakes practice test is just not comparable to the score on the real thing (with or without test prep). It's not that implausible to believe that, say, 10% of people (more than enough to account for your anecdotes) will score at least 100 points less on an early practice test than they could score on the real thing at the same level of preparation. It's hard to trick your brain into believing that something is high-stakes when it isn't.

ETA: On reflection, the low-stakes hypothesis probably doesn't account for too much of the puzzle. In particular, it doesn't explain any gain between consecutive low-stakes practice tests. I think James Miller's explanation takes the cake. The SAT-g correlation is likely a lot lower for a population not proficient in English.

Comment author: Dirac_Delta 28 May 2014 08:43:19AM 0 points [-]

The way to reconcile your experience with the evidence is to note that the score on a low-stakes practice test is just not comparable to the score on the real thing (with or without test prep).

This is one possibility. Thanks for bringing it up.

Comment author: Dirac_Delta 27 May 2014 12:45:28PM *  21 points [-]

Hello. I've been a lurker here for quite some time now, but this is the first time I am making an appearance. I would like to consult everyone here regarding what I perceive to be irrationality on my part. I hope that you will be patient towards me and refrain from downvoting out of irritation, as I would prefer not to have my comment hidden, since that would greatly reduce my chances of getting feedback.

The issue is this: while I am fully aware that anecdotes do not constitute data, I have a very difficult time believing that test preparation only has a modest positive effect (if any) on SAT scores, even though this has been noted by several studies. Such a finding is completely incongruent with my personal experiences growing up in an East Asian country where most students (regardless of their socioeconomic status) attend cram schools or hire after-school private tutors -- many of these students have managed to perform much better than they otherwise would have, due to the extra lessons and revision. (They usually go through several iterations of testing themselves using old SAT papers before sitting for the actual test, and there are often very significant gains -- sometimes as high as several hundred points -- in their performances.)

So, my request to my fellow LW readers is this: Please help me reconcile this jarring chasm between research findings and my own personal experiences. There might be some information I am missing that is contributing to my inability to believe the findings. Is there anyone here who has done extensive reading on this topic?

View more: Prev