Problems in Education

31 ThinkOfTheChildren 29 April 2013 09:00PM

Alright guys. The main complaint of the discussion article was simply "hoax", yelled as loudly or as quietly as the user felt about it. Hopefully this won't get the same treatment.

We have been evaluating educational,  grant-funded programs for 20 years. Throughout these years, we have witnessed a slow change in how students are selected for academic services.  Traditionally, students were targeted for academic services and opportunities based on demographic characteristics—usually race and, until recently, family income status (based on free or reduced priced lunch). Wealthier, white students are given challenging lessons and tracked into the advanced courses, while their non-white and poorer peers are tracked low and given remediation services. The latter students are often referred to as “at-risk,” though we are finding more and more that the greatest risk these students face is being placed into inappropriate remedial courses which eventually bar them from access to advanced courses.  After students have been labeled “at-risk,” and then tracked inappropriately and provided unnecessary (and often harmful) remediation, their downward trajectory continues throughout their education. The demographic gap this creates continues to expand, despite the lip service and excessive tax and grant funds paid to eliminate—or at least lessen—this very gap. This “at-risk” model of assigning services is slowly being replaced by a “pro-equity” model. The driving force behind this change is the availability and use of data.

The literature is full of documentation that certain demographic groups have traditionally had less access to advanced math and science courses than equally scoring students belonging to demographic groups thought to be “not at risk.” Some examples from research follow.
•    Sixth grade course placement is the main predictor of eighth grade course placement, and social factors--mainly race---are key predictors of sixth grade course placement (O’Connor, Lewis, & Mueller, 2007).
•    Among low-income students, little is done to assess which are high achievers. Few programs are aimed at them, and their numbers are lumped in with “adequate” achievers in No Child Left Behind reporting. As a result, little is known about effective practices for low-income students (Wyner, Bridgeland, & DiIulio  Jr., 2007).
•    In a California school district, researchers found that of students who demonstrated the ability to be admitted to algebra, 100% of the Asians, 88% of the whites, 51% of the Blacks, and 42% of the Latinos were admitted (Stone & Turba, 1999).
•    Tracking has been described as “a backdoor device for sorting students by race and class.” Many researchers agree (Abu El-Haj & Rubin, 2009).
•    When course grades are used to determine placement, studies show that some students’ grades “matter” more than others. Perceptions of race and social class are often used to determine placement (Mayer, 2008).
•    Studies show that when schools allow students the freedom to choose which track they’ll take, teachers and counselors discourage many previously lower tracked students from choosing the higher track  (Yonezawa, Wells, & Serna, 2002).
•    The sequence of math students take in middle school essentially determines their math track for high school. In North Carolina, this is true because of math prerequisites for higher level math (North Carolina Department of Public Instruction, 2009).

We are seeing a move toward using objective data for placement into gateway courses, such as 8th grade algebra. Many school districts are beginning to use Education Value Added Assessment (EVAAS) and other data system scores that predict success in 8th grade algebra for criteria to enroll. This pro-equity model is replacing the traditional, at-risk model that relied on professional judgment.  One example of this is in Wake County, North Carolina. Superintendent Tony Tata attributed a 44% increase in the number of students enrolled in algebra to the use of the predictive software, EVAAS, to identify students likely to be successful. The success rate in the course increased with the addition of these students (KeungHu, 2012).

Although the pro-equity model of using objective data to assign students to more rigorous courses has proven successful, many people resist it. These people cling to the at-risk model, dismissing the objective data as inconclusive. Many of the overlooked students who were predicted to succeed, yet were placed in lower tracks (disproportionately minorities), are “weaker,” according to the old-school staff, and allowing these students into the gateway 8th-grade algebra course would be a disservice to them. (Not allowing them into this course ensures their bleak academic future.)  Review of the data had shown that strong students were being overlooked, and this objective use of data helps identify them (Sanders, Rivers, Enck, Leandro, & White, 2009).

The changes in education began with concern for aligning academic services with academic need. Aligning opportunities for rigor and enrichment is only just beginning. In the past, a large proportion of federal grant funds were for raising proficiency rates. In the at-risk model, grant funds were provided for services to the minority and poor demographic groups with the goals of raising academic proficiency rates. When we first started evaluating grant-funded programs, most federal grants were entirely in the at-risk model. The students were targeted for services based on demographic characteristics. The goals were to deliver the services to this group. Staff development was often designed to help staff understand children in poverty and what their lives are like, rather than helping them learn how to deliver an effective reading or math intervention. The accountability reports we were hired to write consisted of documentation that the correct demographic group was served, the program was delivered, and staff received their professional development. Proficiency rates were rarely a concern.

In 2004, the federal government developed the Program Assessment Rating Tool (PART) to provide accountability to grant-funded programs by rating their effectiveness.  The PART system assigned scores to programs based on services being related to goals, showing that the goals were appropriate for the individuals served, and student success measured against quality standards and assessments. PART rated programs that could not demonstrate whether they have been effective or not because of lack of data or clear performance goals with the rating “Results Not Demonstrated”  (U.S. Office of Management and Budget and Federal Agencies, n.d. "The Program Assessment Rating Tool") . In 2009, nearly half (47%) of U.S. Department of Education grant programs rated by the government are given this rating, thus illustrating the difficulties of making this transition to outcome based accountability (U.S. Office of Management and Budget and Federal agencies, n.d. "Department of Education programs"). The earliest changes were in accountability, not in program services or how to target students. Accountability reports began asking for pre- and post-comparisons of academic scores. For example, if funds were for raising the proficiency rates in reading, then evaluation reports were required to compare pre- and post-reading scores. This was a confusing period, because programs still targeted students based on demographic information and provided services that often had no research basis linking them to academic achievement; professional development often remained focused on empathizing with children in poverty, although the goals and objectives would now be written in terms of the participants raising their academic achievement to proficiency. We evaluators were often called in at the conclusion of programs to compare pre- and post-academic scores, and determine whether participants improved their scores to grade-level proficiency. We often saw the results of capable students treated like low-achievers, thought to have no self-esteem, and provided remedial work. Such treatment damaged the participants who had previously scored at or above proficient prior to services.

 A typical narrative of an evaluation might read:

 The goal of the program was to raise the percentage of students scoring proficient in reading. The program targeted and served low-income and minority students. Staff received professional development on understanding poor children. Services offered to students included remedial tutorials and esteem-building activities. When the program ended, pre-reading scores were obtained and compared with post-scores to measure progress toward the program objective.  At that time, it was discovered that a large percentage of participants were proficient prior to receiving services.

Rather than cite our own evaluations, we found many examples from the school districts reporting on themselves.

Accelerated Learning Program.

The following is a direct quote from a school system in North Carolina:

. . . Although ALP [Accelerated Learning Program] was designed primarily to help students reach proficiency as measured by End-of-Grade (EOG) tests, only 41.1% of those served showed below-grade-level scores on standard tests before service in literacy. In mathematics, 73.3% of students served had below-grade-level scores. ALP served about 40% of students who scored below grade level within literacy and within mathematics, with other services supporting many others. . . . Compared to those not served, results for Level I-II students were similar, but results for Level III-IV students were less positive. One third of non- proficient ALP mathematics students reached proficiency in 2008, compared to 42.1% of other students. (Lougee & Baenen, 2009).

Foundations of Algebra

This program was designed for students who fit specific criteria, yet it served many students who did not. Students who were below proficient or almost proficient were to be placed in courses to eventually prepare them for Algebra I. When criteria for placement are not met, determining program effectiveness is difficult, if not impossible. Students were likely entered into the program based on teacher recommendations, which were subsequently based on demographic factors such as race. The teachers “mistook” these students for below-proficient students when they were not. Had objective data, such as actual proficiency scores, been consulted, the proper students could have been served. The report indicates a success, as a higher percentage of these students than similar students who were not served enrolled in Algebra I. However, it is not known if this comparison group includes only students who actually meet the criteria, or if they are a heterogeneous mix of students of varying abilities. Missing data also makes program effectiveness evaluation difficult (Paeplow, 2010).

Partnership for Educational Success (PES)

This program was purportedly for students who are “at risk,” which is defined as students who scored below grade level on EOG (below proficiency)  and have been “identified by the PES team as having family issues that interfere with school success.” What is meant by “family issues” is unclear. The majority of students served are Economically Disadvantaged (ED) (91.3%) and Black (71.5%). More than half the students served, according to the evaluation, were at or above grade level on their EOGs when they began the program, thus making program effectiveness difficult to judge. The family component is an integral part of the program, and outside agencies visit families. Many community organizations are involved. But if the staff could miss so easy a datum as EOG scores for so many students, one has to wonder about such a subjective criterion as “family issues.” The program appears to have targeted ED students, with little regard to prior performance data. Data for many students (43.5%) was missing. Teachers indicate that parents of the targeted families have become more involved in the school, but little else has changed (Harlow & Baenen, 2004).

Helping Hands

Helping Hands was initiated based on data indicating that Black males lag behind other groups in academic achievement. The program is supposed to serve Black males, and most of the participants fit these criteria. The program is also designed to improve academics, and to curtail absenteeism and suspensions. Although the percentage of selected participants who needed improvement in these areas was higher than it was for the overall population of the students served, not all students served demonstrated a need for intervention. Many students were at grade level, were not chronically absent, and had not been suspended. Yet they were served because they were Black and male (Paeplow, 2009).

At Hodge Road Elementary School, students were tutored with remedial work in an after-school program. The only criterion the students had to meet to be allowed into the program was the inability to pay full price for their lunch. Their academic performance was irrelevant. (To be fair, these criteria were instituted by No Child Left Behind, and not the school system.) Most students were already reading and doing math at or above grade level (the two subjects for which tutoring was provided). The evaluation shows that giving remedial coursework to students who are at or above grade level, as if they were below grade level, can actually harm them. In the final statistics, 11.1% of Level III & IV 3rd through 5th graders scored below grade level after being served, compared with only 2% of a comparable group who were not served. An astonishing 23% of students in kindergarten through 2nd grade served who were at or above grade level prior to the tutoring scored below grade level afterward, compared with 8% of comparable students who were not served (Paeplow & Baenen, 2006).

AVID

AVID is a program designed for students who may be the first in their families to attend college, and who are average academic performers. The program, developed in the 1980s, maintains that by providing support while holding students to high academic standards, the achievement gap will narrow as students succeed academically and go on to successfully complete higher level education. Fidelity of implementation is often violated, which, as proponents admit on AVID’s own website (www.AVID.org) may compromise the entire program. Student participants must have a GPA of 2.0-3.5. We were asked to evaluate Wake County Public School Systems AVID program. Many students chosen for the program, however, did not fit the criteria (Lougee & Baenen, 2008). Because AVID requirements were not met, a meaningful evaluation was not possible.

This AVID program was implemented with the goal of increasing the number of under-represented students in 8th grade algebra. This was at a time when no criteria for enrollment in 8th grade algebra existed (i.e., a target to help the students reach didn’t exist), and high scoring students in this very group were not being referred for enrollment in algebra. Under these conditions, the program makes no sense. In summary, the goal of this program is to enroll in 8th grade algebra more low-income, minority, and students whose parents didn’t go to college. Only students recommended by teachers can enroll in 8th grade algebra. The data showed that very high-scoring, low-income and minority students were not being recommended for 8th grade algebra. Why do we think that students whose parents didn’t go to college can’t enroll in 8th grade algebra without being in an intervention program first? (Also, how it is determined that the students’ parents did not attend college is not addressed.) The program is for low-average students. They served high-average students. Then they still didn’t recommend them to be in 8th grade algebra. This program is very expensive. We have evaluated this program in many school districts and we find the same results, typically, as this report.

During this era, the interventions typically have not been related to the desired outcomes by research. For example, self-esteem-building activities were often provided to increase the odds of passing a math class, or to improve reading scores. Sometimes programs would be academic, but claims for success were not research-based, nor was the relationship between the activities and the desired outcomes. Although many interventions were at least related to the academic subject area the program  was trying to impact, it was not unheard of to see relaxation courses alone for increasing math test scores, or make-overs and glamor shots for raising self-esteem, which in turn would allegedly raise reading scores.

During the last decade, education has slowly moved toward requiring accountability in terms of comparing pre- and post-scores. We saw this causing confusion and fear, rather than clarity. More than once, when we reported to school districts that they had served significant numbers of students who were already at or above proficiency levels, they thought we were saying they had served high-income students instead of their target population of low-income students. We have seen many school systems assess their own programs, write evaluation reports like the examples above, and then continue to implement the programs without any changes. We have worked with some educators whose eyes were opened to the misalignment of services and needs, and they learned to use data, to identify appropriate interventions, and keep records to make accountability possible. We’ve seen these innovators close their achievement gaps while raising achievement of the top. But, those around them didn’t see this as replicable.

Race to the Top will impact the rate of change from the at-risk to the pro-equity model. Teacher and principal evaluations are going to include measures of growth in student learning (White House Office of the Press Secretary, 2009).  EVAAS will be used to measure predicted scores with observed scores. If high-achieving students who are predicted to succeed in 8th grade algebra are tracked into the less rigorous 9th grade algebra, they are not likely to make their predicted growth .

We are moving out of this era, and the pace of change toward identifying student needs using appropriate data is picking up. North Carolina’s newly legislated program, Read to Achieve, mandates that reading interventions for students in K-3 be aligned to the literacy skills the students struggle with, and that data be used to determine whether students are struggling with literacy skills. Schools must also keep records for accountability. Although this approach seems logical, it is quite innovative compared with the past reading interventions that targeted the wrong students (North Carolina State Board of Education; Department of Public Instruction, n.d.).

Education Grant programs are now requiring that applicants specify what data they will use to identify their target population, and how the intervention relates to helping the participants achieve the program goals. Staff development must relate to delivering the services well, and accountability must show that these things all happened correctly, while documenting progress toward the program objectives. It is a new era. We are not there yet, but it is coming.

 References
Harlow, K., & Baenen, N. (2004). E & R Report No. 04.09: Partnership for Educational Success 2002-03: Implementation and outcomes. Raleigh, NC: Wake County Public School System. Retrieved from http://www.wcpss.net/evaluation-research/reports/2004/0409partnership_edu.pdf
KeungHu. (2012). Wake County Superintendent Tony Tata on gains in Algebra I enrollment and proficiency. Retrieved from http://blogs.newsobserver.com/wakeed/wake-county-superintendent-tony-tata-on-gains-in-algebra-i-enrollment-and-proficiency
Lougee, A., & Baenen, N. (2008). E & R Report No. 08.07: Advancement Via Individual Determination (AVID): WCPSS Program Evaluation. Retrieved from http://www.wcpss.net/evaluation-research/reports/2008/0807avid.pdf
Lougee, A., & Baenen, N. (2009). E&R Report No. 09.27: Accelerated Learning Program (ALP) grades 3-5: Evaluation 2007-08. Retrieved from http://www.wcpss.net/evaluation-research/reports/2009/0927alp3-5_2008.pdf
Mayer, A. (2008). Understanding how U.S. secondary schools sort students for instructional purposes: Are all students being served equally? . American Secondary Education , 36(2), 7–25.
North Carolina Department of Public Instruction. (2009). Course and credit requirements. Retrieved from http://www.ncpublicschools.org/curriculum/graduation
North Carolina State Board of Education; Department of Public Instruction. (n.d.). North Carolina Read to Achieve: A guide to implementing House Bill 950/S.L. 2012-142 Section 7A. Retrieved from https://eboard.eboardsolutions.com/Meetings/Attachment.aspx?S=10399&AID=11774&MID=783
O’Connor, C., Lewis, A., & Mueller, J. (2007). Researching “Black” educational experiences and outcomes: Theoretical and methodological considerations. Educational Researcher. Retrieved from http://www.sociology.emory.edu/downloads/O%5c’Connor_Lewis_Mueller_2007_Researching_black_educational_experiences_and_outcomes_theoretical_and_methodological_considerations.pdf
Paeplow, C. (2009). E & R Report No. 09.30: Intervention months grades 6-8: Elective results 2008-09. Raleigh, NC: Wake County Public School System. Retrieved from http://www.wcpss.net/evaluation-research/reports/2009/0930imonths6-8.pdf
Paeplow, C. (2010). E & R Report No. 10.28: Foundations of Algebra: 2009-10. Raleigh, NC: Wake County Public School System. Retrieved from http://assignment.wcpss.net/results/reports/2011/1028foa2010.pdf
Paeplow, C., & Baenen, N. (2006). E & R Report No. 06.09: Evaluation of Supplemental Educational Services at Hodge Road Elementary School 2005-06. Raleigh. Retrieved from http://www.wcpss.net/evaluation-research/reports/2006/0609ses_hodge.pdf
Sanders, W. L., Rivers, J. C., Enck, S., Leandro, J. G., & White, J. (2009). Educational Policy Brief: SAS® Response to the “WCPSS E & R Comparison of SAS © EVAAS © Results and WCPSS Effectiveness Index Results,” Research Watch, E&R Report No. 09.11, March 2009. Cary, NC: SAS. Retrieved from http://content.news14.com/pdf/sas_report.pdf
Stone, C. B., & Turba, R. (1999). School counselors using technology for advocacy. Journal of Technology in Counseling. Retrieved from http://jtc.colstate.edu/vol1_1/advocacy.htm
U.S. Office of Management and Budget and Federal Agencies. (n.d.). The Program Assessment Rating Tool (PART). Retrieved from http://www.whitehouse.gov/omb/expectmore/part.html
U.S. Office of Management and Budget and Federal agencies. (n.d.). Department of Education programs. Retrieved from http://www.whitehouse.gov/omb/expectmore/agency/018.html
White House Office of the Press Secretary. (2009). Fact Sheet: The Race to the Top. Washington D.C. Retrieved from http://www.whitehouse.gov/the-press-office/fact-sheet-race-top
Wyner, J. S., Bridgeland, J. M., & DiIulio  Jr., J. J. (2007). Achievement trap: How America is failing millions of high-achieving students from low-income families. Jack Kent Cooke Foundation, Civic Enterprises, LLC. Retrieved from www.jkcf.org/assets/files/0000/0084/Achievement_Trap.pdf
Yonezawa, S., Wells, A. S., & Serna, I. (2002). Choosing tracks:“Freedom of choice” in detracking schools. American Educational Research Journal , 39(1), 37–67.

Comment author: atucker 09 April 2013 01:29:25AM *  1 point [-]

Has anyone published data on the effectiveness of Bayesian prediction models as an educational intervention? It seems like that would be very helpful in terms of being able to convince school districts to give them a shot.

Comment author: ThinkOfTheChildren 25 April 2013 08:08:12AM 0 points [-]

http://www.sas.com/govedu/edu/k12/evaas/papers.html

Quite a few things there. SAS's EVAAS is generally considered the gold standard of bayesian prediction models as educational interventions; unfortunately as SAS is based in North Carolina it has yet to spread outside that particular state. Some states have similar systems being produced by similar companies.

Particularly, if I were you I would read: http://www.sas.com/success/wf_rolesville.html

Comment author: James_Miller 09 April 2013 05:11:35AM *  11 points [-]

If you could prove this stuff you could become a hero to a lot of people.

Edit: I now think this post is probably a hoax. As EY writes "Your strength as a rationalist is your ability to be more confused by fiction than by reality."

Comment author: ThinkOfTheChildren 25 April 2013 08:05:30AM 0 points [-]

Please look through the comments where I have replied to criticisms; I have tried to find relevant citations.

Comment author: mwengler 09 April 2013 05:23:44PM 19 points [-]

My top candidates for what is up here are: 1) fabrication as part of a social experiment on how credulous we are 2) fabrication by a sociopath with a very odd idea of self-entertainment 3) incredibly erroneous interpretation of what is going on by a crank

But it is SO full of red flags that I would be surprised if it is not intentional. Call it 66% chance it is intentional hoax.

And it is so far from the mark of a true post that I would be very surprised if it had more than a glancing connection to the truth, call it 95% that it is barely connected to actual facts.

I have kids in California public schools. I have read, over the years, many critiques of public schools and public funding generally. As bad as things are, they are quite obviously nowhere near as bad as this article suggests in the schools my kids have gone to and are now going to. Further, I am quite good friends with a long time teacher, administrator, and union officer in NYC. I by no means share her respect for the union and DO believe documented horror stories of "turkey farms" where truly impossibly bad teachers are stored while being paid rather than following the more expensive process of firing them. I do believe other horror stories. But I can tell you for sure, while things are not amazingly wonderful in California public schools, they are simply not even vaguely close to that bad in many real Calfifornia schools I am exposed to.

So at bare minimum, if there is any truth to the allegations in the original post, the idea that these things are universal, or at least pervasive in American public schools is wrong.

Next argument: many of us reading this board, and even being taken in by this post, went through the American public school system ourselves, and by my standards, (I'm 55) many of you went through quite recently. Many of us, I dare say, were in advanced classes. Does the OP fit even vaguely with what you saw with your own eyes? It is miles from my 40 year old experiences.

Next, there is a thriving critique of publicschools in this country. With the amount of negative attention public education has drawn, is it really plausible that NONE of this critique has discovered the depths of waste and stupidity described as routine by this post? It is not plausible to me.

Next, public spending and public education tends to be a pretty open process. If these are Government grants, there is a crap load of information that has to be public.

Finally, to make such extreme claims with absolutely NO linkage to any source other than the post itself, would require remarkable naivete about how an intelligent audience should perceive claims like this, an innocence which is belied by the beautiful craftsmanship of the post itself. Really, EVERY program discussed needs to be obfuscated? No agency involved can be mentioned?

I googled "black men ipad education grant" hit nothing similar to the OP claimed program.

The real question is how long before the trap is sprung and we are told we were naive to believe this at all and we are really no better than birthers and creationists when the story fits our fears. I think it is better than 50% we will get such a message, but we'll see.

Comment author: ThinkOfTheChildren 25 April 2013 08:02:10AM *  1 point [-]

I have kids in California public schools.

I have never worked in California, nor New York, and cannot speak for your experience.

Next argument: many of us reading this board, and even being taken in by this post, went through the American public school system ourselves, and by my standards, (I'm 55) many of you went through quite recently. Many of us, I dare say, were in advanced classes. Does the OP fit even vaguely with what you saw with your own eyes? It is miles from my 40 year old experiences.

Really? I myself went through the advanced classes. In my "Calculus AB" class, there were 28 whites, 1 hispanic, and 2 blacks. My school was probably around 30% black, 20% hispanic, 50% white. There are two possibilities. Either whites are 5/3 * 28/2 = 23 times more likely than blacks to be prepared for Calculus, or there is some kind of institutionalized racism going on.

Nobody issues grants to help the academically gifted kids who are already doing well. Most grants come as "dropout prevention grants", or are otherwise targeted at students unlikely to end up on Lesswrong. So I would ask you: in your advanced math classes, were minorities represented as a proportion of the school's population? Or was the ratio of the percentage of minorities in your school's population to the percentage of minorities in your advanced classes higher than 1? Perhaps higher than 2? For me it was 23.

The real question is how long before the trap is sprung and we are told we were naive to believe this at all and we are really no better than birthers and creationists when the story fits our fears. I think it is better than 50% we will get such a message, but we'll see.

Eh. I wish citations were easier to find; it's kind of ridiculous, honestly. Just trying to find math placement criteria for any given school system on the internet is impossible, much less a random assortment of school systems such that my location is anonymous.

Comment author: educationrealist 09 April 2013 01:46:28PM 21 points [-]

Man, I registered just so I could vote and then it turns out there's something called karma.

This post is almost entirely nonsense. I give it "almost" simply because in certain all-URM school districts the corruption level is high. It's within the realm of possibility that "fake grants" to "fake grant programs" that are nothing more than chump change doled out by large employers who can wave the program in front of Jesse Jackson and his ilk--look! We're providing gravy!--so I won't call it an outright lie. But it's certainly not the norm. Did you notice that this guy acts like the education world is comprised solely of blacks and whites? If any element of his story is true, it's because he lives or works in an all black school district that is, indeed, corrupt. Detroit, New Jersey somewhere, or the like. And that's a generous interpretation.

The second half of his post is so risible I'm amazed anyone takes it seriously. We live in a world where, as I write this, federal settlements are forced on schools that suspend or expel minorities at a higher rate, never mind the details, and anyone believes that schools assign classes by race? It's not just wrong. It's an outright LIE. Even in very rich schools that have low income URM students (and I can think of five within 20 miles of my home), the pressure to integrate classes when the kids are unprepared is huge. Principals are at risk for losing AP classes if they don't put enough URMs in them. They face lawsuits if they do use tests to assign kids to advanced classes, much less if they assigned by race. As for the idea that black students do well if the teachers like them there? Please. Teachers have next to no say as to their assignments---it's one area in which principals have a great deal of control.

Every word beginning with "unfortunately" is such a lie I'm astonished anyone would credit it.

Comment author: ThinkOfTheChildren 25 April 2013 07:40:17AM *  1 point [-]

I apologize for the late response.

As for the idea that black students do well if the teachers like them there? Please. Teachers have next to no say as to their assignments---it's one area in which principals have a great deal of control.

I do not know where you come from, but I have personally reviewed the math placement criteria of hundreds of middle schools and high schools. Teacher recommendations are always on the list, whereas I have never seen a school which used "principal recommendations". Wake County, NC's placement criteria: http://www.wcpss.net/policy-files/series/policies/5611-bp.html Alamance County's placement criteria: http://tinyurl.com/d35dtfy I will find more if you'd like me to, but teacher recommendations are plainly listed. In my experience, principals generally back their math teachers when it comes to which students get placed where.

We live in a world where, as I write this, federal settlements are forced on schools that suspend or expel minorities at a higher rate, never mind the details, and anyone believes that schools assign classes by race? It's not just wrong. It's an outright LIE.

The schools do not outright assign math placement based on race; it is slightly more subtle than this. An example would be Wake County, in North Carolina. Wake County used a model called the "effectiveness index". A student is given a score based on: 1) Their previous test scores 2) Their income level (trinary: free lunch, reduced-price lunch, normal) 3) Their race. If two students with exactly equal grades and test scores were evaluated using the effectiveness index, with one student being a poor black, and another being a middle-class white, the former would be given a lower residual score, and therefore would be less likely to be placed into an advanced class. These scores were also used to determine how well a school is doing at teaching. If the poor black student did as well as the white student, the difference between his score and his effectiveness index residual would be larger than the white student's, and so the school would be rewarded for overcoming the "risk factors" of being poor and black and managing to instruct him anyway. Wake county is currently doing away with the effectiveness index, replacing it with EVAAS, a system which takes into account nothing but test scores. Source: http://content.news14.com/pdf/sas_report.pdf

Can you point me to a federal settlement forced on a school that suspends or expels minorities at a higher rate? I ask because in all of the school districts I have worked with, the schools did suspend minorities at a higher rate, and I have yet to see any consequences for this.

Principals are at risk for losing AP classes if they don't put enough URMs in them.

This, as well, I would like to see a citation for.

This post is almost entirely nonsense. I give it "almost" simply because in certain all-URM school districts the corruption level is high. It's within the realm of possibility that "fake grants" to "fake grant programs" that are nothing more than chump change doled out by large employers who can wave the program in front of Jesse Jackson and his ilk--look! We're providing gravy!--so I won't call it an outright lie. But it's certainly not the norm. Did you notice that this guy acts like the education world is comprised solely of blacks and whites? If any element of his story is true, it's because he lives or works in an all black school district that is, indeed, corrupt. Detroit, New Jersey somewhere, or the like. And that's a generous interpretation.

The school districts I have worked with have varied from being 90% black to 3% black. You are right in that I should have said "minority" rather than "black", for hispanics, native americans, and other minorities are at a similar disadvantage. However, I've seen enough districts in enough states that I, at least, believe the traits I ascribed to the education system to be nearly universal.

Comment author: roystgnr 09 April 2013 05:01:57PM 3 points [-]

If I hadn't recently seen that "students fighting segregated prom" story from credible news sources, I'd have considered this part of the story to be nearly conclusive evidence of trolling. I should be more charitable than that.

It's still evidence, though. Who could fail to anticipate the devastatingly bad PR from "iPods vs Makeover/mani/pedis"? For that matter, why didn't the devastatingly bad PR occur? Surely the students and their parents weren't under NDA too.

Yet a Google search for 'ipod makeover school -"chic school girls"' doesn't seem to find anything relevant, with or without outraged commentary attached. This random lesswrong page comes up for me in the first couple dozen hits, even on a browser with no Google login or cookies that might trigger personalized rankings.

Nobody ever felt it was worth blogging about how their kids were being given these prizes at school?

Comment author: ThinkOfTheChildren 09 April 2013 06:58:09PM 3 points [-]

The only people aware that the project happened, as far as I know, are myself, my boss, the man in charge, and the 56 students (who were in 6-8th grade at the time, and all from poor black families). The issuer of the grant was the local government, and they issue so many grants that I seriously doubt there's anyone looking at all of them.

Comment author: John_Maxwell_IV 09 April 2013 02:48:43AM *  15 points [-]

So what criteria are necessary to apply for these grants? I have a feeling there are a lot of smart people working on startups in the ed tech space. If you could get in contact with them, you might have more competent grant applicants, and those startups would find more revenue to pursue their (potentially workable) ideas for improving education.

Here are some ed tech incubators I found on Google. If you get in contact with the people behind the incubators, they'll probably tell all of their startups about the ease of getting funding this way. Their startups will have to work on one of the problems that there exists a grant for, but there should be a decent number that find this workable.

Comment author: ThinkOfTheChildren 09 April 2013 06:48:02PM 7 points [-]

You might have seen some of those sketchy advertisements, similar to the "Google will Pay YOU!!! To Work From Home!" ads, which say stuff like "Get Grant Money Here!". At least, I associate those two kinds of ads as being similar.

In any case, the process of finding grants to apply for is very simple. The Department of Education grants are all on http://www.grants.gov/. Pretty much every university's Research and Evaluation Department gives out grants to the local community; check out your local Uni's website. Sometimes large corporations give out grants, sometimes individual people. In general, get in touch with the education department of your county government to find out which grants are being offered nearby and how to apply for them.

Now that I think of it, this is the main request I should have of lesswrongers. I bet anyone on this website could write a damn good proposal for any grant they come across, and I bet their project would be better than the shit I evaluate.

Comment author: NancyLebovitz 09 April 2013 06:07:32PM 6 points [-]

I'm torn between thinking that if this is a hoax, the hoaxer should be banned with extreme prejudice, and hoping that there will be another hoax designed to appeal to right-wingers.

Comment author: ThinkOfTheChildren 09 April 2013 06:40:28PM 0 points [-]

That's interesting.

If this were a hoax, it would certainly appeal to right-wingers. In general, the way the school board is debating this issue, the democrats are in favor of teacher recommendations and "helping the poor black kids", whereas the republicans (although, on the school board, they're all teapartiers) are the ones running with the "Data Driven Decisions D^3" slogan.

Problems in Education

65 ThinkOfTheChildren 08 April 2013 09:29PM

Post will be returning in Main, after a rewrite by the company's writing staff. Citations Galore.

Comment author: ThinkOfTheChildren 08 April 2013 07:35:31PM 6 points [-]

Hey Lesswrong.

This is a sockpuppet account I made for the purpose of making a post to Discussion and possibly Main, while obscuring my identity, which is important due to some NDAs I've signed with regards to the content of the post.

I am explicitly asking for +2 karma so that I can make the post.

View more: Next