"A child's learning is the function more of the characteristics of his classmates than those of the teacher." James Coleman, 1972

Saturday, March 02, 2013

WaPo Editorializes on KIPP Study and Forgets to Report the Findings


When Mathematica Policy Research, Inc. presented the final piece of a study commissioned by KIPP, Inc. the other day ("KIPP Middle Schools: Impacts on Achievement and Other Outcomes", WaPo’s Editorial Board was quick to announce “KIPP Doubters Proven Wrong.

OFFICIALS OF KIPP (Knowledge Is Power Program) have become accustomed to the doubters who think the success of the fast-growing charter-school network is too good to be true. KIPP’s positive outcomes are the result not of its unique learning approach but rather, so the familiar critique goes, of its ability to attract the best students with highly motivated parents. Now comes rigorous research that should put an end to those suspicions and hopefully prompt discussion of what other schools might take away from KIPP’s experience in working with low-income students.
A study conducted by the independent firm Mathematica Policy Research, which analyzed data from 43 KIPP middle schools, found that students in these charter schools showed significantly greater learning gains in math, reading, science and social studies than did their peers in traditional public schools. The cumulative effects three to four years after entering KIPP translated, researchers found, into middle-schoolers gaining 11 months of additional learning growth in math and social studies, eight months in reading and 14months in science. KIPP, which operates in 20states and the District, is portrayed as among the highest-performing school networks in the country.
Debunking claims that KIPP’s success is rooted in “creaming” the best students, researchers found that students entering KIPP schools are very similar to other students in their neighborhoods: low-achieving, low-income and nonwhite.
Yes, the characteristics that WaPo mentions are very similar.  What the Editorial Board ignores from the study are some other characteristics that show real demographic differences that influence test scores.  For instance, significantly fewer KIPPsters in the study were limited English proficiency and special education students (from p. xiv):
How facts have a way of intruding!  But let us continue.
A typical student enrolling in a KIPP school scored at the 45th percentile in his or her school district in reading and math, lower than both the average in the school they attended and the school district as a whole.
As to whether KIPP finds ways to shed its low-performing students, the study determined that 37 percent of KIPP students depart through attrition, in line with school district attrition rates.
This gloss presented in the final installment of the Mathematica study presents a misleading picture at best.
Yes, this final piece of the study shows an attrition rate very similar to the school district attrition rates, which on the face of it would appear to stem criticism that KIPP dumps its low performers and keeps only its high performers.  Yet in an earlier part of this study presented at AERA in New Orleans in 2011 presented findings that were not talked about in this final report.  For instance, when attrition rates were compared between middle school KIPPsters and public middle school kids from the same feeder elementary schools (rather than comparing to the entire district), the researchers found something quite different: attrition rates at KIPP were significantly higher than comparison schools following 5th grade (16% compared to 11%), not significantly different for 6th grade, and significantly lower at KIPP than comparison schools after 7th grade (9% compared to 13%). (click any chart to enlarge)

Who were the 16 percent of students leaving KIPPs after 5th grade?  Were they predominantly low performers?  Mathematica doesn’t break this out by grade level, but they do tell us that, over the entire middle school span, attrition for low performers was higher at KIPP (38%) than it was at the comparison schools (36%). I think it is fair to assume that, since most of the KIPP attriters left after 5th grade, significant numbers were low performers. 
 
To complicate the achievement comparison picture further, we find that public comparison schools were getting large numbers of new students in grades 7 and 8, while KIPP had stable populations.  And  as the facts just above are not discussed in Mathematica’s final report, neither is this one.  These numbers are from the AERA paper presentation delivered by the same Mathematica researchers in 2011:
And what were the achievement characteristics of the late arrivers at KIPPs and the public comparison schools?  Glad you asked.  This quote below (my bolds) is from the paper upon which the AERA presentation was based: its title, Student Selection, Attrition, and Replacement in KIPP Middle Schools:
KIPP schools differ from district comparison group middle schools in how late arrivals compare with on-time enrollees. Students who enroll late at KIPP tend to be higher achieving than those who enroll on time, as measured by their grade 4 test scores, whereas the reverse is true at district comparison group schools (see Table III.2). At KIPP schools, on average, late arrivals scored 0.16 and 0.15 standard deviations above the mean for the local district in math and reading, respectively, at baseline (or the 56th percentile)…. Conversely, late arrivals at district schools had significantly lower average baseline test scores than on-time enrollees. In district comparison schools, late arrivals scored 0.29 standard deviations below the mean in both subjects (or the 39th percentile); on-time entrants scored 0.03 and 0.01 above the mean in math and reading, respectively (the 51st and the 50th percentile). All of these differences are statistically significant.
In short, late arrivals at KIPP were on average are much better students than the average public school student, while the large influx of late arrivals at the public comparison schools were regularly much weaker.
We may be begin to wonder, then, how much better KIPP test scores would be if philanthro-capitalists and the federal government had not poured hundreds of millions into KIPP’s 125 schools and if KIPP had the same kinds of students as the public schools, and if KIPP did not have a 9 to 10 hour school day, an if KIPP did not operate like a total compliance test factory, and if KIPP was not able to buy the editorial support of a once-respectable national newspaper.

Finally, the Editorial Board did not mention these troubling findings from the Mathematica study that have to do with behavior of children who have been locked down in a pressure cooker school where “grit” is valued over honesty:

On the negative side, the findings suggest that enrollment in a KIPP school leads to an increase in the likelihood that students report engaging in undesirable behavior such as lying to or arguing with parents (p. xii).


2 comments:

  1. Thanks Jim. I'm not surprised that the Washington Post came out with this. Of course they didn't mention that they have a conflict of interest here, or that Mathematica itself had a conflict of interest in that KIPP paid for the study.

    I didn't see this addressed in the study or the editorial. What happened to the kids that left KIPP, particularly the middle school kids? Did they go to the public schools? If so, and if they are low performing students, then the scores of public schools go down, and scores from KIPP schools go up. On the other hand, I doubt many kids were leaving public schools in middle school to transfer to KIPP schools.

    ReplyDelete
  2. "...enrollment in a KIPP school leads to an increase in the likelihood that students report engaging in undesirable behavior such as lying to or arguing with parents (p. xii)."
    Sounds like a curriculum standard for future KIPP-profiteers.

    ReplyDelete