"A child's learning is the function more of the characteristics of his classmates than those of the teacher." James Coleman, 1972

Monday, January 16, 2017

Understanding KIPP Model Charter Schools: The Media and Research

Marketplace, which has a time slot on most public radio stations, had a piece last week on the replacement of a high-performing community public elementary school in Baltimore by a "no excuses" KIPP school. Most often these community wrecking ball school replacements are justified by low test scores within the targeted school, but the destruction of Langston Hughes Elementary School in Park Heights required other reasons.  Langston Hughes was a high performing community anchor, where parents knew their children would be taught by professional and caring teachers in small classes within a safe and supportive environment.

So the "under-utilized" excuse was used by the elite efficiency zealots who control public schools in Baltimore.  And even though the community had worked effectively to improve enrollment, the school choice had been made for the parents, children, and parents of Langston Hughes.  Their school would be shut down, regardless of their choice to stay open, and buses would be provided to ferry children to a KIPP school a mile away, where 26 children would be taught in a single classroom by a teacher schooled in corporate paternalism and most assuredly lacking in experience, cultural understanding, and empathy.  No Excuses.

This "school choice" story, where corporate interests push in with charter replacements and call it "choice," is never told in the media.  The sympathetic story cited above is the rare exception to the corporate charter cheerleading that has been the position of the New York Times and Washington Post for years.  See Part 17 below from Work Hard, Be Hard . . . .

-->
The Media and KIPP Research
In 2008, Columbia University professor, Jeffrey Henig (2008) examined seven previous KIPP studies, and based on his analysis of previous finding, he offered the following recommendations:
·      Policy makers at all levels of government should pay attention to KIPP and consider it a possible source of information and guidance for their decisions.
·      Although KIPP may yield useful information, policymakers and others should temper their interest in the operation with wariness and realistic expectations. There are significant unanswered questions about how expansion might affect outcomes, especially in relation to the difficulty of sustaining gains dependent upon KIPP’s heavy demands on teachers and school leaders. Moreover, it is not realistic to think that the KIPP model is a panacea for distressed systems. It is possible that only a small proportion of students and families will be able to meet the demands KIPP imposes on them; even those enthused when they begin the KIPP regimen tend to leave in high numbers.
·      Policymakers, accordingly, should treat KIPP schools as potential tools that may contribute to—but not substitute for—systemic improvement.
·      Policymakers should be aware that KIPP has prompted some district interest in longer school days, weeks, and years. However, an extended schedule sometimes brings parental objections as well as potential taxpayer objections to the additional expense. With no strong evidence yet linking extended scheduling to KIPP success, policymakers might best encourage it as a school- level (rather than district-wide) option while concurrently promoting a combination of experimentation and careful analysis of consequences.
·      Researchers should help provide better data on patterns of movement in and between charter schools and traditional public schools, including information on why students leave and how their mobility affects student and school-level performance (p. 22).
The Great Lakes Center for Educational Research and Practice published Henig’s paper online on Monday, November 10, 2008.  Three days before the paper was published, however, Jay Mathews (2008) dedicated his education column at The Washington Post to preempting the Henig paper with his own interpretation, while taking the opportunity to promote the imminent publication of Mathews’ Work hard, be nice… (Mathews, 2009a). Mathews included this in his gloss of Henig’s recommendations:
He [Henig] says that ‘policymakers at all levels of government should pay attention to KIPP and consider it a possible source of information and guidance for their decisions’ but ‘should temper their interest in the operation with wariness and realistic expectations.’ He says policymakers ‘should treat KIPP schools as potential tools that may contribute to -- but not substitute for—systemic improvement.’
That makes sense to me and the KIPP officials I have been interviewing the past seven years… (Mathews, 2008, para 7-8).
Mathews does not mention in his column Henig’s other caveats and reservations, and no other news outlets, including The Washington Post, carried news stories on the publication of Henig’s research.
The situation was quite different, however, when Mathematica Policy Research, Inc. published the final piece of a study commissioned by KIPP and paid for by The Atlantic Philanthropies in 2008 at a cost of almost $4 million.  Not only did Jay Mathews (2013) devote a lengthy post to a piece, “Biggest study ever says KIPP gains substantial,” but The Washington Post’s Editorial Board (The Washington Post, 2013) went on the record a few days later to announce “KIPP doubters proven wrong:”
Officials of KIPP (Knowledge Is Power Program) have become accustomed to the doubters who think the success of the fast-growing charter-school network is too good to be true . . . . A study conducted by the independent firm Mathematica Policy Research, which analyzed data from 43 KIPP middle schools, found that students in these charter schools showed significantly greater learning gains in math, reading, science and social studies than did their peers in traditional public schools. The cumulative effects three to four years after entering KIPP translated, researchers found, into middle-schoolers gaining 11 months of additional learning growth in math and social studies, eight months in reading and 14 months in science. . . .Debunking claims that KIPP’s success is rooted in “creaming” the best students, researchers found that students entering KIPP schools are very similar to other students in their neighborhoods: low-achieving, low-income and nonwhite (para 2, 3).
Indeed, both KIPP (study included 43 of KIPP’s 125 schools) and the neighborhood students in this study are similar in terms of family income, achievement levels, and ethnicity.  In their eagerness to make a case for supporting KIPP, however, the Editorial Board remains mum about differences acknowledged by the Mathematica study (Tuttle, et al, 2013) that influence test outcomes.  For instance, the Mathematica researchers note a characteristic differences that is common in examining charter school and public school demographics: the 43 KIPP schools enrolled significantly fewer male students (52% compared to 49%), fewer limited English proficiency (15% compared to 10%) and fewer special education students (13% compared to 9% (p. xiv).
Conducted over five years, an earlier part of the Mathematica study was presented at annual conference of the American Education Research Association (AERA) in New Orleans in 2011.  There, researchers presented findings related to attrition rates that were not included in the final summary findings.  Researchers (Nichols-Barrer, Gill, Gleason, & Tuttle, 2012) found that when attrition rates were compared between middle school KIPPsters and public middle school students from the same feeder elementary schools (rather than comparing to the entire district), KIPP’s attrition rates were significantly higher than comparison schools for 5th grade (16% compared to 11%), not significantly different for 6th grade, and significantly lower at KIPP than comparison schools for 7th grade (9% compared to 13%). 
Researchers found, too, that while KIPP maintained stable populations in grades 7 and 8, the public comparison schools were receiving large numbers of new students in grades 7 and 8.  The chart below (see Figure 17.1) was part of the 2011 AERA presentation and was not included in Mathematica’s final report. 
In effect, KIPP schools replace, or “backfill,” fewer students in grades 6, 7, and 8 than the surrounding public schools, and the late arrivals that KIPP schools generally have scores that are above the mean for the district (Nichols-Barrer, Gill, Gleason, & Tuttle, 2012), whereas the late arrivals at the public schools have scores below the mean:
KIPP schools differ from district comparison group middle schools in how late arrivals compare with on-time enrollees. Students who enroll late at KIPP tend to be higher achieving than those who enroll on time, as measured by their grade 4 test scores, whereas the reverse is true at district comparison group schools (see Table III.2). At KIPP schools, on average, late arrivals scored 0.16 and 0.15 standard deviations above the mean for the local district in math and reading, respectively, at baseline (or the 56th percentile). . . . Conversely, late arrivals at district schools had significantly lower average baseline test scores than on-time enrollees. In district comparison schools, late arrivals scored 0.29 standard deviations below the mean in both subjects (or the 39th percentile); on-time entrants scored 0.03 and 0.01 above the mean in math and reading, respectively (the 51st and the 50th percentile). All of these differences are statistically significant (p. 15).
In short, late arrivals at KIPP are significantly stronger academically than the average district students who arrive late, while the larger influx of late arrivals to public comparison schools in grades 7 and 8 are significantly weaker than the district mean.  The same paper reported that KIPP’s late arrivals were significantly less likely to be black males or in special education, and they were more likely to make the KIPP schools less disadvantaged over time.  The opposite was found to be the case for the late arrivals at district comparison group schools.  All of these important facts escaped the attention of the Washington Post’s Editorial Board and its principal education writer, Jay Mathews.
While the Mathematica study (Tuttle, Gill, Gleason, Knechtel, Nichols-Barrer, & Resch, 2013) found significant test score increases among KIPP students (pp. 31-40), questions remain as to how much better KIPP school test scores would be without the known advantages like 50-60 percent more time in school, test preparation focus, fewer and higher-achieving replacement students, fewer black male students, higher attrition among low performers and problem students, fewer special education and ELL students, and large funding advantages from both public and private sources. 
To its credit, The New York Times (Dillon, 2011, March 31) reported in 2011 that Western Michigan University researchers found
. . . the KIPP network received $12,731 in taxpayer money per student, compared with $11,960 at the average traditional public school and $9,579, on average, at charter schools nationwide.
In addition, KIPP generated $5,760 per student from private donors, the study said, based on a review of KIPP’s nonprofit filings with the Internal Revenue Service (para 8-9).
Another study (Baker, Libby, & Wiley, 2012) also found large budgeting advantages at KIPP, as well as at two other KIPP-inspired charter chains, Achievement First and Uncommon Schools:
We find that in New York City, KIPP, Achievement First and Uncommon Schools charter schools spend substantially more ($2,000 to $4,300 per pupil) than similar district schools. Given that the average spending per pupil was around $12,000 to $14,000 citywide, a nearly $4,000 difference in spending amounts to an increase of some 30%. In Ohio, charters across the board spend less than district schools in the same city. And in Texas, some charter chains such as KIPP spend substantially more per pupil than district schools in the same city and serving similar populations, around 30 to 50% more in some cities (and at the middle school level) based on state reported current expenditures, and 50 to 100% more based on IRS filings. Even in New York where we have the highest degree of confidence in the match between our IRS data and Annual Financial Report Data, we remain unconvinced that we are accounting fully for all charter school expenditures (pp. i-ii). 
Mathematica researchers acknowledged, too, the potential positive influence on KIPP scores that results from built-in parental self-selection bias, even though Mathematica (Nichols-Barrar, Gill, Gleason, & Tuttle, 2014) was not asked to investigate this important aspect:
A potentially important limitation of this study is that there could still be unmeasured differences between the students attracted to KIPP and those enrolling in other schools. We analyze the peer environment at KIPP as measured by demographic characteristics and prior achievement, but we do not have direct measures of parent characteristics, prior motivation, or student behavior (para 31).
         Finally, the enthused Editorial Board of The Washington Post did not mention the following significant findings from the Mathematica study (Tuttle, Gill, Gleason, Knechtel, Nichols-Barrer, & Resch, 2013) that raise serious questions related to KIPP’s inability to increase student “good behaviors,” as well as KIPP’s negative effects on the behavior of children in KIPP’s total compliance environments where “grit” and zest are valued over honesty and compassion:
KIPP has no statistically significant effect on several measures of student behavior, including self-reported illegal activities, an index of good behavior, and parent reports of behavior problems. However, KIPP has a negative estimated effect on a student-reported measure of undesirable behavior, with KIPP students more likely to report behaviors such as losing their temper, arguing or lying to their parents, or having conflicts with their teachers (p. 68).
References
Alter, J.  (2008, July 11).  Jonathan Alter on Obama and education.  Newsweek.  Retrieved from
     http://www.newsweek.com/jonathan-alter-obama-and-education-92615
Baker, B. D., Libby, K., & Wiley, K. (2012). Spending by the major charter management organizations: Comparing charter school and local public district financial resources in New York, Ohio, and Texas. Boulder, CO: National Education Policy Center. Retrieved from http://nepc.colorado.edu/publication/spending-major-charter 
Desilver, D.  (2013, December 19).  Global inequality: How the U.S. compares. Pew Research Center.  Retrieved from http://www.Pewresearch.org/fact-tank/2013/12/19/global-inequality-how-the-u-s-compares/
Dillon, S.  (2011, March 31).  Study says charter network has financial advantages over public schools.  The New York Times.  Retrieved from http://www.nytimes.com/2011/03/31/education/31kipp.html?_r=1&
Goodnough, A.  (1999, October 20).  Structure and basics bring South Bronx school acclaim.  The New York Times.  Retrieved from http://www.nytimes.com/1999/10/20/nyregion/structure-and-basics-bring-south-bronx-school-acclaim.html
Grann, D.  (1999, October 4).  Back to basics in the Bronx.  The New Republic.  Retrieved from https://www.cs.unm.edu/~sto/maunders/educate/grann.html
Grannan, C.  (2008, July 13).  Newsweek recommends that Obama do a little teacher-bashing to win fans. Examiner.com.  Retrieved from http://www.examiner.com/article/newsweek-recommends-that-obama-do-a-little-teacher-bashing-to-win-fans
Henig, J.  (2008).  What do we know about the outcomes of KIPP schools?  The Great Lakes Center for Education Research and Practice.  East Lansing, MI: The Great Lakes Center for Education Research & Practice.  Retrieved from http://greatlakescenter.org/docs/Policy_Briefs/Henig_Kipp.pdf
KIPP Foundation.  (2014).  The promise of college completion: KIPP’s early successes and challenges—Spring 2014 alumni data update.  Retrieved from http://www.kipp.org/files/dmfile/2013AlumniUpdateonCollegeCompletion.pdf
Klein, J.  (2014).  Lessons of hope: How to fix our schools.  New York: Harper.
Mathews, J.  (2013, February 27).  Biggest study ever shows KIPP gains substantial.  The Washington Post. Retrieved from http://www.washingtonpost.com/blogs/class-struggle/post/biggest-study-ever-says-kipp-gains-substantial/2013/02/26/ff149efa-7d50-11e2-9a75-dab0201670da_blog.html
Mathews, J.  (2009a).  Work hard, be nice: How two inspired teachers created the most promising schools in America.  New York: Algonquin Books.
Mathews, J.  (2009b).  Turmoil at two KIPP schools.  [Blog post]. Retrieved from http://voices.washingtonpost.com/class-struggle/2009/03/turmoil_at_two_kipp_schools.html?wprss=rss_blog
Mathews, J.  (2008, November 7).  The most promising schools in America.  The Washington Post.  Retrieved from http://www.washingtonpost.com/wp-dyn/content/article/2008/11/07/AR2008110700861.html
Monahan, R.  (2014, November 11).  Charter schools try to retain teachers with mom-friendly policies.  The Atlantic.  Retrieved from http://www.theatlantic.com/education/archive/2014/11/charter-schools-now-try-to-keep-teachers-with-mom-friendly-policies/382602/
Nichols-Barrer, I., Gill, B., Gleason, P, & Tuttle, C.  (2014).  Does student attrition explain KIPP’s success?  Education Next, 14 (4).  Retrieved from http://educationnext.org/student-attrition-explain-kipps-success/
PBS News Hour.  (2015, January 8).  Can teaching kids to resist the marshmallow help pave the way to success?  [Transcript]. Retrieved from http://www.pbs.org/newshour/bb/can-teaching-kids-resist-marshmallow-pave-road-success/
Rotherham, A.  (2011, April 27).  KIPP schools: A reform triumph, or disappointment? Time.  Retrieved from http://content.time.com/time/nation/article/0,8599,2067941,00.html
Smith, H.  (2005a).  Making schools work.  [Transcript].  Retrieved from http://www.pbs.org/makingschoolswork/atp/transcript.html
Tuttle, C., Gill, B., Gleason, P., Knechtel, V., Nichols-Barrer, I., & Resch, A.  (2013).  KIPP middle schools: Impacts on achievement and other outcomes.  Washington, DC: Mathematica Policy Research.  Retrieved from http://www.kipp.org/files/dmfile/KIPP_Middle_Schools_Impact_on_Achievement_and_Other_Outcomes1.pdf
Somerby, B.  (1999, September 24.)  Our current howler: Critique the children well.  [Blog post].  Retrieved from http://www.dailyhowler.com/h092499_1.shtml
Tuttle, C., Gill, B., Gleason, P., Knechtel, V., Nichols-Barrer, I., & Resch, A.  (2013).  KIPP middle schools: Impacts on achievement and other outcomes.  Washington, DC: Mathematica Policy Research.  Retrieved from http://www.kipp.org/files/dmfile/KIPP_Middle_Schools_Impact_on_Achievement_and_Other_Outcomes1.pdf
The Washington Post.  (2013, March 1).  KIPP doubters proven wrong.  The Washington Post.  Retrieved from http://www.washingtonpost.com/opinions/kipp-doubters-proved-wrong-in-new-study/2013/03/01/f003b95c-81ef-11e2-a350-49866afab584_story.html
Wilgoren, J.  (2000, August 2).  The Republicans: The issues; for 2000, the G.O.P. sees education in a new light.  The New York Times.  Retrieved from https://www.cs.unm.edu/~sto/maunders/educate/grann.html
Woodworth, K. R., David, J. L., Guha, R., Wang, H., & Lopez-Torkos, A. (2008).  San Francisco Bay Area KIPP schools: A study of early implementation and achievement. Final report. Menlo Park, CA: SRI International.


No comments:

Post a Comment