Friday, October 21, 2005

Standard & Poor's & the Business of School

The verdict is in--this week's release of NAEP data shows no positive effect for turning schools into test-prep factories. One day before the scores were released (sheer coincidence), Standard & Poor's SchoolMatters (no relation, please) issued a press release that offered two new publications to soften the blow for those who were expecting the miracle of NCLB to be reflected in rising NAEP scores.

Insisting on having their cake and eating it as well, one paper, The National Assessment of Educational Progress and State Assessments: What Do Differing Student Proficiency Rates Tell Us?, downplays NAEP as a "no-stakes" test without real performance goals; while the other, Leveling the Playing Field: Examining Comparative State NAEP Performance in Demographic Context, insists that, "once the playing field has been leveled by taking student poverty into account, most states actually perform as might be statistically expected."

In an attempted explanation that is Kafkaesque in its hopeless absurdity and absurd hopelessness, and one that hopes to reassure those at ED who have crafted a national policy around impossible expectations, they say this:
Note that “expectation” is used to refer to statistical probability, not educational goals. The correlation between performance and poverty does not mean that students living in poverty cannot learn, or that less achievement should be expected from them as a matter of educational policy.
In other words, don't worry if these kids in these poor schools don't have a chance in hell--keep those demands in the impossible zone! (The Fordham hoods could take some lessons from these guys on how to talk out of both sides of their mouth at the same time).

What does it all mean? Does it mean that S&P would like use NAEP and their "leveled playing field" model to craft their own impossible performance expectations for the nation's public schools? I suspect so, just as I suspect that they have plans to eventually house the national school and student information database for the corporate welfare school businesses (Whittle, et al) that hope to replace the public schools. With so many underemployed MBAs now aiming at part of that $400 billion that American spend on K-12 education every year, their plan is less than assured.

But try they will, and with an insider's ferocity. In fact, their loyalty to the neo-con agenda has been front and center since SchoolMatters was launched in 2001 as a part of S&P's School Evaluation Services (another coincidence in timing). Here is something from the "About Us" page:



Despite a 50 percent increase in per pupil spending over the past two decades, nearly one third of public high school students fail to graduate, and two thirds of all students leave high school unprepared for a four-year college, according to the Manhattan Institute. Given the diminishing economic prospects for Americans without a quality education, the need for reform is clear.

What was once, then, just another website for mobile middle class parents to use for checking, anywhere in America, on a school's test scores and percentage of economically disadvantaged students (and we all know what that means), has now really started to evolve into something that is really, well . . . choose your own adjective.

Thanks to Judy Rabin for help on this story.

2 comments:

  1. Jim,

    I have similar thoughts about S&P's involvement.

    At first I thought, like you, it must just be a middle class tool for upwardly mobile parents to find a high-scoring school for their kids.

    Then I thought, since when did Standard and Poors ever do something that wasn't profit motivated.

    ReplyDelete
  2. Jim

    You are all over the map here. I can't figure out what you really think. Are you saying that if there were no NCLB, student scores would have risen nicely? Are you saying that NCLB can't raise student scores through its mechanisms?

    Where do teachers come in? Are they always unable to raise student scores? Or only some student scores? At what point do teachers decide which students' scores can't be raised?

    As for NAEP and S&P, the report that pinpoints states where scores exceeded statistical expectations--to what do you attriibute that? Did it have anything to do with teachers and their work?

    I think you are thrashing about at enemies, and you've started to make yourself and your educational colleagues look less able than they are. Think harder about your line of argument....

    ReplyDelete