Sunday, December 11, 2011

A Closer Look at Teach for America's Research Page

The following analysis is from Philip Kovacs, posted at Anthony Cody's blog.
Several weeks ago I posted a firsthand report from University of Alabama, Huntsville assistant professor Philip Kovacs, regarding his efforts to get the Hunstville school board to re-examine its decision to spend $1.7 million on bringing Teach For America interns to the public schools there. Huntsville, he pointed out, has laid off 300 teachers over the past two years. Today, Dr. Kovacs takes us on an exploration of the research that TFA offers to justify its aggressive expansion. 

Guest post by Philip Kovacs.
Recently I have been exchanging emails with a TFA employee in my city. On my last exchange, I tried to press her to answer at least one of my questions.

"Given the choice, would you see a doctor with 5 weeks of training or a certified doctor? A lawyer? An actuary?"
Answering with a yes would be absurd. Answering with a "no" would indicate a blatant disrespect for teachers.

Unfortunately this disrespect is exactly what we have going on in our country at this time: a blame the teacher mentality that ignores real world issues and concerns. There are those that argue that, until we respect teachers as much as they are respected in Finland or South Korea or Singapore, we are going to continue to have a third rate education system.

To be fair here, we should make teaching twice as hard to enter and double the pay and the issue would be partially solved. I've said this publicly for years. Let me emphasize the partially here, as Finland, South Korea and Singapore (heralded as educational standouts) have issues that we don't have and vice versa. One is our child poverty rate of 20% compared to Finland's 5%. And check out what happens to the "test score gap" when poverty is taken into account (scroll down).


The TFA employee directed me to the organization's "research" page where the TFA claims "A large and growing body of independent research shows that Teach For America corps members make as much of an impact on student achievement as veteran teachers."

This claim, based on the "studies" supplied by TFA, is misleading at best and demonstrably false at worst.
I read all of the 12 "studies" available on TFA's website, and here is what I found.

TFAResearch.jpg
Four of the 12 "studies" are irrelevant to the argument re: "make as much of an impact on student achievement as veteran teachers." Of these four:
One, Creating a Corps of Change Agents, is a fluff piece from Education Next that discusses the high rate of entrepreneurs who come from TFA;

The second is a peer-reviewed piece, The Price of Misassignment: The Role of Teaching Assignments in Teach For America Teachers' Exit from Low Income Schools and the Teaching Profession, which discusses improving TFA retention ;

The third, Teacher Characteristics and Student Achievement:

Evidence from Teach For America, discusses predicting outcomes at the time of TFA hire (For the record, this one could have gone under "problematic" as the front page contains the disclaimer "PRELIMINARY AND INCOMPLETE" in all caps.)
The fourth is another peer reviewed article, Assessing the Effects of Voluntary Youth Service: The Case of Teach for America presents evidence against TFA's claim that TFAers go on to "pro social" jobs.

Seven of the 12 "studies" are problematic or mixed.
They have methodological flaws making the findings problematic. Two of the seven acknowledge such flaws and warn the reader against making judgments based on their data. Another problem with these is that they show mixed results, i.e. TFA recruits are better at math than some teachers in some cases but are not better in other subjects, or they are better than novice teachers but not better than those with experience, etc. I will address each of these "studies" in my next post, as each warrants a paragraph of its own.

Importantly, all of the seven "studies" that show mixed or problematic results are based on the use of Value Added Measurement (VAM). Here is a link to one peer-reviewed research paper, Teacher Effects and Teacher Effectiveness, a Validity Investigation of the Tennessee Value Added System, which argues that there are "several logical and empirical weaknesses of the system" used to evaluate teachers in Tennessee, the system which found its way to TFA's "research" page.

VAM is flawed at best, as argued in this report from the Annenberg Institute, an institute that can hardly be called partisan or pro-status quo, though some readers will no doubt level the criticism. Diane Ravitch, discussing the Annenberg report, asks an important question: "[Dr. Corcoran] describes a margin of error so large that a teacher at the 43rd percentile (average) might actually be at the 15th percentile (below average) or the 71st percentile (above average). What is the value of such a measure? Why should it be used at all?"

One of the issues Anthony Cody rightly addresses, however, is that the more we talk about VAM, the more we reify it as an accurate tool for determining teacher effectiveness, which it simply isn't.

I agree with Cody in principle and will set my critique aside after pointing out one more flaw. A teacher raising student scores from the 15th to 25th percentile is going to look, to bean counters, much more effective than a teacher who raises student scores from the 85th to the 90th.

Which teacher is more effective? That's debatable, but it is the type of debate that happens when people go to football games and stare at the scoreboard for two hours.

Both teachers might be equally effective. The teacher with the smaller gain might be more effective, but to really know, you'd have to know something about the teams and you would have to watch the game.

Finally, one "study" is overwhelmingly positive
, but that "study" is actually a one-page summary from a survey of principals. The questions and data are not available, but the one page summary is overwhelmingly positive.

It turns out TFA left some reports off of its website.
They aren't very flattering though, so I understand. See, for example, The Effectiveness of "Teach for America" and Other Under-certified Teachers, by Laczko-Kerr and Berliner, and Does teacher preparation matter? Evidence about teacher certification, teach for America, and teacher effectiveness, by Darling-Hammond et al. Note that both of these research papers are from EEPA, one of the two peer-reviewed journals included on TFA's website.

As to the importance of peer-review...or the non importance...scholars and scientists have the mechanism in place to make sure research is sound and people aren't simply making things up and convincing others that they have found the cure for cancer, created a miracle drug like Vioxx, cloned a sheep, or narrowed the achievement gap.

Two of the 12 studies on TFA's website are peer-reviewed. Both are, however, irrelevant to TFA's claim "that Teach For America corps members make as much of an impact on student achievement as veteran teachers." They appear to be included to pad TFA's resume.

What is troublesome here is that we now live in a world where foundations and organizations have millions of dollars to spend lobbying and at the same time can bypass peer-review in order to make a case for whatever they are selling. If you have enough money, science no longer matters. For more on this ask the scientists trying to address global warming.

Here is what I can say with some certainty based on TFA's "reports." In some cases, in some places, and in some grades, TFA might produce better results on math tests than traditionally certified, novice teachers.
That is what I am certain about.

The rest is very debatable. The "research" is certainly not worth Huntsville paying an extra $1.7 million for recruits, and it is certainly not good enough for the children who need experienced teachers.

The data showing experience matters is overwhelming, in fact one of the reports on TFA's "research" page acknowledges this fact (it's the "Portal Report" which is a pdf ): "Teachers with 4 years or more experience out perform teachers with 1 year of experience on 9 out of 10 indicators."

By the way, the Tennessee data linked to on that "research" page shows that 8% of TFA recruits are still teaching after 4 years (compare that to UT Knoxville's 50%). (scroll down for TFA and UT's data)

This makes the giant claim on the right side of TFA's "research" page...interesting...take a look at what TFA claims their retention rate is...They claim it is a little higher than 8%.

What do you think? If "research" is misleading, cherry picked, based on flawed instruments, and avoiding the topic at hand, is it research or is it marketing?


Philip Kovacs is an assistant, tenure tracked professor at UAHuntsville.

Graph provided by Philip Kovacs, used with permission.

No comments:

Post a Comment