Monday, January 07, 2013

The TFA PR Machine Even Works in the Critics

Walt Gardner has a post on the TFA PR machine, and in it he shows exactly how effective the TFA PR machine is by perpetuating the lie that TFA teachers perform as well as, or better than, professionally-prepared teachers.
The protracted recession has caused many graduates from top-tier schools to put their real career plans on hold and take their knowledge and skills to the classroom through TFA. I'll give them the benefit of the doubt and assume they perform as well as, or even slightly better than, traditionally certified teachers. To the extent that such performance helps students who otherwise would be shortchanged, I commend TFA. But let's not forget that there is hubris associated with TFA ever since it was the subject of Wendy Kopp's senior thesis at Princeton.
Come on, Walt, you can do better than that.  Just in case your google machine is broken, here is this sidebar from a Barbara Miner piece in Rethinking Schools, just to get you started on the right track:


Are TFA Recruits Better Teachers?

The Mathematica Study
One of the controversies swirling around TFA is the teaching quality of its recruits. To answer this question, Kerci Marcello Stroud, TFA’s communications director, pointed me to a 2004 Mathematica study on Teach for America. She specifically noted that the conservative education policy journal Education Next gave the report an “A” for methodology and that three other studies, including a 2005 study by Linda Darling Hammond and others from Stanford University, received a “C” or lower.
I went to the Mathematica study and, quite frankly, wondered why TFA was promoting it. I imagined how the Onionmight summarize the study: “Teach for America goes up against the worst teachers in the country—they’re both awful!”
The Mathematica study involved 17 schools across the country, 100 classrooms, and nearly 2,000 students, and thus could be considered a representative, one-year snapshot. The study’s executive summary notes that the control group for the TFA teachers consisted of other teachers in the same schools and at the same grades—teachers with “substantially lower rates of certification and formal education training” than a nationally representative sample of teachers. In addition, the study said that many of the control group teachers had no student teaching experience at all and were less prepared than the TFA recruits.
The Mathematica study, using the Iowa Test of Basic Skills, found that there were statistically insignificant differences in reading achievement for students in the TFA and control classrooms. In math, students in the TFA classrooms faired slightly better—equal to one month’s extra teaching.
The Mathematica study also found, however, that TFA teachers “had no substantial impact on the probability that students were retained in grade or assigned to summer school.”
A closer look at the math and reading results shows that neither the TFA group nor the control group was even beginning to close the achievement gap. In math, the TFA teachers bumped their student math scores from the 14th to the 17th percentile. The control group stayed flat at the 15th percentile. In reading, both the TFA and control group teachers marginally raised reading scores, from the 13th to the 14th percentile for the control group, and from the 14th to the 15th percentile for the TFA recruits. This, as Center for Teaching Quality head Barnett Berry notes, “is essentially virtually no gain at all. These [TFA] students were still reading more poorly than 85 percent of their peers nationwide, and well below grade level.” Teach for America boasts about its impact, noting on its webpage: “[O]ur corps members and alumni work relentlessly to increase academic achievement.” Yet in a study touted by TFA, the students of corps teachers remained far below their national peers and made only marginal gains.
Darling-Hammond’s Houston Study
"Does Teacher Preparation Matter?” is a peer-reviewed, scholarly evaluation of the effectiveness of the TFA approach, published by Linda Darling-Hammond and three other Stanford University colleagues in 2005. Reading through the study, one can see why TFA doesn’t like the results.
The study is a longitudinal, six-year look at data from Houston representing more than 132,000 students and 4,400 teachers, on six different math and reading achievement tests. (TFA has sent recruits to Houston since 1991, and this year has more than 450 corps members teaching there.)
“Although some have suggested that perhaps bright college graduates like those who join TFA may not require professional preparation for teaching, we found no instance where uncertified Teach for America teachers performed as well as standard certified teachers of comparable experience levels teaching in similar settings,” the study states.
The study also found, however, that teachers who gained certification, including TFA teachers who became certified by their second or third year of teaching, increased in effectiveness.
At the same time, few of the TFA teachers stayed in the Houston schools for long. Based on district data, the study notes that “generally, rates of attrition for TFA teachers were about twice as high as for non-TFA teachers.” For instance, of those who entered in the 1998 school year, 85 percent had left the Houston public schools after three years, compared to about 55 percent of non-TFA teachers. ­ —B.J.M.

No comments:

Post a Comment