"A child's learning is the function more of the characteristics of his classmates than those of the teacher." James Coleman, 1972

Sunday, May 13, 2012

Urgent Note to Race to the Top Winners


I know Secretary Duncan forgot to send this to you almost 3 years ago.  Sorry.
However, in building a teacher evaluation system based on value-added test scores, 
YOU HAVE SCREWED UP!

News from the National Academies
Read Full Report

Date:  Oct. 7, 2009
Contacts:  Sara Frueh, Media Relations Officer
Alison Burnette, Media Relations Assistant
Office of News and Public Information
202-334-2138; e-mail

FOR IMMEDIATE RELEASE

EDUCATION INNOVATIONS FUNDED BY 'RACE TO THE TOP' SHOULD BE RIGOROUSLY
EVALUATED; VALUE-ADDED METHODS TO ASSESS TEACHERS NOT READY FOR USE 
IN HIGH-STAKES DECISIONS

WASHINGTON -- The Race to the Top initiative -- a $4.35 billion grant program included in 
the American Recovery and Reinvestment Act to encourage state-level education reforms -- 
should require rigorous evaluations of the reform efforts it funds, says a new report from 
the National Research Council.  The initiative should support research based on data that
 links student test scores with their teachers, but should not prematurely promote the use of 
value-added approaches -- which evaluate teachers based on gains in their students' 
performance -- to reward or punish teachers.  Too little is known about the accuracy of these 
methods to base high-stakes decisions on them right now, the report says.

The U.S. Department of Education is developing regulations that explain how the $4.35 billion
 will be awarded.  The National Research Council's report offers recommendations to help
 the department revise these guidelines.

The report strongly supports rigorous evaluations of programs funded by the Race to the 
Top initiative.  Only with careful evaluations -- which allow effective reforms to be identified 
and perhaps used elsewhere -- can the initiative have a lasting impact.  Without them, any 
benefits of this one-time expenditure on innovation are likely to end when the funding ends, 
the report says. 

Evaluations must be appropriate to the specific program being assessed and will be easier 
to design if grantees provide a "theory of action" for any proposed reform -- a logical chain 
of reasoning explaining how the innovation will lead to improved student learning. Evaluations 
should be designed before programs begin so baseline data can be collected; they should 
also provide short-term feedback to aid midcourse adjustments and long-term data to judge 
the program's impact.  While standardized tests are helpful in measuring a reform's effects, 
evaluations should rely on multiple indicators of what students know and can do, not just a 
single test score, the report adds.

The Department of Education's proposed guidelines encourage states to create systems
 that link data on student achievement to teachers.  The report applauds this step, arguing 
that linking this data is essential to conducting research about the best ways to evaluate teachers. 

One way of evaluating teachers, currently the subject of intense interest and research, are 
value-added approaches, which typically compare a student's scores going into a grade with 
his or her scores coming out of it, in order to assess how much "value" a year with a 
particular teacher added to the student's educational experience.  The report expresses 
concern that the department's proposed regulations place excessive emphasis on value-added 
approaches.  Too little research has been done on these methods' validity to base high-stakes 
decisions about teachers on them.  A student's scores may be affected by many factors other 
than a teacher -- his or her motivation, for example, or the amount of parental support -- and 
value-added techniques have not yet found a good way to account for these other elements.

The report also cautions against using the National Assessment of Educational Progress, a 
federal assessment that helps measure overall U.S. progress in education, to evaluate programs 
funded by the Race to the Top initiative.  NAEP surveys the knowledge of students across the 
nation in three grades with respect to a broad range of content and skills and is not aligned 
with the curriculum of any particular state.  Although effective at monitoring broad trends, it is 
not designed to detect the specific effects of targeted interventions like those to be funded by 
Race to the Top.

The study was sponsored by the National Research Council.  The National Academy of 
Sciences, National Academy of Engineering, Institute of Medicine, and National Research 
Council are private, nonprofit institutions that provide science, technology, and health policy 
advice under a congressional charter.  The Research Council is the principal operating agency 
of the National Academy of Sciences and the National Academy of Engineering.  A committee 
roster follows.
                                                                                                                                                                                                                      
TO THE TOP FUND are available from the National Academies Press; tel. 202-334-3313 or 
1-800-624-6242 or on the Internet at HTTP://WWW.NAP.EDU.  Reporters may obtain a copy
 from the Office of News and Public Information (contacts listed above).


#       #       #

[ This news release and report are available at HTTP://NATIONAL-ACADEMIES.ORG ]

NATIONAL RESEARCH COUNCIL
Division of Behavioral and Social Sciences and Education
Board on Testing and Assessment

EDWARD HAERTEL (CHAIR)
Jacks Family Professor of Education, and
Associate Dean for Faculty Affairs
School of Education
Stanford University
Stanford, Calif.

LYLE F. BACHMAN
Professor and Chair
Department of Applied Linguistics and TESOL
University of California 
Los Angeles

STEPHEN B. DUNBAR
Professor of Educational Measurement and Statistics
College of Education
University of Iowa
Iowa City

DAVID J. FRANCIS
Hugh Roy and Lillie Cranz Cullen Distinguished Professor and Director
Department of Psychology
University of Houston
Houston 

ARTHUR S. GOLDBERGER
Professor Emeritus
Department of Economics
University of Wisconsin
Madison

MICHAEL HOUT*
Professor of Sociology
Graduate Group in Sociology and Demography
University of California
Berkeley

MICHAEL KANE
Samuel J. Messick Chair in Test Validity
Research and Development Division
Educational Testing Service
Princeton, N.J.

KEVIN LANG
Professor of Economics and Chair
Department of Economics
Boston University
Boston

MICHAEL NETTLES
Senior Vice President for Policy Evaluation and Research
Policy Evaluation and Research Center
Educational Testing Services
Princeton, N.J.

DIANA PULLIN
Professor
Lynch School of Education
Boston College
Chestnut Hill, Mass.

BRIAN STECHER
Senior Social Scientist
Education Program
RAND
Santa Monica, Calif.

MARK R. WILSON
Professor of Policy, Organization, Measurement, and Evaluation Cognition and Development
Graduate School of Education
University of California
Berkeley

REBECCA ZWICK
Professor of Education, and
Director of Research Methodology
Gevirtz Graduate School of Education
University of California
Santa Barbara

RESEARCH COUNCIL STAFF

STUART ELLIOT
Study Director

JUDITH ANDERSON KOENIG
Senior Program Officer

                                                                       
Member, National Academy of Sciences

1 comment:

  1. Anonymous10:44 PM

    I am speechless, but not surprised. Just outraged. What a scam all of this has been. Good news but man do we have a battle ahead of us to reverse the insanity!

    ReplyDelete