Washington DC has long been the poster child for high stakes tests used to label teachers as successes or failures. Now news comes that errors in the Value-add formulas used to measures theses apparent successes or failures resulted in 44 teacher being incorrectly labelled, and as a consequence, 1 teacher was fired.
More than 40 teachers in D.C. public schools received incorrect evaluations for 2012-2013 because of errors in the way the scores were calculated and one was fired as a result.
The president of the Washington Teachers’ Union, Elizabeth A. Davis, has asked for details from D.C. Schools Chancellor Kaya Henderson in a letter (text below) that says that the problems were found by Mathematica Policy Research, a partner of the school system’s. The mistakes were found in the individual “value added” scores for teachers, which are calculated through a complicated formula that includes student standardized test scores.
This “VAM” formula is part of the evaluation system called IMPACT, begun under former chancellor Michelle Rhee in 2009. Henderson, Rhee’s successor, continued with IMPACT, though this year she reduced the amount of weight given to test scores from a mandatory 50 percent to at least 35 percent. (See below for IMPACT chart).
Testing experts have long warned that using test scores to evaluate teachers is a bad idea, and that these formulas are subject to error, but such evaluation has become a central part of modern school reform.
44 teachers may not sound like a lot, but it turns out it was a significant percentage of teachers
Those affected are about 1 percent of about 4,000 teachers in the school system. But they comprise nearly 10 percent of the teachers whose work is judged in part on annual city test results for their classrooms.
When an evaluation system is so poor that 1 in 10 results are error riddled and result in a teacher wrongfully being terminated, the system needs to be put on hold. Here in Ohio, the results could be even worse.
Some of the confusion may be due to a lack of transparency around the value-added model.
The details of how the scores are calculated aren't public. The Ohio Department of Education will pay a North Carolina-based company, SAS Institute Inc., $2.3 million this year to do value-added calculations for teachers and schools. The company has released some information on its value-added model but declined to release key details about how Ohio teachers' value-added scores are calculated.
The Education Department doesn't have a copy of the full model and data rules either.
The department's top research official, Matt Cohen, acknowledged that he can't explain the details of exactly how Ohio's value-added model works. He said that's not a problem.
"It's not important for me to be able to be the expert," he said. "I rely on the expertise of people who have been involved in the field."
If something similar were to happen in Ohio, which is highly probable, no one would be any the wiser, because no one can double check the work of SAS Institute Inc., not even ODE - which remarkably doesn't even seem to care.
The formula for calculating the value-add score for Ohio's teachers must be open to inspection so that our teachers are not falsely named and shamed and fired as they are being in Washington DC.