add

Budget conference committee take backward steps

If one had hoped that the budget conference committee would take the Governor, House and Senate education policy plans and blend them into a better product, those hopes were dashed yesterday.

The budget continues to disinvest in Ohio's public education system to the tune of $532.7 million compared to 2010-2011 funding levels. To add further insult to that injury, in order to pass along income tax cuts to Ohio's wealthiest citizens, the GOP controlled legislature is also eliminating the 12.5% property tax rollback. A homeowner would face paying an additional $4.38 per mill for every $100,000 in taxable property value on new levies - making those levies a tougher sell for struggling schools.

In other areas of education policy, the conference committee failed too. The Senate had proposed to reduce the weight of a teachers evaluation using value-added from 50% to 35%. However, the conference committee reversed that policy improvement leaving the absurd over-reliance of value-add in place at 50%. Furthermore, the Senate had proposed eliminating the scores from teachers evaluations of students who were unexcused absent for 30 days or more. This would have been down from the current law of 60 days. The Conference committee reset that to an objectionable 45 days. For reference, Ohio Revised Code states that a student is chronically truant after only 15 days of unexcused absence - so why any teacher should be evaluated based on chronically truant students can only be explained by the legislature wanting to be punitive towards educators.

According to Gongwer

Conferees did adopt some last minute tweaks to the school funding that Republicans said would steer some additional money to poorer urban and rural districts.

One amendment would shift some funding from the K-3 literacy fund for all schools to economically disadvantaged districts and charter schools, according to House Republican policy aide Colleen Grady. However, the revision would not significantly alter the bottom line on K-12 spending.

So in order to more adequately fund rural school districts the legislature decided not to add more money to the put but to shift money from their own 3rd grade reading guarantee. This isn't education policy, it is madness.

Other notable changes

  • Revise the enrollment count for funding traditional school districts by switching to an annualized processed that would be updated three times a year starting in 2015.
  • Remove a funding guarantee for charter schools rated "excellent" for three years consecutively.
  • Subject private school students to state testing requirements if more than 65% of the population uses state vouchers, while allowing pupils not on scholarships to opt out of the exams.
  • Specify that homeschooled children and students moving into Ohio could obtain for EdChoice vouchers if they live in an eligible school district.
  • Ensure that students attending a STEM school can participate in extracurricular activities in their resident schools.
  • Create an advisory committee to guide distribution of the Straight A grant program funds and advise the governing board.
  • Cap Straight A fund awards at $5 million for a single grantee and $15 million for a consortium, while allowing the Controlling Board to approve higher amounts.

Value-added: How Ohio is destroying a profession

We ended the week last week with a post titled "The 'fun' begins soon", which took a look at the imminent changes to education policy in Ohio. We planned on detailing each of these issues over the next few weeks.

Little did we know that the 'fun' would begin that weekend. It came in the manner of the Cleveland Plain Dealer and NPR publishing a story on the changing landscape of teacher evaluations titled "Grading the Teachers: How Ohio is Measuring Teacher Quality by the Numbers".

It's a solid, long piece, worth the time taken to read it. It covers some, though not all, of the problems of using value-added measurements to evaluate teachers

Those ratings are still something of an experiment. Only reading and math teachers in grades four to eight get value-added ratings now. But the state is exploring how to expand value-added to other grades and subjects.

Among some teachers, there’s confusion about how these measures are calculated and what they mean.

“We just know they have to do better than they did last year,” Beachwood fourth-grade teacher Alesha Trudell said.

Some of the confusion may be due to a lack of transparency around the value-added model.

The details of how the scores are calculated aren’t public. The Ohio Education Department will pay a North Carolina-based company, SAS Institute Inc., $2.3 million this year to do value-added calculations for teachers and schools. The company has released some information on its value-added model but declined to release key details about how Ohio teachers’ value-added scores are calculated.

The Education Department doesn’t have a copy of the full model and data rules either.

The department’s top research official, Matt Cohen, acknowledged that he can’t explain the details of exactly how Ohio’s value-added model works. He said that’s not a problem.

Evaluating a teacher on a secret formula isn't a practice that can be sustained, supported or defended. The article further details a common theme we hear over and over again

But many teachers believe Ohio’s value-added model is essentially unfair. They say it doesn’t account for forces that are out of their control. They also echo a common complaint about standardized tests: that too much is riding on these exams.

“It’s hard for me to think that my evaluation and possibly some day my pay could be in a 13-year-old’s hands who might be falling asleep during the test or might have other things on their mind,” said Zielke, the Columbus middle school teacher.

The article also performs analysis on several thousands value add scores, and that analysis demonstrates what we have long reported, that value-add is a poor indicator of teacher quality, with too many external factors affecting the score

A StateImpact/Plain Dealer analysis of initial state data suggests that teachers with high value-added ratings are more likely to work in schools with fewer poor students: A top-rated teacher is almost twice as likely to work at a school where most students are not from low-income families as in a school where most students are from low-income families.
[…]
Teachers say they’ve seen their value-added scores drop when they’ve had larger classes. Or classes with more students who have special needs. Or more students who are struggling to read.

Teachers who switch from one grade to another are more likely to see their value-added ratings change than teachers who teach the same grade year after year, the StateImpact/Plain Dealer analysis shows. But their ratings went down at about the same rate as teachers who taught the same grade level from one year to the next and saw their ratings change.

What are we measuring here? Surely not teacher quality, but rather socioeconomic factors and budget conditions of the schools and their students.

Teachers are intelligent people, and they are going to adapt to this knowledge in lots of unfortunate ways. It will become progressively harder to districts with poor students to recruit and retain the best teachers. But perhaps the most pernicious effect is captured at the end of the article

Stephon says the idea of Plecnik being an ineffective teacher is “outrageous.”

But Plecnik is through. She’s quitting her job at the end of this school year to go back to school and train to be a counselor — in the community, not in schools.

Plecnik was already frustrated by the focus on testing, mandatory meetings and piles of paperwork. She developed medical problems from the stress of her job, she said. But receiving the news that despite her hard work and the praise of her students and peers the state thought she was Least Effective pushed her out the door.

“That’s when I said I can’t do it anymore,” she said. “For my own sanity, I had to leave.”

The Cleveland Plain Dealer and NPR then decided to add to this stress by publishing individual teachers value-added scores - a matter we will address in our next post.

The Three Biggest TFA Lies

When I was a kid, around ten years old I guess, my father told me a joke that began with the question “What are the three biggest lies?” I said I didn’t know and he proceeded to tell me that the first biggest lie is “The check is in the mail,” which as a ten year old I really didn’t get. The second biggest lie was, apparently, “Some of my best friends are Black,” which also didn’t make much sense coming from my father, considering that some of his best friends were, in fact, Black. The third, well, was a bit too X-rated for this blog, and definitely for me as a ten year old. Not everyone is a perfect parent, I know, and I don’t hold this against him, though I do try to limit his unsupervised time with my own two kids.

As someone who is, I suppose, a big “friendly critic” (an expression TFA coined as the need to describe the growing number of frustrated alumni) of TFA, I think the biggest problem with TFA is all the lying. Though the individual people I’ve known on staff aren’t huge liars, themselves, the sum of all the lies add up to an organization whose lying is pathological. Really, they’ve elevated the art of lying to new heights, much the way Mozart elevated the concerto. Even people like Bernie Madoff who thought they were great liars can’t help but marvel at TFAs techniques.

[readon2 url="http://garyrubinstein.teachforus.org/2013/04/30/the-three-biggest-tfa-lies/"]Continue reading...[/readon2]

Improving the Budget Bill Part I

Hb 59, the Governor's budget bill can be significantly improved during the legislative process. We're going to detail some of the ways improvements can be made.

Improvements can first start by correcting a major policy flaw inserted into HB555 at the last minute. HB 555 radically changed the method of calculating evaluations for about 1/3 of Ohio's teachers. If a teacher's schedule is comprised only of courses or subjects for which the value-added progress dimension is applicable - then only their value-add score can now be used as part of the 50% of an evaluation based on student growth. Gone is the ability to use multiple measures of student growth - i.e. Student Learning Objectives or SLO's.

Therefore we suggest the legislature correct this wrong-headed policy by repealing this provision of HB555.

Furthermore, greater evaluation fairness could be achieved by lowering the number of absences a student is allowed before their test scores can be excluded from a teacher's value-add score. Currently a student needs to be absent 60 times - or 1/3 of a school year. This is an absurd amount of schooling to miss and still have that student's score count towards the evaluation of his or her teacher. This absence exclusion should be lowered to a more reasonable 15 absences.

Value-add should not be used to punish teachers on evaluations, instead it should be just one component of a multiple measure framework, and a tool to help teachers improve student learning. HB555 moved us much further away from that goal.

ODE publishes propaganda

prop·a·gan·da
/ˌpräpəˈgandə/
Noun
1. Information, esp. of a biased or misleading nature, used to promote or publicize a particular political cause or point of view.
2. The dissemination of such information as a political strategy.

That aptly describes the latest document published by the Ohio Department of Education, titled "Myths vs. Facts about the Ohio Teacher Evaluation System". The document lists 10 alleged myths about the teacher evaluation system being created. We thought we'd take a closer look at some of these alleged "myths".

1. Myth: The state is telling us what to do in local evaluations.

ODE, under a bulleted list discussing local board flexibility in creating evaluations, state "The percentages within the given range for student growth measures for the teachers in that district;" This is no longer true for teacher who have Value-add scores. These teachers (over 30% of Ohio's teaching corps) will have 50% of their evaluation based on student test scores. On this, local boards have zero flexibility, it's a state mandate. We judge aspects of this myth to actually be true

2. Myth: This is just a way to fire teachers.

ODE goes to great length to discuss how these evaluations will be great for teachers in identifying areas of improvement (though no money has been allocated for professional development). Utterly lacking is any discussion of the provision within HB153 prohibits giving preference based on seniority in determining the order of layoffs or in rehiring teachers when positions become available again, except when choosing between teachers with comparable evaluations. It is no secret that corporate education reformers such as Michelle Rhee desperately want to use evaluations for the basis of firing what they purportedly measure to be "ineffective" teachers. After all, this is exactly the process used in Washington DC where she came from. It's far too soon to call this a myth, it's more like a corporate educators goal.

3. Myth: One test in the spring will determine my fate.

It's nice that ODE stresses the importance of using multiple measures, but once again they fail to acknowledge that HB555 removed those multiple measures for 30% of Ohio's teachers. For those teachers their fate will be determined by tests. This myth is therefore true.

5. Myth: The state has not done enough work on this system – there are too many unanswered questions.

How can it be a myth when even this documents fails to state that "we're ready". SLO's have yet to be developed, Common Core is almost upon us but no one knows what the tests will be, the legislature keeps changing the rules of the game and no where near enough evaluator training has taken place to evaluate all of Ohio's teachers. Ohio isn't ready for this and that's a fact, not a myth.

6. Myth: “Value-Added” is a mysterious formula and is too volatile to be trusted.

This is perhaps one of the most egregious points of all. Study after study after study has demonstrated that Value add is volatile, unreliable and inappropriate for measuring teacher effectiveness. Their explanation conflates the use of value-add as a diagnostic tool and its use in evaluating teachers. Those are 2 very different use cases indeed.

As for it being mysterious, the formula used in Ohio is secret and proprietary - it doesn't get more mysterious than that! This claim by ODE is simply untrue and ridiculous, they ought to be embarrassed for publishing it. This myth is totally true and real and backed up by all the available scientific evidence.

7. Myth: The current process for evaluating teachers is fine just as it is.

Their explanation: "Last year, 99.7 percent of teachers around the country earned a “satisfactory” evaluation, yet many students didn’t make a year’s worth of progress in reading and are not reading at grade level." Right out of the corporate education reformers message book. Blame the teacher. Still think this isn't going to end up being about firing teachers? This myth is a straw-man, no one argues the current system is ideal, but the proposed OTES is dangerously constructed.

8. Myth: Most principals (or other evaluators) don’t have time to do this type of evaluation, so many will just report that teachers are proficient.

ODE states "Fact: Most principals are true professionals who want the teachers in their buildings to do well." But wait a minute, in Myth #7 these very same principals were handing out "satisfactory" grades like candy to 99.7% of teachers. Which is it? Are they professionals who can fairly evaluate teachers, or aren't they? We wrote about the massive administrative task faced by school administrators almost 2 years ago. Nothing has happened to alleviate those burdens, other than a $2 billion budget cut. This myth is 100% true.

9. Myth: This new evaluation system is like building the plane while we’re flying it.

ODE states: "Fact: Just as the Wright brothers built a plane, tried it by flying it, landed it, and then refined the plane they built, the new evaluation system was built, tried and revised. "

We'll just point out that 110 years have passed since the Wright Brothers first flew and the world has developed better design and project management tools since then.

10. Myth: It will be easy to implement the new teacher evaluation system.

Has anyone, anywhere said this? Or did the ODE brainstorming session run out of bad ideas at 9, and this is all they could come up with? Talk about ending with a straw-man, which frankly, given the rest of the document is probably the most appropriate ending.

ODE ought to withdraw this piece of propaganda from public view.

Catastrophic failure of teacher evaluations in TN

If the on going teacher evaluation failure in Tennessee is any guide, Ohio has some rough waters ahead. Tennessee's recently passed system is very similar to Ohio's.

It requires 50 percent of the evaluation to be comprised of student achievement data—35 percent based on student growth as represented by the Tennessee Value-Added Assessment System (TVAAS) or a comparable measure and the other 15 percent based on additional measures of student achievement adopted by the State Board of Education and chosen through mutual agreement by the educator and evaluator. The remaining 50 percent of the evaluation is determined through qualitative measures such as teacher observations, personal conferences and review of prior evaluations and work.

Tennessee’s new way of evaluating classrooms “systematically failed” to identify bad teachers and provide them more training, according to a state report published Monday.

The Tennessee Department of Education found that instructors who got failing grades when measured by their students’ test scores tended to get much higher marks from principals who watched them in classrooms. State officials expected to see similar scores from both methods.

“Evaluators are telling teachers they exceed expectations in their observation feedback when in fact student outcomes paint a very different picture,” the report states. “This behavior skirts managerial responsibility.”

The education administration in Tennessee are pointing the fingers towards the in classroom evaluations, but as one commentator on the article notes,

Perhaps what we are seeing with these disparities is not a need to retrain the evaluators, but rather confirmation of what many know but the Commissioner and other proponents of this hastily conceived evaluation system refuse to see -- the evaluation criteria mistakenly relies too much on TVAAS scores when they do not in fact accurately measure teacher effectiveness.

It has been stated over, and over, that the use of value add at the teacher level is not appropriate, subject to too much variation, instability, and error. Yet when these oft warned about problems continue to manifest, they are ignored, excused and other factors scapegoated instead.

As if to make matters worse, the report (read it below) suggests that "school-wide value-added scores should be based on a one-year score rather than a three-year score. While it makes sense, where possible, to use three-year averages for individuals because of smaller sample sizes, school-wide scores can and should be based on one-year data."

So how did the value add scores stack up against observations? With 1 being the lowest grade and 5 the highest

Are we really supposed to believe that a highly educated and trained workforce, such as teachers are failing at a 24.6% rate (grades 1's and 2's). Not even the most ardent corporate education reformer has claimed that kind of number! It becomes even more absurd when one looks at student achievement. It's hard to argue that a quarter of our workforce is substandard when your student achievement is at record highs.

Instead it seems more reasonable that a more modest 2%-3% of teachers are ineffective and that the observations by professional, experienced evaluators are accurately capturing that situation.

Sadly, no where in the Tennessee report is a call for further analysis of their value add calculations.

Teacher Evaluation in Tennessee: A Report on Year 1 Implementation