According to the Ohio Department of Education, the average teacher salary in 2013 was $56,715. For employees with advanced degrees, this doesn't crack the top ten industries when just looking at starting pay
1. Computer science: Average starting pay: $73,700
Teenage babysitters could earn more. Literally.
2. Business administration/management: Average starting pay: $69,200
3. Mechanical engineering: Average starting pay: $66,800
4. Electrical/electronics and communications engineering: Average starting pay: $66,100
5. Finance: Average starting pay: $64,300
6. Nursing: Average starting pay: $63,800
7. Economics (business/managerial): Average starting pay: $63,400
8. Health and related sciences: Average starting pay: $62,900
9. Accounting: Average starting pay: $62,300
That's right. Let's give them $3.00 an hour and only the hours they worked; not any of that silly planning time, or any time they spend before or after school. That would be $19.50 a day (7:45 to 3:00 PM with 45 min. off for lunch and planning -- that equals 6-1/2 hours).$56,715 turns out to be quite the deal.
A recently published study from the Journal of Labor Economics looked at performance incentives for teachers in the NYC school system. The results should cause corporate reformers to pause
So each parent should pay $19.50 a day for these teachers to baby-sit their children. Now how many students do they teach in a day...maybe 30? So that's $19.50 x 30 = $585 a day.
However, remember they only work 180 days a year!!! I am not going to pay them for any vacations.
That's $585 X 180= $105,300 per year. (Hold on! My calculator needs new batteries).
What about those special education teachers and the ones with Master's degrees? Well, we could pay them minimum wage ($7.75), and just to be fair, round it off to $8.00 an hour. That would be $8 X 6-1/2 hours X 30 children X 180 days = $280,800 per year.
Wait a minute -- there's something wrong here!
Providing financial incentives for teachers to increase student performance is an increasingly popular education policy around the world. This paper describes a school-based randomized trial in over two-hundred New York City public schools designed to better understand the impact of teacher incentives on student achievement. I find no evidence that teacher incentives increase student performance, attendance, or graduation, nor do I find any evidence that the incentives change student or teacher behavior. If anything, teacher incentives may decrease student achievement, especially in larger schools. The paper concludes with a speculative discussion of theories that may explain these stark results.
As Margarita Pivovarova, Assistant Professor of Economics at Arizona State University notes
The estimates from the experiment imply that if a student attended a middle school with an incentive in place for three years, his/her math test scores would decline by 0.138 of a standard deviation and his/her reading score would drop by 0.09 of a standard deviation.
Via Larry Ferlazzo
Not only that, but the incentive program had no effect on teachers’ absenteeism, retention in school or district, nor did it affect the teachers’ perception of the learning environment in a school. Literally, the estimated 75 million dollars invested and spent brought zero return!
It’s not uncommon to hear someone inaccurately state that the teacher has the biggest influence on student achievement — period. Of course, the true statement is that — of the in-school factors — teachers have the biggest influence. On top of that, research has shown that over two-thirds of the factors that influence student achievement occur out of school.
To illustrate this, here's a pie chart
Lots of good links ot dem,onstrate the evidence behind this pie chart at the link.
If we're primarily focusing on teacher quality to increase student achievment, we're focusing on the wrong place. It's why corporate education reforms are doomed to failure.
Susanna Loeb, Professor of Education Stanford University and Faculty Director for the Center for Education Policy Analysis, has a brief published by the Carnegie Foundation for the Advancement of Teaching
The question for this brief is whether education leaders can use value-added measures as tools for improving schooling and, if so, how to do this. Districts, states, and schools can, at least in theory, generate gains in educational outcomes for students using value-added measures in three ways: creating information on effective programs, making better decisions about human resources, and establishing incentives for higher performance from teachers. This brief reviews the evidence on each of these mechanisms and describes the drawbacks and benefits of using value-added measures in these and other contexts.
The brief concludes
Value-added measures are not a good source of information for helping teachers improve because they provide little information on effective and actionable practices.
The use of value-added measures to evaluate individual teachers, especially when connected to high stakes personnel decisions is impossible to defend if one is guided by the research evidence, and the disastrous practical applications.
- School, district, and state leaders may be able to improve teaching by using value-added to shape decisions about programs, human resources, and incentives.
- Value-added measures of improvement are more precise measures for groups of teachers than they are for individual teachers, thus they may provide useful information on improvement associated with practices, programs or schools.
- Many incentive programs for staff performance that are based on student performance have not shown benefits. Research points to the difficulty of designing these programs well and maintaining them politically.
- Value-added measures for selecting and improving programs, for informing human resource decisions, and for incentives are likely to be more useful when they are combined with other measures.
- We still have only a limited understanding of how best to use value-added measures in combination with other measures as tools for improvement.
The full brief can be read below.
HOW CAN VALUE-ADDED MEASURES BE USED FOR TEACHER IMPROVEMENT?
Washington DC has long been the poster child for high stakes tests used to label teachers as successes or failures. Now news comes that errors in the Value-add formulas used to measures theses apparent successes or failures resulted in 44 teacher being incorrectly labelled, and as a consequence, 1 teacher was fired.
More than 40 teachers in D.C. public schools received incorrect evaluations for 2012-2013 because of errors in the way the scores were calculated and one was fired as a result.44 teachers may not sound like a lot, but it turns out it was a significant percentage of teachers
The president of the Washington Teachers’ Union, Elizabeth A. Davis, has asked for details from D.C. Schools Chancellor Kaya Henderson in a letter (text below) that says that the problems were found by Mathematica Policy Research, a partner of the school system’s. The mistakes were found in the individual “value added” scores for teachers, which are calculated through a complicated formula that includes student standardized test scores.
This “VAM” formula is part of the evaluation system called IMPACT, begun under former chancellor Michelle Rhee in 2009. Henderson, Rhee’s successor, continued with IMPACT, though this year she reduced the amount of weight given to test scores from a mandatory 50 percent to at least 35 percent. (See below for IMPACT chart).
Testing experts have long warned that using test scores to evaluate teachers is a bad idea, and that these formulas are subject to error, but such evaluation has become a central part of modern school reform.
Those affected are about 1 percent of about 4,000 teachers in the school system. But they comprise nearly 10 percent of the teachers whose work is judged in part on annual city test results for their classrooms.
When an evaluation system is so poor that 1 in 10 results are error riddled and result in a teacher wrongfully being terminated, the system needs to be put on hold. Here in Ohio, the results could be even worse.
Ohio uses a secret proprietary for-profit formula
Some of the confusion may be due to a lack of transparency around the value-added model.
If something similar were to happen in Ohio, which is highly probable, no one would be any the wiser, because no one can double check the work of SAS Institute Inc., not even ODE - which remarkably doesn't even seem to care.
The details of how the scores are calculated aren't public. The Ohio Department of Education will pay a North Carolina-based company, SAS Institute Inc., $2.3 million this year to do value-added calculations for teachers and schools. The company has released some information on its value-added model but declined to release key details about how Ohio teachers' value-added scores are calculated.
The Education Department doesn't have a copy of the full model and data rules either.
The department's top research official, Matt Cohen, acknowledged that he can't explain the details of exactly how Ohio's value-added model works. He said that's not a problem.
"It's not important for me to be able to be the expert," he said. "I rely on the expertise of people who have been involved in the field."
The formula for calculating the value-add score for Ohio's teachers must be open to inspection so that our teachers are not falsely named and shamed and fired as they are being in Washington DC.
A recent study by the Center for Economic and Policy Research looked into the effects of unionism on women in the workplace. The results of women being in a union compared to non unionized women, even with a college degree, are dramatic.
Being in or represented by a union compared well to completing college in terms of wages, especially once tuition costs are factored in. All else equal, being in a union raises a woman's pay as much as a full year of college does. For the average female worker, a four-year college degree boosts wages by over half (51.9 percent) relative to a similar woman who has only a high school degree. In comparison, unionization raises a woman’s pay by 14.7 percent – over one-quarter of the effect of a college degree.
The union impact on the probability that a female worker has health insurance or a retirement plan through her employer was even larger than the impact on wages. At every education level, unionized women are more likely to have employee benefits than their non-union counterparts with similar characteristics. In fact, for a women worker with a high school degree, being in or represented by a union raises her likelihood of having health insurance or a retirement plan by more than earning a four-year college degree would.
The union advantage is largest when looking at employer-provided retirement plans. Women in or
represented by unions are 53.4 percent more likely to have pension coverage than those not in
unions, which is also larger than the corresponding effect of a four-year college degree (43.6
You can read the full report here. One thing is clear, so-called "right-to-work" is very bad for everyone, but especially women.