Agenda Driven Charter Boosters

The National Alliance for Public Charters, like so many charter school boosters appears to care more about the quantity of charter schools than they do the quality.

This is clearly evident when looking at their state rankings for charter laws

Nicole Blalock, PhD a postdoctoral scholar at Arizona State University writes
In four of the states with a statistical difference between charter school students’ NAEP scores and public school students’ NAEP scores, statistical differences were observed for all grade/subject pairs tested. This occurred in the states of Alaska, Maryland, Ohio, and Pennsylvania.

On average, in Alaska, students attending charter schools outperformed students in public schools by approximately 10 points in most grade/subject area tests and by more than 20 points in reading in grade 4. However, the National Alliance for Public Charters that ranked the 42 states with charter schools and the District of Colombia as per their charter school laws, ranked Alaska nearly the lowest (i.e., 41st of 43) for the “best” charter laws (“Measuring up to the model”, 2013). Put differently, the state whose charter school students performed among the best as compared to their public school peers just happened to be one of the worst charter states as externally ranked.

Otherwise, public school students outperformed charter school students in the other three states (i.e., Maryland, Ohio, and Pennsylvania) with consistent and significant score differences across the board. Maryland was one of two states to be ranked lower than Alaska for the “best” charter laws overall (i.e., 42nd of 43), and Ohio and Pennsylvania ranked in the middle of the pack (27th and 19th of 43 respectively). Each of these states demonstrated charter school student performance that lagged behind public school students by an average of 23 points.
Until charter school boosters begin to care more about quality than they do quantity, we're going to continue to have horribly performing charter schools in Ohio that are not serving our students. We cannot continue to focus on quantity over quality. The National Alliance for Public Charters state charter law ranking are absurd. Ohio should be ranked dead last, based on quality.

Share your comment:

If a teacher was simply a babysitter

According to the Ohio Department of Education, the average teacher salary in 2013 was $56,715. For employees with advanced degrees, this doesn't crack the top ten industries when just looking at starting pay

1. Computer science: Average starting pay: $73,700
2. Business administration/management: Average starting pay: $69,200
3. Mechanical engineering: Average starting pay: $66,800
4. Electrical/electronics and communications engineering: Average starting pay: $66,100
5. Finance: Average starting pay: $64,300
6. Nursing: Average starting pay: $63,800
7. Economics (business/managerial): Average starting pay: $63,400
8. Health and related sciences: Average starting pay: $62,900
9. Accounting: Average starting pay: $62,300

Teenage babysitters could earn more. Literally.

That's right. Let's give them $3.00 an hour and only the hours they worked; not any of that silly planning time, or any time they spend before or after school. That would be $19.50 a day (7:45 to 3:00 PM with 45 min. off for lunch and planning -- that equals 6-1/2 hours).
So each parent should pay $19.50 a day for these teachers to baby-sit their children. Now how many students do they teach in a day...maybe 30? So that's $19.50 x 30 = $585 a day.
However, remember they only work 180 days a year!!! I am not going to pay them for any vacations.
LET'S SEE....
That's $585 X 180= $105,300 per year. (Hold on! My calculator needs new batteries).
What about those special education teachers and the ones with Master's degrees? Well, we could pay them minimum wage ($7.75), and just to be fair, round it off to $8.00 an hour. That would be $8 X 6-1/2 hours X 30 children X 180 days = $280,800 per year.

Wait a minute -- there's something wrong here!

$56,715 turns out to be quite the deal.

Share your comment:

Teacher performance incentives have negative impact

A recently published study from the Journal of Labor Economics looked at performance incentives for teachers in the NYC school system. The results should cause corporate reformers to pause
Providing financial incentives for teachers to increase student performance is an increasingly popular education policy around the world. This paper describes a school-based randomized trial in over two-hundred New York City public schools designed to better understand the impact of teacher incentives on student achievement. I find no evidence that teacher incentives increase student performance, attendance, or graduation, nor do I find any evidence that the incentives change student or teacher behavior. If anything, teacher incentives may decrease student achievement, especially in larger schools. The paper concludes with a speculative discussion of theories that may explain these stark results.
As Margarita Pivovarova, Assistant Professor of Economics at Arizona State University notes
The estimates from the experiment imply that if a student attended a middle school with an incentive in place for three years, his/her math test scores would decline by 0.138 of a standard deviation and his/her reading score would drop by 0.09 of a standard deviation.

Not only that, but the incentive program had no effect on teachers’ absenteeism, retention in school or district, nor did it affect the teachers’ perception of the learning environment in a school. Literally, the estimated 75 million dollars invested and spent brought zero return!

Share your comment:

How much variance in test scores is due to variance in teachers?

Via Larry Ferlazzo

It’s not uncommon to hear someone inaccurately state that the teacher has the biggest influence on student achievement — period. Of course, the true statement is that — of the in-school factors — teachers have the biggest influence. On top of that, research has shown that over two-thirds of the factors that influence student achievement occur out of school.

To illustrate this, here's a pie chart

Lots of good links ot dem,onstrate the evidence behind this pie chart at the link.

If we're primarily focusing on teacher quality to increase student achievment, we're focusing on the wrong place. It's why corporate education reforms are doomed to failure.

Share your comment:

Can Value-Added Measures Be Used for Teacher Improvement?

Susanna Loeb, Professor of Education Stanford University and Faculty Director for the Center for Education Policy Analysis, has a brief published by the Carnegie Foundation for the Advancement of Teaching
The question for this brief is whether education leaders can use value-added measures as tools for improving schooling and, if so, how to do this. Districts, states, and schools can, at least in theory, generate gains in educational outcomes for students using value-added measures in three ways: creating information on effective programs, making better decisions about human resources, and establishing incentives for higher performance from teachers. This brief reviews the evidence on each of these mechanisms and describes the drawbacks and benefits of using value-added measures in these and other contexts.

The brief concludes
    Value-added measures are not a good source of information for helping teachers improve because they provide little information on effective and actionable practices.
  • School, district, and state leaders may be able to improve teaching by using value-added to shape decisions about programs, human resources, and incentives.
  • Value-added measures of improvement are more precise measures for groups of teachers than they are for individual teachers, thus they may provide useful information on improvement associated with practices, programs or schools.
  • Many incentive programs for staff performance that are based on student performance have not shown benefits. Research points to the difficulty of designing these programs well and maintaining them politically.
  • Value-added measures for selecting and improving programs, for informing human resource decisions, and for incentives are likely to be more useful when they are combined with other measures.
  • We still have only a limited understanding of how best to use value-added measures in combination with other measures as tools for improvement.
The use of value-added measures to evaluate individual teachers, especially when connected to high stakes personnel decisions is impossible to defend if one is guided by the research evidence, and the disastrous practical applications.

The full brief can be read below.

HOW CAN VALUE-ADDED MEASURES BE USED FOR TEACHER IMPROVEMENT?

Share your comment:

Bogus Evaluations Used to Fire Teachers

Washington DC has long been the poster child for high stakes tests used to label teachers as successes or failures. Now news comes that errors in the Value-add formulas used to measures theses apparent successes or failures resulted in 44 teacher being incorrectly labelled, and as a consequence, 1 teacher was fired.
More than 40 teachers in D.C. public schools received incorrect evaluations for 2012-2013 because of errors in the way the scores were calculated and one was fired as a result.

The president of the Washington Teachers’ Union, Elizabeth A. Davis, has asked for details from D.C. Schools Chancellor Kaya Henderson in a letter (text below) that says that the problems were found by Mathematica Policy Research, a partner of the school system’s. The mistakes were found in the individual “value added” scores for teachers, which are calculated through a complicated formula that includes student standardized test scores.

This “VAM” formula is part of the evaluation system called IMPACT, begun under former chancellor Michelle Rhee in 2009. Henderson, Rhee’s successor, continued with IMPACT, though this year she reduced the amount of weight given to test scores from a mandatory 50 percent to at least 35 percent. (See below for IMPACT chart).

Testing experts have long warned that using test scores to evaluate teachers is a bad idea, and that these formulas are subject to error, but such evaluation has become a central part of modern school reform.
44 teachers may not sound like a lot, but it turns out it was a significant percentage of teachers
Those affected are about 1 percent of about 4,000 teachers in the school system. But they comprise nearly 10 percent of the teachers whose work is judged in part on annual city test results for their classrooms.
When an evaluation system is so poor that 1 in 10 results are error riddled and result in a teacher wrongfully being terminated, the system needs to be put on hold. Here in Ohio, the results could be even worse.

Ohio uses a secret proprietary for-profit formula

Some of the confusion may be due to a lack of transparency around the value-added model.

The details of how the scores are calculated aren't public. The Ohio Department of Education will pay a North Carolina-based company, SAS Institute Inc., $2.3 million this year to do value-added calculations for teachers and schools. The company has released some information on its value-added model but declined to release key details about how Ohio teachers' value-added scores are calculated.

The Education Department doesn't have a copy of the full model and data rules either.

The department's top research official, Matt Cohen, acknowledged that he can't explain the details of exactly how Ohio's value-added model works. He said that's not a problem.

"It's not important for me to be able to be the expert," he said. "I rely on the expertise of people who have been involved in the field."
If something similar were to happen in Ohio, which is highly probable, no one would be any the wiser, because no one can double check the work of SAS Institute Inc., not even ODE - which remarkably doesn't even seem to care.

The formula for calculating the value-add score for Ohio's teachers must be open to inspection so that our teachers are not falsely named and shamed and fired as they are being in Washington DC.

Share your comment:

Get Involved

 Email*
 First Name
 Last Name
 School District

Search