performance

Study: Ohio charter schools are worst in nation

The Center for Research on Education Issues (CREDO) has just published its 2013 report, "National Charter School Study". CREDO Researchers looked at test data from charter schools in 26 states plus DC. Ohio was one of the 26 states. This study follow up on their 2009 study which garnered a lot of attention for bringing to light the poor quality of the nations charter schools.

This new study finds, despite charter schools being able to screen for the best students, only marginal improvement over the past 4 years

25 percent of charters outperformed traditional public schools in reading while 29 percent of charters delivered stronger results in math. That marked an improvement over a similar 2009 study by the same research team.

But 56 percent of the charters produced no significant difference in reading and 19 percent had worse results than traditional public schools. In math, 40 percent produced no significant difference and 31 percent were significantly worse than regular public schools.

The marginal improvement comes not from improved quality of charter schools in general, but in the closure of more poor performing charter schools lifting the over all average performance.

In Ohio, the charter school experiment is failing miserably. According to the study, Ohio's charter schools got worse over the last 4 years, and now dwell at the bottom of the performance tables. Ohio students who attend charter schools are losing the equivalent of almost 3 weeks of instruction in reading, and an entire grading period in mathematics. That is astonishingly bad news for the 5% of Ohio's students who attend charter schools.

The following table was taken from table 14 (pg 52 of the study)

State Reading Days Math Days
Rhode Island 86 108
DC 72 101
Tennessee 86 72
Louisiana 50 65
New York 36 79
New Jersey 43 58
Massachusetts 36 65
New York City 0 94
Michigan 43 43
Indiana 36 14
Illinois 14 22
Missouri 14 22
California 22 -7
North Carolina 22 -7
Minnesota 14 -7
Georgia 14 -14
Colorado 7 -7
Florida -7 0
New Mexico 0 -29
Arkansas -22 -22
Utah -7 -43
Arizona -22 -29
Texas -22 -29
Ohio -14 -43
Oregon -22 -50
Pennsylvania -29 -50
Nevada -108 -137

You can see from the following graphs of performance in 2009 vs 2013 that Ohio's charters are getting worse, and in math, much worse.

It is time to reassess Ohio's 15 year, billion dollar, charter experiment in light of these results and put an end to boosting charter schools at the expense of public schools. The experiement has not only failed, it is getting worse.

OEA Response to PD and NPR Teacher shaming

Here's the statement from the Ohio Education Association, which represents over 121,000 educators

Responding to a series of newspaper, web and radio stories on value-added scroes of individual Ohio teachers, Patricia Frost-Brooks, President of the Ohio Education Association criticized the fairness of the stories and the wisdom of using value-added scores as such a prominent index of teacher success:

"The Ohio Education Association was not contacted for comment on the Plain Dealer/StateImpact Ohio stories, despite our expertise, which would have provided desperately needed context and perspective. Reporters and editors admitted this value-added data was 'flawed,' but they chose surprise and impact over fairness, balance and accuracy," Frost-Brooks said.

"We are all accountable for student success – teachers, support professionals, parents, students and elected officials. And the Ohio Education Association is committed to fair teacher evaluation systems that include student performance, among other multiple measures. But listing teachers as effective or ineffective based on narrow tests not designed to be used for this purpose is a disservice to everyone.

"Value-added ratings can never paint a complete or objective picture of an individual teacher’s work or performance. Trained educators can use a student’s value-added data, along with other student data, to improve student instruction. But the stories promote a simplistic and inaccurate view of value-added as a valid basis for high-stakes decisions on schools, teachers and students."

Very questionable that reporters would not contact the largest teachers assoication in crafting their story.

Charter School Authorization And Growth

If you ask a charter school supporter why charter schools tend to exhibit inconsistency in their measured test-based impact, there’s a good chance they’ll talk about authorizing. That is, they will tell you that the quality of authorization laws and practices — the guidelines by which charters are granted, renewed and revoked — drives much and perhaps even most of the variation in the performance of charters relative to comparable district schools, and that strengthening these laws is the key to improving performance.

Accordingly, a recently-announced campaign by the National Association of Charter School Authorizers aims to step up the rate at which charter authorizers close “low-performing schools” and are more selective in allowing new schools to open. In addition, a recent CREDO study found (among other things) that charter middle and high schools’ performance during their first few years is more predictive of future performance than many people may have thought, thus lending support to the idea of opening and closing schools as an improvement strategy.

Below are a few quick points about the authorization issue, which lead up to a question about the relationship between selectivity and charter sector growth.

The reasonable expectation is that authorization matters, but its impact is moderate. Although there has been some research on authorizer type and related factors, there is, as yet, scant evidence as to the influence of authorization laws/practices on charter performance. In part, this is because such effects are difficult to examine empirically. However, without some kind of evidence, the “authorization theory” may seem a bit tautological: There are bad charters because authorizers allow bad charters to open, and fail to close them.

That said, the criteria and processes by which charters are granted/renewed almost certainly have a meaningful effect on performance, and this is an important area for policy research. On the other hand, it’s a big stretch to believe that these policies can explain a large share of the variation in charter effects. There’s a reasonable middle ground for speculation here: Authorization has an important but moderate impact, and, thus, improving these laws and practices is definitely worthwhile, but seems unlikely to alter radically the comparative performance landscape in the short- and medium-term (more on this below).

Strong authorization policies are a good idea regardless of the evidence. Just to be clear, even if future studies find no connection between improved authorization practices and outcomes, test-based or otherwise, it’s impossible to think of any credible argument against them. If you’re looking to open a new school (or you’re deciding whether or not to renew an existing one), there should be strong, well-defined criteria for being allowed to do so. Anything less serves nobody, regardless of their views on charter schools.

[readon2 url="http://shankerblog.org/?p=8510"]Continue reading...[/readon2]

How "top charters" screen students

It's no secret that the vast majority of Ohio charter schools are rated F, but what of some of the "high performing" schools? It is with those in mind, we read with interest the article "The Dirty Dozen: How Charter Schools Influence Student Enrollment" .

This commentary offers a classification of twelve different approaches that charter schools use to structure their student enrollment. These practices impact the likelihood of students enrolling with a given set of characteristics, be it higher (or lower) test scores, students with ‘expensive’ disabilities, English learners, students of color, or students in poverty.
[...]
Yet little attention has been paid to the mechanisms that generate these differences. One exception is an article in February of 2013, written by reporter Stephanie Simon of Reuters, which described a variety of ways that charter schools “get the students they want” (Simon, 2013):
  • Applications that are made available just a few hours a year.
  • Lengthy application forms, often printed only in English, that require student and parent essays, report cards, test scores, disciplinary records, teacher recommendations and medical records.
  • Demands that students present Social Security cards and birth certificates for their applications to be considered, even though such documents cannot be required under federal law.
  • Mandatory family interviews.
  • Assessment exams.
  • Academic prerequisites.
  • Requirements that applicants document any disabilities or special needs. The U.S. Department of Education considers this practice illegal on the college level but has not addressed the issue for K-12 schools.

We thought we would pick one charter school and test this hypothesis. We picked DAYTON EARLY COLLEGE ACADEMY, INC. (DECA), as they were elevated by they Fordham Foundation and recently testified on the budget as part of a "coalition of high performing charters".

Following introductions from Fordham’s Terry Ryan, Dayton Early College Academy’s Superintendent Judy Hennessey began to speak in front of the Subcommittee only to be interrupted by Committee Chair Senator Randy Gardner, “Senator [Peggy] Lehner has just commented you lead one of the best schools in the country.”

Jokingly Judy Hennessey nodded and said, “Now we are striving for world class.”

The application process.

Here's DECA's application, which can also be downloaded here.

High School Application 2013-14

The first thing you will note is the application form is 23 pages long, requiring hundreds of pieces of information to be entered including report cards, test scores, disciplinary records, teacher recommendations and medical records. In fact, all mechanisms mentioned in the reuters article commonly used to screen prospective students. This is a significant barrier that only the most determined parent is likely to scale.

The page where the applications can be downloaded clearly states, in bold, "Incomplete applications will not be considered."

A parent who is likely to complete such a detailed, lengthy application is likely a parent who is going to be engaged in their child's education to a greater degree than a parent who is unlikely to apply.

Furthermore, as is pointed out in the 12 approaches charters use to screen for students, this application is in English only. No second language form is available on the application webpage- making English as a second language applications far less likely.

You will also see that on page 5 of the application

Documents needed for a complete application
 Student birth certificate
 Student social security card

"Demands that students present Social Security cards and birth certificates for their applications to be considered, even though such documents cannot be required under federal law." is one of the tell-take screening mechanisms charters use.

The DECA application form also requests that applicants document any disabilities or special needs, another potential barrier spelled out in the article.

So we can plainly see then, that while DECA may produce above average results for a charter school, it can do so because it has a highly selective application process that is likely to screen out lower performing students.

The performance results

We were expecting a charter school whose leader professed to be aiming for "world class standards" to be rated Excellent with Distinction. DECA is not, indeed it is not even rated Excellent, instead it rates as "Effective" according to the latest data available from ODE.

Building IRN 009283
Building Name Dayton Early College Academy, Inc
District IRN 043844
District Name Dayton City
County Montgomery
Principal Judy Hennessey
Grade Span 7-12
Open/Closed Status (as of 9/18/2012) Open
Designation Effective
Number of standards met 14
Number of standards possible 17
Enrollment 411
Performance Index Score 2011-12 99.1
Performance Index Score 2010-11 100.5
Performance Index Score 2009-10 96.2
2012 Attendance AYP N/A
2012 Graduation AYP Not Met
2012 Reading AYP Met
2012 Math AYP Met
2012 Overall AYP Not Met
Four-Year "On-Time" Graduation Rate Numerator 2010-11 35

These aren't bad results, indeed compared to the majority of F rated charter schools they are positively giddy. But, given the arduous application screening process, and the "effective" rating, it's a far cry from being world beating, and a very far cry from the world of traditional public schools which have to accept every student from the district that walks through the door.

Michelle Rhee and the unproven teacher evaluation

Via the LA Times

The debate -- and that’s putting it nicely -- over the use of standardized test scores in teacher evaluations has always confused me, because the answer seemed so simple. One of the things we ask of teachers -- but just one thing -- is to raise those scores. So they have some place in the evaluation. But how much? Easy. Get some good evidence and base the decisions on that, not on guessing. The quality of education is at stake, as well as people’s livelihoods.

Much to my surprise, at a meeting with the editorial board this week, Michelle Rhee agreed, more or less. As one of the more outspoken voices in the school-reform movement, Rhee is at least as polarizing as the topic of teacher evaluations, and her lobbying organization, Students First, takes the position that the standardized test scores of each teacher’s students should count for no less than 50% of that teacher’s rating on performance evaluations.

But asked where the evidence was to back up that or any other percentage figure, Rhee agreed quite openly that it’s lacking.

[readon2 url="http://www.latimes.com/news/opinion/opinion-la/la-ol-michelle-rhee-teachers-20130416,0,4487460.story"]Continue reading...[/readon2]

Ohio Teacher Evaluation System: Dishonest, Unrealistic, and Not Fully Supported by Academic Research

A great article that appeared on Dailykos a few days ago

I've spent the past three days at an OTES (Ohio Teacher Evaluation System) training. This system is being phased in over the next two years, and will serve as the vehicle by which all teachers in Ohio are evaluated. The workshop culminates with a post-assessment, taken some time after the classes end, resulting in licensure and the ability to evaluate instructional staff. OTES is described by ODE as a system that will
provide educators with a richer and more detailed view of their performance, with a focus on specific strengths and opportunities for improvement.

I talked to a number of administrators and teachers who had already taken the training before attending. Without exception, they were all struck by the rigidity of the rubric. I agree, but there's more here. Any system that wields so much power must be realistic, honest, and rooted in the consensus of academic research. The OTES rubric fails this basic test.

Words Matter
Check out the Ohio Standards for the Teaching Profession (starting on page 16) approved in October of 2005. Now look at the OTES rubric. The first thing you will notice is that the OTES rubric has four levels, and that the Ohio Standards only have three. I think it's fair to say that the Ohio Standards did not include the lowest level. (The document says as much.) The top three levels of the OTES Rubric align with the three levels of the Ohio Standards. The snag? The terminology used in the OTES rubric. Proficient has been replaced by Developing, Accomplished by Proficient, and Distinguished by Accomplished. Each level has been relegated!

One might argue that this doesn't matter. But, it does. Teacher evaluations are public record. School performance, or at least the percentage of teachers that fall into each category, will be published. Newspapers will ask for names of teachers and their ratings. And, as we will see as I unpack the rubric in greater detail, the very best teachers are likely to fall into the Proficient category. What's the one relationship between public education and the word Proficient already burned into the minds of parents? The minimal level of performance required to pass the Ohio Graduation Test. Dishonest.

[readon2 url="http://www.dailykos.com/story/2012/11/15/1161894/-Ohio-Teacher-Evaluation-System-Dishonest-Unrealistic-and-Not-Fully-Supported-by-Academic-Researc"]Continue reading...[/readon2]