The Senate Advisory Committee on Testing, appointed March 4th by Senate President Keith Faber (R-Celina), has completed its recommendations to improve state testing for next school year. The 30 member committee chaired by State Senator Peggy Lehner (R-Kettering) was created following the rocky rollout of the new state assessments in February. Teachers, parents, school leaders and policymakers serving on the committee were charged with providing advice to the Senate on how to improve state testing.
A written recommendation is currently being prepared. It will include the following components:
The new twice a year administration of tests that occurred this winter and spring should be scaled back to once a year and the tests should be shortened. The testing window should be closer to the end of the school year to provide more time for classroom instruction and less disruption in learning.
Accommodations for children with Individual Education Plans (IEPs) must be improved and more clearly communicated to parents and schools. Training must be provided for intervention specialists and paraprofessionals who assist students with IEPs.
Test results must be returned in a timely manner to benefit student instruction – although the group recognized that results from a writing test may not be able to be returned as quickly as the rest of the results.
Transparency – test questions and answers must be made available within a reasonable timeframe after the administration of the tests to ensure the tests are aligned to Ohio’s learning standards and that questions are developmentally appropriate for grade level.
Online testing is necessary and schools must plan to move in that direction; however, local schools must continue to have the option to administer paper/pencil tests for at least the next two school years. State funding for technology based on need should be considered.
A single technology platform is preferable for next year’s tests. Improvements in technology are needed to ensure smooth administration of the tests.
A “safe harbor” must be in place that allows results from this year’s tests to be reported but students, teachers or schools should not be penalized for results this year due to the transition to a new test and the concern that results may not accurately reflect a student’s achievement level.
A comprehensive communications plan must be developed to provide parents, teachers, school leaders and the general public with clearer information about the tests.
If the current vendors for state tests - PARCC (Partnership for Assessment of Readiness for College & Careers) for the math and English language arts assessments and AIR (American Institute for Research) for the science and social studies assessments will not make changes to the test for next year to accommodate these issues, the Ohio Department of Education must find a test vendor that will.
As we wait to see what the Ohio General Assembly presents as charter school reform, Steve Dyer has taken a 3 part look at why Ohio's charter schools just don't work.
You can read part one, part two and part three in full, but here's a brief synopsis of his findings.
...the amount of money going to worse-performing charters is more than $430 million, and if you include charters that perform the same, it's now more than $500 million that goes from the same or higher-performing districts to the same or worse-performing charters.
Charters do worse on the report card than districts with greater challenges. So that means that while charters' poor performance compared with districts overall can perhaps be explained by more challenging populations, districts with greater challenges are doing better. So charters are not, on the whole, doing a better job serving our state's most challenging students than districts with more challenges than the charter faces.
Ohio’s charter schools perform worse overall than all local public school buildings, including those in the Big 8 urban districts (Akron, Canton, Cincinnati, Cleveland, Columbus, Dayton, Toledo and Youngstown) – the areas where charters were supposed to offer better alternatives. Charters register lower percentages of As and Bs while having higher percentages of Ds and Fs than local public schools.
Remember that 45% of charter school students do not come from Ohio's urban core -- one of the myths dealt with in yesterday's post.
It is exceedingly unfair cherry picking for charter schools to take money and children from every district in the state, yet only have their performance compared with the most historically struggling schools in the state.
Dyer looks at a whole range of metrics to compare charter schools to traditional public schools, including sub groups. Charter schools simply do not work in Ohio. A key test for any charter school reform effort will be if they close down the bulk of the failing schools and create an environment where only quality charter schools can develop. We're still highly skeptical.
Here is the Legislatvie Service Commsions comparrsion document for HB64 - the budget bill. We have only included the Education section here.
With this document you are able to compare the Governor's initial budget proposal with what the Ohio house has now proposed.
We have spent 100's of hours talking to 100's of educators over the last 2 years about standardized testing, and the associated problems. We're under no illusions that the current testing crisis caused by reformers is complex, layered at the local, state and federal levels, and has stakeholders who hold divergent views.
In a perfect world, testing designed to evaluate an education system ought to be grade-span, and only require a sampling of students, akin to NAEP. It is over-kill to test every student in every subject, every year. However, we do not live in a perfect world, therefore Join the Future has identified the following problems, and provides the following recommendations as a way forward
The technology deployment for this years testing has been a predictable debacle. Perhaps half of Ohio's schools felt so lacking in technological capacity that they performed the tests on paper. For the remainder, most lacked sufficient technology to perform the testing in a short period of time, instead having to schedule weeks of testing that caused massive classroom disruption.
The testing software itself was a mess. Technology and media specialists in schools spent hundreds of hours, making thousands of tech support calls to testing companies trying to resolve technical problems with the software. Teachers were provided with inadequate training. Some students had to re-take the test up to 3 times because of technical difficulties. One can only imagine the stress a 3rd grader must have felt.
There was a lack of hardware compatibility between PARCC and AIR testing platforms. School IT professionals spent countless hours re-imaging computers to switch between testing use and general use. School libraries and computer labs in schools throughout the state have been unusable for most of the time tests have been taking place.
1. Schools must be provided with the flexibility to offer paper based testing for at least 3 years.
2. Any testing solution must be platform agnostic and work on desktops, laptops, Tablets and Chromebooks alike, without need for special software or re-imaging.
3. Educators must be provided with adequate time to train on the platform.
4. Students must be given adequate practice time with the testing platform. They don't need to be taking 2 tests in one - one on the intended content and the other on how to navigate a complex software product.
5. Schools must be provided with adequate resources to purchase compatible technology capable of testing the entire student population within one week.
6. Schools must be provided with adequate resources to ensure they have bandwidth to perform the testing seamlessly.
Problem: Testing Time
Testing cannot consume weeks and weeks of instruction time, and be repeated. Schools spent almost the entire month of February in a state of disruption, made worse by inclement weather and snow days, only to have to perform yet more tests in April. If schools had lost the entire month of February to 4 feet of snow it would have been viewed as a statewide crisis, yet that is the practical effect testing had on schools this year.
Testing Time Recommendations
1. Tests need to be shorter.
2. Tests must be align with a typical classroom schedule, e.g. if a typical classroom period is 30 minutes, the tests cannot be 40 minutes. That disrupts two periods and the subsequent schedule all day.
3. Schools need enough infrastructure to perform the tests in a single week, just once a year.
Problem: High Stakes
Attaching high stakes to a new and unstable testing regime is unfair and has caused lasting damage to morale and motivation. Stories of students getting physically and mentally sick with anxiety are common place. Educators have faced a high stakes evaluation system under constant change. Furthermore, it is becoming apparent to most students that the tests don't matter. Outside of 3rd grade and HS, the results of standardized testing has no impact on students, but deeply affects educators and their schools. This mis-alignment of stakes is a huge problem.
High Stakes Recommendations
1. Implement at least a 3 year moratorium on high stakes consequences until the system matures and proves itself.
2. Stakes must be aligned. It is simply not fair or appropriate for educators to face career consequences for tests students are taking, where the students themselves have no stake in the result. We recommend the elimination of student tests scores as a means to measure teacher quality at the individual level.
Problem: Inappropriate Test Content
Testing has been widely criticized for being age inappropriate. Many test questions were deployed at reading ages much higher than the target age range of the test takers. Often times, especially in Math, the root question itself might be appropriate, but couched in language far too complex for the student to comprehend. Not only does this need to cease, it needs to be investigated.
Reports that questions have spread across multiple pages or screens, forcing a student to remember large blocks of complex text, while navigating back and forth have been widespread. Students should be evaluated on content knowledge not their memories or ability to navigate complex software products
Test Content Recommendations
1. Tests must be age appropriate in content and reading level.
2. Questions must be easily navigable, or compact so that students can concentrate on the answer and not a page flipping ordeal.
3. The State should have a review panel to vet test questions in advance, with the ability to veto inappropriate questions.
4. An element of a testing companies contract should be tied to the appropriateness and accuracy of the tests they provide.
5. Test questions and their answers should be made public within 4 years.
Problem: Useful Results
Parents, students and schools should not have to wait 6 months to receive results, especially results performed electronically. By the time this years results are made available, many students will have moved classes, schools, districts and maybe even out of state.
Results need to provide more than a meaningless score. Educators would benefit from results that include identifying a students strength and weaknesses.
Furthermore, who is grading the tests needs to be closely examined. There has been widespread reports of craigslist advertisements for tests scorers, who require little training or knowledge.
Test Result Recommendations
1. Results should be available before the end of the school year, with a portion of a testing companies contract tied to timely delivery of results.
2. Results should include a description of where a student excelled and performed poorly.
3. All test answers should be publicly available within 4 years, with a portion of a testing companies contract tied to accuracy.
4. A system of qualifying test scorers, and evaluating their accuracy is needed.
Evidently the problems are significant and will require time to implement. Policy makers must resist the calls from corporate reformers for quick fixes. It has been their influence and misguided advice that has navigated us in to this crisis. If a test is being performed in a school the first questions ought to be how does this benefit the students education? Is the benefit of the test proportional to any of its impacts on student education?
Parents, students and educators alike would like to see a reduction in the total volume of testing, with the testing that remains primarily aimed at improving student learning.
Ohio's charter schools are notorious for poor pay and conditions, often treating educators as temporary help. In recent months a number of Ohio's charter school teachers had begun to either contemplate, organize or vote to unionize in order to be able to collectively bargain better conditions and benefits.
It seems the high priced lobbyists that charter operators hired to ward of any meaningful reforms have been busy with a side project to put a stop to employees being treated like professionals. Buried in the Ohio House's substitute budget bill is this little nugget
Excludes community school employees from membership in the State Teachers Retirement System and School Employees Retirement System if the employees elect to organize under federal collective bargaining laws and the community school is subject to those laws.
It is unlikely that employees would organize knowing they would lose their pension benefits - one of the few benefits charters are forced to provide. This is a pretty ugly coercive piece of legislation that serves only to enrich for-profit charter operators at the expense of educators. If ever you needed proof of what Ohio's charter experiment is really about, this is it.