2005

Is Ohio ready for computer testing?

The Cincinnati Enquirer has a report on how Ohio schools are not going to be ready for the new online PARCC tests that are scheduled to be deployed next year.

Ohio public schools appear to be far short of having enough computers to have all their students take new state-mandated tests within a four-week period beginning in the 2014-15 school year.

“With all the reductions in education funds over the last several years and the downturn in the economy, districts have struggled to be able to bring their (computer technology) up to the level that would be needed for this,” said Barbara Shaner, associate executive director of the Ohio Association of School Business Officials.

Districts could seek state permission to deliver the new tests on paper if they can’t round up enough computers, tablets and gadgets to go around, Jim Wright, director of curriculum and assessment for the Ohio Department of Education, said. A student taking a paper test could be at a disadvantage, though. While the paper tests won’t have substantially different questions, a student taking the test online will have the benefit of audio and visual prompts as well as online tasks that show their work on computer, said Chad Colby, a spokesman for the Partnership for Assessment of Readiness for College and Careers.

The state really does need to step up and help districts fund this costly mandate that has been foisted upon them. Added to this, the computer industry is going through significant changes as more and more people move away from the traditional desktops and laptops in favor of the simpler more portable tablets. School districts could find themselves having to make costly investments again in the near future if they pick the wrong technologies.

The article makes note of the possibility of paper based test takers being at a possible disadvantage over those taking the computer based tests. There has been a significant amount of research over the years on this, and the results seem to indicate the opposite effect - that computer based test takers score lower than paper based tests.

The comparability of test scores based on online versus paper testing has been studied for more than 20 years. Reviews of the comparability literature research were reported by Mazzeo and Harvey (1988), who reported mixed results, and Drasgow (1993), who concluded that there were essentially no differences in examinee scores by mode-of-administration for power tests. Paek (2005) provided a summary of more recent comparability research and concluded that, in general, computer and paper versions of traditional multiple-choice tests are comparable across grades and academic subjects. However, when tests are timed, differential speededness can lead to mode effects. For example, a recent study by Ito and Sykes (2004) reported significantly lower performance on timed web-based norm-referenced tests at grades 4-12 compared with paper versions. These differences seemed to occur because students needed more time on the web-based test than they did on the paper test. Pommerich (2004) reported evidence of mode differences due to differential speededness in tests given at grades 11 and 12, but in her study online performance on questions near the end of several tests was higher than paper performance on these same items. She hypothesized that students who are rushed for time might actually benefit from testing online because the computer makes it easier to respond and move quickly from item to item.

A number of studies have suggested that no mode differences can be expected when individual test items can be presented within a single screen (Poggio, Glassnapp, Yang, & Poggio, 2005; Hetter, Segall & Bloxom, 1997; Bergstrom, 1992; Spray, Ackerman, Reckase, & Carlson, 1989). However, when items are associated with text that requires scrolling, such as is typically the case with reading tests, studies have indicated lower performance for students testing online (O’Malley, 2005; Pommerich, 2004; Bridgeman, Lennon, & Jackenthal, 2003; Choi & Tinkler, 2002; Bergstrom, 1992)

A teacher schools the Dispatch

When we read this article in the Dispatch by senior editor Joe Hallet, we were taken aback a little by how fawning it was, and how it seemd to suffer from quite a lot of selective amnesia. One Worthington school teacher thought so too, and forwarded to us his email to Mr. Hallet.

Joe

I read your column on Sunday and came across this line:

(Kasich) used Senate Bill 5 to take on public-employee unions, whose pay and benefit packages were growing unsustainable for taxpayers, in part because their local government and school officials had forgotten how to say no.

I don't think this is a fair characterization at all. The front page article ("Public, private compensation in the same ballpark.") in the Dispatch on Sunday demonstrated that public employee salaries and benefits are on par with those in the private sector. By your logic, combined with the article on pg 1 Sunday, you could say that private sector salaries have also grown unsustainable since private employees have a slightly richer salary and benefit package than public employees.

No discussion of the impact of public employee salaries on budgets is complete without pointing out that, since 2005, the income tax was cut 21%, the estate tax was eliminated, and the locally collected tangible personal property (TPP) tax was replaced with the state collected commercial activities tax (CAT). The income tax cut resulted in a sustained decrease at the state level in tax revenue which was evident even prior to the recession even with the addition of new CAT revenues. In 2005 total state revenue was $56.5 billion - by 2011 that had fallen to an estimated $50.5 billion which is $43.5 billion in 2005 dollars!

On the local level, governments and school districts were literally robbed of tax revenue by the state legislature's elimination of the TPP and estate tax. All of this has dealt a crushing blow to the ability of local governments and school districts to sustain services by starving them of local revenue as well as state aid.

These tax changes were heralded in 2005 as essential to economic growth in Ohio with the promise that the reforms would result in job growth in our state. I think the record on this score shows that tax reform in Ohio has not achieved the promised results and has instead put incredible stress on the state and local governments' ability to provide vital and expected services.

It is disingenuous to say that public salaries and benefits are unsustainable while ignoring the impact tax reform has had on Ohio's ability to fund its government. Furthermore, the front page of the Dispatch on the same day as your column refutes the notion that public salaries and benefits are out of line or unsustainable.

Mark Hill, Worthington school teacher
Columbus OH

We previousy wrote about this issue in an article titled "GUTTING EDUCATION FOR A CUP OF CHEAP COFFEE"

Mapping State Proficiency Standards Onto NAEP Scales

The National Assessment of Educational Progress (NAEP) has just published their report "Mapping State Proficiency Standards Onto NAEP Scales: Variation and Change in State Standards for Reading and Mathematics, 2005-2009"

This research looked at the following issues

How do states’ 2009 standards for proficient performance compare with one another when mapped onto the NAEP scale? There is wide variation among state proficiency standards.
Most states’ proficiency standards are at or below NAEP’s definition of Basic performance.

How do the 2009 NAEP scale equivalents of state standards compare with those estimated for 2007 and 2005? For those states that made substantive changes in their assessments between 2007 and 2009 most moved toward more rigorous standards as measured by NAEP.
For those states that made substantive changes in their assessments between 2005 and 2009, changes in the rigor of states’ standards as measured by NAEP were mixed but showed more decreases than increases in the rigor of their standards.

Does NAEP corroborate a state’s changes in the proportion of students meeting the state’s standard for proficiency from 2007 to 2009? From 2005 to 2009? Changes in the proportion of students meeting states’ standards for proficiency between 2007 and 2009 are not corroborated by the proportion of students meeting proficiency, as measured by NAEP, in at least half of the states in the comparison sample.
Results of comparisons between changes in the proportion of students meeting states’ standards for proficiency between 2005 and 2009 and the proportion of students meeting proficiency, as measured by NAEP, were mixed.

The full report can be found here (PDF). We've pulled out some of the graphs that show Ohio's performance vs the rest of the country for each of the 4th and 8th grade reading and math achievement levels.

4th grade reading

8th grade reading

4th grade math

8th grade math