proficiency

Nation's Report Card' Distracts From Real Concerns For Public Schools

Imagine you’re a parent of a seven-year-old who has just come home from school with her end-of-year report card. And the report card provides marks for only two subjects, and for children who are in grade-levels different from hers. Furthermore, there's nothing on the report card to indicate how well these children have been progressing throughout the year. There are no teacher comments, like "great participation in class" or "needs to turn in homework on time." And to top it off, the report gives a far harsher assessment of academic performance than reports you've gotten from other sources.

That's just the sort of "report card" that was handed to America yesterday in the form of the National Assessment of Education Progress. And while the NAEP is all well and good for what it is -- a biennial norm-referenced, diagnostic assessment of fourth and eighth graders in math and reading -- the results of the NAEP invariably get distorted into all kinds of completely unfounded "conclusions" about the state of America's public education system.

'Nation's Report Card" Is Not A Report Card

First off, let's be clear on what the NAEP results that we got yesterday actually entail. As Diane Ravitch explains, there are two different versions of NAEP: 1) the Main NAEP, which we got yesterday, given every other year in grades 4 and 8 to measure national and state achievement in reading and math based on guidelines that change from time to time; and 2) the Long-Term Trend NAEP given less frequently at ages 9, 13, and 17 to test reading and math on guidelines that have been tested since the early 1970s. (There are also occasional NAEPs given in other subjects.) So in other words, be very wary of anyone claiming to identify "long term trends" based on the Main NAEP. This week's release was not the "long term" assessment.

Second, let's keep in mind the NAEP's limits in measuring "achievement." NAEP reports results in terms of the percent of students attaining Advanced, Proficient, Basic, and Below Basic levels. What's usually reported out by the media is the "proficient and above" figure. After all, don't we want all children to be "proficient?" But what does that really mean? Proficiency as defined by NAEP is actually quite high, in fact, much higher than what most states require and higher than what other nations such as Sweden and Singapore follow.

Third, despite its namesake, NAEP doesn't really show "progress." Because NAEP is a norm-referenced test, its purpose is for comparison -- to see how many children fall above or below a "cut score." Repeated applications of NAEP provide periodic points of comparison of the percentages of students falling above and below the cut score, but does tracking that variance really show "progress?" Statisticians and researchers worth their salt would say no.

Finally, let's remember that NAEP proficiency levels have defined the targets that all states are to aim for according toto the No Child Left Behind legislation. This policy that has now been mostly scrapped, or at least significantly changed, due to the proficiency goals that have been called "unrealistic."

Does this mean that NAEP is useless. Of course not. As a diagnostic tool it certainly has its place. But as the National Center on Fair and Open Testing (FairTest) has concluded, "NAEP is better than many state tests but is still far from the 'gold standard' its proponents claim for it."

[readon2 url="http://ourfuture.org/blog-entry/2011114402/nations-report-card-distracts-real-concerns-public-schools"]Continue reading...[/readon2]

Mapping State Proficiency Standards Onto NAEP Scales

The National Assessment of Educational Progress (NAEP) has just published their report "Mapping State Proficiency Standards Onto NAEP Scales: Variation and Change in State Standards for Reading and Mathematics, 2005-2009"

This research looked at the following issues

How do states’ 2009 standards for proficient performance compare with one another when mapped onto the NAEP scale? There is wide variation among state proficiency standards.
Most states’ proficiency standards are at or below NAEP’s definition of Basic performance.

How do the 2009 NAEP scale equivalents of state standards compare with those estimated for 2007 and 2005? For those states that made substantive changes in their assessments between 2007 and 2009 most moved toward more rigorous standards as measured by NAEP.
For those states that made substantive changes in their assessments between 2005 and 2009, changes in the rigor of states’ standards as measured by NAEP were mixed but showed more decreases than increases in the rigor of their standards.

Does NAEP corroborate a state’s changes in the proportion of students meeting the state’s standard for proficiency from 2007 to 2009? From 2005 to 2009? Changes in the proportion of students meeting states’ standards for proficiency between 2007 and 2009 are not corroborated by the proportion of students meeting proficiency, as measured by NAEP, in at least half of the states in the comparison sample.
Results of comparisons between changes in the proportion of students meeting states’ standards for proficiency between 2005 and 2009 and the proportion of students meeting proficiency, as measured by NAEP, were mixed.

The full report can be found here (PDF). We've pulled out some of the graphs that show Ohio's performance vs the rest of the country for each of the 4th and 8th grade reading and math achievement levels.

4th grade reading

8th grade reading

4th grade math

8th grade math