Questions Every Public School Parent Should Be Asking

A reader emailed us a link to this article. We thought the questions raised really do deserve answers. Parents deserve real answers to these questions…

—> For principals:
1. How many standardized tests does my child have to take this year?
2. Where do these tests originate?
3. What is the specific academic purpose for each one?
4. How will these tests affect my child’s academic future or standing?
5. For each test, does the teacher see individual student results and have a chance to adjust individual instruction to help each student?
6. Who sees the scores, where will they be recorded, and for what purpose?
7. Do the scores become part of my child’s record?
8. Who in the district instructed you to give these tests?
9. How much time does the administration of each test take?
10. What training was provided to staff and how much time did that encompass?
11. Explain how costs for each test are used in a way that is in the best interest of your students.
12. How many staff are being taken away from teaching or counseling duties to administer each test?

—> For school superintendents:
1. Identify by name and frequency each standardized test your district requires in each grade.
2. Explain where these tests originate and, for each, explain its specific academic purpose and the year it started.

—> For school board attorneys:
1. Explain your district’s policy on opting out of/refusing standardized tests and cite its legal foundation.

—> For school board members:
1. How do you view the academic purposes for standardized testing?
2. Are you familiar with all the standardized tests your district requires, and their academic purposes?
3. Are you willing to initiate a parent/teacher review of the use of testing in your district?
4. Is this test mandated by the state or is this a district choice?
5. What are the costs associated with this test per student and to the district per grade level?
6. Will you provide the district’s opt-out policy to all parents in writing?

Share your comment:

10,000 educators say "enough is enough" with the high stakes testing

We've been fortunate enough to attend the last 3 NEA Representative Assemblies, where over 10,000 NEA members from all over the country gather to present and vote upon the Association's business, policies, resolutions, and ot elect its leaders.

This year, in Denver, after saying goodbye to ougoing NEA President Dennis Van Roekel, delegates elected an all-minority, all-female, officer team to lead it.

NEA Vice President Elect Becky Pringle (left), President Elect Lily Eskelsen García (center) and Secretary Treasurer Elect Princess Moss (right).

As historic as this is, delegates also broke with the past on the subject of Arne Duncan, the head of the Department of Education. At previous Representative Assemblies (RAs), delegates had debated, and voted upon, calling for his resignation. Those attempts were narrowly voted down.

This year, citing the U.S. Department of Education’s “failed education agenda focused on more high-stakes testing, grading, and pitting public school students against each other based on test scores, and for continuing to promote policies and decisions that undermine public schools and colleges, the teaching education professionals, and education unions.” - delegates voted to call for his resignation.

This shift is continued evidence of educators frustrations with the growing reliance on testing, and its negative impacts on students and teachers alike.

President-elect Eskelsen García told delegates, “For us, one thing is clear. Before anything is going to get better: It’s the Testing, Stupid. Better yet, it’s the stupid testing.” She called the accountability systems that Duncan has pushed “phony” and harmful to students, teachers and the teaching profession.

Delegates went further on the issues of test misuse, with their very first piece of business - overwhelmingly approving a "Campaign Against Toxic Testing".

NEA will conduct a comprehensive campaign to end the high stakes use of standardized tests, to sharply reduce the amount of student and instructional time consumed by tests, and to implement more effective and responsible forms of assessment and accountability.

You can read the full text here.

Speaking of efforts from the likes of moviemakers, billionaire brothers, and conservative politicians who continue to pursue policies harmful to students, NEA President-Elect Lily Eskelsen García addressed Representative Assembly, “People who don’t know what they’re talking about are talking about increasing the use of commercial standardized tests in high-stakes decisions about students and about educators ... when all the evidence that can be gathered shows that it is corrupting what it means to teach and what it means to learn”.

One thing is clear, educators have had enough, and policy makers should expect to see increased resistance to high stakes testing run amok.

Share your comment:

Poll - Ohioans Reject Recent Education "Reforms"

16 Superintendents, along with community members, joined together and commissioned a poll to gauge the publics perception on a range of education policy topics. The results should give corporate reformers serious pause. You can see the entire survey results here.

Despite the focus on teacher quality by politicians and corporate reformers, citizens continue to see that school funding is the biggest problem schools face. Indeed, they view teacher quality as the least problematic area to be concerned with

Citizens reject other corporate and political reforms too. When asked what the most important indicator of school quality is, the newly revamped report card comes in dead last. Citizens just don't find it relevant.

Further blows are cast, as citizens continue to reject the use of high stakes testing, recognizing that it is not healthy for students and not appropriate for evaluating the quality of teachers.

Profiteers have little to cheer about as citizens reject the use of tax dollars for being used to support charter schools and private schools.

Finally, it is very clear who people view as the real problem - politicians.

Share your comment:

Ohio charter high schools lowest ranked in state

Via 10th Period US News & World Report's rankings of the country's best high schools provides another sobering picture of Ohio's 16-year-old charter school experiment.

Not a single one of Ohio's 97 charter high schools outperformed the state average in reading and math. Meanwhile, 167 traditional public high schools did that. So that's an amazing 167-0 score. Not a single charter high school in Ohio rates in the top 115 high schools in the state -- the lowest rank given by U.S. News.

Share your comment:

Short-Changed: How Poor-Performing Charters Cost All Ohio Kids

Innovation Ohio has published a new report, titled "Short-Changed: How Poor-Performing Charters Cost All Ohio Kids", their principal findings were these:
  • The flawed way in which charter schools are funded in Ohio will result in traditional school students receiving, on average, 6.6% less state funding this year (around $256 per pupil) than the state itself says they need;
  • The table below, from the report, highlights this issue

  • Well over half of all state money sent to charters goes to schools that perform worse than traditional public schools on one or both of the state’s two major performance measurements (the Report Card and the Performance Index);
  • Below is just a selection of some of the traditional public schools that are losing vast amounts of money to lower performing charter schools

  • A number of high-performing suburban school districts are now among the biggest losers in per pupil funding;
  • On average, Ohio charters spend about double (23.5% vs. 13%) on non-instructional administrative costs than do traditional public schools;
  • 53% of children transferring into charter schools are leaving districts that perform better;
  • In 384 out of Ohio’s 612 school districts, every dime “lost” to charters went to schools whose overall performance was worse on the State Report Card.

We encourage you to read and share the entire report, found below

IO Report Short Changed

Share your comment:

Same Teachers, Similar Students, Similar Tests, Different Answers

Via Vamboozled

One of my favorite studies to date about VAMs was conducted by John Papay, an economist once at Harvard and now at Brown University. In the study titled “Different Tests, Different Answers: The Stability of Teacher Value-Added Estimates Across Outcome Measures” published in 2009 by the 3rd best and most reputable peer-reviewed journal, American Educational Research Journal, Papay presents evidence that different yet similar tests (i.e., similar on content, and similar on when the tests were administered to similar sets of students) do not provide similar answers about teachers’ value-added performance. This is an issue with validity, in that, if a test is measuring the same things for the same folks at the same times, similar-to-the-same results should be realized. But they are not. Papay, rather, found moderate-sized rank correlations, ranging from r=0.15 to r=0.58, among the value-added estimates derived from the different tests.

Recently released, yet another study (albeit not yet peer-reviewed) has found similar results…potentially solidifying this finding further into our understandings about VAMs and their issues, particularly in terms of validity (or truth in VAM-based results). This study on “Comparing Estimates of Teacher Value-Added Based on Criterion- and Norm-Referenced Tests” released by the U.S. Department of Education and conducted by four researchers representing Notre Dame University, Basis Policy Research, and American Institutes of Research, provides evidence, again, that estimates of teacher value-added as based on different yet similar tests (i.e., in this case a criterion-referenced state assessment and a widely used norm-referenced test given in the same subject around the same time) yielded moderately correlated estimates of teacher-level value added, yet again.

If we had confidence in the validity of the inferences based on value-added measures, these correlations (or more simply put “relationships”) should be much higher than what they found, similar to what Papay found, in the range of 0.44 to 0.65. While the ideal correlation coefficient is a, in this case, r=+1.0, that is very rarely achieved. But for the purposes for which teacher-level value-added is currently being used, correlations above r=+.70/r=+.80 would (and should) be most desired, and possibly required before high-stakes decisions about teachers are to be made as based on these data.

In addition, researchers in this study found that on average, only 33.3% of teachers’ estimates from both sets of value-added estimates positioned them in the same range of scores (using quintiles or ranges including 20% bands of width) on both tests in the same school year. This too has implications for validity in that, again, teachers or teachers’ value-added estimates should fall in the same ranges, if and when using similar tests, if any valid inferences are to be made using value-added estimates.

Share your comment:

Get Involved

 Email*
 First Name
 Last Name
 School District

Search