High stakes failure

It might be becoming apparent to any rational observer that high stakes corporate education policies are failing catastrophically. Where once various data and tests were used to inform educators and provide diagnostic feedback, they are increasingly being used to rank, grade, and even punish.

This is leading to the inevitable behaviors that are always present when such systems are created - whether it was in the world of energy companies such as Enron, or other accounting scandals including those affecting Tyco International, Adelphia, Peregrine Systems and WorldCom, to the more recent scandals involving Lehman Brothers, JPM or Barclays bank.

Here's another example, in news from Pennsylvania

After authorities imposed unprecedented security measures on the 2012 statewide exams, test scores tumbled across Pennsylvania, The Inquirer has learned.

At some schools, Pennsylvania Secretary of Education Ronald Tomalis said, the drops are "noticeable" - 25 percent or more.

In some school systems, investigators have found evidence of outright doctoring of previous years' tests - and systemic fraud that took place across multiple grades and subjects.

In Philadelphia and elsewhere, some educators have already confessed to cheating, and investigators have found violations ranging from "overcoaching" to pausing a test to reteach material covered in the exam, according to people familiar with the investigations.

When trillions of dollars of the world's money is at stake, investing in tight oversight and regulation is imperative, but when it comes to evaluating the progress of a 3rd grader, do we really want to spend valuable education dollars measuring the measurers?

The question becomes even more pertinent when one considers that the the efficacy of many of the measures is questionable at best. Article after article, study after study, places significant questions at the feet of value add proponents, and now a new study even places questions at the feet of the tests themselves

Now, in studies that threaten to shake the foundation of high-stakes test-based accountability, Mr. Stroup and two other researchers said they believe they have found the reason: a glitch embedded in the DNA of the state exams that, as a result of a statistical method used to assemble them, suggests they are virtually useless at measuring the effects of classroom instruction.

Pearson, which has a five-year, $468 million contract to create the state’s tests through 2015, uses “item response theory” to devise standardized exams, as other testing companies do. Using I.R.T., developers select questions based on a model that correlates students’ ability with the probability that they will get a question right.

That produces a test that Mr. Stroup said is more sensitive to how it ranks students than to measuring what they have learned. That design flaw also explains why Richardson students’ scores on the previous year’s TAKS test were a better predictor of performance on the next year’s TAKS test than the benchmark exams were, he said. The benchmark exams were developed by the district, the TAKS by the testing company.

We have built a high stakes system on questionable tests, measured using questionable statistical models, subject to gaming and cheating, and further goosed by the scrubbing of other student data. We've seen widespread evidence of it in New York, California, Washington DC, Georgia, Tennessee, Pennsylvania, and now Ohio.

Policymakers are either going to have to spend more and more money developing better tests, better models, tighter security and more bureaucratic data handling policies, or return to thinking about the core mission of providing a quality education to all students. Either way, when you have reached the point where the State Superintendent talks of criminalizing the corporate education system, things have obviously gone seriously awry.

State Superintendent Stan Heffner, who leads the department, has launched his own investigation and has said the probe could lead to criminal charges against educators who committed fraud.

Public education - a middle class bargain

The USDA has just released their annual report (issued annually since 1960), "Expenditures on Children by Families". finding that:

  • A middle-income family with a child born in 2011 can expect to spend about $234,900 ($295,560 if projected inflation costs are factored in*) for food, shelter, and other necessities to raise that child over the next 17 years.
  • For the year 2011, annual child-rearing expenses per child for a middle-income, two-parent family ranged from $12,290 to $14,320, depending on the age of the child.
  • A family earning less than $59,410 per year can expect to spend a total of $169,080 (in 2011 dollars) on a child from birth through high school.
  • Similarly, middle-income parents with an income between $59,410 and $102,870 can expect to spend $234,900.
  • A family earning more than $102,870 can expect to spend $389,670.

For middle-income families, housing costs are the single largest expenditure on a child, averaging $70,560 or 30 percent of the total cost over 17 years. Child care and education (for those incurring these expenses) and food were the next two largest expenses, accounting for 18 and 16 percent of the total cost over 17 years. These estimates do not include costs associated with pregnancy or the cost of a college education or education beyond high school.

Child care and education expenses consist of day care tuition and supplies; baby-sitting; and elementary and high school tuition, books, fees, and supplies. Books, fees, and supplies may be for private or public schools. However, according to the report, child care and education was the only budgetary component for which about half of all households reported no expenditure.

Without a free public education, the educational expense of raising a child would be the number 1 expense by far. Consider that in Ohio, the per student public school cost is ~$10,000. That would cost the typical 2 child family $20,000 per year, for a total of ~ $260,000 for the entire K-12 education - more than the total expense the USDA reports for raising a child!

It's hard to imagine a greater bargain that that.

Here's a look at how costs have changed since 1960

Expenditures on Children by Families, 2011

Charters spend more on admin, less on class

We observed that SB316 delayed, by 12 months, the requirement passed in the budget to rank schools based upon their classroom spending. As was noted at the time, this was an ill conceived, ill considered plan. As that plan now hits the slow track, research emerges that charter schools spend less in the classroom than traditional public schools.

One of the most frequent criticisms put to traditional public schools is that they waste money on administrative bloat, instead of channeling more funding where it belongs—the classroom. A much leaner and classroom-centered model, some say, can be found in charter schools, because of their relative freedom from stifling bureaucracy.

A new study, however, concludes that this hypothesis has it exactly wrong.

The study, released by the National Center for the Study of Privatization in Education, at Teachers College, Columbia University, examines school spending in Michigan and concludes that charter schools spend more per-pupil on administration and less on instruction than traditional public schools, even when controlling for enrollment, student populations served, and other factors.

Researchers David Arsen, of Michigan State University, and Yongmei Ni, of the University of Utah, found that charters spend $774 more per pupil on administration, and $1,140 less on instruction, than do traditional publics. To come up with their estimates, the authors analyze the level and source of funding for charters and traditional publics, and how they spend money, breaking it out by function. They then use a statistical method known as regression analysis to control for factors that could skew their comparisons of spending on administration and instruction in various schools.

Of the extra $774 that charters devote to administration, $506 went to general administrative services, such as the costs of charter school boards, or the fees of the organizations managing the school.

While Arsen and Ni don't examine in depth the causes of charters' relatively low instructional spending, they speculate that a couple factors could be at work. An obvious one is that more than 80 percent of traditional public schools' spending goes to personnel costs, mostly salaries and benefits—which would presumably drive instructional costs up. Charters, on average, pay lower salaries for teachers with similiar credentials to those hired by traditional publics, and also employ a less experienced and less costly teaching force, the authors say, which would keep instructional costs down.

HEre's the paper in question for your review.

Is Administration Leaner in Charter Schools? Resource Allocation in Charter and Traditional Public Schools

Suspicious test scores

Here at JTF, we've been very quick to point out instances of cheating, either isolated, or systemic, as a quick search of our archives or twitter feeds will show. As public education is driven ever more into corporate types of management and measurement, coupled with high stakes tied to test scores, it should surprise no one that corporate types of behavior emerge - think Enron, Arthur Anderson, World Com, MF Global Holdings.

It is with that backdrop we turn to an investigative piece by the Dayton Daily News (DDN) in conjunction with the Atlanta Journal Constitution (AJC), titled "Suspect test scores found across Ohio schools".

Steep spikes and drops on standardized test scores, a pattern that has indicated cheating in Atlanta and other cities across the nation, have occurred in hundreds of school districts and charter schools across Ohio in the past seven years, a Dayton Daily News analysis found.

The analysis does not prove cheating has occurred in Ohio. But interviews and documents show that state officials do not employ vigorous statistical analyses to catch possible cheating, discipline only about a dozen teachers a year and direct Ohio’s test vendor to spend just $17,540 on analyzing suspicious scores out of its $39 million annual testing contract.

It's a weak piece that could be used and sensationalized by many, and the paper has come under almost instant withering criticism for it's approach.

One of the researchers involved in analyzing data for USA today's ground breaking cheating series took a look at the DDN analysis

Given my past role in reviewing data and methods used for detecting systematic cheating, I was delighted to have the opportunity a week ago to review Ohio assessment data that was being used as part of a national study released today by The Atlanta Constitution-Journal and affiliated Cox newspapers. My review, however, yielded serious concerns about the data used, the methods of analysis employed, and the conclusions drawn.
In short, here are some of my concerns about the methods:
  • As noted, the analysis is based on school-level data and not individual student-level data. Accordingly, it was not possible to ensure that the same students were in the group in both years.
  • The analysis of irregular jumps in test scores should have been coupled with irregularities in erasure data where this data was available.
  • The analysis by Cox generates predicted values for schools, but this does not incorporate demographic characteristics of the student population.
  • The limited details available on the study methods made it impossible to replicate and verify what the journalists were doing. Further, the rationale was unclear for some of the steps they took.

He wasn't the only expert to consider the DDN findings. Stephen Dyer, former newspaper reporter, architect of Ohio's prematurely abandoned evidence based model, and think tank fellow had this to say, after discussing similar analytical shortcomings as pointed out above

If you're going to write a story that suggests massive, statewide (and in AJC's case, national) cheating on standardized tests, you'd better be prepared to name the offenders and feel solid enough in your methodology to refute the state's education agency and largest teachers union, both of whom knocked the papers' methods. If you have to spend a large chunk of your story having competing experts defend and knock your statistical analysis, you need to re-do the analysis. Though it showed integrity for the paper to allow those critical comments in the story.

As a former reporter, I can say these issues would invariably pop up before big stories ran. Sometimes, it means delaying your story for a day or two, or in a few cases, never run them at all. As a journalist, you, as a general rule, cannot spend any time in your story defending your story. If you have to, it means you don't have it nailed down yet; it needs more time in the oven.

The DDN spend almost the entirety of their story defending their story.

Greg Mild, over at Plunderbund has an even harsher response, and points out some great absurdities of the DDN analysis

Furthermore, note that the “2,600 improbable changes” include spikes and drops in test results. These journalists are putting out this theory of irregularities and cheating by schools based on numbers that include falling scores! Right, because so many educators are interested in risking their careers by encouraging children to change their scores to incorrect answers to suffer a significant DROP in their test scores. Yet those numbers are touted by these “journalists” in their sweeping accusations of improbable scores and cheating.

We continue to believe that cheating is totally unacceptable and ought to be exposed when and where found, but the Dayton Daily News story, as they point out themselves, does not come close to demonstrating what they seem to want to sensationalize - widespread cheating, Atlanta style.

As we begin to rely more and more upon student test scores to measure schools and teachers, suspicions are going to grow, a few might be borne out, but many will be baseless - but each accusation serves to undermine public education and people's trust in it. It's another unintended failing of the corporate education reform schemes we're currently pursuing.

The Truth About Teachers

Via American Society Today

Myth: “Teachers are overpaid.”

  • According to the report, "What's It Worth: The Economic Value of College Majors" from the Georgetown University Center on Education and the Workforce funded by the Gates and Lumina Foundations, Education majors earned the least for all college majors among 15 sector groupings.
  • According to a 2008 report from the Organization for Economic Cooperation and Development (OECD), American primary-school educators spend 1,913 hours working a year including hours teachers spend on work at home and outside of the classroom. Data from a Labor Department survey that same year showed that the average full-time employee in the United States worked 1,932 hours spread over 48 weeks. This statistic shows that teachers work about the same number of hours as the average worker in the United States. This statistic refutes the argument that teachers should be paid considerably less than other workers because "teachers only work 9 months of the year."

Source: http://cew.georgetown.edu/whatsitworth/
More Info: Teachers Work the Same Number of Hours as Average U.S. Worker
More Info: US Teachers Work Longer Hours Than Anywhere In The World, While Earning Less

Check out American Society Today for more Myths vs Facts, or follow them on Twitter.

Teachers Work the Same Number of Hours as Average U.S. Worker

One of the often repeated myths is that teachers don't do a full years worth of work. It's not a surprise to teachers, but let's set the record straight.

via American Society Today

According to a 2008 report from the Organization for Economic Cooperation and Development (OECD), American primary-school educators spend 1,913 hours working a year including hours teachers spend on work at home and outside of the classroom. Data from a Labor Department survey that same year showed that the average full-time employee in the United States worked 1,932 hours spread over 48 weeks.

The OECD reported that primary-school educators spent 1,097 hours a year teaching in the classroom--the most of any of the 27 members nations tracked.

Proving that teachers work the same number of hours as the average worker in the United States.

Check out the link for graphs
Link to OECD report: http://www.oecd.org/dataoecd/23/46/41284038.pdf
Link to Wall Street Journal Article: "U.S. Teachers' Hours Among World's Longest"