Get Updates via Email

Will Value-Added Reinforce The Walls Of The Egg-Crate School?

via the Shanker Institute

Academic scholars are often dismayed when policymakers pass laws that disregard or misinterpret their research findings. The use of value-added methods (VAMS) in education policy is a case in point.

About a decade ago, researchers reported that teachers are the most important school-level factor in students’ learning, and that that their effectiveness varies widely within schools (McCaffrey, Koretz, Lockwood, & Hamilton 2004; Rivkin, Hanushek, & Kain 2005; Rockoff 2004). Many policymakers interpreted these findings to mean that teacher quality rests with the individual rather than the school and that, because some teachers are more effective than others, schools should concentrate on increasing their number of effective teachers.

Based on these assumptions, proponents of VAMS began to argue that schools could be improved substantially if they would only dismiss teachers with low VAMS ratings and replace them with teachers who have average or higher ratings (Hanushek 2009). Although panels of scholars warned against using VAMS to make high-stakes decisions because of their statistical limitations (American Statistical Association, 2014; National Research Council & National Academy of Education, 2010), policymakers in many states and districts moved quickly to do just that, requiring that VAMS scores be used as a substantial component in teacher evaluation.

While researchers continue to analyze and improve VAMS models, it is important to step back and consider a prior set of questions:

  • Does the wide variation in teachers’ effectiveness within schools simply mean that some teachers are inherently better than others, or is there a more complex and promising explanation of this finding?
  • Is the strategy of augmenting human capital one teacher at a time likely to pay off for students? Or will relying on VAMS for teacher evaluations have unintended consequences that interfere with a school’s collective efforts to improve?

In this column, I bring an organizational perspective to the prospect of using VAMS to improve teacher quality. I suggest why, in addition to VAMS’ methodological limitations, reformers should be very cautious about relying on VAMS to make decisions that have important consequences for both teachers and their students.

Why Is There Variation In Teacher Effectiveness Within Schools?

In his classic analysis, “Social Capital in the Creation of Human Capital,” James Coleman (1988) argues that individuals’ human capital is transformed for the benefit of the organization by social capital, which “inheres in the structure of relations between actors and among actors” (p. S98). In education, this suggests that whatever level of human capital schools acquire through hiring can subsequently be developed through activities such as grade-level or subject-based teams of teachers, faculty committees, professional development, coaching, evaluation, and informal interactions. As teachers join together to solve problems and learn from one another, the school’s instructional capacity becomes greater than the sum of its parts.

Unfortunately, U.S. schools were never designed to benefit from social capital. In fact, over 40 years ago, historian David Tyack (1974) and sociologist Dan Lortie (1975) depicted the school as an organizational “egg crate,” where teachers work in the isolation of their classroom. In egg-crate schools, teachers focus on their own students largely to the exclusion of others, and they interact minimally and intermittently with their colleagues. As a result, their expertise remains locked within their classroom (Darling-Hammond 2001; Hargreaves & Fullan 2012; Johnson 1990; Kardos & Johnson 2007; Little 1990). This egg-crate model was efficient for managing the “factory school,” but did not serve students well; nor does it support the instructional needs of today's teachers.

Therefore, when teachers in the same school continue to work in isolation, they cannot benefit from the social capital that their school might provide. As a result, wide differences in teachers’ effectiveness persist over time.

The Evidence On School-Based Improvement Efforts

Studies have persuasively documented the benefits of systematic efforts to improve student learning through school-based improvement initiatives (Bryk, Sebring, Allensworth, Easton, & Luppescu 2010; McLaughlin & Talbert 2001; Rosenholtz 1989). Successful efforts increase norms of shared responsibility among teachers and create structures and opportunities for learning that promote interdependence—rather than independence—among them. That is social capital at work.

Many who dismiss the potential of social capital to improve schools doubt that teachers can improve significantly over time. However, a recent study by Kraft and Papay (2014)showed that teachers working in more favorable professional environments—as rated by a school’s staff—improved throughout the ten years they analyzed, while those who worked in environments judged to be less supportive stagnated. This and other studies challenge the conventional view that teachers reach a “plateau” in their development relatively early in their career (Rivkin et al. 2005). Creating a school context that supports teachers’ work can have important, lasting benefits for students and faculty throughout the school, whereas simply swapping out low-scoring for a high-scoring individuals without changing the context in which they work probably will not (Ladd & Sorenson 2014; Leana 2011; Lohr 2012).

Threats To School-Based Improvement Efforts

Not only are personnel polices based on VAMS scores likely to have, at best, modest effects on a school’s success, they may inadvertently undermine improvement efforts that are already underway. How so? Here, I suggest several possible unintended consequences of increasing reliance on VAMS (for a more detailed discussion see here).

(Continue reading at the Shanker Institute)

Just Another Day In The Wild Wild West of Ohio Charterland

The Dispatch has an article chronicling the almost comical doings of the Imagine Columbus Primary Academy charter school.

Let's go down the list of shenanigans
six members who resigned in recent weeks amid ongoing concerns about a high-cost building lease, teacher turnover and adequate services for students.
Almost the entire board of the school resigned because they had so little control over the management company, thanks to Ohio's charter school laws that allow operators more power than the boards. So what's this lease issue that caused these resignations?
Board members complained that the $700,000 annual lease consumes too much of the school’s $1.3 million annual budget. According to the Franklin County auditor’s office, the building, at 4656 Heaton Rd., is valued at $1,164,600.

Schoolhouse Finance purchased the building in 2005 for $1.5 million and made $2.6 million worth of improvements, according to the auditor’s website. SchoolHouse sold the building in 2006 for $5.2 million to a real-estate investment trust, then leased it back from the trust to charge rent to the school.
That's not a school, that's a real estate scam designed to bilk tax payers. And if you need further proof there's this
The school opened in the 2013-14 school year, just months after another Imagine School that occupied the same building under a different sponsor was closed for poor academic performance.
Again, Ohio's ridiculous charter laws allows these for profit companies to sponsor shop, even after they have been closed down for terrible performance.

Why are the schools so terrible? Because so much money is being siphoned away with dodgy lease deals there was none left to spend on paying teachers
Sinoff said board members also tried to increase teacher salaries, concerned that low salaries of $30,000 a year were causing high turnover.
No wonder we're starting to see a number of charter schools teachers consider joining teachers unions. So what of this schools sponsor, the North Central Ohio Educational Service Center?
In an interview with The Dispatch hours before threatening to close the school, the education service center’s Lahoski said he was unaware of the board departures but he, too, wanted answers about how operators were spending the money they receive.
Providing so little oversight, the sponsor was totally unaware of the resignations, and only when the situation reached comically high levels of absurdity are they even beginning to think about closing the school.

So where is the Ohio Department of Education in all of this?
Ohio Department of Education spokesman John Charlton said the agency oversees the educational service center and other charter-school sponsors and encourages them to hold school operators accountable.

“If a school isn’t doing what they are supposed to, we want the sponsor to step in,” he said.
Passing the buck and ignoring the problem as always. Meanwhile 150 young children are not getting the education they deserve and that tax payers have paid for. It's just another day in the wild wild west of Ohio Charterland

Even Scaled Back, PARCC Still Has Big Problems

The Fordham Foundation has a good blog post detailing the impacts that PARCCs proposal to scale back testing would have
The spring 2015 testing window for PARCC extended from mid-February to mid-May. That’s a long time. Of course, schools were not required to administer exams throughout the full testing window—they could use as few or as many of the days within the window as they needed. But for students, parents, and educators, the three-month window probably made “testing season” feel unusually long and drawn out. (In contrast, Ohio’s old state exams were administered over the course of roughly one month.) It also meant that testing interrupted classroom instruction for more of the school year—and earlier.

The reason for the long testing window was fairly simple: The assessment system included two exams. The first, the “performance-based assessment” (PBA), was given in February–March, and the second— the “end-of-year assessment” (EOY)—was given in April–May. The PBAs focused on students’ application of knowledge and skills (e.g., solving multi-step problems, explaining mathematical reasoning), while the EOYs focused more on traditional assessment items like reading comprehension or straightforward multiple-choice math problems. See for yourself the differences in the sample PARCC exams.

But starting in spring 2016, PARCC will be administered in one thirty-day testing window, occurring in the traditional testing period of April–May. Importantly, while the earlier PBA testing window is erased, some of PARCC’s performance-based tasks will be preserved in next year’s summative exam.
This is clearly movement in the right direction, but fixes just a fraction of the problems PARCC has, and has created. We detailed these problems, with our suggested fixes a few weeks ago. We identified 4 broad areas that needed to be addressed. Testing time, technology problems, test content problems, and high stakes.

Law makers cannot simply cut back on some tests and think the job is done. Are we still going to have schools struggle to deploy tech heavy testing solutions, on unstable software platforms with inadequate bandwidth? Combining the PBA and the EOY testing into a single test is likely to exacerbate the content problems, not alleviate them - and yet more change applied to a system under constant change isn't an appropriate environment to have mis-matched stakes for students, teachers and schools.

The Ohio legislature needs to address the full range of problems, not just the politically convenient changes PARCC has recently proposed in the face of losing millions of dollars in funding.

1917 Paper Highlights How Antiquated Modern Corporate Reformers Are

Corporate reformers may like to think many of their ideas are new and ground breaking, but as this paper published in 1917 demonstrates, their ideas are old and antiquated. The paper titled the “Problems of Teacher Measurement” was published in the Journal of Educational Psychology by B. F. Pittenger. The full paper can be read below. We've highlighted a number of relevant passages to demonstrate how what was once old is now new again.

Clearly, almost 100 years later, some are still trying and failing. Teaching is far too complex a discipline, with too many variable inputs, to be boiled down to a single number, no matter how fancy ones formula might be.

Problems of Teacher Measurement

HB74 Takes Axe To State Testing

HB74 pass the Ohio House 92-1, and has a wide range of provisions affecting student testing.

HOUSE BILL 74 SUMMARY (As Reported by H. Education)

Academic content standards

  • Requires the State Board of Education, within 30 days of the bill's effective date, to provide an online opportunity on the website of the Department of Education to make comments on specific academic content standards.
  • Requires each academic standards review committee (established under current law), by September 30, 2015, to submit its review and determinations of the academic content standards and state assessments to the State Board and Department of Education.
  • Requires the State Board, by June 30, 2016, to review the current academic content standards, taking into consideration the input from the academic standards review committees and the comments posted on the Department's website, and to adopt revised academic content standards for each of grades K-12 in English language arts, mathematics, science, and social studies.

Achievement assessments

  • Requires the Department of Education, within 30 days of the bill's effective date, to issue a request for proposals to provide the elementary achievement assessments and the high school end-of-course examinations for administration by school districts and schools beginning with the 2015-2016 school year.
  • Prohibits certain multistate consortia, or their agents or subsidiaries, from being eligible to submit a proposal to provide the elementary assessments and end-of- course examinations.
  • Limits to three hours per assessment the duration of the administration of each state elementary achievement assessment beginning with the 2015-2016 school year.
  • Limits to three hours per year the duration of the administration of each high school end-of-course examination beginning with the 2015-2016 school year.
  • Specifies that the bill's time limits do not apply to (1) assessments for students with disabilities, (2) the nationally standardized assessments that measure college and career readiness, (3) the third-grade English language arts assessment, (4) any diagnostic assessment for students who did not pass the third-grade English language arts assessment, or (5) substitute examinations in science, American history, or American government.
  • Reduces, from twice annually to once annually, the administration of the third- grade English language arts assessment beginning with the 2015-2016 school year, and prohibits school districts from being required to administer that assessment in the fall.
  • Eliminates the requirement for school districts and schools to administer all of the writing diagnostic assessments in grades K-3, and the requirement for the mathematics diagnostic assessments to be administered in kindergarten and first grade.
  • Requires the Department to specify not less than two mathematics diagnostic assessments that are approved for (1) identifying students as gifted in mathematics and (2) the student academic growth component of teacher evaluations.
  • Requires the reading diagnostic assessment to be completed by September 30 of each year for students in grades one to three.
  • Limits the duration of the administration of the kindergarten readiness diagnostic assessment to one hour.
  • Specifies August 1, instead of "the first day of the school year" as under current law, as the earliest date by which a student may take the kindergarten readiness diagnostic assessment.
  • Permits a school district or school to administer the kindergarten readiness diagnostic assessment all at one time or in portions at different times, so long as the assessment has been administered in its entirety by November 1 of the school year.
  • Requires the Department, by July 1, 2016, to make available a kindergarten literacy assessment that districts and schools may use in lieu of the kindergarten readiness assessment.
  • Requires the Department, by December 31, 2016, to complete a study comparing nationally normed, standardized assessments approved by the Department for specified purposes and the state elementary assessments administered during both the 2013-2014 and 2014-2015 school years.
  • Eliminates the English language arts II and geometry end-of-course examinations.
  • Requires the State Board of Education, by March 1, 2016, to (1) compile a list of multiple assessments that are equivalent to the end-of-course examinations for use instead of the end-of-course examinations and (2) identify a table of corresponding score equivalents that correlate to the current achievement levels (advanced, accelerated, proficient, basic, and limited) for all end-of-course examinations.
  • Beginning with the 2016-2017 school year, requires a district or school to notify the Department of any assessment in a subject area that it elects to use as an equivalent examination, and requires that the notification be made by September 15 of each year.
  • Beginning with the 2016-2017 school year, authorizes a school district to use end-of- course examinations, substitute examinations, or equivalent examinations as final examinations for the related class or course of study.
  • Specifies that, for purposes of substitute examinations and equivalent examinations, a score of 2 on an Advanced Placement (AP) examination and a score of 3 on an International Baccalaureate (IB) examination are to be considered equivalent to a "proficient" score.
  • Prohibits a school district from charging a student for (1) any of the nationally standardized assessments that measure college and career readiness, (2) any end-of-course examination, (3) any substitute examination, or (4) any equivalent examination, unless the examination is an AP or IB examination.
  • Requires the Department to identify and approve at least two assessments that can be used for multiple purposes, including (1) a diagnostic assessment administered to third-grade students, (2) an assessment that permits a student to demonstrate an acceptable level of performance for purposes of the third-grade reading guarantee, and (3) an assessment used to identify students as gifted in specific academic ability fields in reading, writing, or both.
  • Requires the Department to develop a table of assessments that may be used for multiple purposes and for which a measure of student performance or aptitude is required, in order to reduce the total number of assessments administered by a school district or school.
  • Requires the Department, within 90 days of the bill's effective date, to determine which components of the resident educator performance-based assessment may be used as part of the teacher evaluation system.
  • Extends through the 2015-2016 school year a current provision prohibiting the Department from requiring school districts, other public schools, and chartered nonpublic schools to administer any state achievement assessment in an online format.
  • Requires the Department to conduct a comprehensive survey of the capacity and readiness of each school district for the online administration of the state achievement assessments based on recommended specifications for such administration of the assessments and to report the results of the survey to the Governor, the State Board of Education, and the chairpersons and ranking members of the House and Senate Education Committees by June 30, 2016.
  • Requires the Department to study the impact on student performance of the online administration of the state achievement assessments and submit results of the study to the General Assembly and Governor by June 30, 2016.
  • Requires the State Board, by November 1, 2015, to make a recommendation on whether to extend by one year the safe harbor provisions in effect for the 2014-2015 school year for students, public school districts and schools, and teachers.
  • Requires the Department, except as otherwise prescribed by federal law, to consider as an acceptable measure of technical skill attainment (1) an industry-recognized credential or (2) a license issued by a state agency or board for practice in a vocation that requires an examination for issuance of that license.
  • Prohibits the Department from requiring a student to take additional technical assessments regardless of whether the student has earned the credential or taken the licensure examination at the time the technical assessments would otherwise be administered.
  • Requires the State Board to periodically revise the nationally recognized job skills assessment that it selects for use as a pathway to high school graduation and to do so with input from individuals and educators who have a background in career- technical education.
  • Prescribes the manner in which the governing body of a school district, community school, STEM school, or educational service center must evaluate the student academic growth component of a teacher for purposes of teacher evaluations.
  • Requires, for the 2014-2015 school year only, a school district or school to use a different measure of student progress for purposes of teacher evaluations, if the district or school has entered into memorandum of understanding with the teachers' labor union stipulating that the value-added progress dimension rating for the 2014- 2015 school year will not be used when making decisions regarding dismissal, retention, tenure, or compensation.
  • Requires the State Board to submit recommendations to the Governor, to the chairperson and ranking members of the House and Senate Education committees, and to the State Board itself on how to revise by July 1, 2016, the framework for the evaluation of teachers to reduce the estimated time necessary to complete teacher evaluations.

State report cards

  • Specifies a schedule of deadlines by which the State Board of Education must adopt rules establishing the proficiency percentages required to be considered meeting performance indicators.
  • Removes the prohibition on the Superintendent of Public Instruction from establishing a performance indicator for passage of the third- or fourth-grade English language arts assessments that is based solely on the fall administration of those assessments.
  • Delays until July 1, 2017, the date by which the State Board must adopt the high school student academic progress measure.
  • Makes optional the inclusion of the high school student academic progress measure as an ungraded measure.
  • Delays until the 2017-2018 school year the assignment of a separate letter grade for the high school student academic progress measure and the inclusion of that grade in a district's or building's overall letter grade.
  • Requires that a district's or school's overall letter grades, component grades, and each performance measure grade be expressed as a percentage of total possible points, in addition to the required letter grades on the state report card.

The Full LSC analysis can be found below

HOUSE BILL 74 SUMMARY (As Reported by H. Education)

Senate Advisory Committee On Testing Recommendations

The Senate Advisory Committee on Testing, appointed March 4th by Senate President Keith Faber (R-Celina), has completed its recommendations to improve state testing for next school year. The 30 member committee chaired by State Senator Peggy Lehner (R-Kettering) was created following the rocky rollout of the new state assessments in February. Teachers, parents, school leaders and policymakers serving on the committee were charged with providing advice to the Senate on how to improve state testing.

A written recommendation is currently being prepared. It will include the following components:

  • The new twice a year administration of tests that occurred this winter and spring should be scaled back to once a year and the tests should be shortened. The testing window should be closer to the end of the school year to provide more time for classroom instruction and less disruption in learning.
  • Accommodations for children with Individual Education Plans (IEPs) must be improved and more clearly communicated to parents and schools. Training must be provided for intervention specialists and paraprofessionals who assist students with IEPs.
  • Test results must be returned in a timely manner to benefit student instruction – although the group recognized that results from a writing test may not be able to be returned as quickly as the rest of the results.
  • Transparency – test questions and answers must be made available within a reasonable timeframe after the administration of the tests to ensure the tests are aligned to Ohio’s learning standards and that questions are developmentally appropriate for grade level.
  • Online testing is necessary and schools must plan to move in that direction; however, local schools must continue to have the option to administer paper/pencil tests for at least the next two school years. State funding for technology based on need should be considered.
  • A single technology platform is preferable for next year’s tests. Improvements in technology are needed to ensure smooth administration of the tests.
  • A “safe harbor” must be in place that allows results from this year’s tests to be reported but students, teachers or schools should not be penalized for results this year due to the transition to a new test and the concern that results may not accurately reflect a student’s achievement level.
  • A comprehensive communications plan must be developed to provide parents, teachers, school leaders and the general public with clearer information about the tests.
  • If the current vendors for state tests - PARCC (Partnership for Assessment of Readiness for College & Careers) for the math and English language arts assessments and AIR (American Institute for Research) for the science and social studies assessments will not make changes to the test for next year to accommodate these issues, the Ohio Department of Education must find a test vendor that will.
(c) Join the Future