Here's the response by his Democratic Party rival, Ed FitzGerald
Here's the response by his Democratic Party rival, Ed FitzGerald
This communication is to share the sentiment of the FMS staff as it relates to the Smarter Balance Test. As you know, our staff used the December Early Release to take this test with the goal that it would provide us with some insight how we might incorporate the “common core” and the format of the Smarter Balance Test into our instructional practices. We believe that we successfully incorporated the NECAP Test format and GLEs into our teaching practices, especially in our “Bell Activities. “
Although a few staff members shared that they believed that our test scores would improve over time, I was surprised with the responses from the FMS teachers when we gathered to debrief after taking the test. I was hopeful that we would have teachers sharing test vocabulary, ideas for test taking, and strategies to help prepare our students for the 2015 Smarter Balance Test. Instead, teachers shared frustrations they had when they were taking the test and disappointment in test format and the difficulties they had trying to use their computer to take this test.
The comments shared below come from successful dedicated veteran teachers. I have much respect for this staff and I not only appreciated the honesty of the staff in their response to the Smarter Balance Test; but, I am hopeful that the Nashua School District will accept these responses in a positive way and not look at the comments as “negative” or “unprofessional”. The FMS staff collectively believe that the Smarter Balance Test is inappropriate for our students at this time and that the results from this test will not measure the academic achievement of our students; but will be a test of computer skills and students’ abilities to endure through a cumbersome task.
Listed below are some of the concerns that were shared by our staff:
*I feel sad for the students who have to take this test — not many will be successful.
*Much is said about “depersonalizing” information as part of a learning strategy. This is not how students learn.
*There is too much “stuff” going on the screen at once. It is difficult to move the icons where you want them. Students don’t know how to use the “mouse” everything for them is “tough screen”.
If you leave the screen for a short period of time the information on the screen will be gone when you return. “I tried the grade six-grade math—it was humbling. It was scary.
*I had technology problems. If kids have these problems they’ll just quit.
*Double-wide monitors would help. I am a huge fan of concept maps but notepad does not let you do that on Smarter Balance. You can’t even copand paste from the notepad into the test.
*This was more of a test on the computer skills than on the math concepts. If I was a student I would just pickout an answer and move on.
*Too tedious—kids will get sick of it and just guess to move on. Kids won’t even get past the computer directions.
These are just a sample of the concerns that were raised at this meeting. We did shift to “what do we have to do from now until the spring of 2015 to prepare students. Sample answers include:
*Pay attention to the directions. Provide students with many opportunities to read directions for their assignments.
*You can’t just read this test and then respond. Students need to highlight and take notes—especially during the audio questions.
*Students need to learn to “read the question first”.
*Students need to be able to go back into the text passages to pull out data that will support their answers.
*Students need to read through the questions and all possible answers. Sometimes questions give the answers to other questions in the test.
*Kids need to know how to do “note taking”.
*We need to teach students “how to draw an inference”.
*Students need to learn how to write a transition sentence between two paragraphs.
*Students need to learn how to write using “the speakers” voice.
*Students need to memorize formulas in this test.
*Students will have difficulty writing in the boxes that expand because of the technology of the way the box expands.
*Students will have trouble reading and understanding the directions and what is being asked by the question. Is this test closely aligned to the “common core?” It is important that teachers know what the test will be assessing.
*I am concerned that the math test is not necessarily testing students’ math abilities since there is so much reading. This test seems to assess how well the students read the math questions more than their math skills. Thus, because of the amount of reading, I question the validity of our receiving a math ability score.
*When Measured Progress developed the NECAP there was a committee on bias to check for testing bias. Does Smarter Balance do the same? Also, math teachers were asked to evaluate the questions to eliminate unnecessary verbiage so that the Math was being tested.
*The opening pages of directions and computer information was ridiculous. I didn’t read it—I’m sure my students won’t. Suggestions: We should have posters made of the most important and often used keys to post in each math classroom. Students need to practice making equations in Word, including the fractions symbol. We need to teach students to distinguish between on correct answer and many correct answers. There are questions that tell the students to choose the correct answers.
*The test is difficult to navigate with so many keystrokes to juggle.
*The page layout makes it eye weary even though you can expand the screen and zoom in and out.
*The passages are lengthy and time consuming and made me consider just choosing “B” so I could move on. Some terms in the reading seemed out-dated—“Plumb crazy and millwright” for example.
I had to use multiple skills and at the same time multitask—id—the audio portions require me to listen and at the same time read possible answers while constructing a well written paragraph in my head.
*The test assumes the students are skilled in such areas as pre-reading and questions and if they are not, it assumes they will learn while taking the test to read the questions in advance of the reading.
*There wasn’t a flow or cadence to the questions. The type or style of questions changed from one to the next. The answers were not straight forward—for example on the math test they did not want the answer to the equation, they wanted to know if the answer was 2/3rd greater than what you started with. I understand this is import ant but this test will be exhausting for the kids.
*The idea of the best answer and then there being 2 or more good and appropriate answers. It felt like a trick. We’re going to look bad for a few years.
*I did 30 questions in an hour and then had to take a break. My eyes hurt and my shoulders felt strained. When I returned 5 minutes later the work was gone.
*Each question is totally different than the one before it creating confusion which creates more confusion for the test takers.
*Frustration level builds as your take the test creating mental despair—students will shut down.
Many of the math questions seemed to have no basis in the real world and skills that will never be used in life. Students will need to be taught the technology skills for the test.—scrolling through screens, highlighting, scanning the questions, touch typing, and more.
*The test does not encourage students to use writing webs, brain maps, organizers to assist with writing. Summary: In my opinion, this test is a sad indictment of how disconnected the people who design the test are from the typical students in the classroom. Assessment is necessary but it should be designed to be developmentally appropriate for the students being tested. Assessment should also all for different methods to demonstrate competency rather than one computer model. This test is designed for one type of student—the verbal learner with exceptional executive functioning skills.
*I took the Grade 7 Language Arts test which I believe is developmentally designed for adults, not seventh grade students. The questions were tedious and punitive. I’m not sure that any seventh grader in the St ate would be able to score well on this test. The worst part of this test was the directions. They were numerous and multifaceted. After observing middle school students take tests for over a decade, it is my firm belief that most kids will stop reading the directions because there are too many and they are far too complex. Students will fail this test and the test will destroy their confidence which is an important stage of their development. In addition, the results of this test will become a public relations night mare for the school and the school districts as children will fail in large numbers.
How does a policy of school choice compare to other reform initiatives in their perceived efficacy for school improvement?
Figure 8(below) includes the average perceived efficacy for each type of school reform after controlling for covariates. School choice in the form of vouchers is in the middle of the pack, with smaller class sizes, technology, and accountability perceived as more efficacious and reducing teachers’ unions’ influence, merit pay, and longer school days as less efficacious.
Even after an attempt to goose the results (pg 5 "source comes from a poll conducted by the Friedman Foundation that included a nationwide sample and an oversampling of mothers of school- age children (“school moms”)."), respondents didn't think much of merit pay, busting teachers unions or vouchers. What they cared most for was smaller class sizes and better technology.
If those results weren't bad enough for our corporate reformers, they had more bad news in their poll
Only 29% of respondents liked the idea of tax payer funded vouchers to pay for private schools,
The Columbus Dispatch reported in January that 17 charter schools in the city had failed this year alone.
Nine of the 17 schools that closed in 2013 lasted only a few months this past fall. When they closed, more than 250 students had to find new schools. The state spent more than $1.6 million in taxpayer money to keep the nine schools open only from August through October or November.The problem is not isolated to Columbus, as the Toeldo Blade reports
Secor Gardens Academy’s short tenure in Toledo came to an abrupt end, as the school closed its doors over the weekend because of financial struggles.Comically, the leader of this "school" couldn't even provide an enrollment figure. According to the Department of Education FY 14 Detail Funding Report this charter was receiving $220,952.70 for FY 2014. We wonder if the leader of this school is unable to account for that, too.
The charter school, which opened in the fall, was based in the back of the Armory Church, 3319 Nebraska Ave. School Superintendent Samuel Hancock and others involved with the school realized the school’s finances had become untenable, said James Lahoski, superintendent of North Central Ohio Educational Service Center, the school’s sponsor.
According to the dispatch 29% of Ohio's charter schools have closed, and the pace of opening and closing is accelerating
It took 15 years for Ohio’s list of closed charters to reach 134; then that number grew by almost 13 percent last year from charters closing in Columbus alone.We expect the acceleration to continue, as the Cincinnati Enquirer notes, Forty-five new charter schools opened in Ohio this academic year, but with only 600 new students. That isn't sustainable. Schools cannot properly run with just a handful of students attending them, and they certainly shouldn't be ran out of the backs of churches. Students need a stable learning environment, with quality facilitates in order to thrive.
Charter schools in Ohio have been allowed to epxand faster than the pool of students wishing to attend them, and the oversight of the quality of the schools being opened has been cast aside.
The School Choice movement and the legislature in Ohio needs to step up and put an end to this train wreck, and take a long hard look at the charter school authorizing process.