Our View: So long, CSAPs

Advertisement

Editorial Board, May 11 through Sept. 21, 2011

  • Scott Stanford, general manager
  • Brent Boyer, editor
  • Tom Ross, reporter
  • Laura Schmidt, community representative
  • Jim Miller, community representative

Contact the editorial board at 970-871-4221 or editor@SteamboatToday.com. Would you like to be a member of the board? Fill out a letter of interest now.

Last week the state released what should be the final Colorado Student Assessment Program test results. Count us among those who look forward to the promise of an improved standardized test that could help guide meaningful education reform.

But please forgive us for being skeptical.

State education officials are saying goodbye to the CSAP in favor of a new test — still being developed — that should be ready by 2014. The test, still unnamed, will incorporate new standards such as personal financial literacy, information management and social studies, among others.

Standardized tests for public school students can play an important role in education. They provide a measure of achievement for individual students as well as groups of students across grade levels, schools and districts. Ideally, they also provide a clear picture of where schools need to focus efforts in the classroom.

Unfortunately, standardized tests like the CSAP also can demonstrate much of what is wrong with the public education system in our country. Significant time and resources are spent every year on preparing students for the CSAP, and the test’s history showed very little overall improvement in student scores throughout time. If student achievement is the same this year as it was when the CSAP first began years ago, what has been gained?

Here in Routt County, the test results by district became nearly as predictable as the seasons. Steamboat Springs schools regularly scored above the state average in all content areas. In South Routt and Hayden, the results were continually mixed. Some grade levels in some subject areas tested below the state average; some tested above. Every year was a roller-coaster ride, with little evidence of permanent improvement throughout time in any particular subject area or grade level.

Starting next year, Colorado public school students will take the Transitional Colorado Assessment Program, or TCAP. It will largely resemble the CSAP, with the exception of not including questions from old state standards that also won’t be part of the new state standards. The TCAP will be administered in 2012 and 2013 before being phased out in favor of the new standardized test.

For that new test to be more meaningful than the CSAP, it must be properly aligned with good standards. And there must be accountability for students, teachers and administrators. Until that happens, we’ll remain skeptical that it’s just CSAP with a different name.

Comments

Scott Wedel 3 years, 1 month ago

"If student achievement is the same this year as it was when the CSAP first began years ago, what has been gained?"

Yes, blame the test for the results. The only thing wrong with the CSAPS was that it measured the same things year after year and measured serious academic skills such as reading, writing and mathematics so there was no evading how the schools performed on the test. The recent local CSAP were quite revealing because there was such difference for the HS math scores. SB scored 50% above state average as it has for a while so it is pretty clear SB is doing something good on their math track. Meanwhile, Soroco was about state average. Meanwhile, Hayden was about 50% below state average as they have been for a while and they clearly have issues.

Which is why the educational system likes to switch tests every few years so all those upset by the test scores can evade accountability. Even better technique is giving a new test and using those scores as the official test average. So then schools can teach to those tests and in a few years suddenly 90% of schools are scoring above average which makes all the local school districts proud.

The CSAP was a pretty darn rigorous test. It is pretty remarkable that they were willing to set high standards and so state averages of proficiency for many of the grades levels were less than 70%.

That various schools were unable to show improvements in CSAP was not the fault of the test, but of the schools. And actually, while all local schools did quite well at the 3rd grade level year after year, SB can actually point to improvements in the high school scores over the years.

Actually, I think it is entirely possible that the editorial board fundamentally fails to understand statistics and thus believes that a test is faulty when it fails to show the desired results. That is the most plausible explanation for statistically ignorant statements such as "Some grade levels in some subject areas tested below the state average; some tested above. Every year was a roller-coaster ride, with little evidence of permanent improvement throughout time in any particular subject area or grade level. " Actually, statistically that is exactly what you would expect to find and, if anything, generally proves the statistical validity of the test.

0

RPG 3 years, 1 month ago

Test results mostly reflect the motivation of the students and parents and how much education is valued and respected at home.

0

sodacreekpizza 3 years, 1 month ago

I agree Scott. The editorial expresses skepticism about both the current test and upcoming change but then seems to support the reasons given for change with arguments that are weak. This editorial is wide of the mark or perhaps just written in a way that makes it seem like the reasons given by the State for a change in testing are the views of the editorial board?

"Ideally, they (test results) also provide a clear picture of where schools need to focus efforts in the classroom." The editorial starts out well with this statement but then wanders out into left field: " the test’s history showed very little overall improvement in student scores throughout time. If student achievement is the same this year as it was when the CSAP first began years ago, what has been gained?"

Isn't it possible that the test actually did exactly what it was supposed to do and the truth is that student achievement in our local districts has not changed?

This section is followed by a statement that includes the following: "Every year was a roller-coaster ride, with little evidence of permanent improvement throughout time in any particular subject area or grade level."

Does the pilot have evidence from another source that "permanent improvement" was in fact taking place and the CSAP missed it?

Student performance is combination of family and community expectations, programs, resources, curriculum, teacher qualifications, class size, facilities and other factors. If you want to change student performance, change some of those things!

No test is perfect and the new one will not be either. But as Scott points out, we WILL get a new baseline. As long as districts are rewarded and "punished" for test results the districts will adapt to teaching for results on that new test.

By all means, change tests. Change is good and, no doubt, the testing can be improved. But when, after a few cycles for the districts to adapt to the new test, the results again settle into a predictable pattern lets talk about changing the quality of the education rather than the yardstick that measures it.

0

Scott Wedel 3 years, 1 month ago

The lack of rapid improvement in CSAP scores is not a fault of the test, but a demonstration of the difficulty of substantially improving a child's education.

The CSAP scores are typically presented in a flawed form of comparing a grade's current scores on this year's test with those student's scores on the different test for their lower grade the previous year. It is a flawed approach because it is comparing scores on two different tests. And I do not have the data in front of me, but I am pretty sure that if someone were to graph proficiency for 12th grade over 10 years or so then you would notice general trends. In particular, improvements for SB schools.

0

Krista Monger 3 years, 1 month ago

I would like to point out a flaw that no one has brought up. There is no accountability on the part of the student. If a high-schooler chooses to turn in the test in 10 minutes with written responses blank, and multiple choice answers based on where the pencil landed, there is no repercussion. The scores aren't received until AFTER school is out for summer. I have seen it happen, and have known schools to threaten remediation courses based on the results. These kids are known as "won't do's" as compared to the "can't do's" who generally need remediation. With smaller budgets, these classes would require additional resources/teachers, and it's hard to justify spending extra on the "won't do's". In a school such as Hayden or Soroco that has 30 in one grade level, one student can sabotage the entire school's data. I am not sure if this happened, but it is something to keep in mind.

Another point to consider are those students with some type of special ed help. The state only allows 2% of the school population to take the modified test, and anything more than that must take the regular test, or take the modified test, and have it not be counted. Let's say Hayden or Soroco's 9th and 10th classes total 60 kids, only 1 student can take the modified version of the test, and let's say there are actually 3 students who need to take the modified version, those 2 other students will take the regular test, and their scores will be compared to their non- special ed peers. Again, the school's larger CSAP scores will be affected.

Non-english speaking students are required to take the test too. The first two years, the scores are not counted (although they must take it). The third year, the scores are counted, and the school takes the hit. I'd like to talk to the legislator that decided a 8th grade student from Mexico would be fluent in English by their 10th grade year and pass the CSAP tests.

If a10th grade class of 30 had one "extra" special ed student, one non-english speaker on her third year, and one kid who didn't give a flip, the school's scores will not be great, and those scores DO NOT give a good picture of the education that is happening.

0

addlip2U 3 years, 1 month ago

"Non-english speaking students are required to take the test too. The first two years, the scores are not counted (although they must take it). The third year, the scores are counted, and the school takes the hit. I'd like to talk to the legislator that decided a 8th grade student from Mexico would be fluent in English by their 10th grade year and pass the CSAP tests."

A kid living in US coming from another country attending US school should be able to be fluent in English in two years. Unless, of course, the schools are teaching them in Spanish or their native language.

But let's face the (not politically correct) truth...Non-Mexican kids have rapidly "adapted" to speak, read and write in English much faster than Mexican because they have to. No one in US school would speak their language.

0

Krista Monger 3 years, 1 month ago

addlip2U -- I would like to see you write an essay on the causes of world war 2 in french with only 2 years of language under your belt. Just becuase one can SPEAK the language does not mean they can read and write it, especially at a 10th grade 'proficient' score.

If you care to research language acquisition, you would find it takes 2-3 years to be able to communicate in a non-native language, but it takes between 5-8 years for someone to be fluent in academic language (the stuff that is tested on our CSAPs).

and your racist remarks that non-mexicans can somehow buck the research and do it quicker is way off the mark. I have 6 years experience working with vietnamese and mexicans, and both acquire the language at the same rate.

0

Requires free registration

Posting comments requires a free account and verification.