Print Friendly, PDF & Email

A bill to suspend school state assessments this year was postponed indefinitely by the House Committee on Education on March 5. Representatives Emily Sirota and Barbara McLachlan as well as Senators Rachel Zenzinger and Don Coram were prime sponsors on Bill HB21-1125, which would have suspended assessments statewide, pending federal approval. The rationale for the bill suspending educational assessments was to counter the impacts of COVID-19 interuptions on student test scores.

The bill aimed to suspend Science exams for students in grades 5, 8, and 11; Math exams for students in grades 3 through 8; English Language Arts for students in grades 3 through 8; and Social Studies exams for students in grades 4 and 7. The bill would also have prohibited a school district from using academic growth or student performance measures when evaluating teachers and principals for the 2020-2021 academic year.  

Salida Schools Superintendent David Blackburn says he feels that state exams aren’t necessarily the best measure for each district’s needs. 

“It is hard to use the state data to drive excellence in our programming because the data comes so late or it’s such large data,” he says. “It doesn’t really tell us how to help a kid. That’s why we do a lot of local testing measures that we have more control over to get data faster. It’s more granular.”

The Salida district uses an NWEA (Northwest Evaluation Association) system called MAP testing. The system provides a national perspective so schools can measure where each student is compared to national norms.

“We’ve done that [testing] this year, even through COVID,” Blackburn explains. “And we’ve actually been very happy because what we’ve discovered is we have been able to reduce most learning gaps locally.”

Blackburn and the district intend to follow what the legislature dictates. However, they hope to reduce the testing time so that students aren’t missing out on instructional minutes.

“We normally spread the tests out so we don’t have test exhaustion and so kids can concentrate better,” he says. “We’re more concerned this year about how many instructional minutes we have with kids. We’re going to reduce, however possible … the time it takes to test the kids so that we have more time to teach because the data will be fairly unreliable for us to make any decisions on.”

One of the biggest benefits of the state’s Colorado Measures of Academic Success (CMAS) testing is that it helps state legislature and the state’s Board of Education put money and resources to bridging statewide gaps.

“We don’t want legislators creating laws, policies, or routing money to where they feel or have a gut instinct there’s a gap,” he says. “We want to make sure they have data to guide those decisions.”

Over the past year, Blackburn says the district has been far more focused on student and classroom-specific needs in order to make up for lost teaching time last spring, so there hasn’t been as much focus on statewide systems. 

“We want to know a name by name, behind the name, what is their need, and which adult is going to help them,” he says. “That is the driving conversation about how we look at things. It’s not about state tests, even though we do very well.”

In addition to the MAPs testing, the district also looks at students’ enrollment and test results in Advanced Placement (AP) courses at the high school level, PSAT exams, Accuplacer, and concurrent enrollment testing. State testing, while indicative of state norms, doesn’t always show how to help students quickly.

“We might find there’s a concept in math and understanding number sense, and we see that our entire system or program or school is lower than the norm,” he explains. “Then we know to really dig in and ask questions. It informs us about program-wide success as compared to the rest of the state. It doesn’t do a great job of telling us how to help a certain kid. The data tends to come too late.”

Another potential obstacle is the major gap between this year’s test and 2019 testing. As 2020 testing was canceled, an entire year of data is missing, which may make the tests’ data less reliable.

“You get state-level data out of the test, but it is going to be a less reliable answer to that question because of last year and having a gap,” he explains. “I mean, we’ve been in person, but the majority of the student body in Colorado has not been in person, so it’s going to have a really low level of reliability. Yet there is a strong argument that we want to make sure that our state lawmakers have some data to guide their decision-making because we are in an economic downturn. So there are hard economic decisions to be made and if you provide them good data that’s better than nothing.”

Blackburn feels that people often think state assessments ask and answer questions that they don’t always address. 

“When you look at a dataset and try and make it answer a question it wasn’t designed to answer, then you tend to come up with erroneous conclusions,” he says. “A parent needs to be asking the question that our teachers are asking: ‘What does my kid need to learn? Where’s my kid at? What’s the next growth level?’ Those are more important parental questions of excellence than how we scored on the state tests.” 

“The state testing measures are not the best system to answer the question of whether or not your local school system is successful,” he concludes. “Great, we’re highly successful on those things….Our teachers are more interested in whether Johnny or Susie was successful not whether we scored well on [classroom] tests. That’s where we spend our energy.”