Test scores for Missouri school districts are available from the state Department of Elementary and Secondary Education, and if you saw the story in Sunday's Globe, you'll have noticed that results for local districts were mixed.
Some district officials said they were pleased with the growth of individual students over last year, while others said their test scores, lagging behind the state average, need improvement.
But there's another problem with the data. The 2019 English and math scores can only be compared to the 2018 scores because a new test was implemented just prior, meaning that any comparison with 2017 or earlier would be apples to oranges.
A new assessment for science was administered in 2018-19, so any comparison of science scores — which will be released in November — to previous years won't happen. And it's the same for social studies, for which only a field test was given in 2018-19, with a new assessment coming this year.
Parents and other members of the public having access to their district's 2019 scores is good. That information tells them how their schools performed last spring.
But is the information really that meaningful if you can't compare it with prior years to determine how your school district is performing over time? Shouldn't the public be able to assess whether their district is showing growth or getting worse over the years?
We think so, for accountability purposes.
That kind of long-term data also would be useful to the school districts themselves. Teachers, principals and administrators also would benefit from being able to track test scores over time so they can develop strategies, based on those trends, to tailor their instruction for better student performance.
"We like to look for trend data," Joplin Superintendent Melinda Moss told the Globe last week. "If you're truly going to compare, you need some data to be consistent over time."
That consistency component is key. Think of a test like the ACT. It rarely changes in either format or content, meaning that student scores over time are easily an apples-to-apples comparison for a school district.
But the state keeps tweaking its assessments.
We recognize that happens because state officials are continually trying to devise an assessment that best captures what students have learned in an academic year and measures what they know.
However, there is value in keeping those assessments unchanged for a longer period — say, three years — so we can understand how our schools perform over time. We implore the state to consider this approach going forward.