Deciphering MCAS meaning
Some 15 years after students in this state took the first round of MCAS tests in 1998, the release of the most recent scores has once again prompted school officials, teachers, policy analysts and parents to find meaning in an avalanche of numbers, charts and trend lines.
That process will be ongoing, but preliminary reports contain some positive findings our area.
Notable among them, Smith Vocational and Agricultural High School in Northampton made significant progress in narrowing achievement gaps between high-needs students and the general student population, while Sunderland Elementary School was among 64 in the state commended for showing high achievement and progress.
Before the dust settles on those results and others, we’d like to highlight what are in our view, key points regarding the latest MCAS results.
Stephen Sireci, director of the Center for Educational Assessment at the University of Massachusetts Amherst, homed in on a key point, namely the benefits of taking a close look at schools that show improvement after faring badly.
“What are the schools coming out of Level 4 doing right? What are they doing differently? There’s a lot of data there and we should think about how it can be used,” Sireci told a Gazette reporter.
It’s worth noting that Sireci has expressed reservations before about aspects of the MCAS tests — as have scores of education officials, teachers and parents who over the years who have raised important questions about the pros and cons, and the usefulness and reliability of standardized testing.
That bring us to another key point: the reality that testing is by now part of the landscape of public school education — and that, even with its flaws, MCAS provides some information that can and should be studied and shared.
Many of the school officials interviewed in recent days by the Gazette were clearly doing just that. They are intent on scrutinizing the results in their own schools and others to glean new insights. Make no mistake, that’s a time-consuming and potentially frustrating, confusing process.
Local educators say they will analyze data to pinpoint weak areas and then make changes.
Making those changes will take money and resources — a crucial point made by educators last week that also bears repeating.
One local education official noted that sometimes improvements seem to come in one grouping of students at the expense of others, reflecting, perhaps, a redirection of grants and resources to problem areas.
In an era of chronically tight budgets, this very legitimate concern is unlikely to abate any time soon.
As the state continues down its chosen path of using standardized tests to target trouble spots, additional money and resources will be needed to address problems the tests reveal.
Without that, local school districts will be playing a game of Whac-A-Mole, fixing one problem, only to see others pop up.