ELLSWORTH — The results of the Maine Educational Assessments (MEA) for the 2018-19 school year are in, and they’re more detailed than ever. But are they useful?
Many administrators and curriculum developers say not so much.
George Stevens Academy Head of School Tim Seeley said in an email he is “not a fan” of the current MEA testing system, and that the state hasn’t released results for GSA or any private school he could find, so he couldn’t comment on the results.
“I think the tests Maine uses for high schools (the SAT for English and math), is a remarkably poor way to assess how a school with a broad range of students is doing,” Seeley said.
The state is required to have some sort of “state accountability system that includes standardized assessment data,” said Kelli Deveaux, director of communications for the Maine Department of Education (DOE) in an email. States use different assessments and scoring rubrics, said Deveaux, but must have their plans approved by the U.S. Department of Education.
Maine officials use a combination of exams to comply with those requirements. For the general MEA assessment, students take the eMPowerME exams (for math and literacy for students in grades 3 through 8), the general MEA science exams (for students in grades 5, 8 and the third year of high school) and the College Board SAT for students in the third year of high school. (There are exemptions and alternate assessments for students learning English as a second language or those with learning differences.)
It’s difficult to determine an overall figure for the cost of administering the tests, but figures provided by Deveaux indicate that Maine will pay roughly $10 million to administer the general and alternate assessments, including testing for students in all grades who are identified as English language learners, between Jan. 1, 2018, and Dec. 31 of this year.
“This is a huge undertaking by the state,” said Jim Boothby, superintendent of Regional School Unit (RSU) 25.
“I think we could do something much better and much more pertinent.”
Seeley said using the SAT exams for high school students is particularly problematic, since the tests are designed to assess whether a student is ready for college (and they may not even be good at doing that).
SAT exams, said Seeley, “are notoriously limited in their ability to assess success in college” and “are not designed to assess how well a school does with its students across the board.” Students who can afford to do so get extra coaching for the tests, said Seeley.
“How does it assess how schools are doing with students who are not college-bound? Isn’t that important to measure?”
In a statement provided by Ellsworth Curriculum Coordinator Rachel Kohrman-Ramos, administrators in Ellsworth said they do find the data helpful, because it “allows us to identify trends or gaps in curriculum, as well as to differentiate individual student performance results by identifying specific areas where students are challenged and need additional instruction and/or support.”
The state changed the formatting of how it reported results this year, providing more information to the public, including figures on the number of students who are economically disadvantaged, chronically absent or considered to have a disability, as well as the amount spent per pupil in each district.
But Julie Meltzer, director of curriculum, assessment and instruction for the Mount Desert Island Regional School System, cautioned that some of the newly available public information, particularly the per-pupil spending, is “very, very misleading.”
“I think the intent was to draw some line between what you spend and what results you get,” Meltzer said. “I think it initially was to enhance investment in education. But of course, that’s not necessarily how towns and budgets and selectmen and voters and everybody have taken it.”
Because the tests are administered to the same grade level but different cohorts of students each year, said Meltzer, ups and downs in testing have “a lot to do with cohort differences,” rather than changes.
“I think for our state assessment we try to answer too many questions with the same assessment,” Meltzer said.
And while the DOE provided more detailed public information, some administrators say they didn’t get the usual breakdowns of individual student results.
“I was disappointed to learn that for that reporting year, there are no individual student actual breakout items to review,” said Lew Collins, superintendent of Moosabec Consolidated School District (CSD) and Union 103, in an email.
Educators can view the student’s overall score, said Collins, “but not any meaningful information about the actual test results.”
Without that information, said Collins, “the MEA results are simply not helpful to us.”
Superintendents of several area schools, including Collins, Meltzer and Boothby, said that local assessment data and tests developed by the Oregon-based nonprofit organization NWEA are more useful in setting curriculum.
“We use the NWEA at the beginning and the end of the year to see if kids are meeting the growth target,” said Meltzer. “No matter where a kid starts, did they make a year’s growth. It gives us a view of our program and allows us to figure out what kids might need more help and what kids might need more challenge,” said Meltzer.
Deveaux said in an email that the results of the MEAs are not intended to “be used in isolation to make classroom instructional decision.” Administrators, said Deveaux, should use “Local formative assessment data in conjunction with the state summative assessment data” to “determine trends and possible gaps in curriculum.”
Ronald Jenkins, superintendent of schools in Calais, said the tests “Are useful but only one of many things we use to assess indicators of progress.”
Jenkins agreed that the exams should not be used to compare districts, calling them “nearly useless,” in that regard.
“I believe it is good to hold schools accountable,” said Seeley, “and to try to find useful measures to do so, but Maine’s current system (the third, Seeley thinks, in the five years he has been at GSA) does not do a very good job, I don’t think.”