State Board of Education Hears Report On MAP Scores – Here’s The Rest of the Story
Share and Enjoy
Yesterday the State Board of Education heard a report from DESE Assessment Director Michael Muenks about how our students did on last spring’s MAP (SBAC) test. The message was, “We did better than expected.” Well that ain’t sayin’ much.
This chart shows how our students did on the math portion of the test compared to the results expected by SBAC.
- The green bars represent SBAC field test results from spring 2014. This is when many districts had extreme technical issues even giving the test (crashes, inability to log on, on screen tools not working etc.) The questions had no validity of reliability information so we aren’t even sure what they were measuring. The cut scores were somewhat randomly chosen, with SBAC’s stated goal to have only 30% of test takers ultimately ranked Proficient.
- The brown bars represent how Missouri students were rated based on the very clunky unreliable field test data from the year before. The improvements could be due simply to better infrastructure management. Or they could be due to new cut scores. We don’t have that information from DESE.
- The chart only shows percentage of students rated proficient or advanced. We have no definition for what that means. Did they get 80+% of the questions right? Did they get 24+% of the questions right? We would look pretty foolish for patting ourselves on the back if the cut point for proficient was less than 50% of the questions right. The students could guess and be proficient.
- The charts compare Missouri performance with national results from last year. We do not have results from the other SBAC member states yet and won’t until November of this year. That’s a 6 month delay for a test that was supposedly set up on-line so they could turn around results to teachers in a matter of days. That’s 6 months before we have a true picture of how Missouri did compared to other states on the same test.
- The chart shows a decline in proficiency as we approach 8th grade.
Muenks attempted to explain that decline saying we had new standards, new curriculum and new tests. Those students only had 4 years of exposure to the CC aligned curriculum. Then he had to show this chart.
Same students, same length of exposure to the same conditions, but much better results. SBE member Mike Jones didn’t buy the spin. That’s when Muenks and Sharon Helwig attempted to explain the results saying students take either the 8th grade math MAP test or the End of Course Algebra 1 exam. Essentially they said we have our best math students take a different test. If they had to take the MAP test our scores would have been higher.
But as Commissioner Vandeven told St Louis Public Radio, “If all children means all children, we’re going to have to figure it out.” DESE cannot excuse poor math results by students who are on a slower track in math who have not even had to master the advanced concepts of Algebra I. In theory, they should do even better on the 8th grade MAP math because expectations are already lower.
What DESE also had to admit yesterday was that one of the great promises of CCSSI, that these standards were going to help us reduce the performance gap, has not been borne out in test results. We still have the same gap in Super Sub-Group performance that we had before, in all content areas.
(Super-Supgroup includes, Black, Hispanic, English Language Learners, severely disabled and students on Free and Reduce Lunch.) All slides here.
There was discussion about how the testing situation will change over the next four years. DESE reported that we will be giving the same on-line test next spring. It will have the same look and feel as the one we just gave. Cut point validation for the test will not be done until next June. The law requires that we have a test aligned with our standards and to the extent that our standards change this October, our test questions could change, but the format will not. We are scheduled to keep the same assessment plan in 2017 but potentially have a new test in 2017-18. Sharon Helwig reported that she has met with several groups around the state to get their input on testing. She reported that they “like the on-line format and do not want to go back to the pencil and paper format.” She also told the board that the drag and drop format younger students use in the on-line test is “hard for adults but easy for kids.” (Sure would like to hear from teachers in the comments whether they agree with these two statements.)
The concern expressed by President Shields, and echoed by Commissioner Vandeven in STL Today, was that starting in 2016 we in Missouri will only be able to compare our student’s performance with themselves in prior years. By changing our exam we “lose the ability to compare our results with other states.”
This is a kerfuffle based on fear mongering that DESE is only too happy to stir up.
First of all, we have, as the board noted, both the NAEP and ACT if we are still so bent on being able to compare our students to those in other states. There is no statistical need to test every child in order to do a basic comparison. That is Stats101. The NAEP is a random sampling test which is statistically valid for comparison purposes. The ACT, which all our 11th graders will be forced to take allows the state to see how they are doing with the final product of our public education production line. This is no different than how they do it in Finland which everyone likes to point to as the utopia of education.
Secondly, how many times do we hear that the important thing is to make sure our children are moving ahead. Do we not get a measure of that when we test a child from year to year?
Thirdly, not every state was using SBAC. Some used PARCC so we couldn’t really compare to all the other state now with this test anyway. Overall the number of states staying in the community testing programs is dropping and will likely continue to drop, especially if Congress is true to their word in not mandating or incentivizing states to use any particular test. This interstate comparability is something testing companies like to keep promoting to states. It’s a tool for their marketing departments. Peter Herschend knows this. He said the tests are an “important tool” to help the state sell itself. Students become a tool for the state to help it attract businesses. Not sure how that squares with “wanting what’s best for children” which he likes to sprinkle out there periodically to sound good.
I will add this for the Board members who are afraid that Missouri might be standing out as different from other states when it comes to changing tests. Consider, this from the Alternet article a few days ago. “Switching to Questar means that New York students will be taking a different test for the third time in five years. Before Pearson began its testing in 2012, the state used McGraw-Hill.” New York is still looking to get it right rather than staying with a poor product just because other states are.
There were signs in the meeting that the board members are worried about the wrong things while claiming to be interested in improving education in our state. They worried that we had low numbers of teachers passing the new teacher certification exams in STEM content areas and thought that maybe the test was too hard. How do you worry about having high quality teachers and a goal for more students going into STEM, but consider making the certification exam a little easier so you can have greater numbers of less knowledgeable teachers to teach our students? How do you worry about educating our students to be college ready, but also consider sticking with a poor test because at least it lets you compare your students to those in other states who also took the poor test?
When you have a goal of making the state in the Top 10 by 2020 and you don’t know what that means, I guess these other concerns makes sense.