brookings CC updateThere is a Michigan State University study done by Schmidt and Houang in 2012 that is often sited as justification of Common Core Standards effectiveness at improving student performance in math. The study looked at student scores on the National Assessment of Educational Progress (NAEP) in 2009 of states whose standards were close to Common Core and those whose standards were very different. The study’s first conclusion was that there was a correlation between alignment to CCSSI standards and improved performance on the NAEP. An update of that study throws that correlation out the window.

If you have done your homework on Common Core you know that the basis for the standards was the American Diploma Project which produced a final report in 2006 on college and career ready standards based on their economic analysis of growth business sectors in five states.  Thirty five states signed on as ADP Partners. Those states began work early to migrate their standards closer to those of the ADP. The ADP only had end of school competencies so there was virtually no alignment to CCSS for the early grades because those standards were not included in the ADP competencies. However, by 2009, the year that the MSU study looked at for NAEP scores, there were several states who could be said to have good alignment to Common Core even though the CCSS standards would not be officially completed for another year. The study initially showed a correlation between those states and higher NAEP scores. This was used as evidence that CCSS math “deserved to  be seriously implemented.” [William H. Schmidt and Richard T. Houang, “Curricular Coherence and the Common Core State Standards for Mathematics,” Educational Researcher 41, no. 8 (2012): pp. 307.]  If you would like to read an explanation of all the fascinating statistical analysis done by Schmidt and Houang, go to Tom Loveless’s coverage of the Brown Center Report here.

Every good statistician knows, however, that a single point does not a trend make. We now have NAEP data from 2009-2013 to consider for CCSS effectiveness. The Brookings Institute, using Schmidt and Houang’s methodology, re-evaluated the regression lines using the additional scores and guess what they found? While states that had the highest alignment to Common Core had the largest increase in NAEP scores, if you remove Alaska from the group of 5 non-adopters because their large drop in scale score points drags down the gains of the statistically small group, the non adopters scores rose as much as those of the medium adopters. In addition, using the regression analysis of Schmidt and Houang and projecting score increases into the future, the study predicts the slowest improvement in NAEP scores going forward compared to historical averages.

In the first 23 years of NAEP testing, there was a 22 point gain in scale scores. According to the new analysis, in the next 24 years there would only be a 7.62 point increase in NAEP scale score points. And while states aligned to CCSS did have a higher average gain than the states with no alignment, the difference was not statistically significant, .035 standard deviation. Brookings points out that “a SD of .20 is not even noticeable let alone significant.” For all the cost and effort states are putting in to implementing CCSS, they are receiving a less than noticeable impact on their NAEP scores.

An earlier Brown Center Report identified another significant statistical trend on the standardized NAEP test scores. [M]ost variation on NAEP lies within states—between students, not between states. The standard deviation of state NAEP scores on the 2009 math test is 7.6 points.  The standard deviation of the 2009 NAEP eighth grade math score, a statistic based on variation in student performance, is 36 points—four to five times larger. [The 2012 Brown Center Report on American Education (Washington: The Brookings Institution, 2012)] This is yet more proof that all children learn differently, advance at different rates and that there are extreme limitations on the ability of a single test, a single snapshot in a child’s educational progress to demonstrate anything about the educational system as a whole.  We would see a dramatic rise in NAEP scores simply by changing to a competency based educational model which would test children on the NAEP based on their competency rating, not their age.

The Brown Center concluded:

Based on empirical analysis of the effects of state standards, the CCSS will have little to no impact on student achievement.  Supporters of the Common Core argue that strong, effective implementation of the standards will sweep away such skepticism by producing lasting, significant gains in student learning.  So far, at least—and it is admittedly the early innings of a long ballgame—there are no signs of such an impressive accomplishment.

I am reminded of the education reformers’ mantra, that we use data based decisions for education. They also value critical thinking. It would appear that, according to the data, CCSS is not going to be the silver bullet for education. Critical decision making would not indicate it is time for everyone to go All In.

 

Anne Gassel

Anne has been writing on MEW since 2012 and has been a citizen lobbyist on Common Core since 2013. Some day she would like to see a national Hippocratic oath for educators “I will remember that there is an art to teaching as well as science, and that warmth, sympathy and understanding are sometimes more important than policy or what the data say. My first priority is to do no harm to the children entrusted to my temporary care.”

Facebook Twitter 

Share and Enjoy !

0Shares
0 0