How Common Core-Aligned Tests Are Turning College Students Into High School Students
By Sandra Stotsky | March 22, 2019, 8:13 EDT
State-mandated tests must by state law be based on a state’s official standards. That is why the tests currently given in the Bay State (aka MCAS 2.0) are aligned to Common Core. MCAS 2.0 tests are based on or aligned to the Common Core standards for English language arts and mathematics that were adopted by the state board of education in 2010 and slightly revised by the Massachusetts Department of Elementary and Secondary Education in 2016 for the four-year state education plan required by Every Student Succeeds Act.
Despite the similarity in name, the new tests are totally unlike the original Massachusetts Comprehensive Assessment System tests. For example, MCAS 2.0 tests have no Open Response test items, which were useful for assessing content-based writing. On the original MCAS tests, there were four Open Response test questions on every test given at every grade level. And, better yet, they were corrected by live teachers, not computers. The state’s new four-year plan was submitted to the U.S. Department of Education in 2017 for review and approval, in exchange for Title I money to help low-income school districts. Approval by the state legislature and local school committees was not required or obtained for four-year plans that voters in the state also never debated or voted for.
Since all 50 states today use Common Core-aligned tests, that means almost all schools (including public charter schools) teach to Common Core’s standards. It is not possible to understand the growing opposition to Common Core’s standards without understanding several key issues now being raised about the tests aligned to them.
A. Criteria Used for Selection of Passages for Reading Tests
It would have been reasonable for the original testing consortia subsidized by the U.S. Department of Education (which are called Partnership for the Assessment of Readiness for College and Career (or PARCC), and Smarter Balanced Assessment Consortium (or SBAC)) to use the criteria that developers of the National Assessment of Educational Progress tests are supposed to use for reading tests. Why? Primarily because the chart showing the “percentage distributions” of basic types of reading passages that NAEP uses for its own reading tests is already in Common Core’s English language arts document. Common Core recommends the chart as a guideline for the school reading curriculum, even though these percentages were never intended by NAEP to guide the K-12 curriculum. NAEP documents tell us only that these percentages are for the various kinds of reading passages to be used on NAEP tests.
In fact, members of the Steering Committee of the National Association for Educational Progress were told that NAEP test developers deliberately do not assess dramatic literature (plays) on the grounds that test passages would have to be very long and would exceed word limits. NAEP’s decision to exclude assessment of dramatic literature makes it clear that the percentages of literary and informational passages recommended at various educational levels for a Common Core-based K-12 curriculum were not intended by NAEP to shape a K-12 reading and literature curriculum. (Dramatic literature — think Shakespeare — was considered by many as the central genre to be studied in high school English. Indeed, selections from plays appeared in test items on all the original MCAS English language arts tests.)
Nor is there any research suggesting that a heavy dose of informational reading in secondary English classes develops reading skills as well as, or better than, the literary essays, biographies, and well-known speeches English teachers have always taught. David Coleman, lead writer for Common Core’s English language arts standards and now chief executive officer of the College Board, probably didn’t understand this or know that members of the National Assessment Governing Board in 2004 had helped to develop criteria for the kind of reading passages to be chosen by NAEP test developers.
Passage Source: Among other criteria, the NAEP document on item specifications for the 2009 NAEP reading assessments says that reading passages are to “reflect our literary heritage by including significant works from varied historical periods.” The U.S. Department of Education could easily have insisted on this criterion for the Common Core-aligned reading tests it funded since several of Common Core’s high school standards require the study of this country’s seminal political documents, as well as significant texts or authors in American literary history. But so far, no sample test items for college and career readiness tests in reading can be found addressing this country’s seminal political documents. Among Common Core-aligned tests, released PARCC test items for reading can be located via this web site. SBAC provides sample test items here. Apparently, few test developers and educators care what is assessed by Common Core-aligned reading tests.
Overuse of Informational Snippets: Many sample passages in grade 10 or 11 test items aligned to Common Core’s reading standards cannot assess college readiness because they are snippets from what could be a long curriculum unit in science or history with a heavy vocabulary load. Surely, if college readiness is to mean anything at all it should mean the ability to follow the gist of long stretches of prose or poetry. It’s hard to see how college readiness can be determined by test items consisting chiefly of short informational articles drenched in subject-related vocabulary. It is certainly not clear if any Common Core-aligned informational test items will be of an adequate length for judging readiness for, say, reading a chapter in a frequently assigned college science textbook.
Secrecy Problems: As of March 2019, we still do not know what specific people have vetted test items in either reading or mathematics and how demanding the items are for high school, college, and career-readiness tests (or for the revised SAT or ACT tests now judged by the U.S. Department of Education as legally usable as high school exit tests). We do not know if college teaching faculty in mathematics, science, engineering, and the humanities have been involved in determining cut-off or pass scores for college readiness. Nor do we know what their scores mean to academic experts in the subjects tested.
B. Low Expectations for College- and Career-Readiness
We must above all consider what Common Core means by “college readiness.” Common Core itself claims that by addressing its standards, students will graduate from high school able to succeed in entry-level, credit-bearing academic college courses and in workforce training programs. College readiness thus means that students will not have to take a remedial course in mathematics or English if they seek to attend a non-selective college or a community college.
In Mathematics: Yet, with respect to the coursework implied by the math standards themselves, college readiness reflects a relatively weak algebra II course, as mathematician James Milgram pointed out. Both logarithms and the standard algebraic analysis of conic sections are missing, according to his examination of the math standards. With only a few advanced (+) standards in trigonometry filling the void between the algebra II standards and introductory college mathematics, Common Core’s standards apparently cannot help to prepare students for science, technology, engineering, or mathematics careers, which require extensive high school coursework in trigonometry and/or precalculus.
In English: We know much less about what college readiness in English language arts means. Common Core’s English language arts standards suggest few specific texts to read, and the range of titles in Appendix B in its English language arts document illustrating the quality and “complexity” of what students should read from grade to grade is so broad by the high school years that no particular level of reading difficulty above grade 5 or 6 can be discerned. Research studies suggest that the reading level of the average American high school graduate is now about grade 6. Moreover, we don’t yet know where the pass score has been set in English language arts or reading. (Or, if it has been set, who set it, and what it means to English professors or anyone else.)
C. What College Readiness Test Scores Tell Us
What, then, can college readiness test scores in mathematics and reading tell us? Since tests based on Common Core’s standards cannot address the mathematics requirements of selective public or private colleges and universities (because major topics in trigonometry and precalculus are not in Common Core’s standards, and state-mandated tests by law cannot address topics that are not in the state’s official standards), scores on Common Core-aligned tests can tell us only how many students may be ready for a non-selective or community college. It is unclear whether most colleges rely simply on a score on a presumably college-related test such as a literature or language Advanced Placement test. Although, now that AP tests are aligned to Common Core’s standards, it is not clear what AP test scores themselves mean.
What will we as a society have gained by the use of Common Core’s “college readiness” tests? We will likely gain a much larger number of college graduates, assuming that more students will complete a college degree program because they haven’t had to take remedial coursework in their freshman year. But they are unlikely to know any more than they would have known if they had had to take remedial coursework because their for-credit college coursework will likely be adjusted downward to accommodate their lower level of high school achievement.
Recall that the level of college readiness in Common Core mathematics is, to begin with, lower than what is currently required for admission to most two- and four-year colleges in this country. What this means in effect is that our colleges will become expensive high schools.
College readiness tests based on Common Core’s standards will play two significant roles. First, they will guarantee the presence of credit-bearing courses with low academic expectations in mathematics, reading (English), and possibly other freshman subjects. Second, these tests will change more than the college courses students are enrolled in. How, we do not yet know. But it seems logical to expect large numbers of relatively low-performing high school students who have been declared college-ready based on a test with low expectations to have an impact on the other students in the college courses they are entitled to enroll in for credit. It’s apt to make colleges more like low-performing high schools that in effect depress the education of their higher-performing students.
Sandra Stotsky, former senior associate commissioner at the Massachusetts Department of Elementary and Secondary Education, is professor of education emerita at the University of Arkansas. Read other articles by her here.