the blog of Carol Burris

Letter to the New York State Regents on CCR

February 25, 2011

Dear Regent Tilles:

You are presently engaged in the critical and thoughtful work of determining new requirements for high school graduation.  Revised passing scores on Regents exams will in part determine which students will and will not graduate from high school.  In addition, the creation of a new diploma to indicate whether or not students are college ready, will affect their perception, their parents’ perception, and the public’s perception of their life chances. Perceptions have already been altered following this month’s publication of articles on the presumed lack of college readiness among our state’s graduates.  As a high school principal, I understand how powerful the phrase college ready is to my students and their families.  Because of its power, it matters how it is measured and how it is used, especially within the context of high stakes testing.

 I am writing to express my concern that the scores you have chosen to indicate college readiness are highly problematic and do not measure the construct that you wish to measure.  I will explain why I believe this to be so in this letter.  Following that explanation, I will suggest an alternative measure of college readiness that would be simple to implement and which is grounded in research.  In the end, we all want the same thing—increased student learning that will better prepare our graduates to meet the challenges of the 21st century.

In addition to serving as the principal of South Side High School for the past ten years, I have remained active in the research community. I research, write and publish regularly on issues of educational equity. In two of these studies (Burris, Heubert, Levin, 2006 and Burris, Wiley, Welner & Murphy, 2008) my co-authors and I used binary logistic regression analysis to measure the effects of curricular and grouping practice reforms on student achievement.  Binary logistic regression, when it includes a rich array of covariates which are entered or excluded from the model based on their significance, contribution and interactions with other covariates, can provide most useful information. 

As I understand the technical paper which is posted on your website, Dr. Everson used binary logistic regression to determine how ELA Regents scores and Math A Regents scores predict grades in college mathematics and freshman composition at CUNY.  The dependent variable was binary—a grade below C in a math or in composition courses= 0 and a grade at C or above =1. The interaction between the two (the Regents score and the grade) produced a probability curve for each course, which associated Regents scores with grades at C or better.  Although this is helpful information for CUNY, it is not sufficient to create a measure of college readiness.  First, only New York City public school graduates’ scores were included in the analysis.  About 70% of CUNY students attended NYC public schools.  Graduates of other New York State schools, which comprise most of the remaining 30%, would also have Math A and ELA Regents scores on their transcripts. They were not included and no rationale was given for their exclusion from the study. 

This is problematic because using a select population limits the ability to generalize the results to all New York State students.   For example, we know that school type can dramatically influence students’ performance– David Liebowitz & Dan Koretz, in their July 2, 2010 memo to Dr. Steiner discuss how different the resulting probabilities are when school type is taken into account.  The ability to generalize findings impacts the reliability of the measure.

Validity, in its simplest definition, refers to whether we are, in fact, measuring what we intend to measure. In this regard, there are unique problems associated with the math college ready score, and there are problems common to both the math and ELA scores.

Considering mathematics, there was no covariate in the model to account for other high school mathematics course work taken by the students included in the study. Dr. Everson used only Math A scores to find the score to indicate college readiness. The content that students learn in high school would certainly affect their college performance. Below is an example of the influence of other coursework that we can infer from the Everson technical paper.

According to Table 1:

  • Approximately 1 in 4 CUNY students had a Math B score.
  • Of those students who passed Math B with a score of 65-69, only about 14% needed to take a remedial course based on the CUNY test.  The same threshold (14%) is associated with a score of approximately 90 on Math A. 

Clearly passing the Math B Regents, at the even proficiency level, has a positive effect on mathematical success at CUNY.  Was it the score of 90 on the Math A Regents, or the knowledge gained in subsequent math courses taken by the students who scored a 90, that made the difference?  I suspect that it was the latter, but it is impossible to know without a covariate that accounts for math courses taken beyond Math A. If we encourage students to chase a score, rather than take more challenging math, we may increase the need for remediation and decrease readiness.

Second, The CUNY students included in this cohort, which began CUNY in fall 2008, may have taken the Math A exam in 2005, 06, 07, or 08. There is no way of knowing, because the model does not control for the many versions of the exam (a possibility of 12 or more) given while that cohort was in high school.  The Regents critique of state exam cut scores, as well as the analysis performed by Diane Ravitch, makes the scores on these exams an unreliable measure. 

Finally, the Math A exam used in the CUNY study no longer exists.  The Integrated Algebra exam measures different content, has cut scores established by a different bookmarking procedure, and according to a memo on the SED website, it is more difficult to score an 80 on the Integrated Algebra exam than it was on the Math A exam (see July 20, 2010 memo from Leibowitz and Koretz). To transfer the meaning of the Math A score of 80 to a score of 80 on the Integrated Algebra Regents as either a new passing score, or a college readiness score, would result in a standard  higher than the standard set by the Koretz technical paper, even if all of the aforementioned problems did not exist.

Common to both the Math and ELA logistic regression analysis is the problem of the lack of covariates. In addition to a lack of control for coursework as mentioned above, the model did not include factors such as SES status, English language status or whether the student entered CUNY immediately after high school graduation or later. The inclusion of students from schools other than NYC public schools could have measured the contribution of school type to the outcome. All of these covariates, including coursework taken, are available data from admissions. If covariates had been included in the model, it would have been possible to determine the contribution of the Math A and ELA scores to the need for math remediation, which would allow you to know whether changing the score might have any effect at all.  From what was done, all we can say is that NYC public high school graduates who scored higher on the Math A Regents and the ELA Regents have a decreased probability of needing remediation at CUNY and that we can make some predictions regarding their grades, based on this singular observation.  What cannot be said is that low scores on the Regents cause a student to need remediation or that improved scores would result in higher college performance.

To make the case with another example, we know that the most reliable predictor of test scores is parental income, therefore Dr. Everson could have produced a probability curve that was similar if income were used as the independent variable.  We would certainly not tell students that they are ready for college or not based on their parents’ income, although we could certainly find a threshold probability associated with a range of incomes. Likewise, I do not think that any of us would believe that if we put 10,000 dollars in a student’s pocket he would do better at CUNY. To believe that raising a test score on the Integrate Algebra exam that a student generally takes in Grade 9 will result in increased readiness is equally unfounded by the evidence presented. Without covariates, it is not possible to establish the role that scores on the exams play in preventing remediation.

Finally, there is no clear rationale for why these scores were chosen. The Math A score of 80 and a score of 75 on the ELA Regents, have different Grade of C probability values associated with them. For Math A, a score of 80 appears to generate a probability value of 65 of obtaining a C or better in College Algebra. For ELA, the probability value is 80 of obtaining a C or better in freshman composition. Even the ELA passing score of 65 generates a higher probability (75) than does the score of 80 in mathematics.  In the college readiness indicator discussed in The New York Times and the Wall Street Journal the scores are converged—a college ready student is identified as one who achieves the score in both.  There is no evidence provided by Everson that would justify that convergence.

After the embarrassment and confusion generated by the declining rigor of the state exams, it is critical for both our students and their teachers that we get the new graduation standards right. The Regents exams are used for purposes of graduation, diploma type, and shortly for the evaluation of teachers and principals. AIS services, which must be provided to students who are not at proficiency require re-allocation of funds from programs in the arts and other enrichment programs.  Students are subjected to drill and skill in an attempt to raise test scores, and are denied access to more enriching learning experiences. How long will it be, given the publication that 85% of our African American and Latino students are not college ready, that some commentators will translate not college ready into not college material?

If an 80 on the Integrated Algebra Regents becomes the passing score for graduation, many students across the state (according to your data, the majority) will not progress into Integrated Geometry, Algebra2/Trigonometry and beyond.  If 80 is considered college ready, students who plan to attend college may remain ( or be pressured by schools to remain) in Integrated Algebra chasing a score rather than taking the advanced math courses that research has established as a contributor to not only readiness, but to college completion.  The research is clear—rigorous curriculum in high school, especially in mathematics, is the key to college success.

Examining longitudinal data from the High School & Beyond/Sophomore Cohort, 1982-1993,[1] researcher Clifford Adelman (1999)[2] concluded that enrollment in math beyond Algebra 2 (trigonometry and beyond ) in high school “more than doubles the odds” of a student enrolled in college of actually completing a bachelor’s degree. He determined those odds using logistic regression, but he included covariates in his analysis.

Curriculum was one of three components that Adelman (1999), in his study entitled Answers in the Tool Box: Academic Intensity, Attendance Patterns and Bachelor’s Degree Attainment, refers to as “academic resources” that strongly affect college completion.  Of the three components Adelman included in academic resources—test scores, class rank/GPA, and curriculum—the curriculum students studied contributed more than either of the other two factors.  Curriculum contributed 41% to college completion, as compared with 30% for test scores and 29% for class rank/GPA.  Adelman identifies the highest level of math studied as having the most “obvious and powerful relationship” of all of the components of curriculum. Furthermore, HIGHMATH, the construct he created to represent the highest-level math course taken, proved to be an even more powerful indicator of successful college completion than socio-economic status.

 Ironically, the increased importance of curriculum over scores is posted on CUNYs own website.  From the site– http://collegenow.cuny.edu/nextstop/finish_hs/

“According to studies, which is the best predictor of college success?

  1. SAT scores
  2. Grade point average (GPA)
  3. the courses you took in high school

The correct answer is c. You’ll prepare yourself better by taking tough academic courses.”

Rather than rely on high test scores to measure readiness, I would suggest that the better course of action is to create policies and incentives that reward students and schools for engaging students in challenging coursework.  One possibility is to use the earning of the Regents Diploma with Advanced Designation as an indicator of college readiness.  It encourages the completion of a sequence in mathematics that includes the first higher math course identified by Adelman as contributing to increased college success.  It requires students to continue in second language studies or complete a sequence in CTE, the Arts, or business and students must take two at least laboratory sciences, passing their Regents exams. 

It is also interesting to note, this more comprehensive measure yields greater numbers of students who would be considered college and career ready in minority groups—the percentage of all students, both special and general education, who earned the diploma with Advanced Designation and who are Black, Hispanic or Native American slightly exceeds the college ready rates based on scores that excludes special education students and which were published in The New York Times and the Wall Street Journal.  Alternative  ways by which students could demonstrate college and career readiness could be by taking and passing a target number of IB or AP courses—a measure which is in keeping with the research on readiness as well.  SAT scores could be a third way to demonstrate readiness, thus each child would have different opportunities to show readiness, all with a research base.

 It would still be incumbent upon the Regents to determine meaningful proficiency scores on all Regents exams using a thoughtful process grounded in what students should know and be able to demonstrate.  Such a process would serve all of our students, including our students who are not yet at the present standard, well. I thank you for taking the time to read this letter and for your consideration of the thoughts contained within.

Sincerely,

Carol Corbett Burris, Ed.D.

Principal of South Side High School, Rockville Centre, New York

Cc: William H Johnson, Ed. D. Superintendent of Schools

       Rockville Centre Board of Education

 

 

 

 


[1]High School and Beyond/Sophomore Cohort, 1982-1993 is one of three age-cohort studies designed and sponsored by the National Center for Educational Statistics.

[2] Adelman, C. (1999). Answers in the tool box: Academic intensity, attendance patterns and bachelor’s degree attainment. Washington, DC: U.S. Department of Education, Office of Educational Research. Retrieved from http://www.eric.ed.gov/PDFS/ED431363.pdf

5 Responses to “Letter to the New York State Regents on CCR”

  1. explainer video producer

    I’m impressed, I have to admit. Rarely do I encounter a blog that’s equally educative and engaging, and without a doubt, you have hit the nail on the head.
    The problem is something too few folks are speaking intelligently about.
    I am very happy I found this in my hunt for something relating to this.

    Reply

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

Basic HTML is allowed. Your email address will not be published.

Subscribe to this comment feed via RSS

%d bloggers like this: