What Else Can the Logue, Watanabe-Rose, & Douglas (2016) Experiment Tell Us? The Original Experiment and Some Follow-Up Data

In my blog post of January 20, 2017, I addressed the issue of whether or not it makes sense for someone to say that a course on subject A is easier or harder than a course on subject B.  I concluded that it is not possible to make such a comparison for two qualitatively different courses.  There is not a single easiness scale on which we can compare such courses.  That blog piece particularly concerned the research reported in Logue, Watanabe-Rose, and Douglas (2016; http://journals.sagepub.com/doi/pdf/10.3102/0162373716649056), which compared the performance of students (all assessed as needing remedial math) who took elementary (remedial) algebra (what we called Group EA) with those who took college-level introductory statistics with a weekly 2-hr workshop (what we called Group Stat-WS).

Although we cannot compare the easiness of two such qualitatively different courses, we can determine if different groups of students receive similar grades taking the same course, an approach employed in one part of our 2016 experiment. We found that, exposed to the same grading criteria, the Stat-WS students did significantly less well than students who had taken the same course but who had been assessed as needing no elementary algebra.  Nevertheless, the Stat-WS students did significantly better than the EA students, with the majority of the Stat-WS students passing statistics but the majority of the EA students failing elementary algebra.  Students assessed as needing elementary algebra do not need to pass that course in order to pass introductory statistics with an accompanying weekly workshop.  Further, our findings did not differ by instructor, by college, or by the race/ethnicity of the students, and there was approximately the same boost in pass rate from being assigned to Stat-WS versus EA for students with the full range of placement test scores.

We also know that it was not necessary for the Stat-WS students to pass elementary algebra in order to pass other college-level natural and social science courses that they took.  Our 2016 publication showed that the Stat-WS students, during the year after the experiment was over, continued to earn more credits than the EA students, and the Stat-WS students were as likely to pass natural and social science general education courses as the EA students.

We now have 2.5-year follow-up data on the over 900 students who were randomly assigned in our experiment to EA, Stat-WS, or elementary algebra with a weekly workshop (a total of 36 course sections taught by 12 different faculty at 3 CUNY community colleges, with each of the faculty members teaching one section of each course type; these long-term follow-up data were reported at the 2017 AACC conference, see p. 38 of http://www.aacc.nche.edu/newsevents/Events/convention2/Documents/97thAnnualAACCConventionProgram.pdf ). Of the 907 randomly assigned students (approximately 300 in each group), the data show that, so far, whereas 9% of the EA students have graduated, 17% of the Stat-WS students have graduated (a statistically significant difference), and whereas 52% of the EA students have dropped out of college (i.e., are not enrolled in any college in the United States), only 47% of the Stat-WS students are in that category.  Further, once again, our findings do not differ according to the race or ethnicity of the students.  Clearly having been assigned to Statistics without first taking Elementary Algebra in no way harmed, and indeed benefitted, the progress of the Stat-WS students with regard to all of their college courses.

Yet before we, based on these results, assign lots of students to college-level statistics with extra support instead of traditional remedial math, perhaps we should consider programs for associate’s-degree students that may raise all students’ graduation rates, programs that are so successful that we need no longer be concerned about the low graduation rates of students assigned to remedial math.  For example, we now know, from the most rigorous evidence available for any higher education program, that CUNY’s ASAP program (http://www1.cuny.edu/sites/asap/ ) doubles associate’s-degree graduation rates, even for students who are assessed as needing some remediation.  Perhaps if we put all students into ASAP we will no longer need remedial math reform.

However, ASAP’s magnificent success does not mean that we have no work left to do on the problems caused when we assign students to traditional remedial courses.  ASAP only enrolls students who have no or slight assessed remedial need, yet there is a 13% difference in the three-year graduation rates between those two groups of ASAP students, and only 48% of the students with slight assessed remedial need graduate within three years—plenty of room for improvement.  Just because there are interventions such as ASAP that help address the nonacademic reasons for students dropping out, does not mean that we should ignore the academic reasons, and remedial math is the largest single academic reason for students not completing college.  The majority of new students in community colleges are assessed as needing it, and the majority of those students do not complete it.

Colleges may have an incentive to offer many sections of remedial math—such courses are among the most profitable courses because the instructors are among the lowest paid.  However, alternative quantitative pathways, such as employed with Group Stat-WS in our experiment, can increase the retention and graduation rates of students and thus maintain or increase valuable enrollments while simultaneously decreasing the cost per graduate.

Categorized in: