Higher Ed

Are Faculty Missing in Action?

Last fall, an article in Inside Higher Ed authored by Judith Shapiro, President of the Teagle Foundation and former President of Barnard College, made the following statement:

“For the most part, however, faculty members have simply been missing in action when it comes to dealing with campus upheavals around race and racism.”

I agree with this statement, but I would expand it to say that faculty members have frequently been missing in action with regard to all kinds of controversial issues.  At many (most?) institutions, faculty are rewarded with promotions, raises, and tenure first for their research (largely based on their individual efforts), second for their teaching (again, largely based on their individual efforts), and only third for their service, which would include working together with others to make their colleges congenial and productive places for the colleges’ diverse inhabitants.  The faculty who produce the most work of direct benefit to themselves are largely those faculty who keep to themselves, focus on their own work, and stay out of the way of college conflagrations.  Consistent with this statement, research has shown that faculty do not feel safe expressing views with which others may disagree until they have had the final promotion to full professor (not, as some people think, until they have tenure).

An example of these tendencies concerns credit transfer among the 19 undergraduate colleges of The City University of New York, at which approximately 10,000 students transfer each fall alone.  Credit transfer is a controversial subject, just one reason being that whether the receiving college counts the credits or not can directly affect the college’s, as well as a department’s, funds, and whether faculty members have sufficient enrollment to teach certain courses.  Although ensuring that credits transfer can benefit students, it can also mean depriving faculty and/or a college of something desirable to them.  Thus it is no surprise that, although for over 40 years problems with credit transfer were seen as the worst problems for CUNY students, and although the faculty issued some statements about those problems, the faculty took no actions to solve the problems.  When the central administration finally instituted a system (known as Pathways) that guaranteed credit transfer for some courses, and thus directly affecting some faculty’s courses, only then did some faculty spend significant amounts of time on the credit transfer issue, with most of those faculty objecting to Pathways, including filing law suits against it.  This prompted one CUNY Distinguished Professor, in his testimony at a public hearing on Pathways, to say to the faculty in the audience: “Where have you been?  Where have you been for 40 years?”

Although there is nothing wrong with working hard to benefit oneself, we also need to provide clear incentives for faculty to work together for the benefit of students, as well as for the rest of the higher education community.

There is more about these issues in my forthcoming book:  Pathways to Reform:  Credits and Conflict at The City University of New York, to be published early in the fall by Princeton University Press (https://www.amazon.com/Pathways-Reform-Conflict-University-Education/dp/0691169942/ref=tmm_hrd_swatch_0?_encoding=UTF8&qid=1494093848&sr=1-1).

Subscribe  

Rewarding Students

Nudging” (consisting of text and/or email messages sent to students about tasks that they should perform) and “early alert” systems (including messages to students whose performance is inadequate or at risk of being inadequate) are gaining popularity in higher education. A blog entry by Matt Reed in Inside Higher Ed points out that, if students receive unpleasant messages, such as unpleasant nudges and early alerts, they will stop reading them, and asks how to counter that.

Here I will not address the technological or other practical aspects of how to deliver messages that get read.  My purpose is only to review some of the findings from the field of behavioral science that help to inform what should be done to make these messages as effective as possible.

First, Reed is correct about the effect of aversive stimuli such as bad-news texts.  We avoid the sources of negative communications (including when one such source is your professor or boss, and such avoidance can cause all kinds of problems).  Therefore, in general, when possible, the messages that we give to students should be about increasing what they are doing that is right, rather than decreasing what they are doing that is wrong—what is known as catching someone doing something well.

We also know from behavioral science that, in order to increase a behavior of a particular person over the long term, the feedback should be:

  • Targeted to the specific behavior that we want to increase (it should not just consist of an amorphous “good job”)
  • Delivered as close in time as possible to the occurrence of that behavior
  • Accompanied by the delivery of something (praise, points, food, etc.) that is of value to the particular person emitting the behavior, i.e., it should be accompanied by a reward (and, no, that will not cause problems by ruining the person’s internal reward system, whatever that is)
  • Not delivered after every instance of the particular behavior, but on an irregular pattern (if every instance is rewarded, then if reward ceases, the behavior will dissipate more quickly than if reward delivery has been irregular)

Some years ago, Hostos Community College of The City University of New York, developed a points system that had many of these attributes.  The Hostos administration reasoned that there were many behaviors that they wanted students to engage in for which there were no immediate rewards such as grades, behaviors such as filing the FAFSA or filling out a college survey.  Hostos therefore awarded students points for these behaviors and then held lotteries in which students could win real prizes.  The more points a student had, the more chances that student had to win.  Although I never saw any data about the results, I was told that this reward system worked well.

In conclusion, when nudging or using early alerts, we should try to send messages that are positive, not just negative, and our positive messages should be targeted to specific behaviors, be timely, be rewarding, and be irregularly delivered.

Subscribe  

The Evolution of the Research on Mathematics Remediation Reform at CUNY and Elsewhere

Traditional mathematics remediation has been described as the largest single academic block to students graduating in the United States.  Most new college students are assessed as needing it, and the majority of those students never complete it—most of the students who take it do not pass, and many students avoid taking it at all.  Without completing assigned mathematics remediation, students usually cannot satisfy many of their college-level required courses, and so cannot graduate.

In 2008, when I became the chief academic officer of The City University of New York system, which includes 10 colleges that offer mathematics remediation (and 9 that do not), mathematics remediation was a big business.  At that time, CUNY was spending over $20 million per year on remediation, with the majority being on math remediation.  In recent years, I believe that figure has increased to over $30 million.

In 2008, CUNY was mostly delivering mathematics remediation as traditional courses.  These were courses covering only remedial material that were taught in a sequence.  Students could not take the next course in a sequence until they had passed the prerequisite course.  CUNY colleges offered two, and sometimes three, levels of math remediation that a student had to pass or test out of before being allowed to take many college-level courses.

However, research reports on various methods for delivering math remediation were appearing.  Some of those reports concluded that being exposed to traditional remedial courses increased a student’s later college success, and others concluded that new methods such as placing students assessed as needing math remediation directly into a college-level course with extra support were more helpful. Some of this research used what are known as quasi-experimental analytical techniques. But it was hard to know what to make of the research as a whole.  As an experimental psychologist, it appalled me that there was insufficient evidence guiding us in these huge expenditures of funds about programs that affected the lives of many thousands of students.

Therefore in 2013, along with Mari Watanabe-Rose and later Dan Douglas, I conducted a randomized controlled trial using over 900 students assessed as needing elementary (remedial) algebra.  We randomly assigned these students to traditional remediation, that course plus a weekly workshop, or introductory, college-level, credit-bearing statistics with a weekly workshop.  Twelve faculty each taught one section of each course type.  The results showed that there were significantly higher pass rates in the statistics course, and 2.5 years later, 8% more of the statistics students have graduated than the traditional remedial course students, and fewer of the statistics students have dropped out of college.  The students who were assigned to statistics have shown that they did not need to pass elementary algebra to pass statistics, nor did they need to pass elementary algebra in order to satisfy their college-level natural and social sciences requirements.

While we were following the ongoing performance of the participants in our experiments, the research literature on remediation had become large enough, including research conducted with rigorous methodology, that research reviews became possible.  Two such recent reviews have both concluded that the weight of the evidence now shows that traditional remediation, in general, makes it more difficult for students to advance in their academic careers.

Thus, in contrast to the state of the research in 2008, we can now conclude, with a fair degree of certainty, that placing students into traditional remediation, in most cases, is not the best path for them in terms of helping them to pass their college-level courses.  Tying math remediation to college-level courses when that remedial work is specifically needed to understand the college-level course material, streamlining and aligning required quantitative material, all seem to be more effective methods for helping students progress with their college-level requirements.

Subscribe  

Does Increasing Competition for Transfer Students Always Help Them?

In the most recent previous entry on this blog (March 13), I described an articulation agreement between a community college and a university in New Jersey, an agreement that seems to be founded on decreasing the competition for transfer students and instead promoting their transfer to one particular bachelor’s-degree institution.

However, there are also attempts in process by others to increase competition for transfer students, which should be to the benefit of those students.  An organization named Affordable College has developed an app, as well as other resources, for students seeking to transfer.  Bachelor’s-degree colleges pay a fee to participate with Affordable College.  Then Affordable College presents information about those colleges to students at community colleges who want to transfer.  The community college students can use the information from Affordable College to find out how their credits would transfer to each of the participating bachelor’s-degree colleges.  By participating, these colleges make sure that they’re in the mix for obtaining transfer students, and thus for increased enrollment and revenue.  In this way, assuming community college students actually use Affordable College, there is an incentive for bachelor’s-degree colleges to pay the fee to participate and to state that they will transfer many credits.  The community colleges, in turn, receive a “share of the revenue for each successful transfer,” thus providing them with an incentive to participate.

Such incentives work well in benefitting potential transfer students if those students have several possible destination colleges among which they can choose.  But at my institution, The City University of New York, choices are in fact often much more limited.  The 12 bachelor’s-degree colleges of CUNY do not all offer the same majors.  So, for example, a student can only receive a Bachelor’s in Actuarial Science from Baruch College.

Another constraint on CUNY transfer students is geography.  Although the 19 undergraduate colleges of CUNY are all located within the five boroughs of New York City, it can take up to two hours on public transportation to go from one part of the city, and from one CUNY college, to another.  For CUNY students, who are more likely than not to be Pell Grant recipients, and who often must live with relatives and work in order to make ends meet, a very long commute to and from class, home, and/or work can be prohibitive.

Then, too, some of CUNY’s bachelor’s-degree colleges are very selective, including for transfer students. Just because a student applies to transfer to a particular CUNY college doesn’t mean that that student will be admitted.

Thus, the choice of a destination college for a CUNY transfer student can be quite limited.

Benefitting transfer students by fostering competition among the destination institutions will only be effective if transfer students truly have choices as to where they will go.  Unfortunately for many students this is not the case.  Therefore we need additional mechanisms to help such students, including ways to help them transfer as many of their credits as possible.

Subscribe  

Not All Articulation Agreements are Good for Students

This past fall an article in Inside Higher Ed described what initially sounds like an articulation agreement that could be wonderful for students: “a new partnership, called 3+1, between [Rowan College at Burlington County, a community college, and Rowan University], which allows students to remain on the community college campus while earning a Rowan University degree.  Participating students also get a 15 percent discount and are placed in guided degree pathways from the two-year institution that lead to a bachelor’s degree from the university.”

However, the article goes on to state that this “new program…has prompted the community college to limit any advertisements or promotion for other four-year colleges and universities on its campus.  RCBC will not host transfer fairs or information tables for other four-year programs.”

A bachelor’s-degree college or university, such as Rowan, can see itself as giving up something in making one of these agreements because it has less opportunity to refuse to transfer credits when a student transfers in, and thus less opportunity to earn revenue from transfer students.  However, such an agreement, particularly if it involves essentially eliminating the marketing of Rowan’s competitors, can give Rowan a leg up in obtaining transfer students as compared to other competing bachelor’s-degree institutions.  Giving up some credits may be worth it if, as a result, you get more students.

But what happens to students who, for reasons such as geographical constraints or subject matter interest (e.g., Rowan does not have majors in Anthropology or Architecture), don’t want to transfer to Rowan after completing their associate’s degrees?  It appears that the information that these students will have about other options will be limited, and they will have to do more to find their way in the transfer maze.  Perhaps there are few RCBC graduates who would prefer a bachelor’s-degree institution other than Rowan.  In that case, perhaps RCBC is doing the best thing that it can for the majority of its students in making this agreement, which could significantly help those RCBC graduates who want to attend Rowan to attain bachelor’s degrees.

Even so, it is unfortunate that what is good for each and every student is not the only criterion shaping these policies—that, due to existing incentive structures, self-protection and self-interest inevitably come to play in interactions between independently operating institutions, as they do in other areas of academe.

Subscribe  

Is Remedial Elementary Algebra Harder Than College-Level Introductory Statistics?

My coathors (Dr. Mari Watanabe-Rose and Dr. Daniel Douglas) and I have learned that our article on a college-level statistics course alternative to math remediation, published in Educational Evaluation and Policy Analysis (EEPA), was the most read article in EEPA for 2016.  EEPA is an AERA (American Educational Research Association) journal.

Our research shows that when nonSTEM majors, assessed as needing math remediation (elementary algebra), are randomly assigned to college-level statistics (with extra support) they are more likely to pass and to continue to accumulate more college-level credits afterwards than similar students randomly assigned to traditional remedial elementary algebra.  However, adoption of this alternative to math remediation has been slow at The City University of New York, where the research was conducted.

A few CUNY math faculty have questioned our results, saying that the higher pass rate in statistics was due to statistics being easier to learn than is elementary algebra.  Or that the higher statistics pass rate was due to statistics requiring only a 60% score to pass, whereas elementary algebra requires 74%.

Let us leave aside for the moment the fact that, in our experiment, students randomly assigned to statistics with extra support accumulated more credits in college afterwards than did students randomly assigned to elementary algebra, indicating that elementary algebra was not as necessary a prerequisite for students to satisfy their college-level course requirements as some have claimed.

For now, let us just consider the validity of the statement that statistics is easier to learn than elementary algebra, or the logic of comparing the percentage needed to pass statistics vs. elementary algebra.

First consider that, in grading any course, the percentage of correct answers that any student gets is completely arbitrary, because the percentage of correct answers is a function of the difficulty of the quizzes, exams, homework assignments, etc.  A faculty member can make those tasks really hard, so that even good students get few questions correct, or really easy, so that most students get everything correct.  With elementary algebra, there are CUNY-wide standards and tests, so you can be pretty sure that, if 20% of students consistently pass in one faculty member’s class and 60% in another, the latter faculty member is actually teaching better than the former, at least if the students in the two faculty members’ courses are similar at the beginning of the semester.  But if there isn’t a CUNY-wide syllabus, final exam, and grading rubric (and, in fact, none of these exist for Statistics), you can’t tell which faculty member is teaching the material better without obtaining much more information.

Now consider the fact that statistics and elementary algebra are qualitatively different courses, which means that, by definition, they can’t ever have the same syllabus, final exam, and grading rubric.  Which means that the percentage passing, or the percentage you define as passing, simply can’t be directly compared across algebra and statistics.  Which means that there is no way to say that one is easier than the other.

Suppose, in any sample of 100 students who took both statistics and elementary algebra, 60 scored at least 80% correct in statistics and 40 scored at least 80% correct in elementary algebra.  Does that mean that the statistics course is easier for the students than the elementary algebra course?  Perhaps in the simple sense of the students getting better grades in statistics.  But inherently easier?  No.  The faculty teaching statistics could simply make their exams much harder, and the number of students obtaining 80% correct in statisticss would plummet, and then an observer might say that the statistics course is the harder one.

A more useful question for our research was whether the faculty in our experiment graded statistics according to the standards by which they usually graded statisticss.  In our published paper we list nine pieces of evidence that are consistent with the hypothesis that the faculty in our experiment graded statisticss as it is usually graded. We can therefore reasonably conclude that many students, though assessed as needing remedial elementary algebra, can nevertheless pass college-level statistics, taught as it usually is except with some extra support (a weekly 2-hr workshop), as well as passing other subsequent college-level courses. Students are more likely to pass college-level statistics (taught as usual except with extra support) than remedial elementary algebra (taught as usual).

Of course, just because a student can pass college-level courses without having first passed elementary algebra doesn’t mean that no student should have to take elementary algebra, or higher-level algebra courses.  A college or university could decide that it is important for every graduate of that college to demonstrate knowledge of at least elementary algebra, or that students majoring in certain disciplines need to do so.  But such statements are different than saying that every college student needs elementary algebra in order to be able to pass required college-level courses, a statement that our research does not support.

At CUNY there are some faculty who believe that every CUNY graduate should demonstrate knowledge of algebra, at least elementary algebra.  However, the CUNY-wide general education requirements do not currently require that every student know algebra.  Passing college algebra is sufficient for passing the mathematical and quantitative reasoning general education requirement, but it is not necessary.  Passing statistics can also satisfy this requirement, as can passing a quantitative reasoning course. And given that we now know that passing elementary algebra isn’t necessary in order to pass statistics (taught with extra support), it follows that there is no current requirement for all CUNY students to demonstrate knowledge of algebra.

Assuming that CUNY’s general education requirements do not change, all CUNY students who do not require algebra for their majors, and who have been assessed as needing math remediation, should have the opportunity to take statistics, or another quantitative alternative course, with extra support, instead of traditional elementary algebra.

Subscribe  

Experiment in Math Remediation

Here is the Youtube video made by the American Educational Research Association (AERA) of me talking about the experiment concerning math remediation that we just published in the journal Educational Evaluation and Policy Analysis:
 

Subscribe