Disinclined to Respond
At most CUNY colleges, the paper version of the student course evaluation is a thing of the past.
PSC members at several campuses have raised concerns that the low response rates from online-only evaluations provide an incomplete and skewed assessment of a class. The union has also noted, regardless of whether an evaluation is done on paper or online, a greater reliance on student evaluations in making tenure and reappointment decisions – which makes the effectiveness of the evaluation forms a matter of concern for many faculty.
Today only a few campuses, including LaGuardia Community College, City Tech, Bronx Community College and City College, conduct student course evaluations on paper rather than online. And with the shift to digital evaluations, faculty at campuses across CUNY say that the submission rate of completed evaluations at their schools has dropped dramatically.
Low Response Rates
“If student evaluations are placed in front of students at their desks, they are much more likely to complete these feedback forms with helpful commentary than if they have to remember to go to a computer terminal and complete an online form,” Paul Gammarano, an adjunct lecturer at Kingsborough Community College, told Clarion.
Other factors contribute, as well, to dismal rates of course evaluation completion. Students who neglect to activate their campus email accounts never receive the evaluation. Those who lack personal computers and rely on phones to receive their email are left to deal with an evaluation form that is not mobile-friendly.
In the end, those most motivated to complete an evaluation are students with strong opinions about the course and faculty member they’re asked to assess.
Biased Evaluations
“Most responses come from the extremes of the bell curve,” James Saslow, a professor of art history at Queens College, told Clarion. The feedback he receives, he said, usually falls into one of two categories: “[It’s] either ‘art history changed my life and he’s cute,’ or ‘he’s too demanding and nasty’ – the latter usually identifiable as the one student who earned a negligent failure.”
In the language of statistics, what Saslow described is termed “voluntary response bias,” and its effect on online student course evaluations is profound, he said. “I seldom see ratings of general competence or average performance,” Saslow explained.
Kingsborough’s PSC chapter suggested that, for scores to be considered meaningful and included in faculty files, a benchmark must be met for a minimum number of submitted evaluations. The administration agreed that for classes that garner a low submission rate, scores should not be counted. However, Kingsborough administrators did not commit to a minimum threshold for the number of submissions, or to exclude evaluations from classes with low response rates from faculty files.
After PSC members at York College raised concerns over response rates for the new online assessments, the administration began notifying faculty when online course evaluation forms are posted and available so that they could prompt their students to fill them out.
At the Borough of Manhattan Community College, the administration cited an improved college-wide response rate since the move to online evaluations last year, saying it now gets evaluations from students in a greater number of classes. But BMCC faculty members told Clarion that the response rate in certain individual classes has often gone down by 15 percent or more since the switch.
Kathleen Offenholley, associate professor of mathematics at BMCC, says that fewer of her students are submitting the evaluations, and now, with the shift to online forms, students rarely submit written comments when evaluating her class.
Since the evaluations went online, she said, comments have only been offered by about one student per class.
Fuzzy Math
City College switched back to using paper forms in Spring 2011 after using online forms for five years, according to City College officials. PSC Chapter Chair Alan Feigenberg recalls that when CCNY used the online forms, the response rates were “terrible,” dropping to less than 20 percent, according to City College officials. Faculty, department chairs and deans all raised concerns about the dismal completion rates, Feigenberg said. Now that the college has gone back to paper evaluations, according to the college, the response rate is 80 percent.
Feigenberg, a professor in CCNY’s School of Architecture,notes the arbitrary nature of how different departments use the evaluations in their own assessments. “Sometimes it’s looked at as an important issue and sometimes less so,” Feigenberg said, adding that there is not always a clear explanation or rationale for the difference.
While the PSC-CUNY contract does not explicitly mention the use of student course evaluations as a criterion for reappointment and promotion, Article 18 states that teaching faculty shall be evaluated on teaching effectiveness, and student course evaluations are currently used by colleges as one potential measurement of success.
Many faculty members say they have seen a trend toward a troubling reliance on numbers alone, without context or interpretation. A case in point is LaGuardia, which still uses a 55-question Scantron multiple-choice evaluation form that is tabulated by Educational Testing Service. Faculty take issue with LaGuardia’s recent policy change mandating that specifics from student evaluations be included in faculty personnel files.
Before the new policy was implemented, “the chair would analyze and contextualize the numbers,” PSC LaGuardia Chapter Chair Sigmund Shen told Clarion. “But now the administration wants the exact number that is given by [ETS] to be part of your document.” That “exact number,” Shen explained, is simply an average of all the scores to questions in each section in the evaluation.
“Almost every introductory statistics textbook will tell you: ‘Don’t take an average of that kind of data because the numbers that you assign to those categories are arbitrary,’” said Offenholley. A more accurate and useful approach, she suggested to BMCC administrators, would be to avoid reliance on overall averages and instead cite percentages, as City College does – stating the percentage of students who “strongly agree,” “strongly disagree,” etc., with each item on the questionnaire.
Heightening Insecurity
Robbed of context or interpretation, and with each question arbitrarily given equal weight, the use of such averages heightens the insecurity experienced by adjuncts and non-tenured full-time faculty, some of whom feel increasing pressure to go soft on grades. With universities increasingly adopting corporate models of management, students are often referred to as a college’s “customers” – and many members of the CUNY faculty say that some administrators conclude that “the customer is always right.” In fact, a recent report on student course evaluations by the American Association of University Professors (AAUP) refers to the typical student course evaluation not as a measure of teaching effectiveness, but one of “student satisfaction.” (See sidebar.)
Margaret Savitzky, an adjunct assistant professor at York College, says she got nervous when she received a negative evaluation: She was afraid that it would have an outsize effect on the assessment of her teaching overall. Savitzky told Clarion she didn’t understand why one student had said she was never available, even though, despite not having paid office hours, she makes a point of coming to class early and staying after class to answer students’ questions, and she diligently replies to student emails. Luckily, she had the support of her department – but not every adjunct does.
Effective Design
City College’s School of Education revamped its evaluations several years ago in order to make them more meaningful and more in line with the department’s mission. “The best possible assessment is one that informs teaching and supports learning,” Beverly Falk, a professor in the early childhood education program who helped write the new assessment, told Clarion.
The revamped evaluation includes more descriptive questions that get at specifics of teaching, such as to what extent “the instructor encouraged candidates to inquire, to problem-solve, to question their assumptions, and to think critically.”
“I think the folks who do the work and are in the classroom are the ones who should do the work of evaluating people’s chances for tenure and promotion and definitely be the main voice on whether or not people are reappointed,” said Shen, the chapter chair at LaGuardia, where PSC activists are pressing for reform.
“Faculty are best positioned to decide how student evaluations should be designed and used,” he said.
______________________________
RELATED COVERAGE: