At the end of every semester, student inboxes are relentlessly pounded with course evaluation e-mails. It’s a guarantee for all St. John’s students as the semester comes to a close, and gripes about these evaluations are commonly heard around campus.
To find out just how unpopular course evaluations really are with students I contacted Clover Hall, the vice-president of Institutional Research and Academic Planning. She told me that the student evaluation rate hovered at 43 percent last year. That’s a lowly percentage, and Hall’s office has been working hard to increase this number, she said.
I’m thrilled that the folks at the office of Institutional Research and Academic Planning have decided to condense these evaluations into one single e-mail this year, but I’m not sure it will be as effective in encouraging students to fill them out as they may hope. I don’t think five e-mails at the end of every semester are responsible for 57 percent of students turning a cold shoulder. Students simply don’t have enough incentive to spend time on these evaluations.
Clearly, students enjoy the act of evaluating their professors’ performances, as evidenced by the popularity and usage of Ratemyprofessors.com. Every full-time professor at this University has a profile and student feedback, and students use the site religiously every semester. Yet, students continue to scoff at the institutionalized evaluations, even though in theory they should have more weighty repercussions.
Other than their professors asking politely in class, students really have no incentive to complete the course evaluations that are sent their way each semester. They can’t see the effects of their comments, and they can’t be sure that what they’re taking the time to write is going to be seriously considered and acted upon.
Many students might not even really know what is done with these evaluations or if their professors actually see their feedback (even though professors are in fact presented with their students’ feedback, according to Hall).
Let’s compare the differences between STJ course evaluations and Ratemyprofessors.com evaluations, considering both are tools designed to evaluate professor performance.
What’s fascinating is that institutionalized evaluations would seem to have a more practical affect. In theory, professors would be rewarded or punished for their performances. But as it is now, St. John’s students don’t see the results of their evaluation efforts, if any real results are actually taking place. We like stats, including a professor’s average difficulty rating and how many chili peppers they’ve received.
With Ratemyprofessors.com, there’s an instant, practical result that students engage in: they post their say, invaluably warning other students of bad professors or encouraging that they take good ones. They can see their evaluations, watch it make a difference and contribute while feeling that their time and input is actually helpful. The administration needs to consider this if they are to improve the evaluation rate to over 50 percent.
This is to say, why aren’t these results of student evaluations made public? Shouldn’t professors be held accountable for their performances with an overall grade? One thing is for sure: the University can do a significantly better job of scanning evaluations for the ones that are eloquently written and genuinely meaningful than Ratemyprofessors.com can, but doesn’t filter poorly submitted evaluations. After all, if there’s one downfall to the Ratemyprofessors.com site, it’s the unfortunate effect a few dumb or bitter students can have on a professor’s general reputation.
To make it worse, students are more likely to log onto Ratemyprofessors.com to post an evaluation after having a bad experience, so Ratemyprofessors.com doesn’t always reflect the truth for all professors. In many cases, a poor review of a professor on this site can simply reflect a challenging course, which in no way means the teacher is always to blame.
The University has the ability to produce a product similar to Ratemyprofessors.com, only a more polished, accurate one. Most importantly, the University then has the ability to use that information to improve the faculty.
It may sound harsh, but at the end of the day it’s imperative that the administration weed out professors who are proving ineffective, or professors who have used the same syllabus for 30 years and refuse to redevelop their courses. This is a key part of their role in improving the St. John’s experience.
If professors have nothing to hide, they’ll never object to this level of transparency. If they do object, the following questions need to be considered: what is that professor’s usual feedback? What harm would their evaluations have on their job security?
If you ask me, students already understand the concept of course evaluation. If we don’t feel we’re benefiting, there’s little reason to care.