Debunking Myths about RateMyProfessors.com and Course Evaluations

Eric Boccaccio
9 min readApr 18, 2018

--

At this point, talk like: “The internet is revolutionizing the way that people communicate across the world!” has become dreadfully trite. Society as a whole is so accustomed to the internet as a way of life that people rarely stop to think about how far-reaching the web has truly become. Almost every decision we make nowadays, ranging from the life-changing (what career path to go down, where to live) to the mundane (where to eat tonight, how much is too much for a pack of staples), we make using information that we find online. Often, this information comes in the form of reviews.

Review websites have been a staple of the internet almost since the beginning, with the launch of ConsumerAffairs.com in 1998. Websites like ConsumerAffairs saw great success, quickly becoming invaluable to people making purchase decisions. Seeing an opportunity, John Swapceinski, a student at San Jose State University, created TeacherRatings.com in 1999 after having a bad experience with a professor[1]. This simple website extended the internet-review concept to university professors, creating a similarly invaluable tool for students deciding what classes to take and which professors to avoid. According to the mission statement, “The site does what students have been doing forever — checking in with each other — their friends, their brothers, their sisters, and their classmates.”

In 2001 the website was renamed RateMyProfessors.com and has become a fixture at college campuses in the time since. The rapid growth of this website has seen a backlash from many critics, from professors to journalists to even some students themselves. They cite “untrustworthy reviews,” and they say that “Students should not base decisions about their education on it … and professors should not get ideas from it.” They say that the reviews do not “represent the wisdom of crowds [2]” and that “only students with the strongest opinions — either for or against — post on [the] website.[3]” After investigating these claims, I have found the general consensus that “RateMyProfessors is inaccurate and does not matter in the real world” to be far from sound. Before I debunk some of these common myths, however, it is important to know why professor reviews are so important in the first place.

HOW DOES RATEMYPROFESSORS AFFECT THE STUDENT EXPERIENCE?

“It’s actually influenced my class choices,” a USC student said. “I definitely prefer [knowing how professors are], because then I know what I have to do to do better [in the course]. So if I know that one professor is a really hard teacher, then I know from the beginning to stay on top of the readings[4].” This student is not the only one who is strongly influenced by professor ratings. In a Texas A&M University study, 104 students were given two hypothetical professors (A and X): one with positive reviews, and one with negative reviews. They measured how attracted the student was to the professor (note: attraction does not mean physical attraction in this context, but a measure of how much students favor the professor based on reviews) using the following questions: “How likely are you to choose a class taught by Professor (A/X), how likely are you to recommend this professor to another student, how likely are you to be satisfied with the numerical rating of Professor (A/X), and how likely are you to put this class as your first priority?[5]” The results showed, unsurprisingly, that “students [who] received a positive feedback review were more attracted to the professor.”

Clearly, RateMyProfessors has done a lot to alter whether students decide to take a professor. What’s more, if they do decide to take a professor’s course, studies show a marked change in how students who have looked at reviews experience the course. In a Western Michigan University study, undergraduates were presented “with ratings and comments about a professor, [shown] a 10–20 minute lecture by him, and assessed [on] their perceptions of the educational experience[6].” As in the Texas A&M study, neither the professor nor the reviews were real, although in this study participants were not told that the professor in question was fictitious. Importantly, this study had students watch a lecture after reading the reviews to see the professor for themselves. Interestingly, even though all students watched the same taped lecture, students who were shown positive reviews “reported higher levels of affective learning (i.e., “liking” of the lecture material) and motivation to learn” then their counterparts who had viewed negative reviews, mixed positive/negative reviews, or no reviews at all. Students who were shown negative reviews reported the lowest levels of “affective learning.” Even more surprisingly, the study also found that “students who read positive ratings before viewing the lecture performed significantly better on a quiz [of] the content (a letter grade higher, on average) and reported a greater likelihood of actually engaging in the behaviors the professor recommended.” This suggests that reviews hold much more sway over the experiences of students than most people would expect.

If the results of these two studies are looked at together, the trends show that positive reviews cause students to be more “attracted” to professors, and that this attraction creates a positive perception of the professor. According to the Western Michigan study, this positive perception actually translates into tangible benefits like increased interest in the content, more motivation to learn, and even significantly better grades when tested on the material. The psychology involved here is fascinating and it shows just how crucial reviews from other students can be in a student’s experience of a teacher.

With this in mind, it makes sense that bad teacher ratings seriously damage both a professor’s reputation among students and the academic performance of students in those classes. It is also very interesting to note that these reviews can create a positive feedback loop, in which positive reviews cause a student to have positive perceptions of a professor, thereby creating more positive reviews[7].

DOES RATEMYPROFESSORS AFFECT A TEACHER’S STANDING WITH THE UNIVERSITY?

Although RateMyProfessors reviews have a tectonic influence on how students see their professors, there is little evidence showing that these reviews affect how universities evaluate their professors. Andrew Coile, former Computer Science professor at California State University and current Apple Inc. employee, says on an online forum that “For faculty RTP (Retention, Tenure, and Promotion) process, the official student evaluations were the only ones considered[8].” For context, most universities conduct official student evaluations at the end of every semester. These are used by the university to determine things like faculty tenure, and by professors themselves as constructive feedback. Most universities keep the results of these evaluations private, although there has been a recent movement to publicize them[9] (more on this later). Other professors on the forum echo the assertion that universities do not look at RateMyProfessors when discussing hiring, promotion, or tenure. It is worth noting, however, that official course evaluations may be indirectly affected by RateMyProfessors reviews given the positive feedback loop effect discussed above, and if this is true, then RateMyProfessors could have a subtle effect on a professor’s standing with his university.

ARE RATEMYPROFESSORS REVIEWS ACCURATE?

Although universities themselves do not use them, ratings posted on RateMyProfessors.com clearly have a strong bearing on professors’ reputations among students. Given that this is the case, it is important to ask whether it is fair for online reviews to impact professors’ perceived teaching ability in such a significant way. Whether or not it is reasonable for online reviews to sway professor reputations depends of the overall accuracy of these reviews. While accuracy of subjective comments can never be perfectly measured, analysts have been able to gain access to official university course evaluations as a benchmark to compare against.

The results of these benchmarks are surprising. One study published in the Journal of Education for Business includes data from 5 different universities and 1,167 faculty members on many different facets of RateMyProfessors ratings[10]. The study finds, among other things, that RateMyProfessors summary ratings actually “correlate highly” with official course evaluation summaries. A Lander University study was able to replicate these results: “Easiness website ratings were significantly positively correlated with actual assigned grades. Further, clarity and helpfulness website ratings were significantly positively correlated with … the institutionally administered … forms[11].” This same correlation was found in other studies as well[12]. Although accuracy relies on many factors, these studies at least provide evidence that RateMyProfessors rankings generally mirror the results of official course evaluations, which university administrations accept as accurate. This of course will vary from professor to professor depending on how many reviews have been left. Some researchers have found that 10 reviews is a good number to get a rough consensus[13], and some have found accuracy with only a single review[14], although more is always better.

UNIVERSITIES SHOULD RELEASE COURSE EVALUATIONS

Although RateMyProfessors already has general accuracy when compared to university course evaluations, one way to improve accuracy further would be for universities to release their own course evaluations publicly. The way I see it, universities have no excuse not to disclose this data. Most universities do not give reasons for withholding course evaluation results, but the few arguments that I could find hardly hold water. At USC, Vice Provost for Academic and Faculty Affairs Elizabeth Graddy cites “University privacy concerns” and “incivility” in online responses[15]. To be fair, this is a good argument for not releasing specific student responses to open-ended questions, but course evaluations typically include few open-ended questions if any at all. In my experience, course evaluations are mostly done on bubble sheets with 10–20 multiple choice questions about the class and professor. The privacy argument makes little sense against releasing multiple choice data.

Another argument Graddy gives against disclosing course evaluations is that her department is “trying to figure out how we can improve our instrument” because the evaluations found a “a bias against women and faculty of color.” Interestingly, the Faculty Senate at the University of Michigan cites the exact same reason as an argument against releasing course evaluations[16]. The fact that two universities found the same race/gender bias in their course evaluations likely means that these skewed results are not a problem with the “instrument,” but rather an indication of an overarching issue that needs to be addressed. Trends like this are actually a good reason why the data should be released because it can facilitate academic discussion about the issue. Keeping data under lock and key simply because the results are unpleasant is not something that the scientific community tolerates, and prestigious universities like UMichigan and USC need to set an example. To UMichigan’s credit, some of their course data was eventually released to students a year later[17]. If you are interested, there are many more bulleted arguments against releasing course evaluations on this page. I will not address all of them here to for the sake of brevity and because I feel that the remaining arguments are inferior to the ones already discussed.

CONCLUSION

Universities should absolutely release course evaluations, but for universities that don’t, RateMyProfessors is the only place for students to get intel on a professor that they are unfamiliar with. Because of this, I say that students should go ahead and look at the reviews if they want to. Despite common belief, these students are actually relatively safe looking at RateMyProfessors reviews, because they are fairly accurate. They should be aware, however, of the effects (positive and negative) that reviews can have on student psyche and performance. RateMyProfessors is certainly not perfect. Some potential problems include the fact that deceitful professors can underhandedly rate themselves and that students can submit more than one rating. Because analysts have found reviews to be generally accurate however, these are likely only problems in rare, isolated cases. Most professor pages with at least a few reviews have a high chance of being trustworthy. This website really does have the power to dramatically change college enrollment and student interest in their teachers and course material. I hope that all of this knowledge is helpful to you in making an educated decision about courses. Good luck in your professor-hunting and may the enrollment Gods ever be in your favor!

[1] https://www.wired.com/techbiz/media/news/2005/09/68941?currentPage=all

[2] https://www.nytimes.com/2010/03/14/magazine/14FOB-medium-t.html

[3] https://sites.sju.edu/blogs/2015/11/05/7-reasons-why-i-hate-rate-my-professor/

[4] https://dailytrojan.com/2018/02/20/rate-professors-influences-course-choices/

[5]https://static1.squarespace.com/static/5443d7c7e4b06e8b47de9a55/t/59b49ca6f7e0abec5d24cd2b/1505008807272/Who-Needs-Rate-My-Professor3.pdf

[6] https://www.natcom.org/communication-currents/instructors-corner-1-think-RateMyProfessorscom-doesnt-matter-think-again

[7] https://doi.org/10.1080/02602938.2011.594497

[8] https://academia.stackexchange.com/questions/101009/do-professors-care-about-their-rating-on-websites-such-as-RateMyProfessors-or-koo

[9] https://dailytrojan.com/2018/02/20/rate-professors-influences-course-choices/

[10] https://www.tandfonline.com/doi/abs/10.3200/JOEB.84.1.55-61?src=recsys

[11] https://doi.org/10.1080/02602930802079463

[12] https://doi.org/10.3200/CTCH.57.2.89-92

[13] http://pareonline.net/pdf/v15n5.pdf

[14] https://www.tandfonline.com/doi/abs/10.3200/JOEB.84.1.55-61?src=recsys

[15] https://dailytrojan.com/2018/02/20/rate-professors-influences-course-choices/

[16] https://record.umich.edu/articles/faculty-senate-passes-resolution-course-evaluation-data

[17] https://www.michigandaily.com/section/news/art-20-gives-students-more-information-about-courses

--

--