It’s been a rough semester, but we’ve almost completed this semester’s mountain of work.
As you prepare for your final exams with a groggy smile and large coffee in your hand, your inbox dings with a new email from the school with the subject line: “Carleton Teaching Evaluations.”
You already know this email is one of at least three you will be receiving over the next week and choose to ignore it as you go about your day.
A couple of days later, your inbox dings again with the same email. Whether you did or didn’t like your professors, you know it would only take about five minutes to complete.
With 13 questions asking you to rate how audibly your professor spoke to whether or not they returned assignments promptly, the whole experience seems tedious, so you choose to ignore it again.
Besides, rating your professors won’t have any impact on them anyway, except for the optional anonymous comments at the end, right?
Clicking that small rank out of five on how you overall assess your professor won’t do actually anything, right?
It turns out, you actually have a lot more power than you may think.
A matter of instructional life or death
Dan Preece, a political science contract instructor and vice-president (internal) of CUPE 4600, said, “We live or die with our courses on those teaching evaluations.”
“It has a huge impact on us.”
The final question on each evaluation, on how you rate the teacher overall, determines the fate of contract instructors and is taken into account in award and pay decisions for professors.
Contract instructors work on either a four-month or year-long contract, leaving them potentially jobless at the end of these short terms. Professors have more job security, with an indefinite contract, which gives teaching evaluations less power over their long-term jobs.
If contract instructors receive less than a 4.0 average for student’s answers to the last summative question about their overall performance, they may be required to create an action plan with their chair head for improvements. If they receive an average again of less than a 4.0 for the same course they are “essentially fired from that course,” according to Preece.
Andrew Robinson, a contract instructor of physics at Carleton, received a 4.6 average on his student evaluations last term and made his scores public online. He said it’s important for students to know how critical evaluations are in whether or not he has a job next term.
“Students hold an enormous amount of power over contract instructors,” Robinson said. “Student evaluations create a good picture so you can pick up on areas that students are finding difficult, but can be misused because they can be just used as an exercise in popularity.”
Although student input is vital, Robinson said the results can be based on external factors, and there should be other methods of measuring the teaching and learning process.
“The students should have some say, but really it’s a question of co-ordinating it across the course to make sure that the teaching quality isn’t just based on student experience,” he said.
After students fill out teacher evaluations, they are sent to Carleton’s Institutional Research & Planning department where the data is collected “so appropriate people see the results,” said Bruce Winer, assistant vice-president of the department.
Winer explained his department facilitates the process and is “like the Elections Canada,” where their job is to report to various parties controlled by collective agreements between unions.
Having an axe to grind
In 2012, the university began moving evaluations online causing an eight per cent decrease in the number of students filling them out in comparison to evaluations being filled out in class.
Winer said response rates always fall when transitioning to online forums, but numbers didn’t decrease as much as expected.
While professors have the option to have students fill out evaluations in paper or online, contract instructors are only allowed to have their students complete their evaluations online.
Now, about 45 per cent of overall students at Carleton fill out teacher evaluations.
Although Winer said it would save time for the people responsible for the data collection for all evaluations to be online, Preece said this move would be dangerous.
“If you do it online, you never have to step in the classroom. Perhaps you’ve never gone to class and handed in assignments, got bad grades and have an axe to grind,” Preece said.
He emphasized online evaluations cannot control the environment students are in when they decide to fill them out, which can be problematic.
Sarah Ficca, a third-year law student at Carleton, filled out all her teacher evaluations early, but said she doesn’t think that’s the case for all students.
“I’ve heard that people only fill them out if they’ve had a bad experience. With the online ones, I feel like the importance isn’t there anymore,” she said, adding how with her in-class paper evaluations she has found her teachers are more likely to “give the spiel” on why they matter.
Preece said contract instructors have seen comments become more personal from online evaluations rather than focused on the teaching and learning of a course.
“It creates that other barrier, that distance between individuals, which—much like Internet comments—are actually allowing a greater feeling of antagonism to come out,” he said, adding how he has seen other contract instructors comments that are “borderline racist or sexist.”
With the danger of this turning into an open forum for students to be hateful, Preece said these evaluations could become like the RateMyProfessors site.
But Robinson said this may not necessarily be a bad thing.
Robinson, who posts all his formal teaching assessments and links to his RateMyProfessors page on his own website, said there is increasing evidence that shows the data from the two assessments coincide.
Keeping the public informed
In 2007, researchers at the University of Maine published a study comparing RateMyProfessors.com evaluations to in-class student evaluations of teaching and found there was a reasonable correlation for people who get high evaluations on traditional surveys.
After the study, researchers suggested policy implication was that “higher education institutions should make their data publicly available online.”
“Although students doubtless would applaud this move, many faculty would oppose it because of genuine concerns about privacy and the negative consequences that published data may bring . . . But privacy is a thing of the past,” the study reads.
“Moreover, by not making data available to students, the negative consequence is greater still: Students will rely on what is publicly available,” outlining the problematic features of RateMyProfessors including not knowing the environment or legitimacy of the person who wrote the comments and gave the rating.
Preece and Robinson both said there is room for improvement within the current evaluation system. Both contract instructors create a mid-term evaluation for students to fill out halfway through the term and have interactive opportunities throughout the course to get student feedback.
This feedback involves more than multiple choice ranking answer out of five. It includes targeted, long-answer questions that Preece said he can take into account with his teaching.
The triangular approach to assessment, which includes students’ evaluation, peer review, and self-assessment, are the three pillars of what Preece said teaching evaluation should be based on.
Peer reviewing of classes by other professors happen infrequently and aren’t done in a systematic way, according to Preece.
“It doesn’t get at the quality of our instruction,” he said. “Or an assessment on whether we’re teaching effectively.”
Taking responsibility
Although the system of teaching assessment is different for tenured professors, teacher evaluations from students are still used in the “process leading to decisions on the award of career development increments to faculty members, thus affecting annual salaries,” said André Plourde, Dean of Carleton’s Faculty of Public Affairs.
Evaluation comments are only seen by the teacher of the course, but Plourde said the data is shared with departments and the dean to “build and improve teaching and pedagogical capacity.”
These 13 questions asking you to rank your teacher from one to five could take you less than five minutes to complete. But the work that is done with these evaluations spans across various departments and is the central deal-breaker in faculty decisions.
Although Robinson said there are flaws with the current assessment system, he said it’s important to make students aware of just how much power they have.
“This is a particularly poor evaluation tool, but it’s the one I have to deal with,” Robinson said. “I now tell students this is the only way I get rated, so think responsibly.”
Erica Howes asked students about evaluating their professors. Check it out here: