UB rolls out new course evaluation system
Students can evaluate classes in new uniform system for all departments
UB’s making it easier to share how you feel about Psych 101 – and every other class you may have sat through this semester.
A new way to evaluate courses hits campus Monday, allowing students to fill out a uniform system online. Previously, there were six separate evaluation systems used across 14 schools, costing the university more than $50,000 and leaving UB unable to use assessment data to its “full potential,” according to Scott Weber, senior vice provost for Academic Affairs.
On Nov. 17, students will receive an email to fill out the new evaluations, which will provide faculty an immediate report of the information.
“There was no ability to look at data across the university and amalgamate it in a way that’s really interesting and helpful to our faculty,” Weber said of the old systems. “And this provides a much more assessment ability to sort of tease out that students are saying and how we interpret them is just much stronger in this platform.”
In March 2013, the Faculty Senate created the Committee for University Wide Course Evaluations to implement the new system. The group came up with a proposal for the new program and the Faculty Senate passed the proposal in April 2014.
In the old system, the School of Architecture and Planning and the Law School still used a paper system for their evaluations. The total cost needed to maintain the six systems was $51,444, according to a report created by the committee.
The university chose Campus Labs, a specialized assessment program, to provide the new system. It has a total cost of $49,754, according to the same report.
Carol Van Zile-Tamsen, associate director of education innovation and assessment, said a pilot program ran this past summer. The School of Nursing, the Graduate School of Education and select departments in the College of Arts and Sciences participated in the pilot.
Robert Cenczyk, assistant to the academic dean of the School of Nursing, said the response rates “exceeded the electronic evaluations we collected in previous evaluations.”
The School of Nursing was the only school that had summer evaluations before the pilot, and the school’s responses improved by about 15 percent, according to Van Zile-Tamsen. The previous response rates were about 30-35 percent, and it increased to 40-45 percent, she said.
After students fill out the evaluation, a report will be created for the faculty. In the past, it would take a long time for the information to be consolidated and some reports were created too late to be applicable for the next semester, she said.
Dr. Debra Street, a sociology professor and chair of the department, said the major difference in the new system is the ability to ask more course-relevant questions.
“The previous evaluation system was rather rigid, pretty much ‘same questions same way,’ no matter what the course was – large or small, physics or theatre, hands on or theoretical – as if all courses were alike and could be evaluated on identical criteria,” she said in an email. “The new system offers more flexibility to ask questions that are course-relevant.”
Each survey can have up to 29 questions. Students for every course will be asked 16 core questions, which look at aspects like general satisfaction with the course, instructors and facilities. Each department can add 10 additional custom questions that also refer to general course satisfaction. Faculty also has the option of three “qualitative or quantitative” questions, which is customizable to each faculty.
Van Zile-Tamsen said students averaged 10 minutes to complete the survey over the summer.
“This is really a chance for students to provide feedback that is meaningful,” Weber said.
Street said she used the student evaluations to make any changes to her classes the next semester.
“The better the feedback, the better my insight into what students already know, what they still need and want to learn, and what they like and don’t like about my instructional approach once the semester ends,” she said.
Doaa Ahmed, a junior pre-pharmacy major, said having the evaluations in one place helps students avoid “digging around” for the links.
She said the evaluations are helpful when she wants to let a professor or teaching assistant know about how the course went.
Nick Oddo, a junior political science major, said he does not see the evaluations having a large impact.
“In a big school like this, it’s hard to develop that sort of personal relationship,” he said.
He said most students probably don’t think the evaluations affect their academics or do not even care about the assessments.
Ahmed said she doesn’t know where the information goes after filling out the evaluation.
Cenczyk said the system should continuously be evaluated even after the launch.
“What I feel is important in this process is the university’s commitment to constantly revisit the system and make improvements from year to year,” he said in an email. “It does us no good to develop this system just to let it get outdated and collect data that nobody uses.”
The course evaluation system can be filled out online and is mobile-friendly.