Rudy is home! He is a bit groggy (for him), but aside from that, he seems fine. He can't eat or drink anything until tomorrow, and that is going to be a challenge.
All university classes use a multiple-choice course evaluation that is machine read - we generally don't get results for a few months and the data is not all that illuminating, although most students do fill it out. For several semesters, I have used an online assessment called the Mt. Royal College FAST. I can design 20 questions myself, and the questions can be open-ended short answer, Yes/No, etc. Usually about half the students participate in any semester.
I get a lot of valuable feedback this way, and I have used student comments to make revisions to the class, but you really have to develop a thick skin! I never review the responses until after the grades are in, because time is limited, but also because I don't want the assessment to color my attitude toward the class. Anyway, this semester, in my on campus section, two students responded that given the choice, they would not take the class again. This may not disturb me so much except that it probably is not related to the expected grade (the lowest grade in the on campus cohort was C-), and one of the students wrote the most nasty comments whenever there was the option! Just an example, one question was "what did you like most about this class?" The answer? "Nothing, aside from the fact that it was once per week."
Why is it that all the other wonderful comments, and the constructive criticism, never have quite the same impact?
No comments:
Post a Comment