Student Assessment of Learning Gains

The College of Charleston has recently moved to a paperless, online-only course instruction evaluation system. The obvious benefit of the new system is that instructors are not required to use class time for student evaluations, and no students are required to shuffle sealed envelopes from one building to another once the evaluations are complete. I’m a big proponent of technology-enhanced learning and while I appreciate the time (and environmental) savings of the new system, I find myself frustrated with it. One problem is that every semester, there are problems with a very low response rate. Any of our Math 104 (“Elementary Statistics”) students can tell you about the issues with a voluntary response sample.

But the low response rate isn’t my main problem with the evaluations. In an ideal world, the course evaluations would provide statistically meaningful data that is useful in helping me guide course design, structure, and content. Unfortunately, the evaluations don’t do this. For example, one question asks students to rate (using a Likert scale) the statement, “The instructor showed enthusiasm for teaching the subject.” Yes, I am enthusiastic in my classroom (both about teaching and about mathematics), and I am happy that my students notice and enjoy my enthusiasm. But this doesn’t help me teach the course better. I would prefer student feedback on statements like, “In this course I learned to work cooperatively with my peers to learn mathematical concepts.

Overall, my issue with the evaluations is that the questions posed are teacher-centered instead of learner-centered. Example: Rate the statement “Overall this instructor is an effective teacher.” This statement removes the student’s responsibility for their own learning. Compare with the following: Rate the statement “Overall in this course I developed skills as an effective learner.” The biggest goal I have in a mathematics course is to provide students with problem solving skills that they can use beyond my classroom. If a professor often gives a fantastic lecture, then that’s great; but that may not be helpful to students five years from now. Instead I hope to give students skills, practice, and experience in critical thinking, problem solving, complex reasoning, etc. Rating whether or not they’ve learned these skills is more important than rating “Overall, the required textbook was useful.

Of course, figuring out how students have grown academically or intellectually is difficult. In this semester’s Precalculus classes, I’m working together with another instructor on designing course content. One of the things we decided to do was to use something similar to the Student Assessment of their Learning Gains (SALG) tool in an attempt to gather data on student progress through the course. Initially, the students are asked to take a benchmark SALG survey and they will repeat a similar survey two to three times throughout this semester. We are hoping to gather meaningful data on the growth of their skills by tracking things like whether they are in the habit of “using systematic reasoning in the approach to problems” or “using a critical approach to analyze arguments in daily life.” Hopefully this data will prove useful as we continue to tweak the course moving forward.

Wordle

Earlier today, Derek Bruff (@derekbruff) tweeted a link to a Wordle done by graduate student Jessica Riviere. Jessica blogged about her Wordle, so check out this link for what she had to say. Her Wordle contained data from her teaching evaluations and what her students had commented. This was clever and fun and it inspired me to make one as well.

I used my course evaluations done by College of Charleston students during the last academic year (Fall 2011 through Summer 2012). Altogether I have data from eight courses (covering several sections of Elementary Statistics, Pre-Calculus, and Linear Algebra) for a total of 114 evaluations. To make the data collection easier, I restricted my focus just to the “Comments on Instructor” and “Comments on Teaching” prompts. This meant ignoring data from sections called comments on “Organization,” “Assignments,” “Grading,” “Learning,” and “Course.”

The most frequently used words were: and, the, to, I, is, she, a, class, was, of, her, Owens, with, Dr. Several of these were removed by Wordle since I had chosen to “Remove common English words.”  I also removed my first name and corrected some misspellings (ex: “explaiend” to “explained”). I enjoyed the following word counts: awesome, 6; funny, 5; humor, 5; and enthusiastic, 9.

Wordle: Eval Cloud