We were told many times now wonderful, absolutely amazing, the new program was, and how well loved it was by the students. But, looking around online, students had something different to say:
Faculty: “Did you see her students’ comments
on Ratemyprofessors.com?”
Me: “Crap, I was going to tell you about it.
I didn’t think anyone else goes there.”
Faculty: “I thought I’d do some
investigation.”
--Faculty are not stupid, we know better
than to take anything from administrators at face value. We also know when
administration wants us to “say something about” their new ideas, they only do
so because it’s easier to cram things down our throat if we open our mouths
first.
My online search also found what students
had to say about her, and indirectly the program, at Ratemyprofessors.com.
Despite the professor’s presentation claiming great teaching skills, she has
miserable ratings there. Now, I grant, only the most irate (and probably
failing) students would go to an online site and post their complaints, but
after being forced to watch videos of testimonials from students about how
life-changing—I’m not exaggerating their
claims!—the new program is, I think it’s fair to present testimonials
biased in the other direction.
She’s rated in several categories at
Ratemyprofessor.com, but “easiness” is by far her best category. Granted,
“easiness” isn’t on a collegiate student evaluation of teaching form, but this
website caters towards student desires. An administration that was actually
curious about challenging students-- critical to education!--would have such a
question on the evaluations, and it’s very telling that there is no such question
on the evaluations. Qualitatively, here are some abbreviated quotes, some are obviously
from students in the program with this teacher5
“Hardly a
teacher at all. All she does is talk, talk, talk. The second day of class she
taught us how to turn on our calculator. The tests don't make sense.”
“HORRIBLE!!!
she is so rude and doesnt seem like she cares about anyone including her TAs.
She acts as if she is better than everyone and her lectures are completely
WORTHLESS. STAY AWAY!”
“A lot of
people don't go to the lectures because they are extremely boring and
attendence doesn't count but sometest questions ome straight from lecture. The
labs are never fully coordinated. Tests are all about theory more than solving
problems. Use the answer choices to solve test questions.”
“Her tests
are always about concepts with no actual calculations of answers.”
“Her tests
are all on principles and alghorithms. There were hardly any problems you had
to actually solve”
“stay away
from this teacher taking this class with her will be the biggest waste you will
ever come across”
“Wholly
education/grant-funded class. Does a horrible job explaining in class. Try to
get a different professor unless you enjoy being taught like you're in middle
school.”
Not a single student gave a positive
review, not one, but I’ll happily concede considerable bias at that site. Looking
too closely at negative student evaluations usually isn’t a good idea, but when
the same specific complaints keep coming up over the course of years, that
usually means there’s some truth to it. Complaints like this had to have turned
up in student evaluations administered at her school…and yet, administration
never wonders at students consistently saying the course is strange in some
way, instead only looking at passing rates. In reviewing her complaints and
ratings, a more reasonable conclusion about her higher passing rates would be
that she just gives easy multiple choice tests where you can get the answer
just by reading the questions, the program has nothing to do with it. All
administration saw was higher retention, issues like “waste of time” in the course
are insignificant next to the only goal of merit to an administrator.
After the pitch, the head pitchman tells
us that when this program is tried at other institutions, retention rates vary
from 25% to 77%; in other words, outside of controlled situations, the co-requisite
plan doesn’t seem to be working at all. He asks that if we get a high
retention, we should contact him and tell us what we did so they can put that
in their recommendations for the program. Some faculty drove hundreds of miles to be told how to
implement this plan (sic) to improve retention, and the people pitching it
already know it doesn’t work and are looking for help.
Time
and again, bad statistics, methods anyone with basic knowledge of the subject
would know to be invalid, are used to justify a new change. Is it any wonder at
all that education of remedial students hasn’t improved under any of these
plans?
It's nuts.
No comments:
Post a Comment