So I'm watching this Mermaid show on Animal Planet, and it's basically one long w.t.f.
Not one of these supposed scientists is outraged? Granted, all the alleged scientists really look like actors and speak in the same, hesistant, ernest, tones.
Does anybody really buy this crap? Was Animal Planet so limited in imagination that they couldn't come up with *anything* in the ocean interesting, so they air this garbage? Seriously, why would they run such a show?
Rants and raves about the mess of higher education in the United States.
Sunday, May 26, 2013
Friday, May 24, 2013
Been working to get more coverage...
So now I'm on Rense:
Think about that,
Title: An Implication From the Penn State
Scandal that Major Media Missed
A recent article in the Chronicle of Higher Education discussed the highest paid university president in 2011: President
Graham B Spanier. While the name might not be familiar, his institution was in
the headlines at the time: Penn State. That's right, Mr. Spanier was president
for many years of the most grotesque (known) scandal in higher education, the Sandusky Affair.
Mr. Spanier stands to face many criminal
charges for his part in ignoring the many specific complaints that Penn State
administration received over a decade regarding, well, most unpleasant behavior
going on in the showers. Thus, he was "let go" from Penn State,
receiving a golden parachute package of nearly 3 million dollars (insert
"Dr. Evil" laugh here).
He'll also get to keep his $600,000 a year
professorship in the department of human development and family studies, after
a 1 year fully paid sabbatical. For such a vast sum of money, this must be a
pretty important position. That said, I've been in institutions of higher
education for nearly 30 years now, and I've never even heard of such a position
before (much less a whole department).
It's quite possible Mr. Spanier didn't know
what was going on in the showers, so maybe he's entitled to such a huge
severance package, but there's a larger issue that nobody in the media has
managed to figure out. Allow me to connect some dots now:
One detail not fully addressed in the sordid
Sandusky affair concerns the years and years where he was allowed to engage in
his “inappropriate” behavior despite witnesses reporting it to administration
on multiple occasions: none have asked how this happened. The answer is simple:
stopping this activity is not in the administrator's job description. Administrators
have two key goals, which trump all other considerations: retention and growth.
Retention means keeping students in the system as much as possible, and a
professor having low (with administration defining “low”) retention will
generally not keep his job. As the easiest way to lose a student is to fail a
student, “Failure is not an option” has taken on new meaning in higher
education. Growth generally means growth of the student base, as more students
means a larger institution, and a larger institution means more administrative
pay; thus are standards annihilated nowadays, as standards cut into growth.
So, it was no grand conspiracy that
administrators would cover up even the most grotesque of behavior on campus.
The scandal, if revealed, could easily drive students away, reducing both
retention and growth. It was no conspiracy…administrators were merely doing
their job, nothing more.
But here’s the thing missed in this debacle:
administrators are not physically attached to a single school. There’s a bit of
a revolving door in higher education, an administrative career is marked by
going to an institution, improving retention and growth, and then being
hired/promoted to another institution. Job descriptions for open positions
almost invariably describe how candidates must have demonstrated success ,
growth and retention, in previous positions.
There’s nothing special about the
administration of Penn State any more than the administration of NYU, SLU, or
UCLA. This behavior would’ve been covered up anywhere else.
The administrators at Penn State were not
some good ol’ boys with the same last name or with 20 years of experience
working together, they’re a collection of administrators that have all moved up
through the system the same way via numerous institutions, through improving
retention and growth, sacrificing everything else about education for these
goals and these goals alone. Thus, like an ancient Greek phalanx, each
administrator knew what to do without there necessarily being any specific order
given from above: retention and growth are paramount, sodomization of
prepubescent boys in the showers on campus is nothing next to those goals.
Now we come to the seed I must plant: if a
simple random sample of administrators has so little concern for integrity and
human decency that implicitly condoning the sodomizing of children does not
cause even one administrator to resign in disgust over the course of years,
is it possible that their void of integrity fails the cause of education in
other ways?
Think about that,
Monday, May 20, 2013
So, last time around I spoke of the new plan to fix remediation, which the majority of students entering higher education need (and I remind that 90% of these remedial students get nothing from their time here but a lifetime of debt). The new program was headed by someone who misled us as to her qualifications, while administrators, who had to know the deception, stood by and said nothing.
We were told many times now wonderful, absolutely amazing, the new program was, and how well loved it was by the students. But, looking around online, students had something different to say:
We were told many times now wonderful, absolutely amazing, the new program was, and how well loved it was by the students. But, looking around online, students had something different to say:
Faculty: “Did you see her students’ comments
on Ratemyprofessors.com?”
Me: “Crap, I was going to tell you about it.
I didn’t think anyone else goes there.”
Faculty: “I thought I’d do some
investigation.”
--Faculty are not stupid, we know better
than to take anything from administrators at face value. We also know when
administration wants us to “say something about” their new ideas, they only do
so because it’s easier to cram things down our throat if we open our mouths
first.
My online search also found what students
had to say about her, and indirectly the program, at Ratemyprofessors.com.
Despite the professor’s presentation claiming great teaching skills, she has
miserable ratings there. Now, I grant, only the most irate (and probably
failing) students would go to an online site and post their complaints, but
after being forced to watch videos of testimonials from students about how
life-changing—I’m not exaggerating their
claims!—the new program is, I think it’s fair to present testimonials
biased in the other direction.
She’s rated in several categories at
Ratemyprofessor.com, but “easiness” is by far her best category. Granted,
“easiness” isn’t on a collegiate student evaluation of teaching form, but this
website caters towards student desires. An administration that was actually
curious about challenging students-- critical to education!--would have such a
question on the evaluations, and it’s very telling that there is no such question
on the evaluations. Qualitatively, here are some abbreviated quotes, some are obviously
from students in the program with this teacher5
“Hardly a
teacher at all. All she does is talk, talk, talk. The second day of class she
taught us how to turn on our calculator. The tests don't make sense.”
“HORRIBLE!!!
she is so rude and doesnt seem like she cares about anyone including her TAs.
She acts as if she is better than everyone and her lectures are completely
WORTHLESS. STAY AWAY!”
“A lot of
people don't go to the lectures because they are extremely boring and
attendence doesn't count but sometest questions ome straight from lecture. The
labs are never fully coordinated. Tests are all about theory more than solving
problems. Use the answer choices to solve test questions.”
“Her tests
are always about concepts with no actual calculations of answers.”
“Her tests
are all on principles and alghorithms. There were hardly any problems you had
to actually solve”
“stay away
from this teacher taking this class with her will be the biggest waste you will
ever come across”
“Wholly
education/grant-funded class. Does a horrible job explaining in class. Try to
get a different professor unless you enjoy being taught like you're in middle
school.”
Not a single student gave a positive
review, not one, but I’ll happily concede considerable bias at that site. Looking
too closely at negative student evaluations usually isn’t a good idea, but when
the same specific complaints keep coming up over the course of years, that
usually means there’s some truth to it. Complaints like this had to have turned
up in student evaluations administered at her school…and yet, administration
never wonders at students consistently saying the course is strange in some
way, instead only looking at passing rates. In reviewing her complaints and
ratings, a more reasonable conclusion about her higher passing rates would be
that she just gives easy multiple choice tests where you can get the answer
just by reading the questions, the program has nothing to do with it. All
administration saw was higher retention, issues like “waste of time” in the course
are insignificant next to the only goal of merit to an administrator.
After the pitch, the head pitchman tells
us that when this program is tried at other institutions, retention rates vary
from 25% to 77%; in other words, outside of controlled situations, the co-requisite
plan doesn’t seem to be working at all. He asks that if we get a high
retention, we should contact him and tell us what we did so they can put that
in their recommendations for the program. Some faculty drove hundreds of miles to be told how to
implement this plan (sic) to improve retention, and the people pitching it
already know it doesn’t work and are looking for help.
Time
and again, bad statistics, methods anyone with basic knowledge of the subject
would know to be invalid, are used to justify a new change. Is it any wonder at
all that education of remedial students hasn’t improved under any of these
plans?
It's nuts.
Saturday, May 18, 2013
A quick interlude...
Hey, remember Penn State? Seems like something pretty horrific happened there, so bad that the President of the University had to leave his position.
Even with such a scandal, however, that won't keep him from being the highest paid university president last year, receiving close to 3 MILLION bucks.
It's a system of plunder, folks, little more. Now, I'll grant the guy at the top doesn't necessarily know everything that goes on in an institution, even if, in this case, he'd been there for at least a decade, so maybe he doesn't need to be penalized. The top guy at my own institution doesn't seem to know all the levels of corruption and incompetence going on at the lower tiers, either.
Still, a $3 million golden parachute is pretty sweet under the circumstances. Wonder what my severance package will be like when it comes.
Even with such a scandal, however, that won't keep him from being the highest paid university president last year, receiving close to 3 MILLION bucks.
It's a system of plunder, folks, little more. Now, I'll grant the guy at the top doesn't necessarily know everything that goes on in an institution, even if, in this case, he'd been there for at least a decade, so maybe he doesn't need to be penalized. The top guy at my own institution doesn't seem to know all the levels of corruption and incompetence going on at the lower tiers, either.
Still, a $3 million golden parachute is pretty sweet under the circumstances. Wonder what my severance package will be like when it comes.
Friday, May 17, 2013
Remediation is a disaster...so let's look at the new plan (har)
Ever Changing Remediation: A Specific
Example
“We’ve been doing it wrong, focusing on getting remedial students to pass
remedial courses. This new process puts remedial students directly into college
classes, and passes them from there!”
--Administrator, explaining the latest amazing new idea on August 6, 2012
Administrators are well aware that
remedial students tend to fail college classes, and rather than take the choice
of integrity (stop loaning money to so many remedial students), are always very
eager to try something, anything, to get these students to pass. Improved
passing, improved retention, is the only definition of a good education an
administrator can understand, all those unemployed but degree-holding people
occupying Wall Street notwithstanding. Almost every year I endure yet another
cockamamie plan to improve retention, and so allow me to cover the most recent
one in detail:
On August 6, 2012, I attended six hours of
meetings to address the latest method: co-requisite courses for remedial math
and English. I can’t even guess at the money spent merely to have English and
math faculty, along with administrators, travel from all over the state to
Baton Rouge for this catered meeting. First, administrators present an array of
statistics regarding how amazing the new idea is: at every level, “education”
is ignored, and instead all improvement is measured in passing rates. The
entire definition of a successful program is whether it passes students or not.
Every statistic presented to us is all about the retention. After a flood of
such data, impressive to administrators and irrelevant to educators, we’re told
of the new plan.
The new plan this time around? Instead of
having students take remedial classes before taking college classes (for
example, taking College Preparatory Algebra II before being allowed to take
College Algebra), the new plan is to have students take both courses, remedial
and college level, at the same time, and force them to get extra support. The
remedial class is more like a lab, with teaching assistants giving more
personal help. And the statistics show it works! Yay, passing rates are much
higher!
It’s
so, so, trivial to manipulate and mislead the effectiveness of these programs
via statistics, however, and most everyone outside of administration knows it—all
the grey-haired faculty simply chuckle behind the administrators’ backs, having
seen this stuff dozens of times over their careers. It takes nothing to pass
more students, for example one could just make “C” the lowest possible grade in
the course.
The new English remedial program was
particularly amusing—first, the salesman tells us how 15% of remedial students
never take a college level writing course even after passing remedial writing.
Then, he tells us his new program has 0% of these students not taking a college
writing course. How does he achieve it? The new program forces all remedial
students directly into college level writing courses. At least he knew that his
success wasn’t exactly surprising, given the requirements. Still, if success is
defined as taking a college level course, a program that forces the taking of a
college level course is guaranteed to be successful. It is, comparatively
speaking, one of the more honest plans for improved retention I’ve seen.
The math co-requisite program likewise wasn’t
above manipulation to show how successful it was, although at least it put some
effort into being misleading. Like almost all these programs, students were
self-selected; only those that accepted the warnings that it would require hard
work and dedication could enter the co-requisite program. Sure enough, this
program showed greatly improved retention rates, 80% as opposed to the more
typical 50%.
Restrict my classes to only hard working,
dedicated students, and I’ll probably do quite a bit better than average, too,
no matter what special program I try. In twenty years of being subjected to
these pitches, I’ve yet to see a program where they divided self-selecting
students like these into two groups, with the program only being used on one
group. This would go some way to showing it’s the program, and not the
self-selected students, that are making a difference. Why is this basic,
rudimentary, obvious, idea seldom, if
ever, implemented? I imagine it’s because legitimate studies that use such
controls don’t show any major improvements among hardworking students relative to
anything else (since the material at this level is learn-able by most anyone
that puts effort into it), and thus aren’t sold to administrators.
Next, we were shown how the co-requisite
math students had much higher grades and passing rates, using a variety of
poorly demonstrated interactive techniques and group projects. Group projects
are a red flag that something is bogus about the coursework, there’s just no
real learning in group projects, merely, at best, a theoretical division of
skills by students doing what they already know how to do. Nobody pays
attention to the new activities that supposedly help so much, because everyone
knows the ones running the program can give whatever grades they want and pass
everyone on a whim. Higher grades and passing rates are meaningless in a
controlled system with the person selling it giving out the grades.
Even
administrators and the folks selling the new plan know passing rates are easily
manipulated upward, so they back up the improved passing rates with an
important-sounding statistic. Every incoming student had to take a standardized
placement test, and the scores were recorded. The new students had an average
score on the test of 211. At the end of the course, they re-administered the
tests to the students that passed, and the students got an average score of
232. A 21 point improvement on a standardized test! The program works! Yay!
The pitchmen gloss over that 250 is the
cutoff score for students to go into College Algebra for the placement test, so
all they’ve done is pass students out of College Algebra who still don’t
qualify to take College Algebra despite having just passed it. Instead, the
improved scores are celebrated. Without any question from administration,
statistics like these are always taken as evidence, but a bit of critical
thinking reveals these numbers from a standardized test still don’t show
anything worth believing.
Much like with the self-selected students,
there’s no control group, no set of students outside the program to compare
test scores to. For all I know, a typical student that passes College Algebra
has a 30 point improvement on that placement test, or that simply taking the
test repeatedly will cause a 30 point improvement. In the latter case, numbers
like administration provided could be taken as evidence that the co-requisite
program is a massive disaster, with students worse off than if they’d never
entered the program. For some reason, administration never thinks to ask such a
basic question or administer this rudimentary control. It would cost a hundred
bucks or so to double-administer those tests to a class or two of students, but
I guess there’s hardly any money left over after the many thousands of dollars
spent implementing the program and pitching it to various states.
Even without a single control, there’s
another issue here. Only the students that pass the course are re-taking the
placement test, and yet it’s taken as evidence of improvement when the average
score goes up. The students that failed (or dropped) don’t retake the test, so
we don’t have their scores. Gee whiz, you take out all the F students, and the
average grade increases. Practically everyone on the stage selling this to us
has a Ph.D., a research degree, in Education, and none of them see this issue.
One of the pitchmen dares to complain that
they’re still looking for better ways to get statistics, but none of the
faculty bother to try to help with any of the above suggestions about controls
or factoring out the failing students.
An empowered faculty would point and laugh
at administrators using such poorly compiled statistics in a vacuum to justify
a major re-working of course programs. Instead, the faculty sit quietly and
take it, for the most part. One brave soul asks “How do you expect us to teach
the college material with the unprepared students sitting right there, doing
nothing?”…the people giving the pitch mumble out a response that the answer to
the question will come later, but it never does. Similarly, a question is asked
about the failing students in the program, and we’re told that the only
students that failed in this program were the ones that didn’t want help,
disregarding how that conflicts with the self-selection process.
This program could affect every remedial
student, as well as the non-remedial students who get put in co-requisite
“college” courses along with the remedial students. In short, faculty are being
strongly encouraged to follow a program that could affect a clear majority of
college students and the best argument given to us for buying into this plan,
beyond the meaningless “better retention,” is a single feeble statistic from an
uncontrolled experiment. Administration knows they can do whatever they want
anyway, and just goes through the motions of convincing faculty that it works.
Someone with a legitimate research degree, much less a statistician, would be embarrassed to
present this sort of isolated statistic as useful, and yet, one of the pitchers
of the new plan proudly tells us she’s an actual mathematics professor, instead
of saying that her doctorate is actually in Education. She gave no hint at all
of her degree in the presentation or in any of the handout materials, as though
she were embarrassed about it. I had to scramble around online to find that
detail out. I found something else online, too.
But I'll save that for next post.
Thursday, May 9, 2013
More stories...
Wow, it seems like it's only been a few days since I've posted; I've been putting off the latest story of how nuts remedial coursework is. My main distraction has been final exams and tests.
Every semester, I get a student with test grades of F (23 out of 100), F (35 out of 100), F (29 out of 100), and F (41 out of 100). Maybe not those numbers, but straight F's. That's not the weird part, the weird part is then the student takes the final exam, and scores--you guessed it!--an F.
Then, the student comes to my office..."What's my average in the class?" I then explain that an F, F, F, and F, togther with the final exam grade of F, comes out ot an average of F.
Sometimes the student will turn in an exam with most of the questions unanswered, literally left blank, but STILL waits patiently to see his/her grade. And is still disappointed with an F. I have to wonder what goes on in other courses, that turning in a nearly blank paper could reasonably be considered passing work.
"Is there anything I can do to pass?" asks the clueless student. Naturally, I respond negatively, explaining that after 4 months of failing, about the only thing that can happen is failing.
Sometimes the student complains to admin, who naturally sides with the student. I actually have a penalty on my evaluation last year, from a student who failed all assignments, missed a month's worth of classes, and complained to the Dean, who didn't like that I wouldn't help out the student in some way.
This semester I had over half a dozen failing students like this. One had an average of 45, but really wanted to pass. "Can you at least give me a C, I'm applying to a medical program."
"You're 25 points away from a C...are you sure med school is a good option when you're struggling with 10th grade material?"
"I'll study harder. But can I get that C?"
I do what I can to avoid him complaining to the Dean, but I sure don't want this guy having anything to do with my medical care. Maybe he could take care of the Dean or her children when they get sick.
I'm not a monster, but I still think it should be possible to fail. I'm not quite a minority, I hope. I might be; due to glitches in the system, students accidently get enrolled in classes that they know nothing about. Since they don't know about them, they don't go to class, or do any assignments.
The registrar explained to us that every semester she gets such students. 2/3rds of such students failed and are angry about the F. They complain that the grade should be removed and refunded their money. The students have a point, and the college complies.
"2/3 fail courses they don't know they're in. What about the other 1/3?" one can ask. They get A's in the course, but still want the course dropped and money refunded, because they'd rather that loan money be spent on courses of some use to their degree. Again, the college complies, but admin never considers the implication here.
Think about that a minute: empirically, 1/3 of the courses taught on campus (and I'm talking a state, non-profit, institution) are so BOGUS that students can literally do nothing...and still get an A. A thinking person might conjecture more than 1/3 of courses are bogus, since it's possible that student would take a free A, rather than complain about it, at least sometimes, so "1/3" is a minimum estimate on the proportion of college courses that are bogus.
So, yeah, I could be a minority when it comes to thinking that it should be at least possible to fail a course.
Every semester, I get a student with test grades of F (23 out of 100), F (35 out of 100), F (29 out of 100), and F (41 out of 100). Maybe not those numbers, but straight F's. That's not the weird part, the weird part is then the student takes the final exam, and scores--you guessed it!--an F.
Then, the student comes to my office..."What's my average in the class?" I then explain that an F, F, F, and F, togther with the final exam grade of F, comes out ot an average of F.
Sometimes the student will turn in an exam with most of the questions unanswered, literally left blank, but STILL waits patiently to see his/her grade. And is still disappointed with an F. I have to wonder what goes on in other courses, that turning in a nearly blank paper could reasonably be considered passing work.
"Is there anything I can do to pass?" asks the clueless student. Naturally, I respond negatively, explaining that after 4 months of failing, about the only thing that can happen is failing.
Sometimes the student complains to admin, who naturally sides with the student. I actually have a penalty on my evaluation last year, from a student who failed all assignments, missed a month's worth of classes, and complained to the Dean, who didn't like that I wouldn't help out the student in some way.
This semester I had over half a dozen failing students like this. One had an average of 45, but really wanted to pass. "Can you at least give me a C, I'm applying to a medical program."
"You're 25 points away from a C...are you sure med school is a good option when you're struggling with 10th grade material?"
"I'll study harder. But can I get that C?"
I do what I can to avoid him complaining to the Dean, but I sure don't want this guy having anything to do with my medical care. Maybe he could take care of the Dean or her children when they get sick.
I'm not a monster, but I still think it should be possible to fail. I'm not quite a minority, I hope. I might be; due to glitches in the system, students accidently get enrolled in classes that they know nothing about. Since they don't know about them, they don't go to class, or do any assignments.
The registrar explained to us that every semester she gets such students. 2/3rds of such students failed and are angry about the F. They complain that the grade should be removed and refunded their money. The students have a point, and the college complies.
"2/3 fail courses they don't know they're in. What about the other 1/3?" one can ask. They get A's in the course, but still want the course dropped and money refunded, because they'd rather that loan money be spent on courses of some use to their degree. Again, the college complies, but admin never considers the implication here.
Think about that a minute: empirically, 1/3 of the courses taught on campus (and I'm talking a state, non-profit, institution) are so BOGUS that students can literally do nothing...and still get an A. A thinking person might conjecture more than 1/3 of courses are bogus, since it's possible that student would take a free A, rather than complain about it, at least sometimes, so "1/3" is a minimum estimate on the proportion of college courses that are bogus.
So, yeah, I could be a minority when it comes to thinking that it should be at least possible to fail a course.
Subscribe to:
Posts (Atom)