By Professor Doom
I was checking
some sources while I was following the money in higher education in my last
essay, and came upon a few tidbits worth sharing.
Most awesome was
an article that listed
requirements for accreditation over 100 years ago. Much as it’s fascinating
to see tests from the 19th and early 20th century to see
how much schools have changed, I find this list of accreditation requirements
from North Central accrediting (there are numerous accrediting bodies, but they
are all fairly similar) to be amazing:
1. follow respectable entrance requirements
2. offer courses selected from the classics
3. ensure a minimum of eight departments headed by
full-time
instructors, each possessing at least a master’s
degree
4. provide a good library
5. properly prepare students for post-graduate study
6. have a maximum class size of 30
7.
have a productive endowment of at least $200,000.
Let’s go over this line by line, comparing
with the institutions I’m directly and personally familiar with, to see how
much has changed in a century. For the bureaucracy wonks, I have a line
by line analysis of what accreditation is today…dozens of pages of pure
bureaucracy with very little relating to education, unlike the above, which is
brief and mostly about education.
1. follow
respectable entrance requirements
It’s hard to believe that there was a time
when an accredited school had to have entrance requirements. Now, the vast
majority of schools have no entrance requirements, and it’s quite common to
have coursework appropriate
for an 8 year old, as I’ve discussed elsewhere.
I hate to sound elitist, but entrance
exams need to come back. Too many ruthless administrators are taking way too
much advantage of people that have no interest in education, and have no
understanding of what it means to take on student debt. Too many dubious
fields like Education have blossomed, and thrive primarily by scooping up
the suckers that are taken in by administrators.
Imagine if instead people that wanted to
learn something, wanted to work hard, and could show that they could study and
learn, were the only ones on campus. Bogus courses would be laughed off campus,
bogus departments wouldn’t exist, and sniveling sycophant faculty that were
created by such might be in smaller quantity. Perhaps I’m wrong…but has the
open system of today really created a much more educated populace, or a much
more indebted populace?
What of those that can’t pass the exams?
Well, this is just accreditation, there was a time not that long ago that a
non-accredited school could still be a good school, and a school that focuses
on high school and lower material probably shouldn’t claim to be “higher
education” anyway. I imagine with the fat government loan checks out of the
picture, such schools would actually be cheaper…and a serious student can always
just go the many (thousands?) free sites on the internet that have such
information.
2. offer
courses selected from the classics
I had to laugh reading this, since
institutions no longer practice these ideas. This is from such a bygone era. It
used to be, students had to learn Latin in higher education (heck, they used to
need to learn it in school). I grant that Latin isn’t nearly as critical to the
modern world as a few centuries ago, and so it didn’t bother me when students
were instead forced to learn any foreign language in lieu of Latin. That’s been
removed, too, replaced by a Mickey Mouse “computer skills” course where
students “learn” the skills they already know from using a cell phone…that’s
been removed, too (computers being so expensive), and now students don’t have
to learn anything about any other culture or language.
There are a few holdout courses, though
“classic” math has been reduced to 10th grade math, and most
“classic” courses in other departments have similarly minimal requirements,
like “Western Civilization”, a course that’s gone from “read a few books” in a
semester to “read a few chapters.” The problem, of course, is educators no
longer decide what is “classic.” Instead, administrators make such decisions.
So, now it’s “classic” to have “Gender Studies” courses and “White People are
Evil” courses, and “Home Economics” courses.
Is it really so elitist to think that
scholars should determine what is scholarly, instead of administrators?
3. ensure a
minimum of eight departments headed by full-time
instructors,
each possessing at least a master’s degree
This one is another big laugh for me, as
an institution I was at for a decade never did have any departments at all,
instead an administrator with no scholarly skills determined what the
“departments” did. Those days are gone, at least for newer institutions.
The first advantage to having departments
is it’s much harder for an incoming faculty to be completely bogus, to know
nothing, to be a fraud, and operate in a department with legitimate scholars.
Administrators honestly seem to prefer frauds, and I would often have to work with
ignorant “scholars” that clearly did not know what they were supposedly
teaching.
The second advantage is a department won’t
have bogus courses; in that institution with the non-scholar admin, the
students got their accredited 2 year degrees…but when they went to a four year
school, they learned that it would take another 4 years to get a 4 year
degree—nothing in the 2 year degree was of sufficient scholarly merit to apply.
A department run by people that are expert in their field (instead of filled
with cherry-picked educationists by admin) can stop that from happening.
The reference to a master’s degree, as
opposed to a doctorate, is again from the olden days, where you didn’t have to
have a research degree to teach. Nowadays, there are way
too many doctorates, in every field, so it’s no surprise that now it’s
common to require a doctorate. I totally respect research degrees, but for
jobs-based degrees, the requirements should probably allow for people with
actual industry experience as well as (if not superior to) pure research.
4. provide
a good library
This, too, is funny, but only because my
school was forced to buy a bunch of books to satisfy the “good library” clause
that’s still in accreditation. In days of yore, absolutely, a big collection of
books was rather important for learning.
Nowadays? Not so much. You’re reading
this, so you know about the internet, and you can buy a book and have it
cheaply delivered to your door in a few days, tops (except for stupid-expensive
textbooks, but that’s a scam for another day)…it was a very different world a
century ago, and having a big library on campus made much sense back then. It’s
hysterical that the only clause that could have been removed from accreditation
hasn’t been removed, even as so many of the others are gone now.
Halfway through the list, and it’s all
howlers from the perspective of an educator in the 21st
century—alas, not howlers because the ideas from a century ago were so stupid
and ignorant, but because they’re generally good ideas that have been abandoned
in favor of the stupid and ignorant system of today.
I’ll address the others next time. Until
then, consider that the American higher education system was the envy of the
world in the 20th century…are we sure that getting rid of these
simple rules and replacing them with massive bureaucratic requirements is such
a good idea?
Here in Canada, it used to be required that in order to go on to university or technical college, one had to study a second language in high school. Most opted for French, for obvious reasons--a large part of the country's population is French-speaking, mainly in Québec and New Brunswick.
ReplyDeleteHowever, nowadays, most high school graduates don't have to. This is in a country which is *officially* bilingual and where one has the right to receive government services in either English or French.
That's part of "General Education" requirements, which are slowly getting annihilated. I coincidentally mention that in a post I'm putting up about weeks from now.
ReplyDelete