It’s More than Just PC
The traditionalist critique of the university — I made it myself over thirteen years ago in the co-authored Who Killed Homer? — was that somewhere around the time of the Vietnam War, higher education changed radically for the worse. Note I am talking mostly about the liberal arts. America remains preeminent in math, physics, hard sciences, medicine, and engineering, subjects that are largely immune to politicization and race, class, and gender relativism. The top students, and often the more hard-working, gravitate to these fields; indeed, in my general education courses on the ancient world, I often noticed that math and science students did far better than did their sociology or anthropology counterparts.
Such excellence in math and science explains why the world’s top-rated universities in all the most recent rankings are overwhelmingly American. (Indeed, liberal arts professors piggyback on such findings and often, in a sense quite fraudulently, point to these polls as if to confirm their own superiority.)
I spent a great deal of my life in the university as a student and professor and now as a researcher. Higher learning in the arts and humanities has enriched American life for 200 years. Small liberal arts colleges like Hillsdale, St. John’s, St. Thomas Aquinas, and dozens of others continue to be models of enlightened learning. But all that said, increasingly public universities and the larger private institutions have become morally and fiscally bankrupt. Here are some reasons why.
Monotony of Thought
By 2011 we all know that faculties are overwhelmingly liberal. That in and of itself would not be so alarming if they were not activist as well. By that I mean academics are not just interested in identifying supposed past American sins, but also in turning disinterested instruction into political advocacy, especially along race, class, and gender lines. Rosie the Riveter, the Japanese internment, and Hiroshima all deserve study, but they are not the sum total of World War II. Today’s average undergraduate may know that African-Americans were not integrated into American units during World War II, but they have no clue what the Battle of the Bulge, a B-29, or Iwo Jima were. They may insist that global warming is real and man-caused, but would have trouble explaining what exactly carbon is.
The effect of politicized learning on the quality of education was unfortunate in a strange sort of cyclical fashion. The more “–studies” classes saturated the curriculum, the less time there was for classical approaches to literature, philosophy, language, or history. The more the profile of the student body became more important than its preparation, the more these classes had to be watered down, as if thinking the right thoughts could justify the absence of the old rigor.
Deans begin quoting the ethnic profiles of the incoming classes, the supposed expanded diversity of the faculty, their own commitment to various progressive causes, and kept absolutely mum about the average GPAs and SAT scores of the new student body or the content of the new curriculum. And why not? No provost was ever fired for having fewer students graduate with less skills; many were for not “reaching out” to “underrepresented” groups.
A Blank Check
We know all the other pathologies of the modern university. Tenure metamorphosized from the protection of unpopular expression in the classroom into the ossification of thought and the proliferation of the mediocre. Faculty senate votes did not reflect raucous diversity of thought among secure professors, but were analogous to Saddam’s old plebiscites in their one-sided voting. Tenure created the notion of a select cloister, immune from the tawdry pursuit of money and neurotic worry over job security so true on the crass “outside.”
Campus ethics and values were warped by specialization in both faculty instruction and publication. The grandee that butchered a graduate class every semester was deemed more valuable to the university than the dynamic lecturer who enthused and enlightened three undergraduate introductory classes each term — on the dubious proposition that the former serially “published” peer-reviewed expansions on his dissertation in journals that at most five or ten fellow academics read.
Not teaching at all was even preferable to teaching very little, as a priestly class of administrators evaded the “burdens” of instruction. The new bureaucrats were often given catchy titles: “Assistant to the Provost for Diversity”, or “Associate Dean for Cultural Studies”, or the mundane “Special Assistant to the President for Internal Affairs”, in the manner of late Soviet apparatchiks or the power flow charts of the more mediocre corporations. Although the faculty was overwhelmingly liberal, it was also cynical, and understood that the avalanche of self-serving daily memos it received from the nomenklatura need not be read. I used to see entire trash cans filled each morning with reams of xeroxed pages, as professors started off their days by nonchalantly dumping the contents of their mail slots. Most of the memos read just like those “letters” congressmen send to their constituents, listing a dean’s or vice-provost’s res gestae and detailing how they were “working for you.”
Self-invention proliferated. Under the system of “faculty governance” (analogous to carpenters assuming the roles of the contractor and architect), curriculum, hiring, promotion, and firing were managed by peers. An article “in progress” or “under review” was passed off by committees as good as published (And why not? You, in hand-washes-hand-fashion, might be on the other end of a faculty committee and need the same life raft someday). Linda Wilson-Lopez, a third generation one-quarter Mexican-American, was deemed as much a victim as if he she had just crossed the Rio Grande. Old white guys in their sixties, who were often hired sight unseen in the early 1970s, suddenly demanded diversity hires — with the assumption that when the music stopped in the 1980s they had already found chairs and the new discrimination did not apply to the already tenured. (Had affirmative action involved replacing sixty-something, full-professor white males, it would have had a very different reception). Proposals for envisioned research on sabbaticals were as common as post-sabbatical reports of actual work were rare.
Careers were destroyed by charges of “racism,” “sexism,” or “homophobia,” rarely through smearing a Mormon in class, or skipping a week of instruction to junket at a conference. All of the above is well-known, as hundreds of exposes in the last thirty years have explained to us quite well why college graduates are both so politicized and so lacking in knowledge and the inductive method. We see them screaming in videos at Occupy Wall Street demonstrations — full of self-pity it is true, but also in a sense worthy of pity as well. Nothing is worse than to be broke, unemployed, and conned.
Money is the Game Changer
There is a new element in the equation. Debt. Almost every year, tuition climbed at a rate higher than inflation. It had to. Higher paid faculty taught fewer classes. “Centers,” run by professors who did not teach and full of new staff, addressed everything from declining literacy to supposedly illiberal epidemics of meanness. Somewhere around 1980, the university was no longer a place to learn, but a sort of surrogate parent, eagerly taking on the responsibility of ensuring that students were happy, fit, right thinking, and committed. That required everything from state-of-the-art gyms replete with climbing walls, to grief counselors, to lecture series and symposia on global warming and the West Bank. All that was costly.
To pay for it, the federal government guaranteed student loans and the university charged what they wished — with the hook that the interest need not be paid until after graduation. For an 18-year-old, taking on debt was easy, paying it back something to be dealt with in the distant future — especially when the university promised higher-paying jobs and faculty reminded college students that their newly acquired correct-thinking was in itself worth the cost of education. There was little competition. Trade schools were still looked down upon, and online instruction was in its infancy.
The result, as we now know, was a huge debt bubble, one of nearly $1 trillion in aggregate borrowing that rivaled the Freddie and Fannie frauds. And yet the debt no longer comes with guarantees that the liberal arts and social science graduate will find employment, either of the sort that he was trained for, or necessarily more remunerative than the federal clerk or the union tile setter. Starbucks from 7-7 each day will not pay off that Environmental Studies degree from UC Irvine.
As the economy cooled, cash-strapped parents increasingly had little money to ease the mounting burdens. What was once a rare $10,000 student loan became a commonplace $50,000 and more in debt. Living at home until one’s late twenties is in part explicable to the mounting cost of college and the accompanying dismal job market — and the admission that many college degrees are no proof of reading, writing, or thinking skills. (Note as well that the themes and ethos of the university were not “life is short, get on with it”, but rather population control, abortion, careerism, metrosexism, etc. that contributed to the notion that one’s 20s and even 30s were for fun and exploring alternatives, but most certainly not to marry, have children, get a job, buy a house, and run the rat race.)
I noticed about 1990 that some students in my classes at CSU were both clearly illiterate and yet beneficiaries of lots of federal cash, loans, and university support to ensure their graduation. And when one had to flunk them, an entire apparatus was in place at the university to see that they in fact did not flunk. Just as coaches steered jocks to the right courses, so too counselors did the same with those poorly prepared but on fat federal grants and loans. By the millennium, faculty were conscious that the university was a sort of farm and the students the paying crop that had to be cultivated if it were to make it all the way to harvest and sale — and thus pay for the farmers’ livelihood.
How could a Ponzi scheme of such magnitude go on this long?
Lots of reasons. The university was deeply embedded with a faux-morality and a supposed disdain for lucre. “College” or “university” was sort of like “green” — an ethical veneer for almost anything imaginable without audit or examination (Whether a Joe Paterno-like exemption or something akin to Climategate or the local CSU campus where the student body president recently boasted that he was an illegal alien and dared authorities to act — to near unanimous support from the university.) Since World War II, a college degree was rightly seen as the key to middle class upward mobility. That belief was enshrined, and so we forgot to ask whether everyone was suited for college, or whether the college educated per se were always more important to the economy than the self-, union-, or trade-schooled welder, concrete finisher, or electrician.
If Only They Were as Fair as Wal-Mart …
The “part-timer” or “adjunct faculty” now became a sort of Messenian helot to square the circle of the universities lacking the resources to meet their pretensions. With dozens of Ph.D. applicants for each liberal arts or social science tenure-track job (graduate schools likewise turned out far more doctorates than were needed, given their own desire for the prestige and the smaller load of graduate instruction), universities found plenty of cheap labor. When the full professor retired, his courses could be outsourced to itinerant part-time lecturers, for thousands less dollars per class in salary and benefits. That the faculty hated Wal-Mart and yet treated its campus employees far worse than did the retailing bogeyman was assumed, but never acknowledged. In some sense, those hired in the 1960s and 1970s before the “Fall”, like senior California public employees now ready to retire, were the proverbial rat in the snake’s belly that had to make its way out, with the understanding that never again would anything like it make its way in.
But what cannot go on will not go on — at least for most universities without the billion-dollar plus endowments. The present reckoning is brought on not by introspection, self-critique, or concern for our increasingly poorly educated students, but by money, or rather the lack of it. Higher education is desperately searching for students with cash, loaned or not. And it is, by needs, panicking and will ever so slowly start changing. For-profit tech schools, online instruction, and the two-year junior college deliver a cheaper “product,” one not necessarily any longer an inferior one, given the nature of the contemporary university curriculum and values of the faculty.
It used to be that one did not dare go to a DeVry or Phoenix for-profit school for computer certification or accounting, because one would miss out on the rich undergraduate experience, both social and intellectual — best exemplified by the core curriculum of some 50-60 units in liberal arts and sciences. But if the university is serially subsidizing panels about global warming, lauds Palestinian activists, and runs workshops on homophobia (all without balance and counter-opinion), and if its GE required courses, whether so titled or not, are too often little more than the melodramatic obsessions of over-specialized, ranting professors who otherwise would have small audiences, then why spend the money and go through the charade of classically liberal instruction, especially given that the trade school is cheaper and more honestly pragmatic?
Much that was good will fall along with more that was bad. But it was a comeuppance long overdue. With hubris comes nemesis — leading to atê or ruin.
The End of Sparta — a postscript on the sources
The historian Xenophon’s Hellenica, our primary historical source for events of earlier fourth-century-B.C. Greece — in his apparent anger at the rise of a democratic and powerful Thebes — makes no mention of the presence of Epaminondas at Leuktra. He is silent also about his role in the first invasion of Sparta, or the Theban effort to free
the helots of Messenia and to found the citadel of Messenê. Xenophon does, however, in his Anabasis (“The March Up-Country”), talk of a Boiotian Proxenos who had advised Xenophon to join the Ten Thousand, though he says nothing of our son of the same name. The loss of Plutarch’s “Life of Epaminondas,” together with Xenophon’s bias, explains in large part why today we do not fully appreciate the reasons why the classical Greeks and Romans considered Epaminondas the greatest man of the age.
In contrast, the Roman-era Diodorus — based on the lost histories of an Ephoros, Xenophon’s contemporary — much more frequently mentions and praises Epaminondas and his invasions to the south. Thanks to Ephoros — I have no idea whether he had long yellow hair and lisped and was fond of the Boiotians — and the lost historians Theopompos and Kallisthenes, something about the achievement of Epaminondas survives in bits and pieces in the Roman-era traveler Pausanias and the Plutarch’s Life of Pelopidas.
Much of what we know about siege warfare of the age is found in “On the Defense of Fortified Positions,” written by one Ainias of Stymphalos, a shadowy general of the Arkadian federation. His larger corpus, “On Military Preparations,” is unfortunately lost and we otherwise know very little of the general and writer Ainias Taktikos, who may have played a considerable role in the politics of the Peloponnesos in the mid fourth century B.C.
We don’t know exactly all the reasons why Plato (Platôn) so distrusted democracy and favored the Spartans, but it was more than just the democracy’s execution of Sokrates and his own exile. “The Oration on the Messenians” (Logos Messêniakos) by Alkidamas does not survive either, but a fragment of the great speech on the liberation of the helots, “No man is a slave by nature,” seems to be the only explicit condemnation of slavery that survives from classical Greek literature. Perhaps Aristotle had Alkidamas in mind when he later attacked those who taught that there was no such thing as a man suited to slavery at birth.
We hear from Plutarch and others that an adolescent Philip of Makedon spent a year as a hostage with the Thebans. Though it is not recorded that he was known at Thebes as Melissos, the adult Philip bore no antipathy for the Messenians and when, more than thirty years later, he invaded Boiotia, he spared the helot city to the south after his victory at Chaironeia. He did, however, finish the job of subjugating Greece by ending the Sacred Band at Chaironeia — but supposedly lamented the sight of their corpses that littered the battlefield. Scholars are still unsure why Philip erected a proud lion on the battlefield to honor the dead of the Sacred Band, but the monument sits there today guarding the old road to Thebes as it skirts the foothills of Mt. Parnassos.
Pausanias says in his own days of the first century B.C. that there was an iron monument of Epaminondas at Messenia. Both Pausanias, and Plutarch in his life of Agesilaos, record that the off spring of Antikrates were forever known as the “swordsmen” for the thrust of their ancestor that killed the hated Epaminondas. They add that the great liberator was brought alive out of battle to die in 362 B.C. on the hilltop of Skopê, overlooking the battlefield of Mantineia, after Epaminondas’s fourth and last invasion of the Peloponnesos, more than nine years after the victory at Leuktra. They mention none who died with him, not even Mêlon, son of Malgis, farmer of Helikon.
Black limestone steles of the heroes of Boiotia can be seen in the modern museum of Thebes, carved, we believe, by the sculptor Aristides. Archaeologists argue about the architecture of the great cities of Mantineia, Megalopolis, and Messenê, but by general consent the stones seem to reveal the work of now anonymous Boiotian architects whose work resembles the contemporary rebuilt walls of Plataia and Thespiai. Much of the massive Arkadian Gate at Messenê survives, though no one has yet found among the best-preserved city in Greece any fragments of the two stone lions with the likenesses of Chiôn and Proxenos — nor the iron statue of Epaminondas himself.
Of the final end of Phrynê, little is known. Athenaeus in the thirteenth book of The Deipnosophists relates a tradition that she returned to Thespiai and offered her own great riches to rebuild the city walls after Alexander the Great had torn them down — if only they would inscribe her own name on the fortifications.
I have hiked over much of Hesiod’s Mt. Helikon, but so far I have not discovered the highland farm of Mêlon, son of Malgis, father of the good Lophis — the master of godlike Chiôn and Nêto, hero of Leuktra, slayer of Kleombrotos, who in the following decade went south three more times after the founding of Messene to fight the Spartans and, more than nine years after Leuktra, to die on Skopê above Mantineia at the side of his friend — and of his savior — Epaminondas, son of Polymnis, general of Thebes, first man of Greece.