Higher Education As We Presently Know It Is Obsolete

Steven Yates
Oct 17 · 26 min read

And not simply because of technological change.

Photo by Paul Bergmeir on Unsplash

Many of us have followed the campus disruptions over free speech. Some students as well as faculty appear ready to do away with free speech altogether, if it permits “microaggressions” or offends their sense of “wokeness.”

But focusing on a handful of dramatic, well-publicized incidents of this sort doesn’t scratch the surface of the problems plaguing higher education today.

Just to note: I spent over 17 years teaching in colleges or universities at various levels before departing academia in 2012. Some of my jobs were at flagship state universities; one was at a liberal arts college; a few were at technical colleges and business schools.

I mention this only so readers will know I’ve seen higher education from the inside, and can report from a boots-on-the-ground perspective.

And from my perspective, with rare exceptions these institutions are by and large obsolete. They are tied to educational models that existed not just before today’s era of rapid technological change but predate the industrial revolution itself.

This, though, is just one kind of problem. Other problems involve the kinds of moral corruption and intellectual decay that led to the free speech crisis, and worse.

One thing at a time….

The 4-year curriculum leading to a standard degree (a BA, a BS, or some equivalent) is obsolete, if only because many skills imparted to entering freshmen will be outdated when they graduate. And that’s when the educational process is functional, with students actually learning something.

There are detailed studies (such as that of Richard Arum and Josipa Roksa) showing how even this is questionable.

Upshot of that study: students learn little in “general education” courses. The fact that observers don’t agree on why, is itself a problem. There’s a lot of analysis paralysis.

Recently, two authors, Jason Brennan and Phillip Magness, got their heads together and laid out their diagnoses in a book I recommend if you think this is important. Their book is entitled Cracks in the Ivory Tower: The Moral Mess of Higher Education (2019). I don’t agree with them on every point, but they’ve done some original investigating and produced an incisive critique worth wading through.

If I had kids in college or approaching college age, I’d take the time to wade through it!

One by one, the book walks us through the problems plaguing higher education. Most aren’t dramatic like the free speech wars. Since their full list of higher education’s maladies would take a series of articles, I’ll just hit the high spots. This will suffice for my purposes.

One of the alarming features of their book — alarming for those who might still believe in this kind of education — is that they end without offering much in the way of a solution.

There is a solution, and in the final section, we’ll take a look at it.

Key to the Brennan-Magness diagnosis: people respond to incentives. There are many perverse incentives built into higher education today. As a result, the institutions are basically broken. Anyone undertaking to fix them is likely to end up frustrated.

Let’s see why.

Skyrocketing Tuition

Student loan debt is now over $1.6 trillion. Defaults on student loan debt are at record highs. This is probably the second most visible higher education crisis, and definitely affects far more people.

When I was an undergraduate, a four-year degree at a good public institution could be had for under $4,000. Today, many institutions cost more than 10 times that. Tuition grew an average, inflation-adjusted 66.7 percent from 1993 to 2007 alone.

You can graduate from more advanced professional programs with six figures of debt.

Some of those who do not default, which has obvious negative consequences of its own, will be paying down debt for much of their working lives. The need to pay off student loans has already delayed millennials’ buying homes and otherwise caused a drag on the economy.

One problem is the ease of obtaining federally-guaranteed student loans. Loan officers may well approach students with offers before students approach them. Students, most of whom know next to nothing about personal financial planning (there are no courses in the subject in high schools that I know of), are encouraged to buy into buy-now-pay-later short term thinking.

Universities get the money. They have no incentive to do anything keep tuition down. The federal government has them covered. Nor, as we’ll see shortly, do they have much incentive to deliver value. They do have every incentive to keep students enrolled, because if students either flunk out or fail to re-enroll for some other reason, the money flow from those students’ loans stops.

Administrative Bloat

Most universities have plenty of money. They get it from the federal government via from student loan payments and for other purposes, through alumni donations and other private gifts, and from corporations. They have endowments of various sizes.

Large sums of this money have gone into administrative bloat.

From 1993 to 2007, the number of full-time administrators per 100 students grew by 39 percent, while faculty grew by just 18 percent. Spending per student on administration grew by 69 percent adjusted for inflation, while equivalent spending for instruction grew by 39 percent (source).

The Chronicle of Higher Education reports that administrative bloat drove 28 percent of the “boom” in hiring from 2000–2012. Much of this hiring was in student services.

Administrators will justify this by saying they’ve had to expand to comply with increasing federal and state regulations as well as widening accreditation rules.

But administrators also gain prestige through the number of people under them, in their divisions. (In fairness, this happens in government agencies and private corporations as well.)

Salaries for administrators have also gone up, while faculty income has actually gone down relative to the cost of living. According to the College and University Professional Association for Human Resources, as of 2012–13 two-year institutions were paying their presidents or chief executive officers an average of $291,132; four-year institutions were paying an average $370,470; while in a doctoral context institutions were paying an average $431,575. (source)

There are, of course, “perks” with such positions, so that actual income at the very top of the academic pyramid is much higher and can exceed $1 million!

Salaries and “perks” have manifestly not gone up for faculty!

The Adjunct Crisis.

Also over the past three decades or so we have seen a process of replacing tenured and tenure-track faculty with a low-paid contingent workforce of adjuncts, who typically work part-time for no benefits and no realistic expectation of advancement.

They often have no summer work, and so must budget with great care or find jobs as they did when they were students.

In a now-remote past, adjunct faculty tended to be older, retired people who taught classes out to share their work experience and expertise. They desired to continue to serve. Some just wanted to do something with the oceans of spare time retirement made possible. They were usually paid a small courtesy stipend which, as recipients of pensions on top of social security, they didn’t actually need.

The situation for today’s adjuncts is vastly different.

The Chronicle article cited above also noted that the number of full-time faculty and staff members per professional or managerial administrator declined 40 percent, to around 2.5 to 1, from 2000–2012.

Full-time faculty members lost ground to adjuncts, who by that time composed roughly half of instructional staff at most types of colleges — more at “lower tier” technical colleges.

The figure now most often cited is 70 percent. Around 70 percent of college and university faculty, that is, are contingent: part-time, usually paid by course, with wages ranging anywhere from around $2,500 to around $4,000 per course. An adjunct teaching four courses for, say, $3,000 per course, with pay spread across four months, will earn $24,000 before taxes. This is not a livable wage in most cities and towns in the U.S. today.

There are cases of adjunct faculty who are actually homeless and living out of their cars, sneaking into student dorms or public facilities to shower and change clothes. Many more are one car or health emergency away from unemployment and homelessness.

To earn enough to live on, many adjuncts teach on more than one campus at more than one institution. The freeway flier teaches a class at one location, jumps into her car, hightails it 20 or sometimes 100 miles, to meet with a class at a location in another city.

An adjunct can reach living wage status by cobbling together five or six such classes.

Adjuncts generally do not have private office space or computers, and have meet with students in a “commons” area filled with other adjuncts.

They are often unavailable outside of class, though, because of the need to travel to their next job.

In recent years, adjuncts have begun organizing contingent faculty unions and bargaining for better wages and working conditions. Administrators have tried to ignore or downplay the significance of such activities, but have lost ground as unions with backing from national political organizations such as SIEU have taken root.

Unfortunately, this won’t solve the problem, which is also structural and based on perverse incentives like many other problems that, taken together, are rendering higher education increasingly dysfunctional.

Too Many PhDs? Misplaced Priorities? Or Both?

The official narrative within university administrations is that they have no money to pay adjuncts more. And that if they were forced to pay more, most adjuncts would be forced out of the academic labor force.

I was once told by a dean, “We can use you, but we can’t afford you.”

A mindset of scarcity dominates.

But as we’ve seen, these institutions have the money, if they can pay their presidents and top administrators six figures. The head football coach usually makes even more.

At an institution where I taught for several years, the business school wanted a new building and got it. Once the dirt settled, the plush new tower cost roughly $150 million.

This wasn’t the only such case. There seemed to be plenty of money for new buildings.

There was also money for all the latest technology including centers helping faculty and students figure out how to use it, new gyms and gym equipment, campus beautification projects, and so on.

We are not talking about no money, in other words. We are talking about misaligned priorities for allocating it.

It is true that there are too many PhDs. This reflects multiple perverse incentives. For starters, for a while now people outside STEM fields have had an incentive to stay in school to avoid a hostile job market. Universities have had every reason to keep them enrolled. The more PhD-granting programs a university has, the greater its prestige, and the more PhDs a given department can graduate, the greater its prestige — and the better its chances for attracting grant money, new tenure-track faculty positions, and so on.

The root cause of the adjunct crisis is an academic marketplace unable to absorb all the new PhDs. Variations on this problem go back to the 1970s, when the academic job market basically collapsed after a long period of expansion.

We can argue endlessly about what happened. Conservatives will say the market reached saturation. Liberals will claim state funding for colleges and universities dropped, forcing them to go to corporations for money. The latter proceeded to reshape then to reflect a more corporate image.

These aren’t exclusive, and there is probably truth to both.

An oversupply of anything drops its marketplace value. Economics 101. That includes labor.

Someone will doubtless say, finally, that the head football coach should be paid more since he brings more revenue into the school.

The problem is: the school’s primary business is not entertainment.

Perverse incentives, extending from higher education systems out into surrounding society, feed on each other. As a low-education population grows, its priorities are reflected in how money is spent. Good books, for example, place mental demands on readers. For most television viewers, sports does not.

Serious education thereby slowly wastes away from lack of resources.

If only an over-emphasis on sports were the worst problem.

For at least the past three decades, academics have done a very good job of hurting their credibility without any help at all from sports enthusiasts.

Worthless Degrees

Many degrees being offered at universities these days, especially at the MA and PhD level, are not only worthless for employment purposes, they are worthless from the standpoint of scholarship.

Take “gender studies,” for example.

Such subjects attract women students whose politics already tilt leftward, and their courses reinforce this. Academic leftists see themselves and others of their “tribe” as victims. They graduate, having imbibed this mindset, angry, entitled, and locked into a kind of terminal adolescence.

Their writing skills consist of overwrought expositions on “intersectionality” and rants against the “patriarchy.”

Even assuming they haven’t dyed their hair purple or covered their arms with tattoos … if you’re an employer, would you hire one of these women?

If you’re a woman in charge of hiring in a government agency, or a nonprofit, are you going to hire a younger woman whose degree, appearance, and overall demeanor suggests a divider instead of a uniter?

A chronic complainer or resentful victim instead of a problem solver?

If you’re a man, are you going to want to work alongside or supervise someone who might interpret constructive feedback as a “microaggression,” as creating a “hostile work environment” or as “gender discrimination,” who might then haul you in front of a board of Title IX bureaucrats?

In this #MeToo era, we now have men refusing to meet with women alone, or travel with them. Can you blame them?

Whether anyone likes it or wants to admit it or not, radical feminism has badly damaged relations between the sexes, be they for courtship or professional purposes.

When I was still teaching, I’d encountered enough horror stories that my own meetings with female students were conducted with the door open, and preferably, with my office mate in there as a witness.

Not to single out “gender studies” or the effects of feminism. There are plenty other majors, including traditional ones, in today’s smorgasbord that don’t really educate. That’s if the goal of an education is something of lasting value.

Today’s novelties, often based on what is exciting or “transgressive,” academic equivalents of clickbait, become outdated tomorrow.

They are replaced by the next titillation to come along.

The humanities have been rife with fads for a very long time. Many have been self-reinforcing via the herd mentality. Go against this flow, and you might not even graduate, much less find employment in your field.

The perverse incentive here is to pursue what is trendy and superficial instead of deep and truly meaningful. One specific result is a flood of meaningless “research” that is forgotten as soon as it reaches print.

Meaningless or Biased “Research”

This proliferation of meaningless “research” merits further comment.

“Publish or perish” was the mantra my generation came up with.

Today, of course, you can publish and still perish.

But that’s an aside. How much of what is published is actually worth reading? In all honestly: very little.

The expression, “it’s academic,” wasn’t coined for no reason at all.

Most academic research contributes little or nothing to solving real human problems. This is not its goal. Its goal is to win tenure, promotion, or accolades for its author. Sometimes, in today’s environment, its goal is just to win its author a job.

Even if a book or article accomplishes that goal, it will often not be read again by anyone. Oftentimes the neglect is justified.

Most academic disputation is utterly pointless. It doesn’t even make sense as intellectual curiosity.

I once sat through a session at a conference of academic philosophers and listened to two professors debate the exact interpretation of a single logical symbol used by an early twentieth century academic philosopher.

Embarrassing. Even more so was the seemingly rapt fascination of the 25 or so academic philosophers in attendance. That’s the herd mentality at work.

What is published in refereed journals is all over the map in quality — from excellent to complete rubbish.

What the brochures never tell you is that in getting published, just like getting hired, who you know really is far more important than what you know.

I’m not saying that out of resentment as a rejected author, although like all of us I’ve my share of rejection notices.

I’m saying it from experience as a published author with over a dozen articles in refereed journals.

Back in the late ’80s and early ’90s, I published articles that got through because I’d built a personal network. One was arranged through an editor I talked up at a meeting. Another was accepted for publication because the referee it went to was a buddy of mine who had requested it (he told me about it later).

These articles, seen through today’s lens of 20–20 hindsight, were rubbish. Even thinking about them today makes me cringe.

The network I built back then deteriorated over time. Some of its members passed away long ago. One of the unfortunate consequences is that what is probably my best philosophical work, the product of a more mature thinker instead of an enthusiastic beginner just a few years out of graduate school, is languishing on flash drives. Most if not all of it will appear on blogs or through self-publication platforms, or not at all.

Pseudo-scholarship, Political Biases

Pseudo-scholarship is designed to look like scholarship but is not, because discovering and communicating truth is not its aim.

Flattering the political biases of journal editors is a way to get pseudo-scholarship into print. This can be done even if you don’t have the personal contacts.

Alan Sokal, a physicist, proved this with his infamous hoax back in the mid-1990s. He submitted an essay to Social Text, a leading journal of postmodernism. His article drew fake “connections” between quantum physics and postmodernist fashion supposedly casting doubt on the “objective knowability” of the “external world.”

His essay employed all the right jargon, made all the right political noises.

Social Text accepted it.

On the eve of its appearance, Sokal went public, announced the hoax, and “deconstructed” his own product as total BS.

A major row ensued. Had Sokal behaved ethically? Or had he done the world a service by exposing academic fashionability for what it was?

His piece fit the tenor of the publication (I’ve read it).

Or the majority of refereed journals in the humanities as they’ve existed for the past three decades.

You see, nothing changed. If anything, matters have gotten worse.

This is a sign of how much crap gets published because it flatters the political biases of its evaluators. Most of these biases are cultural left, of course.

(Did I mention? As far as tenurable academic jobs go, no conservatives need apply. There are no guarantees that more traditional liberals can survive in today’s environment. Ask Brett Weinstein.)

There are other sorts of biases. What used to be called confirmation bias is very strong in academia. Intellectuals are as prone as anyone to confusing reality with their beliefs about it. Maybe even more so. Because they believe themselves “smarter” than nonintellectuals, they don’t see the danger.

Especially if their conclusions have political or major policy implications.

One has to wonder how much academic consensus, assuming it exists, is just plain wrong.

Man-made global warming? It’s all the rage today. But is the hysteria factually justified? If you’re not a scientist and don’t even begin to have time to wade through all that is written on this, who do you believe?

A consensus is just a collective agreement, and subject to all the problems we’ve seen, including confirmation bias, the herd mentality, ostracizing or even bullying dissidents, and so on.

As a Watergate teenager and mid- to late-70s undergraduate I have clear memories of a flurry of hysteria back then over global cooling.

We were on the verge of a new Ice Age!

We were all gonna die …

Didn’t happen, obviously.

Nor did we all die because of a hole in the ozone layer that became the academic-sponsored eco-fashion a few years later.

So why do I get the feeling that again when all is said and done, Miami will still be very much above water in 2050?

Higher Education’s Materialist Bias

To believe in God in academia is to be seen as backward. This is because for a very long time now, materialism of one form or another has been the reigning metaphysical dogma.

Metaphysical here means: having to do with the fundamental nature of reality.

Not truths that can claim to be established by any science.

Materialism may seem to follow from naturalism as a method, which tells us to assume that everything we see has a natural cause.

Both are forms of operational atheism. Operational here just means: all one’s actions make that assumption.

The assumption is that nothing exists except the physical universe in space and time. God does not exist.

Do we see Him?

That depends on if we’re looking for Him.

Most who are, no longer go into higher education, of course. Some go into the ministry or some other form of church work or some related occupation. Or just into something where they can make a difference.

This, though, abandons supposedly truth-seeking enterprises to the operational atheists.

Thus the number of visible Christians in higher education outside obscure Bible colleges is negligible, and there are very few people on the inside to draw students’ attention to the problems materialism and naturalism raise.

For example, the naturalist view of the universe requires that life came from nonlife. If God does not exist, there is no other option. Scientists know this, and for over 100 years now, they’ve made exhaustive efforts to recreate supposed circumstances of an early Earth that could have generated living and replicating entities from electrical discharges over a “primordial soup” in a chemically volatile environment.

While these efforts have produced simple organic molecules — amino acids, for example — if their aim was to lead to something able to replicate itself viably, without error — surely a condition of saying something is alive — they have failed.

In other words, there’s no hard scientific evidence life either came from, or can come from nonlife….

Statisticians have also worked on this problem. Their conclusion: the odds of its happening are trillions to one.

Against.

Another thing seldom mentioned. First, there’s no hard physical evidence, e.g., geological, that the conditions being replicated ever existed. Given this, even if organic life could be created under laboratory conditions, directed by human intelligence, what would be proven is that life can be created in the laboratory by intelligent design, not that it actually happened under natural circumstances, absent any intelligent direction.

Oops! I just committed a heresy, uttering that phrase intelligent design.

But why do practically all scientists think the idea of a Designer behind the order we can directly observe in the cosmos, and in life, is nonsense — their favorite word is pseudo-science?

Most are in universities, were trained in them. They were exposed to, or leaned instinctively towards, materialist and naturalist thinking as students. Now, as faculty members, their research makes these assumptions, and they are members of professional organizations and associations that continually reinforce them.

Science as a human enterprise has its own sociology.

The historian and philosopher of science Thomas S. Kuhn showed this clearly in his classic The Structure of Scientific Revolutions (orig. 1962). Scientific investigations are not neutral investigations of reality. They presuppose what he called paradigms: roughly speaking, bodies of ideas, methods, key observations perhaps, and sets of achievements taken for granted as able to guide further research.

Scientific communities organize around paradigms. Paradigms are practical, because as Kuhn had no trouble showing, without them every generation would have to start all over again. Paradigms give them assumptions they can take for granted without continued testing.

Where do paradigms originate? With those who can be described as thought leaders in various disciplines. In the past, such leaders stood up like giants: Aristotle in ancient cosmology, Newton in physics, Lavoisier and Dalton in chemistry, Lyell in geology, Darwin in biology, Mendel in genetics, Einstein again in physics, and so on.

Such figures are scarce for obvious reasons, and it should be clear from the above material why we rarely see such folks today.

Forging a new paradigm, whether it turns out partly right or mostly wrong, is a massive undertaking of thought. In an environment driven by the superficial and rife with fads, it won’t happen.

Contrary to many postmodernist types, Kuhn’s work did not imply the silly idea that we never “really” know anything about “objective truth.”

But it did open a door to the idea that institutions, hierarchies, and even widespread consensus, could block important paths of questioning and discovery if dominant voices confuse objective truth with their collective beliefs or consensus about it, and suppress findings that do not support those beliefs.

The upper echelons of today’s academic professions, usually based in the Ivy Leagues, typically set the agenda for the various academic disciplines, the sciences included. Where they lead, others follow, and teach their students to follow. Conformity dominates as always. Dissent is weeded out, especially in an overcrowded academic “buyer’s market.”

This has produced some results even worse than what we’ve already seen.

The Cold War on Truth

We come to something that, for all I know, could get me kicked off this platform before I’ve even really established a presence here. I’ll take my chances, because as a writer I have alternatives and because truth matters to me more than popularity, virtue-signaling, or political correctness.

As a rule, if something is trendy, and is defended by virtue-signaling, appeals to popularity, and if those fail, bullying, threats, and career-sabotage, you can assume it’s probably wrong and may even be ludicrous when exposed to the (increasingly rare) light of reason.

I’m using cold war here as a metaphor for the ongoing battle to establish dominant narratives and marginalize dissent. Its weapons are not guns but language which is easily weaponized and used to bully, intimidate, or even destroy.

Consider the now-trendy idea that you can choose your own “gender” from a smorgasbord of possibilities.

Where did this bizarre idea come from?

According to mainstream academic philosophy and “gender studies,” at least since radical feminist Judith Butler put the idea forth in her book Gender Trouble (1990), gender is a “social (or cultural) construct.” This implies ideas about gender can be changed and gender itself, changed — increasingly, at will.

To those of us who took biology in high school as it was taught back in our Watergate-teenager days, the idea of changing your gender makes no sense at all. Your sex is fixed genetically, by your chromosomes. If you have a pairing of X and Y chromosomes you are male. If you have two X chromosomes, you are born female. Period.

There are cases of persons whose chromosomal properties are ambiguous, but these cases are statistically very rare and don’t invalidate what is otherwise obvious: you can’t change your sex through will power, “gender-reassignment” surgery, or even via academic fad.

This bit of biological common sense is now heresy for those who distinguish biological sex from their “social construct,” gender, and then (as far as I can tell) forget the former even exists.

Hence visible professors who openly question the new sacred writ about gender are increasingly rare.

Ask David Deming, geophysicist at the University of Oklahoma who committed numerous academic heresies including this one. He has contended that biologically-rooted differences between the mindsets and psychologies of the sexes better explain workplace ratio differences between men and women in fields such as physical science and engineering than gender discrimination. He advised against affirmative action programs aimed at hiring more women faculty.

For this, and for a range of other offenses against cultural left dogma (on global warming, guns / gun control, homosexuality, and others), he effectively lost his classes and his office in the geology department, resulting in two bouts of litigation the university settled.

His advice has been for people to get rid of their victim mindsets and assume responsibility for their lives.

To his credit, he’s consistently stood up to the bullies.

We can also ask James Damore, fired from his job as an engineer at Google for issuing the same challenge in a well-publicized memo initially circulated within the corporation.

This is proof that the sacred writ that began in higher education spread to the upper echelons of Big Tech and now controls its corporate policy.

What goes on in higher education matters! It tends to affect what goes on in the rest of society, because of how the rest of society has higher education on a pedestal.

Or possibly an altar.

Upshot: Present-Day Higher Education Is FUBAR

There are topics I haven’t dealt with because otherwise I’d be writing till the cows come home (and you surely wouldn’t be reading!).

Take student evaluations, which — it was clear to me when I was still an undergraduate — are a joke. I knew even then they were used by students to retaliate against faculty members who’d given out low grades.

They said so openly.

Faculty are expected to submit to them anyway, as part of the end-of-the-semester ritual, and there’s now a Yelp-style online version entitled RateMyProfessors.com.

More demanding profs and instructors who are “old school” and just dispense information without trying to entertain tend to get poorer evaluations.

They are described as “boring,” or worse.

This can be bad news, if you’re an adjunct instructor on a semester-by-semester contract, and your superiors are looking only at your “evals” because it’s easier than doing a real faculty review.

A few bad evaluations, or a complaint from a student, and you’re history.

Since in this digital world bad “evals” can follow you elsewhere, if that happens your teaching career may be history.

Remember: to the administration, students are money. The last thing they want to do is for a student to drop out, including because of low grades.

Your perverse incentive is to make your courses easy!

Or to be less polite, to dumb them down!

If your subject is philosophy, as mine was, it helps if your classroom demeaner is a cross between Socrates and, say, Jerry Seinfeld.

Most students like to be entertained, after all.

They view themselves as consumers-in-training, for a consumer culture.

Entertaining them may be the only thing that gets them to put down their phones.

It’s helpful to your reputation to be more exciting in the classroom than their Twitter feeds.

FUBAR, as most everyone knows, is an acronym for f****d up beyond all repair.

Because the higher education my generation grew up with is gone. The buildings remain, but the spirit of learning is all but dead.

Higher education’s structural problems were long in the making. They range across teaching and its evaluation, the proliferation of “research” no one reads, the overproduction of PhDs most of whom will never find decent academic work, bloated administrations with inflated salaries and a lust for prestige, the many resulting misplaced priorities, and the many people on campuses tied exclusively to their political agendas.

Four-year higher education is ridiculously overpriced for the value it delivers, and one has to wonder how long it will be before employers figure this out and begin looking at resumes for concrete accomplishments instead of four-year degrees.

I mentioned that the Brennan-Magness book doesn’t propose any concrete solutions.

I’m not sure that’s a fair criticism, because if there’s a fix-from-the-inside for all this, I sure don’t see it. Perverse incentives are self-reinforcing and eventually feed on the system itself, destroying it from the inside.

Until the buildings are standing but their insides are hollow.

Because the intelligent and independent have spoken with their feet, as they did under Communism. They took early retirement, resigned their positions, dropped out of graduate programs, or never enrolled to begin with.

Or just lowered their heads as well as their spirits, did as little as possible, and lived out their lives of quiet desperation on evenings and weekends, like millions of other workers in consumer society.

What’s the Solution?

Present-day higher education should be abandoned instead of reformed.

What is FUBAR, by definition can’t be fixed.

How do we do this, and what’s next?

R. Buckminster Fuller, the twentieth century author, inventor, systems theorist, and sage, said it best:

“You never change things by fighting the existing reality. To change something, build a new model that makes the existing model obsolete.”

Higher education has done much to make itself obsolete.

What is needed — badly — are new models, founded on new premises about education, civilization, and the future. Or occasionally on old ones. New institutions have already appeared for special purposes, and I’ve been involved with a couple (like these guys). So far they are highly niched. What we need are models able to revisit fundamentals and apply them to real and living problems.

Who are we? What gives our lives value? How do we learn to see all persons as having intrinsic value?

How do we have the global conversations we need to have, conversations that celebrate our intellectual diversity?

How do we persuade people to stop thinking and acting like victims, and take charge of their lives?

Can we persuade them that truth is what it is, and that sometimes we just plain get things wrong?

How do we negotiate our way past the divisions of the moment into an uncertain future?

And should we plan the future, or just leave things to chance as we’ve been doing?

How’s that working out, anyway?

These questions may seem overwhelming, or just rhetorical, but facing them is essential.

Should we revisit the basics of philosophy and theology, and work with the idea that removing God from our “map of reality” was a bad idea?

Just to see where such a conversation might take us?

Remember what operational atheism was. I think it was more widespread than most people think, including in the church. What’s a “Sunday Christian”? A person who goes to church on Sunday and back to wheeling and dealing on Monday without giving spiritual matters a second thought.

Much of modernity worked under the assumption that God either doesn’t exist (communism) or has somehow “blessed” our secular endeavors whatever they are but otherwise stays out of the way (capitalism).

We have a massive case study right in front of us. It’s called the twentieth century. And the twenty-first, so far.

Given the legacy of brutal dictators, fascist or communist, who saw themselves as gods and their ghastly ideologies as paths to heaven-on-Earth, we still don’t have an accurate body count.

The capitalist legacy is better, but far from ideal. The idea that “the social responsibility of business is (solely) to increase its profits” has allowed corporations to enrich themselves foisting harmful products on unsuspecting consumers, while damaging the environment, and shrugging off such results as “externalities.”

Democratically elected governments have been brought down so that extractive enterprises could continue pillaging a nation’s resources and taking the profits out of the country. Does the name Mohammed Mosadegh ring any bells?

If not, try Iran, 1953.

The Iranians happen to have memories, even if we don’t.

There are places in the world where history is cared about, even if America is not one of them.

There are other problems.

Built-in obsolescence is the order of the day in the tech world, as constant unnecessary upgrades compel consumers to buy, or fall behind when older versions of a product or device are no longer serviced.

Peer behind all these curtains, break through the “matrix” (choose your metaphor), and secular capitalism’s track record is less than stellar!

It’s true, people have fought each other over religion, but was removing systems of faith the answer? Modern history suggests not.

A different metaphor, throwing out the baby with the bathwater, may apply.

We have more conveniences and creature comforts than ever, and we’re more “wired in” than ever before. But simultaneously, and ironically, we’ve never felt more frustrated, suffered from more stress, or had a greater sense of being alone.

We’ve never had more health problems, mental as well as physical. We’re probably the most drugged up society in history.

Big Pharma is chuckling all the way to the bank.

Maybe we should we look at all this in the context of a secular scientific-technical and consumer culture — modernity — and question it.

Perhaps we should see modernity as a stage Western civilization has passed through: an important stage but not its best or final state (if it has one).

Can such questions be built into new educational institutions — or learning, problem solving, and leadership communities?

At the other end of the ideas spectrum is the realization that many people just aren’t suited for what some of us think of as the “life of the mind.” But they can live happy and fulfilled lives working with their hands, provided they can get training for a fair price. Civilization needs plumbers, electricians, gardeners, and family farmers, among others, no less than it needs computer programmers and network administrators and support technicians. At present there are not enough such people — partly because of the myth that Everyone Should Go To College.

What will the next stage of civilization look like?

How will it view technology? They way it does now, as something to be consumed — a source of information overload, distraction, and chronic stress?

Or as a force for good, capable of producing abundance instead of maintaining systems based on a presumption of scarcity, its products to be sold exclusively for profit?

Imagine technological systems aimed at doing what Fuller once envisioned: producing sufficient food for everyone on the planet (and simultaneously dropping their price to zero or almost zero!)

Talk about disruptive technology!!

The only way to find out is to begin thinking strategically about what it will take, what we must learn, to build this kind of future.

Many if not most of these ideas, should they begin to gain ground, will doubtless be resisted. Truly disruptive ideas always are.

This won’t be for everybody. There’s no point in pretending there will be huge numbers of early-adopters.

We may have to form our learning communities in relative isolation, using our own resources, drawing clients in the traditional fashion: starting with offers of truly superior value and clear benefit — and inspiring to them!

The place to begin is to by creating new educational models and starting up new problem-solving communities of learners and thought leaders.

Join me!

I invite readers who have waded all the way through this admittedly challenging saga to send me your email addresses, so you might receive more information when it becomes available, or share ideas and information of your own.

If we do not assume ownership of the future (i.e., our futures), others will do it for us. Let’s begin planning a future that will be better than the recent past or present. Let it begin with at least one new learning, problem solving, and leadership community that replaces the obsolete model represented by present-day higher education.

Steven Yates

Written by

Writer, editor, copywriter, philosopher (ex-academic), futurist, Stages of Civilization theorist, aspiring visionary.

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade