Reprinted with permission Mises Institute Chris Calton
Jeff Deist recently posed the question “Is College Worth It?” My first thought when I opened the article was that he could have reduced the entire piece to a single word: “no.” This cynicism might seem odd coming from somebody who is nearing the end of a PhD program in history, which takes an average of eight years to complete, not including the years spent acquiring a BA and, for many students, an MA before even starting on the doctorate. But this experience has given me perspective, disillusioning as it has been, on the value of higher education in its current state.
The ever-sober Deist, of course, did not simply make a blanket argument that higher education is never worth it. Rather, he acknowledged the important reality that however far the university system has fallen, society still needs at least some of what college programs offer. Most people would prefer that the engineers designing our bridges and the doctors operating on us have some formal training, and majors in STEM (science, technology, engineering, and mathematics) tend to correspond to higher incomes for graduates. In other words, market signals still show that there is value in at least some degrees.
Even for the humanities and social sciences, education is—or at least should be—generally valuable. While these fields are less likely to translate into direct vocational returns, they help students cultivate transferable skills, such as critical reasoning and writing. This type of education does not necessarily have to come through formal training at a university, but even the best autodidacts benefit from having somebody who can guide them through the learning process.
This is all to say that, despite my growing disillusionment with higher education, even I am not ready to chuck the whole system. But more people have begun to recognize that the costs of attending college often exceed the benefits, both individually and socially. It is not that people are devaluing education itself, but that they are increasingly questioning the value of formal education in its present form.
There is no single factor behind the issues plaguing higher education, so this is the first article in a four-part series devoted to the college problem. I should acknowledge at the outset that my assessment is based heavily on personal observations and anecdotal experiences, as both a student and teacher, that have contributed to my disillusionment with higher education. The present article deals with the cultural factors contributing to the declining value of college, and the rest will respectively address the public-policy, ideological, and institutional problems of higher education.
The most basic problem facing higher education today is the plummeting value of the bachelor’s degree. Undergraduate students face increasing tuition costs—often covered by sizeable student loans—to earn degrees that in many cases no longer provide any advantage on the job market. This is basic economics: more demand (for access to college) leads to higher prices (tuition), while greater supply (of college graduates) leads to declining prices (in postcollege salaries). The fall in market value also corresponds to a decline in the quality of education, as professors often rubber-stamp students through courses, chipping away at the value signal degrees are supposed to convey.
While discourse over higher education tends to center on policy and politics, it is important to first recognize the cultural aspects of this problem. The push to widen access to college education took off after World War II, when tuition costs were modest and a bachelor’s degree in any field was scarce enough to give graduates in any field a career advantage. The devaluation of the degree was gradual, and it largely went unnoticed as long as the advantage of earning a diploma continued to exceed the costs. Even as college enrollment became more accessible, a bachelor’s degree continued to be a ticket from blue- to white-collar lifestyles for many Americans.
Acceptance into a university was not yet a given. Entry was competitive, even for state schools of little prestige, but the rarified status of being college educated signified middle-class respectability. The degree was as much a status symbol as it was a value signal for employers. Trade skills and manual labor, such as welding or bricklaying, were not to be looked down upon—after all, this was the type of honorable labor the fathers of first-generation students did—but college offered the possibility of upward mobility. The American dream suddenly included not only the suburban home and white picket fence, but also a university diploma.
It is understandable, then, that parents began to encourage their children to “study hard and stay in school,” so they could have the opportunity to attend college. This was, at least for a time, good advice. The blue-collar father worked in the Ford factory so his children wouldn’t have to. “School is your job” became the refrain of the growing group of parents who discouraged their children from getting part-time jobs in high school and college so that they would have nothing distracting them from their studies.
But as first-generation graduates gave way to second- and third-generation students, something started to change. College became an experience. University enrollment swelled with the maturation of baby boomers, many of whom—enmeshed in hippie culture—graduated with more than a diploma. They left with memories of an active social life, full of dalliances, experimentation, and liberation. Most outgrew this lifestyle, and their degree still had market value, but college had become more than an instrument for increasing somebody’s earning potential. They remembered, perhaps too fondly, their youthful excursions and used the prospects of such experiences to encourage their own children to look forward to college.
I once told my advisor that students today predominantly attend college not to learn, but to have sex, do drugs, and watch football. He laughed courteously, but I wasn’t joking. The vast majority of students today are hardly engaged in the educational aspect of college, and while it’s easy to write this off as the immaturity of youth (which is not entirely untrue), it is also the natural outcome of treating college as something more than preparation for the job market.
Many parents today champion the college experience as a way of cultivating good citizens and well-rounded adults, but the reality is that it is largely a euphemism for the extension of adolescence. Middle-class parenting culture developed an almost religious faith that college is inherently valuable, regardless of the income-earning potential of a degree, and college became a goal for its own sake.
The problem wasn’t merely that students weren’t motivated to learn; even for parents and teachers, the questions of whether and what students learned were relegated to positions of secondary importance. And as college increasingly became an appendage of public schools, with states adopting policies mandating automatic admission to state universities for any high school graduate, the mantra of “study hard so you can go to college” lost all meaning. College became an expectation rather than an ambition; it became less about educational and economic outcomes, and more about the “college experience” gained merely by attending.
Little thought was given to the opportunity cost of the college experience. Not all parents embraced the ethos that school was their child’s “job” until they graduated college, but many accepted this logic uncritically. That our children could afford not to work was thought one of the luxuries of modern middle-class life, and too many parents failed to consider that work is a form of education in its own right.
When I was an undergraduate student at Marshall University, I befriended a remarkably intelligent young woman whose parents discouraged her from seeking even part-time employment until she finished college. Bright and attractive in both appearance and personality, she graduated with a degree in biology at age twenty-three.
Suddenly thrust into the job market, she had not only to learn how to conduct a job search for the first time in her life, but also to market herself with absolutely zero employment history. By almost any other standard, she should have been tremendously employable, but instead of weighing competing job offers, she was forced to confront the reality that employers are reluctant to hire an applicant with an empty résumé. Even with a STEM degree, this midtwenties college graduate landed her first job as a part-time employee at Bed, Bath, and Beyond.
My friend’s story is, sadly, far from unusual. But while it may be easy to dismiss her situation with “she made her bed, and now she has to lie in it,” this fails to recognize the nature of the problem. Students in this situation are not simply making bad choices, as young people often do. They are doing exactly what they have been taught is the wise and responsible choice by their parents, their teachers, their pastors, their neighbors.
There are many factors contributing to the college problem, but I chose not to begin this series with the policy or political issues that dominate the conversation because the cultural factor is the one that people are most often blind to. This is not true for everyone, and much of the interest in the college problem is driven by a growing recognition that the conventional wisdom of the twentieth century doesn’t work for the twenty-first. Nonetheless, these ideals remain powerful, despite the poor outcomes they often lead to, and even the soundest policy reforms will have little effect on cultural failings.
Fortunately, parents and students can address the cultural problems even without political change. The first principle to follow is to treat the cost-benefit analysis for college as an economic decision: How much will it cost financially, how will you pay for it, and what can you reasonably expect in terms of long-term returns on income?
Students who begin at a two-year community college may sacrifice parties and football, but they enjoy better four-year completion rates and lower tuition costs, and they graduate with the same degree as their counterparts who immediately enroll in a state university. The choice of major also matters. Most students, thankfully, choose not to pursue ridiculous degrees such as “feminist studies,” but many long-standing fields of study—including even some STEM degrees—leave students with few employment opportunities.
Perhaps even more importantly, parents should encourage their children to get part-time jobs in high school and continue working while in college. Not only do working students show better completion rates than students who are taught to view school as their job, but the work experience leaves them more prepared for the postgraduation job market. Employment is a form of education in its own right, and it’s proving to be more valuable than the formal education universities offer. Children who are taught to treat college as an economic decision and part-time employment as valuable experience—rather than the reverse—will almost invariably find themselves better prepared for adulthood than most young people are today.
Chris Calton is an economic historian and a former Mises Research Fellow. He was the writer and host of the Historical Controversies podcast.
Subscribe to our evening newsletter to stay informed during these challenging times!!