...I have argued that higher education, properly understood, is distinguished by the absence of a direct and designed relationship between its activities and measurable effects in the world.
This is a very old idea that has received periodic re-formulations. Here is a statement by the philosopher Michael Oakeshott that may stand as a representative example: “There is an important difference between learning which is concerned with the degree of understanding necessary to practice a skill, and learning which is expressly focused upon an enterprise of understanding and explaining.”
Understanding and explaining what? The answer is understanding and explaining anything as long as the exercise is not performed with the purpose of intervening in the social and political crises of the moment, as long, that is, as the activity is not regarded as instrumental – valued for its contribution to something more important than itself...
This ideal, he says ruefully, is on its way out:
Except in a few private wealthy universities (functioning almost as museums), the splendid and supported irrelevance of humanist inquiry for its own sake is already a thing of the past.He goes on to discuss the argument of a new book, "The Last Professors: The Corporate University and the Fate of the Humanities," by Frank Donoghue. One of the main supporting pieces of evidence is the growing use of adjunct and temporary faculty. That may reflect shifting priorities, but I would partly attribute it to Baumol's cost disease - the relative cost of personally delivered services has tended to rise over time because their productivity growth is slow. That is, the price of professors relative to cars has risen, because the manufacture of cars requires much less labor than it did a generation or two ago, while teaching requires about the same amount of human effort.
I think the broader problem, what Fish describes as a "shift from a model of education centered in an individual professor who delivers insight and inspiration to a model that begins and ends with the imperative to deliver the information and skills necessary to gain employment," is more a consequence of the increasing college wage premium.
(figure swiped from Goldin and Katz). Because the gap in earnings between those who hold college degrees and those who don't has grown significantly over the past several decades, young adults (and their parents) have come to regard college as a stepping stone - or obstacle - to "success." It is not surprising that they therefore believe that universities should deliver some sort of "useful" job-specific knowledge.
Unfortunately, many people inside the university seem to make the same error. There are several reasons why they are wrong -
The economics argument would be that college is really a signaling mechanism, as Christopher Caldwell explained in the Times:
But the education kids are rewarded for may not be the same education their parents think they are paying for. Economists would say that a college degree is partly a “signaling” device — it shows not that its holder has learned something but rather that he is the kind of person who could learn something. Colleges sort as much as they teach. Even when they don’t increase a worker’s productivity, they help employers find the most productive workers, and a generic kind of productivity can be demonstrated as effectively in medieval-history as in accounting classes.
Another argument, that I have made before, is that liberal education develops broadly applicable skills, like critical thinking and writing.But Fish's view is the one that I like best, that seeking to understand and interpret the world is an enterprise of intrinsic value. I hope someday the college wage premium will disappear, so we can overcome the confusion about what we are really here for.