“I’M GONNA WASH THAT MAN RIGHT OUTTA MY HAIR,” I sang in a full voice from the back row of a University of Texas lecture hall, over the heads of fifty cringing undergraduates. It was the spring of 1995, and I was the oldest student (by at least five years) in a history course called United States Culture, 1945–Present. That day we had a guest lecturer, an American-studies professor who had produced award-winning books on documentary expression in the 1930s and on postwar Broadway musicals. His lecture was on the importance of the latter. He had just asked the room if any of us knew any Rodgers and Hammerstein numbers. Swept away by the enthusiasm of the moment more than by my affection for Oklahoma! or South Pacific, I raised my hand and sang my reply.
Professor William Stott smiled and held his arms akimbo. He paused. Then responded. “I’m just a girl who can’t say no.” His voice was rich and joyful. We had broken the fourth wall of academic performance protocols; the expert’s lecture had somehow threatened to become a song swap. The already befuddled younger students in the class were now on the verge of horror, as this pair of aged show-tune enthusiasts shared a moment of mutual recognition with passion, confidence, and a complete lack of embarrassment.
That’s when I first began to recognize my calling as a scholar of the humanities—a vocation that these days is steeped in a corrosive identity crisis, seemingly never-ending job insecurities, and no small amount of wider cultural embarrassment. But from my outsider’s perch as a hitherto aimless humanities student rebounding from a dismaying false start to my career in the no-less-precarious field of journalism, this fugitive communion in song persuaded me, on some other-than-conscious level, that there was real joy to be had in the academic calling. And when I met with Professor Stott in his office after class, his ebullient description of his scholarly passions helped me grasp this crucial point more clearly.
As his office swelled with piano music from a cassette player—it was Randy Newman’s alternately brooding and soaring sound track to The Natural, he informed me—I told him I was in the process of pulling the plug on five undistinguished years of daily reporting. I had worked for several Texas newspapers during the early 1990s, a time of existential angst in the media industry, significant layoffs, and the closure of dozens of newspapers around the country. It was a bad time to be just beyond entry level in the newspaper business. I feared being stuck on the three-to-midnight police beat for the rest of what was destined to be a life full of ibuprofen abuse, 1 AM medicinal margaritas, and an unrelenting reporting diet of fatal car crashes and absurd murders.
I explained that I was back in school to figure out how I could learn to write books. I had bigger and different questions in my head than my current writing outlet would accommodate. And while I had no interest in being a professor—it was the family business, and I had been running from it for years—I had also spent weeks making use of the office hours of professors who had written books I admired, like Stott. I needed a road map.
“Why don’t you apply to graduate school in American studies?” Stott suggested. I listed all my excuses. But Randy Newman’s piano seemed to taunt my objections as soon as I voiced them, rendering them harmless; what chance did a mundane litany of half-formed career complaints really stand against the day’s unlikely sound track of ordinary American strivers triumphing against formidable odds? I didn’t know it at the moment, but I had answered the calling.
“The calling” is the term that members of the clergy use to describe their mystical pull to the priestly vocation. And they didn’t enter this line of work after a cost-benefit analysis. They dedicated themselves to thought, reading, prayer, sermons, and service because their entire life trajectory fit the duties and never quite fit anywhere else in society.
This is why I have helped dozens of students over the years avoid going to academic graduate school and a handful pursue it with enthusiasm. Very few students truly feel the calling. Being good at school is the worst possible reason to continue going to school. The decision to forgo years of retirement contributions, steady income, and full social and professional engagement in exchange for seven (or more) years of monastic life should not be taken lightly. But some things, like a calling, are pretty heavy. Most people who start MA programs never get close to the delicious drudgery of dissertation research and writing. And that’s probably for the best. Graduate-school attrition is a feature, not a bug, as they say on the West Coast.
That my calling came under such particular conditions, that it could not apply to anyone else, should make it clear that there is no algorithm or checklist one may hand young people when they consider the path. That I romanticize my story and mention none of the privations, absurdities, and compromises of my graduate education should serve as an appropriately cautionary sign. It’s not for everyone. It’s not valuable because of the status conferred by the credential, or the job at the end of the apprenticeship—which is a rare-to-endangered commodity these days and, even when landed, is almost never as great as advertised. It can be just as valuable an experience for those who abruptly hit the eject button, or gradually give up amid the petty indignities of adjuncting or departmental infighting. But the university life will only really exert a lasting appeal for those who feel truly called to the immersive pursuit of knowledge.
Should I Go to Grad School? makes these points abundantly clear. This charming collection of very personal essays by writers, artists, scholars, and filmmakers who have considered, and in most cases enrolled in, MFA and Ph.D. programs does the great service of offering no generalizable advice. Once you finish the forty or so short pieces, you will certainly like almost all of these interesting and accomplished people. You will remember their funny, self-deprecating lines. You will appreciate their candor about failure. You will note the ubiquity of good luck in the stories by the few tenured scholars who contributed. You might even, as I did here, yearn to spin your own myth about the inspiration and perspiration of graduate school. But if you are considering grad school, you will not find your answer here. And that’s a good thing.
You will find that each happy ending is happy in its own way. You will find that some graduate programs put students into massive debt. Others did not. Some graduate experiences offered fun and brilliant, life-changing conversations. Others did not. Some people benefited from finding out the hard way that academia was not for them. Others found the opposite and followed the calling.
In the United States, and increasingly in the world at large, we tend to reduce the conversation about the value, role, and scope of the scholarly life to how it serves short-term and personal interests like career preparation or job training. Sometimes we discuss higher education as an economic boon, attracting industry to a particular location or employing thousands in a remote town. Or we probe it as an engine of research and innovation. And sometimes we use academia as a tableau for satire or social criticism when we expose the excesses of the lazy and self-indulgent professoriat or giggle at the paper titles at the annual meeting of the Modern Language Association.
But none of these appraisals of the life of the mind gets at the real heart of the matter: the now quaint-sounding matter of the university’s “mission”—the bigger-picture question of what our institutions of higher learning do for and with the world. In sizing up such issues, every account is a vignette. So sometimes the best we can do is assemble the widest array of vignettes and try to maintain proper critical distance.
So here is another vignette—less personal, more theological. Within every great American university, even MIT, there is a monastery. It’s at its core. Sometimes the campus walls and spires make that ancestry undeniable. More often, the stadiums, sweatshirt stores, laboratories, fraternity houses, and career-placement offices mask the monastery. But it’s still there. European universities emerged from the network of monasteries that had accumulated, preserved, copied, and catalogued texts and scrolls over centuries. The transformation from cloistered monastery to slightly less cloistered university occurred in fits and starts over three centuries. But by the eighteenth century, universities throughout Europe were able to converse about this new thing called science and reflect on the meaning and utility of ancient texts that bore new meaning at the dawn of an industrial age.
Early American colleges and universities were likewise religious institutions built to train clergy to serve a sinful people. Soon they took on an additional role: exposing idle sons of the landed gentry such as James Madison and Thomas Jefferson to dangerous books coming over from Europe.
Jefferson and Benjamin Franklin—among others—founded new, ostensibly secular institutions that they naively hoped would civilize slaveholders into a life of enlightened public service based on science and deliberation rather than on superstition and tradition. But they never fully overturned the monastic traditions. Instead, they invented traditions of their own. In the nineteenth century, public land-grant universities taught farmers the latest breakthroughs in agriculture and developed remarkable feats of engineering and analysis; they thus served as catalysts of economic growth and national expansion. After World War II, Americans got the idea that anyone could work her way into college and that higher education could be an engine of social mobility. And, for less than a third of Americans in 2014, the four-year degree has been just that.
Now, however, that promise of mobility appears to have stalled out. Since the economic collapse of 2008, we have encountered tirade after tirade, book after book, lamenting the ways the American university fails to serve society yet succeeds in indulging itself. The university, like the music business before it, our cohort of brave new digital pundits tells us, is due for “disruption.” It has to adopt a new “business model.” It’s “broken”—like everything else that someone does not like.
Almost all of these accounts blame the university or its cultures and traditions and invoke either a personal cost-benefit analysis (should one borrow to pay nearly $240,000 for the honor of attending NYU?) or a societal one (should we starve universities so they abandon the last few non-market-based trappings, such as tenure, academic freedom, high grading standards, and the pursuit of knowledge for its own sake?).
None of these recent rants appreciates the monastery or understands the calling. When they acknowledge such passions or traditions, it’s with derision. These new John Calvins hope to dismantle the cultural status of the academy by exposing its inefficiency and inaccessibility—as if those factors were not at the very heart of the academy. Calvin believed in following one’s calling as well, but that calling had to be useful and active. So like the great Puritan divine who helped (inadvertently) weld the Protestant ethic to the spirit of capitalism, they promise that a personal engagement with the text will liberate Truth from its monastic prison. (Only, in the age of the hotly touted Massive Open Online Course, the direct explication of a text or a discipline is usually reduced to a ten-minute Web video offered through a commercial Silicon Valley start-up with no revenue but ample faith in its own disruptive potential.)
And like Calvin, they claim that salvation and damnation are predestined, and we may only discover how it all unfolds after our fates are sealed. So the academy is predestined to collapse. Because disruption.
Technology pundit and tenured NYU professor Clay Shirky made this Calvinist case in a blog post in January. “There is no longer enough income to support a full-time faculty and provide students a reasonably priced education of acceptable quality at most colleges or universities in this country,” Shirky writes. Things have changed, Shirky explains in the telltale weary passive-voice diction of the true professional fatalist—as if this situation were not the direct result of overt political decisions to shift the burden of paying for the production of knowledge from the society in general to a collection of desperate young adults who have yet to benefit from it.
Shirky considers calls to restore the level of public investment in higher education to even 2001 levels “unpersuasive.” Why? Because he’s not interested in persuasion. That’s too hard and takes too long. Shirky loves a simple story of guilds, castes, inefficiencies, economics, and technology—you know, like the music business. Times have changed. Universities have not.
Except they have. For better or worse, universities seeded the very changes—neoliberal economics, technological innovation, and the education of the Calvinists who now declare a reformation—that Shirky sees as inevitable Oedipal threats to the academy. Agency, contingency, and complexity never figure into the baby-simple analysis of discredited public monopolies of knowledge falling before the spontaneous liberating sway of crowdsourced intellectual inquiry. Predestination, or maybe some oracle, explains everything.
The market surely can’t—and shouldn’t. The richest nation in the history of the world subsidizes all sorts of luxuries and inefficiencies. Football stadiums, bridges to nowhere, bases and planes that even the military does not want, churches, temples, cathedrals, and vacation homes. Yet in the present consensus on the future of our higher learning, the notion that perhaps we can afford a reasonable level of public investment in the inefficient institutions that gave us the Green Revolution and Google is deemed unrealistic. The public debate is locked on measurable outputs. But the opportunity costs of failing to reinvest never come up. What is the public expense, for instance, if we continue to gouge funding for research on communicable diseases or climate change? How do we measure the cost of failing to inspire and guide the student who might write the next great work of political thought that can guide us safely through the challenges of this century? Why can’t the richest country in the world afford to adequately support passionate potential scholars in the pursuit of their calling? We make explicit value choices in this republic. We have chosen tax breaks over history, poetry, and science. Nothing is inevitable. We can choose otherwise.
When we scholars explain our passions—the deep satisfaction we feel when we help a nineteen-year-old make a connection between the Mahabharata and The Iliad, or when our research challenges the surprising results of some medical experiment that the year before generated unwarranted headlines—many of our listeners roll their eyes like my fellow students did back in that classroom in 1995. How embarrassing that people find deep value in such uncountable things.
It’s been a couple of decades since any American faculty member could engage in the deep pursuit of knowledge untethered from the clock or calendar. But many of us still write for the guild and the guild only, satisfied that someday someone might find the work a valuable part of a body of knowledge. But if that never happens, so be it—it’s all part of the calling’s steep price of admission.
Meanwhile, out in what’s laughingly known as the real world, self-regarding pragmatist pundits like Nicholas Kristof disagree. In early February, Kristof devoted his New York Times column to what seemed like the millionth elegy for the American public intellectual. Kristof doted mainly on the tired canard that American professors have shirked their duty to make their work accessible and relevant to the public and policy makers.
More than twenty years after Russell Jacoby made the same unfounded, nostalgic claims in The Last Intellectuals, Kristof’s content-free piece sparked the defensive riposte we’ve come to expect from the academic wing of the culture wars. Defenders of the American university as it’s presently configured provided a thorough list of scholars who do, in fact, engage with the public and influence policy.
That’s fair enough, as far as it goes. But it misses the point. There is no single way to be a scholar. Critics can’t acknowledge the diversity of work, roles, talents, and duties in the American academy without dissolving their own rants in the process. And, of course, scholarship itself is all about applying the solvents of depth, history, complexity, and rigor to rants that seem to explain too much, far too easily.
In the past three years, only one of my undergraduate students has asked me to write a letter of recommendation for graduate study. For most of the previous decade I would receive five or more such requests per year and agree to write one or two. I don’t know if my experience is typical or representative. But I know my students tell me that the word is out: Graduate school is for suckers.
Rodgers and Hammerstein’s final musical, The Sound of Music, was the story of a young woman who entered the vocation for the wrong reasons yet still manages to find her path and passion. Maria in 2014 might enroll in a graduate program instead of a convent. She might quit. But she could never solve her own “problem like Maria” without that experience, that place, and that time to think and sing. Maria the graduate-school dropout would still be a graduate-school success. So pardon me while I put this aside for a chorus of “Climb every mountain.”
Siva Vaidhyanathan is the Robertson Professor of Media Studies at the University of Virginia and the author of The Googlization of Everything (and Why We Should Worry) (University of California Press, 2011).