Looking back on too many years of education, I can identify one truly impossible teacher. She cared about me, and my intellectual life, even when I didn’t. Her expectations were high — impossibly so. She was an English teacher. She was also my mother.
When good students turn in an essay, they dream of their instructor returning it to them in exactly the same condition, save for a single word added in the margin of the final page: “Flawless.” This dream came true for me one afternoon in the ninth grade. Of course, I’d heard that genius could show itself at an early age, so I was only slightly taken aback that I had achieved perfection at the tender age of 14. Obviously, I did what any professional writer would do; I hurried off to spread the good news. I didn’t get very far. The first person I told was my mother.
My mother, who is just shy of five feet tall, is normally incredibly soft-spoken, but on the rare occasion when she got angry, she was terrifying. I’m not sure if she was more upset by my hubris or by the fact that my English teacher had let my ego get so out of hand. In any event, my mother and her red pen showed me how deeply flawed a flawless essay could be. At the time, I’m sure she thought she was teaching me about mechanics, transitions, structure, style and voice. But what I learned, and what stuck with me through my time teaching writing at Harvard, was a deeper lesson about the nature of creative criticism.
First off, it hurts. Genuine criticism, the type that leaves an indelible mark on you as a writer, also leaves an existential imprint on you as a person. I’ve heard people say that a writer should never take criticism personally. I say that we should never listen to these people.
Criticism, at its best, is deeply personal, and gets to the heart of why we write the way we do. Perhaps you’re a narcissist who secretly resents your audience. Or an elitist who expects herculean feats of your reader. Or a know-it-all who can’t admit that stylistic repetition is sometimes annoying redundancy. Or a wallflower who hides behind sparklingly meaningless modifiers. Or an affirmation junkie who’s the first to brag about a flawless essay.
Unfortunately, as my mother explained, you can be all of these things at once.
Her red pen had made something painfully clear. To become a better writer, I first had to become a better person. Well before I ever read it, I came to sense the meaning of Walt Whitman’s “Song of Myself.” And I faced the disturbing suggestion that my song was no good.
The intimate nature of genuine criticism implies something about who is able to give it, namely, someone who knows you well enough to show you how your psychic life is getting in the way of good writing. Conveniently, they’re also the people who care enough to see you through the traumatic aftermath of this realization. For me the aftermath took the form of my first, and I hope only, encounter with writer’s block.
It lasted three years.
Franz Kafka once said: “Writing is utter solitude, the descent into the cold abyss of oneself.” My mother’s criticism had shown me that Kafka is right about the cold abyss, and when you make the introspective descent that writing requires you’re not always pleased by what you find. But, in the years that followed, her sustained tutelage suggested that Kafka might be wrong about the solitude. I was lucky enough to find a critic and teacher who was willing to make the journey of writing with me. “It’s a thing of no great difficulty,” according to Plutarch, “to raise objections against another man’s oration, it is a very easy matter; but to produce a better in its place is a work extremely troublesome.” I’m sure I wrote essays in the later years of high school without my mother’s guidance, but I can’t recall them. What I remember, however, is how she took up the “extremely troublesome” work of ongoing criticism.
There are two ways to interpret Plutarch when he suggests that a critic should be able to produce “a better in its place.” In a straightforward sense, he could mean that a critic must be more talented than the artist she critiques. My mother was well covered on this count. (She denies it, but she’s still a much, much better writer than I am.) But perhaps Plutarch is suggesting something slightly different, something a bit closer to Cicero’s claim that one should “criticize by creation, not by finding fault.” Genuine criticism creates a precious opening for an author to become better on his own terms — a process that’s often excruciating, but also almost always meaningful.
My mother said she would help me with my writing, but first I had to help myself. For each assignment, I was to write the best essay I could. Real criticism isn’t meant to find obvious mistakes, so if she found any — the type I could have found on my own — I had to start from scratch. From scratch. Once the essay was “flawless,” she would take an evening to walk me through my errors. That was when true criticism, the type that changed me as a person, began.
She chided me as a pseudo-sophisticate when I included obscure references and professional jargon. She had no patience for brilliant but useless extended metaphors. “Writers can’t bluff their way through ignorance.” That was news to me — I’d need to find another way to structure my daily existence. She trimmed back my flowery language, drew lines through my exclamation marks and argued for the value of understatement. “John,” she almost whispered. I leaned in to hear her: “I can’t hear you when you shout at me.” So I stopped shouting and bluffing, and slowly my writing improved.
Somewhere along the way I set aside my hopes of writing that flawless essay. But perhaps I missed something important in my mother’s lessons about creativity and perfection. Perhaps the point of writing the flawless essay was not to give up, but to never willingly finish. Whitman repeatedly reworked “Song of Myself” between 1855 and 1891. Repeatedly. We do our absolute best with a piece of writing, and come as close as we can to the ideal. And, for the time being, we settle. In critique, however, we are forced to depart, to give up the perfection we thought we had achieved for the chance of being even a little bit better. This is the lesson I took from my mother: If perfection were possible, it wouldn’t be motivating.
John Kaag is an associate professor of philosophy at the University of Massachusetts Lowell and former visiting assistant professor of expository writing at Harvard. He is the author of the forthcoming book “Finding Westwind: A Story of American Philosophy.” And yes, Becky Griffith Kaag, his mother and a former high school English teacher, took her editing pen to this essay.
Draft is a series about the art and craft of writing.
Professor of psychology at Vanderbilt University.
Genius is arguably one of the rarest, if not the rarest, phenomenon in the human condition. In Murray’s (2003) compelling analysis of Human Accomplishment, genius is seen as describing individuals who generate products that transform humanity. When leaders in the field examine their creative contributions, a frequent response is: “How could a human being have done that?” Because genius is such a rare phenomenon, some have questioned whether it is meaningful to attempt to study it scientifically. Given some estimates suggesting that only about 400 individuals over the past 2,000 years could meet the unassailable criteria across all domains (literature, the military, and science & technology, to name just a few), are there ways to scientifically examine these rare occurrences?Read More
Darrin M. McMahon
Professor of history at Dartmouth and author of Divine Fury: A History of Genius (Basic).
If you’ve been to the movies lately, chances are you’ve learned something about what makes a genius a genius. Benedict Cumberbatch’s stunning depiction of the British mathematician Alan Turing in the Imitation Game and Eddie Redmayne’s no less compelling performance as the physicist Stephen Hawking in The Theory of Everything provide master-classes on the particular virtues long believed to set these particular creatures apart.
Both films present men of daunting intelligence as creative visionaries and highly original minds. Turing, in cracking the Nazi Enigma code, conjures computers before there is any such thing; Hawking stares into space to perceive events on the horizon of possibility, where others see nothing at all.Read More
Neuropsychologist and assistant professor of neurosurgery at the University of New Mexico.
In 1984, when I was 20 years old, I felt a burning sensation in my stomach that would not go away. I was a sophomore in college, taking courses in finance and trying to get a young lady in one of my classes to take notice of me. The gnawing in my gut grew worse as it became increasingly evident that her affections were elsewhere. I went to the doctor, dutifully swallowed the pink “barium” milkshake, and was diagnosed with a duodenal ulcer. The doctors told me that this ulcer was due to “stress,” a penchant for spicy food, and various other defects of my character that I must stop, immediately, in order to rid myself of this disease.
I would like to say that I won the heart of that young lady, that I gave up spicy foods, and that I abandoned my stressful ways. None of that happened. Instead, my ulcer gradually dissipated, I finished my degree, and life trundled on. But around the same time, something wonderful was happening.Read More
Rabbi Lord Jonathan Sacks
Global religious leader, philosopher, author and moral voice for our time.
In 1756, Voltaire wrote a sharply anti-Semitic essay on the Jews. They had, he said, contributed nothing to civilization. Their religion was borrowed, their faith superstitious, their originality non-existent. They were “an ignorant and barbarous people.” Still, he added, “we ought not to burn them.”
In the course of the next two centuries, Jews (or individuals of Jewish descent) became pioneers in almost every field of endeavour: Einstein, Bohr, Durkheim, Levi-Strauss, Freud, Adler, Klein, Spinoza, Bergson, Wittgenstein, Mahler, Schoenberg, Heine, Bellow, Agnon. The litany has become a cliché: less than a fifth of a percent of the population of the world, Jews have won 22 percent of all Nobel prizes.
What led to this efflorescence of genius?Read More
John Steele Gordon
Author and scholar of business and economic history.
As playwright Peter Shaffer brilliantly elucidates in Amadeus, the distance between competence and genius is, paradoxically, at once minuscule and infinite. Antonio Salieri was a good, competent, and popular composer—but Wolfgang Amadeus Mozart was a genius. So Salieri is forgotten; Mozart, immortal.
We will probably never know what makes the difference between competence and genius in the arts. But genius is found in every field of human creativity, and in many, its essence can be perceived.Read More
Rabbi Alon Goshen-Gottstein
Founder and director of The Elijah Interfaith Institute.
All religions recognize there are outstanding individuals, whose spiritual insight and power surpass those of others. These individuals help create, define, drive, reform, and inspire their traditions. To a large extent, they serve as models that others hope to emulate, the bodily manifestation of ideal traditions. What makes these individuals more than the ordinary teacher or the successful practitioner is that they bring something novel to the religious community. They thereby facilitate a regeneration of the tradition and a spiritual renewal in the lives of its adherents.
The religious genius has the capacity to apply intuition and intellect to bring about a new understanding, one that is grounded in awareness of a broader existential dimension that leads to a deep personal transformation. The new understanding offered by the religious genius provides creative and constructive solutions that help solve religious and spiritual problems. A religious genius will accordingly have high positive output, effectively addressing challenges that are fundamental to a tradition or, more universally, to being religious.Read More