I have had a breakthrough in my thoughts on the nature of poetry. To recap, in the last episode of this blog, I stated that over the past twenty years or so, I had somehow decided that unless I really knew what poetry was, I had no business writing it. Despite having taught more poetry than you can shake a spear at, I didn’t feel I could actually define poetry. It couldn’t be just the use of creative language, because that’s used in the best prose; nor could I say it was in the idea of moving the reader to feel a specific emotion, because that’s the motivation behind all different kinds of prose, too. What was left was simply the form of poetry, which meant that a poem is a poem because the person who created it says it’s a poem and delineates its appearance, using line breaks and stanzas, in such a way to suggest that it is a poem.
That’s fair, of course, but not very satisfying. So I came up with the idea of busting apart the entire idea of genre, and asking if it really matters what we call a piece of writing. Whether it’s prose or poetry, if we feel moved by it, if it elicits a vivid picture or sensation or thought, then it’s good writing. But something in me was left unsatisfied, and so I did what I always do when I have a tricky little intellectual problem: I simply tried to forget about it.
But a few days ago I had an idea about the motivation behind writing poetry. Perhaps, I postulated, that’s what really differentiates a poem from a prose piece: the writer’s motivation. By chance, I was helped along in this line of thinking–about the whole idea of why we write and read poems–from, of all things, a very fine science writer named Ed Yong.
You might remember Yong from his insightful articles on the Covid-19 pandemic, which were published in the Atlantic. I knew Yong to be an excellent writer, so when I saw his book An Immense World: How Animal Senses Reveal the Hidden Realms around Us (2022), I picked it up and read it.
But how does a book on natural science relate to poetry? Bear with me a few minutes and I’ll explain.
Yong’s book is all about the way in which animals’ perceptions are different, sometimes starkly, from our own. It’s also about how human beings have misunderstood and misrepresented the way animals perceive things for millennia because we’re so immured in our own self-contained perceptive world. In other words, by thinking of animals in purely human terms, we limit our view of them.We also limit our view of the world itself. What we perceive, Yong argues throughout the book, determines in large part what we think and how we feel–and, most important of all for my point here, how we process the world we live in.
Yong uses the term “Umwelt” throughout the book to refer to an animal’s perceptual world, a term that means “environment” in German but has taken on a new flavor thanks to the scientist Jakob von Uexküll, who first used the word in 1909 in this specific sense. A dog’s “umwelt,” then, reflects the way it perceives the world, a world in which different colors are highlighted, scents linger in the air long after their source has moved away, and so on.
So how does this all relate to poetry and why we read and write it? Simply this: I propose that a poem’s primary task is to present an Umwelt for its reader. To do this, the poet creates a piece of writing that closely reflects (if she is lucky) the way she sees the world and presents it to the reader as a gift. If the reader accepts the gift, his reward for reading the poem attentively is being able to glimpse the world afresh through an Umwelt that is different from his own. In other words, the reader gets to see the world, or at least a piece of it, through a different perceptual grid, an experience that can be entertaining, sometimes unsettling, often thought-provoking, and, at its best, revelatory.
Is this different from prose? Perhaps not too much, but I’d argue that the very choice to write a poem instead of an essay, short story, or novel indicates something–I’d say something vitally important– about the writer’s Umwelt. The other forms of writing have messages they want to relay. The poem, however, exists simply to allow its reader to step into its author’s Umwelt for a few moments in order to experience the world differently.
So there you have it. For me, at least for now, discovering why we write poems has given me a new understanding and appreciation of poetry. It means I don’t have to decide whether I like or dislike a poem, nor do I have to justify my reaction to it. Poetry simply is; there’s no more point in arguing whether a poem is good or bad than there is in arguing with my dog Flossie whether her way of experiencing the forest we walk through every morning is better than mine, or whether mine is better than hers. If I got the chance to experience the world through her senses, you can bet I’d take it. Curiosity alone would drive me to it.
At the most basic level, then, I write poetry to demonstrate how I experience the world. I read poetry to discover how other people experience the world. In the end, we read and write poetry to bridge the gap between ourselves and others. It’s about sharing our Umwelten, which, in the end, means it’s all about breaking out of our own little self-contained worlds and joining together to form a bigger, better sense of the world we live in.
The United States is a mess right now. Beset by a corrupt president and his corporate cronies, plagued by a — um — plague, Americans are experiencing an attack on democracy from within. So just how did we get to this point in history?
I’ve given it a bit of thought, and I’ve come up with a theory. Like many theories, it’s built on a certain amount of critical observation and a large degree of personal experience. Marry those things to each other, and you can often explain even the most puzzling enigmas. Here, then, is my stab at explaining how American society became so divisive that agreement on any political topic has become virtually impossible, leaving a vaccuum so large and so empty that corruption and the will to power can ensure political victory.
I maintain that this ideological binarism in the United States is caused by two things: prejudice (racism has, in many ways, always determined our political reality), and lack of critical thinking skills (how else could so many people fail to see Trump for what he really is and what he really represents?) Both of these problems result from poor education. For example, prejudice certainly exists in all societies, but the job of a proper education in a free society is to eradicate, or at least to combat, prejudice and flawed beliefs. Similarly, critical thinking skills, while amorphous and hard to define, can be acquired through years of education, whether by conducting experiements in chemistry lab or by explicating Shakespeare’s sonnets. It follows, then, that something must be radically wrong with our educational system for close to half of the population of the United States to be fooled into thinking that Donald Trump can actually be good for this country, much less for the world at large.
In short, there has always been a possibility that a monster like Trump would appear on the political scene. Education should have saved us from having to watch him for the last four years, and the last month in particular, as he tried to dismantle our democracy. Yet it didn’t. So the question we have to ask is this: Where does the failure in education lie?
The trendy answer would be that this failure is a feature, not a bug, in American education, which was always designed to mis-educate the population in order to make it more pliable, more willing to follow demogogues such as Trump. But I’m not satisfied with this answer. It’s too easy, and more important, it doesn’t help us get back on track by addressing the failure (if that’s even possible at this point). So I kept searching for an explanation.
I’ve come up with the following premises. First, the divisions in the country are caused by a lack of shared values–this much is clear. For nearly half the American people, Trump is the apotheosis of greedy egotism, a malignant narcissist who is willing to betray, even to destroy, his country in order to get what he wants, so that he can “win” at the system. For the other half, Trump is a breath of fresh air, a non-politician who was willing to stride into the morass of Washington in order to clean it up and set American business back on its feet. These two factions will never be able to agree–not on the subject of Trump, and very likely, not on any other subject of importance to Americans.
It follows that these two views are irreconcilable precisely because they reflect a dichotomy in values. Values are the intrinsic beliefs that an individual holds about what’s right and wrong; when those beliefs are shared by a large enough group, they become an ethical system. Ethics, the shared sense of right and wrong, seems to be important in a society; as we watch ours disintegrate, we can see that without a sense of ethics, society splinters into factions. Other countries teach ethics as a required subject in high school classes; in the United States, however, only philosophy majors in universities ever take classes on ethics. Most Americans, we might once have said, don’t need such classes, since they experience their ethics every day. If that ever was true, it certainly isn’t so any more.
Yet I would argue that Americans used to have an ethical belief system. We certainly didn’t live up to it, and it was flawed in many ways, but it did exist, and that’s very different from having no ethical system at all. It makes sense to postulate that some time back around the turn of the 21st century, ethics began to disappear from society. I’m not saying that people became unethical, but rather that ethics ceased to matter, and as it faded away, it ceased to exist as a kind of social glue that could hold Americans together.
I think I know how this happened, but be warned: my view is pretty far-fetched. Here goes. Back in the 1970s and 1980s, literary theory poached upon the realm of philosophy, resulting in a collection of theories that insisted a literary text could be read in any number of ways, and that no single reading of a text was the authoritative one. This kind of reading and interpretation amounted to an attack on the authority of the writer and the dominant ideology that produced him or her, as it destabilized the way texts were written, read, and understood. I now see that just as the text became destabilized with this new way of reading, so did everything else. In other words, if an English professor could argue that Shakespeare didn’t belong in the literary canon any longer, that all texts are equally valid and valuable (I’ve argued this myself at times), the result is an attack not only on authority (which was the intention), but also on communality, by which I mean society’s shared sense of what it values, whether it’s Hamlet or Gilligan’s Island. This splintering of values was exacerbated by the advent of cable television and internet music sources; no one was watching or listening to the same things any more, and it became increasingly harder to find any shared ideological place to begin discussions. In other words, the flip side of diversity and multiplicity–noble goals in and of themselves–is a dark one, and now, forty years on, we are witnessing the social danger inherent in dismantling not only the canon, but any system of judgment to assess its contents as well.
Here’s a personal illustration. A couple of years ago, I taught a college Shakespeare class, and on a whim I asked my students to help me define characters from Coriolanus using Dungeons and Dragons character alignment patterns. It was the kind of exercise that would have been a smashing success in my earlier teaching career, the very thing that garnered me three teaching awards within five years. But this time it didn’t work. No one was watching the same television shows, reading the same books, or remembering the same historical events, and so there was no way to come up with good examples that worked for the entire class to illustrate character types. I began to see then that a splintered society might be freeing, but at what cost if we had ceased to be able to communicate effectively?
It’s not a huge leap to get from that Shakespeare class to the fragmentation of a political ideology that leaves, in the wreckage it’s produced, the door wide open to oligarchy, kleptocracy, and fascism. There are doubtless many things to blame, but surely one of them is the kind of socially irresponsible literary theory that we played around with back in the 1980s. I distinctly remember one theorist saying something to the effect that no one has ever been shot for being a deconstructionist, and while that may be true, it is not to say that deconstructionist theory, or any kind of theory that regards its work as mere play, is safe for the society it inhabits. Indeed, we may well be witnessing how very dangerous unprincipled theoretical play can turn out to be, even decades after it has held sway.
I’ll be honest: for a moment I thought about entitling this post “Reflections on Re-reading the Iliad,” but aside from sounding very dull, I will admit that I’m not sure I ever did read that pillar of Western Literature in college. Of course, like most other people, I’d heard of it. I’m old enough to have gotten my first and greatest dose of mythology–Greek, Roman, and a small bit of Norse myths–from Edith Hamilton’s Mythology almost fifty years ago, back when I was in high school.
To be honest, I’ve always wondered why American schools even bother to teach mythology. For a long time, I thought it was just to provide an introduction to the basis of Western culture, but then I realized, with a shock, that mythology in high-school English curricula actually had no point; rather, it was an oversight, a leftover from previous educational imperatives. Our insistence on teaching mythology to bored high school students, in other words, is something like having an appendix in our guts: there is no real purpose for it. While it once did have a function, it now simply dangles there with any reason for existing.
Here’s my version of why we have mythology in high school. It certainly isn’t for them to become acquainted with stories of Greek and Roman gods and goddesses. After all, these stories are brimming with violence and sex, and are totally unsuitable for young learners. How do you explain the rape of Leda by Zeus–in the shape of a swan, no less–to high school students? Yet this is where the Trojan War, and the Iliad, really begins, as William Butler Yeats reminds us in his masterful poem “Leda and the Swan.” No, the reason we teach such things is because they were once vehicles for learning Latin and Greek. All language learners know that it’s no fun simply to do exercise after exercise when you’re trying to acquire a second language; you want to get to stories and dialogues, no matter how puerile or simplistic. (Incidentally, the language-learning computer platform Duolingo has figured this out and now provides an entire block of lessons with short stories to keep its learners interested. It’s worked for me.) Since a truly educated person, from the Renaissance to the early twentieth century, needed to know at least some Latin and less Greek (as the poet Ben Jonson rated Shakespeare’s knowledge), schools were obsessed with drumming classical languages into recalcitrant students’ heads. What better way to get them to learn than to present them with violent, prurient tales of heroes and heroines? For generations, apparently, the scheme worked. But gradually the need and desire to showcase one’s Latin and Greek knowledge wore off, and these languages ceased to be taught in schools.
But the mythology remained. And thank goodness it did.
A few years ago, a friend of mine and I decided to read the then newly published translation of the The Iliad by Caroline Alexander. We never got past the first few books then, but Covidtimes provided us a new opportunity, and we started over. I began by being less than impressed with the story, but I have to admit that now I am pretty much hooked. The world that it presents is violent and nasty, but there are some moments of real beauty, too.
Yet what has really caught my attention is that the world of the Iliad is totally random. Things happen for no reason, or for reasons well beyond the control of the humans involved. You may think you’re winning a battle, but then a god shows up, sometimes disguised as a human, sometimes in a fog, and things go to hell in a handbasket quickly, and suddenly you’re terrified and hiding by your ships wondering if you should push off for home. Events kaleidoscope by and you can’t do anything about them, because even if you do take action, often it has the opposite effect you intend.
In other words, life as represented in The Iliad is something like life in a pandemic. Covid seems to hit randomly, and to hurt randomly. We don’t know why some people are barely affected by the virus while others are struck down, killed or incapacitated by it. We don’t know how long the pandemic will last. We don’t know what steps the government will take to protect us from it. We are like the characters in the Iliad, taking action in good faith but knowing in our bones that anything can happen.
Nowhere is this brought out more poignantly than in a relatively insignificant scene in Book 8, which takes place in the middle of a raging battle. The Trojan Paris shoots an arrow at the Greek Nestor, which hits the seasoned warrior’s horse in the head, “where a horse’s forelock / grows on the skull, and where is most fatal” (lines 84-85). Then something truly odd happens; the narrative perspective changes and instead of watching sweeping actions–men swinging swords and throwing spears, horses stamping over bodies, chariots careening and crashing about–suddenly we are watching a single arrow as it plunges into a horse’s head. We watch, transfixed, as the arrow skewers the poor horse, who in his death agony “flung the horses with him into panic as he writhed around the arrow point” (line 87). We go from big action (battle), to smaller actions (arrow shooting), to an even smaller action (the arrow penetrating the horse’s brain). The center of focus has contracted to the tiny tip of an arrow, and we, like the horse itself, are flung around this arrow, orbiting it just as the earth orbits the sun. We have changed our perspective, from large heroic actions taken by men, to a single arrow around which a horse rotates. It’s as if we’re inhabiting a kaleidoscope, living on the inside of it, subject to its twists and turns at any moment. The effect on the reader is disorienting, just as it is meant to be, because it reinforces the sense that events in the story are random, uncontrollable, and largely unpredictable, while at the same time suggesting that mere perspective determines our allegiance and our ideology.
This is why I find reading The Iliad right now so very meaningful. This is a poem that was written ages before the Enlightenment values of logic, continuity, causality–in short, Reason–had been adopted in Western culture. These values are being tested right now in our daily lives, and reading this ancient epic reinforces the sense that values come and go, that worldviews shift and change, and that our sense of primacy is, and should be, rather fragile. If there is anything that Covid-19 has taught us, it is that, at least in the short run, we are all at the mercy of the gods, whoever and whatever those gods may be, and we must, like Odysseus, Agamemnon, Hector, and Achilles, simply get along as best we can in the face of a world we cannot control, even if we desperately want to believe we can.
As it happens, recent research suggests that the appendix does, in fact, have a function: to protect and nurture healthy bacteria until they are needed in the gut. Perhaps teaching mythology serves a similar purpose; perhaps, appendix-like, it preserves and protects various ideas, attitudes, and perspectives that, while outmoded and seemingly unnecessary in modern life, can provide us some kind of insight in difficult times. At any rate, reading The Iliad has certainly given me food for thought these past few weeks.
Most of us who have taken (or, as the case may be, taught) literature classes understand that stories are made up of three components: plot (what happens); setting (when and where it happens); and characters (whom it happens to). And what makes the study of literature so fascinating to us is that these things aren’t present in equal amounts. Picture a series of knobs, like those on a complex sound system. Say you slide the plot knob way high, turn down the setting knob , and leave character knob in the middle region. This configuration might describe a detective novel, in which what happens (plot) is of paramount importance. But if it’s Sherlock Holmes stories that you like, then the setting will be different, because it isn’t their compelling plots that draw you in, but rather the unique character of Holmes himself, or the foggy, turn-of-the-century setting of London, because it’s the hansom cabs, gas lighting, and general ambiance that appeals to you. A book’s literary mix, in other words, can reflect a variety of combinations of plot + setting + character.
Certain writers tend excel at one or the other of these three elements. (Of course, there are more elements of story out there beyond plot, character and setting; for example, I haven’t discussed “voice,” the teller of the story, and there may be some elements I haven’t thought of or read about. But for the purposes of this blog post, we can just focus on the standard three components of story.) To illustrate my point, I’ll just say that Thomas Hardy, who created an entire English county (Wessex) for his novels, is great with setting, that Agatha Christie is ingenious as far as plot goes, and that Jane Austen produced amazing characters. Some writers are wonderful at two of these, but fail with the third. For example, Charlotte Bronte is great with setting and characters but her plots are pretty much bat-shit crazy. (I still love her works, by the way.) A few highly talented writers, like Charles Dickens, manage to work all three elements in equal portions. But for today, I’d like to talk about stories in which nothing much happens, those novels which are virtually plot-less, and why they can be a source of comfort and entertainment to readers today.
I am now going to alienate half of my readers (sorry to both of you!) by saying that I place Jane Austen squarely into this category. But just think about it: not a whole lot happens in Pride and Prejudice. I mean, the only really exciting part of the novel I can remember (and I’ve read it many times) is when Lydia elopes with Wickham. And that scandalous event doesn’t even happen to the main character. That’s not all: to be honest, I cannot even remember the plot of Sense and Sensibility, which suggests that it scarcely has one. But that’s okay–Jane Austen isn’t about plot. If you want excitement and adventure, don’t read Austen. Read Sir Walter Scott instead. But be advised: Walter Scott himself, author of Ivanhoe and Waverley, those early, action-packed adventure novels so beloved by the Victorians, openly admired the newfangled work of Jane Austen, his opposite in so many ways, as he clearly indicated in an unsigned review of her book Emma. As far as nineteenth-century English writers go, Austen is not the only plot-eschewing literary giant, either; if you’ve ever read an Anthony Trollope novel, you’ll know that few dramatic scenes ever occur in his novels. In fact, when something dramatic does happen, it often occurs offstage, leaving the characters to deal with the effects of momentous and emotional events without ever allowing the reader to witness them herself.
Now this type of novel might be dull and frustrating for most readers, but I will admit that I take great pleasure in books in which very little happens, especially nowadays, when I must brace myself anytime I dare to look at news headlines, with crisis after crisis occurring at breakneck speed. Thankfully, in the world of literature, there is a whole category of works in which books with minimal plots highlight either setting or characters, or both components, in order to produce a delightful and soothing reading experience. I will share some of these works below, with the ulterior motive and express intention of hoping to spur my readers to make their own suggestions in the comments section, and thereby help me find more of these little treasures that I can place on my personal reading list.
First, there are the Mapp and Lucia novels of E.F. Benson. I am a late-comer to these books, having just finished the first in the series, Queen Lucia, in which nothing really happens other than village residents in early twentieth-century England try to one-up each other and claim dominance within their social circle. The very pettiness of these maneuvers is highly entertaining, however, and the characters are drawn well. The writing is as precise as a well-built chronometer, with an Austenian feel to it. Earlier this year, I attempted to listen to Mapp and Lucia, which was a mistake, I think; I stopped listening because it was too acerbic. I think that with Queen Lucia under my belt, I will be much more appreciative of the sharp wit with which Benson portrays a character that not even he likes that much. (Sidenote: Agatha Christie wrote a book called Absent in the Spring, under the name Mary Westmacott, in which she also created a very unlikable character. It’s worth reading, but very different from her usual detective novels.)
Another novel quite similar to Benson’s work is D.E. Stevenson’s Vittoria Cottage. Stevenson was a first cousin of Robert Louis Stevenson, author of swashbuckling novels like Kidnapped and Treasure Island, but she specialized in what was termed “light” fiction. Now, I’m not taking anything away from Robert Louis, but I believe it takes real talent to write about the trivial; as Hamlet says, “There’s a special providence in the fall of a sparrow” (V.ii). D.E. Stevenson possesses this talent, and it is a delight to delve into the world she has created, in which nothing happens, and little seems to change.
The Kindle version of Vittoria Cottage has an introduction by Alexander McCall Smith, which is highly appropriate, since Smith’s works offer an excellent contemporary example of the minimally plotted novel and fit precisely into the category I’ve identified here. Sure, the Sunday Philosophy Club books are detective stories, but they are the subtlest mysteries imaginable. One could say the same thing about the Number One Ladies’ Detective Agency series; we don’t read them for plot, but rather for the delightful characters they introduce, such as Precious Ramotswe and Grace Makutsi, as well as for the simply drawn but well-evoked setting of Botswana. Smith’s 44 Scotland Street books have more plot, but only because they depend on coincidence and absurdity to move their stories forward. I could sum it up by saying it this way: in Smith’s novels, there is scarcely any climax, but instead a gentle descent to the concluding pages. And far from condemning or critiquing such a structure, I will praise it here, in an attempt to celebrate these minimally plotted novels that allow us to focus on, and take delight in, both setting and character instead of plot.
Now, readers, it’s up to you: do you have any suggestions for books of this type? I look forward to more discoveries.
Legacy’s Lady Camilla, fresh from her Crufts win in March, 2019.
You may notice, if you are a regular reader of this blog, that I have posted much less frequently in the last few months. The reason is this: I have taken some time to stop writing and really think about what writing does, what it can do, and what it should do. In other words, I have given myself a self-imposed sabbatical from writing while I contemplate the job of writing, and, more personally, how–and even if–I want to continue writing at all.
I have sorted some ideas out in my head, and I’m beginning to get to a point where things are making a bit more sense than they did a few months ago. One thing that galvanized me was an experience I had with a good friend, an experienced writer who kindly volunteered to help me with a short story I was working on. He gave me some excellent advice on how to make the story better: more polished, more focused, and ultimately more ready for publication. I could tell that his advice was spot on. I knew that he was right about the changes he suggested. And yet, almost as soon as I heard his suggestions, I also knew I would not take his advice. Despite knowing that he was right about these suggested improvements, I could not bring myself to make them.
Now, every writer knows that you are supposed to “kill your darlings”: writers should never get so attached to their work that they are not willing to chop it all up in order to mix it back together, or even trash it and begin anew if necessary. I knew that my story wasn’t perfect, so my reason for not making those changes wasn’t that I thought it was good enough as it was. At the time, I didn’t know why I resisted my friend’s excellent advice. In fact, as it turned out, I had to think a good, long time before I could discover why I had had such a profound and visceral reluctance to tinker with it. And now, some three months later, I think I have found the answer.
But in order to explain it, I have to refer to a world that is far removed from writing. My husband shows dogs (you can find his website here), and over the years we have noted something interesting about the way kennel clubs and dog shows operate. The winning dogs all correspond closely to a perceived (but not fully agreed upon) standard. No surprise here: this is, of course, to be expected. The dog who looks closest to the standard, in the judge’s opinion, is the dog that takes home the trophy. Of course, the key words are “in the judge’s opinion“: there can be a wide variety of opinions, which is why different dogs might win in different shows with different judges. Yet it is the corollary to this rule which is most interesting, and most troubling for the future of all pedigree show dogs. If dogs are penalized for deviating from the norm, then that inherently means all the show winners must look more alike–must be more alike–than different. And because it is largely the show dogs that are responsible for propagating the breed, then it naturally follows that the genetic diversity is always shrinking, because of this desire to create a puppy that follows the breed standard to a tee. In other words, the very act of judging the dog makes it so that all people participating in showing will want their dog to look just like the “ideal” dog–the breed standard–and they will take great pains to make sure that the puppies they produce, and sell, and buy, will be more similar to this perceived ideal than different from it. (This has dire consequences for the sustainability, and even the survivability, of pedigree dogs, but that is a matter for other blogs.)
It’s human nature to want to win, whether in dog shows (where surely the dogs don’t care if they are Best of Show) or in the world of writing, which we know as publishing. Publishers–and by extension, readers–are like the dog show judges: they are looking for the best combination of words and anecdotes to hit a home run in the marketplace. They have an ideal standard, which is why all the fiction published in literary journals and The New Yorker ends up feeling the same over time. In other words, what is published will always come to look a great deal more like everything else that is published than it will look like something individual and unique.
And so, being “good,” in the sense of getting published, means that a writer may have to close off options that will divert his or her work into an unfamiliar, perhaps even an uncomfortable, form. It could mean that a writer has to compromise on whatever artistic integrity he or she has developed, getting rid of archaic words, semicolons, and–yes–even adverbs in favor of a more widely accepted style of writing. In short, it means that a writer might have to second-guess his or her own writerly instincts in order to fit into a “breed standard” that is instantly recognizable and appreciated by publishers, readers, and critics alike.
I am not saying that writers should write what they want and not worry about revision. Nor am I saying that all writing is good writing. I am just saying that with the market set up as it is today, it could be very easy to miss unique and talented writing in favor of writing that resembles what we’ve already seen. The danger in this situation is that we may, tragically, fail to recognize authentic writing, and worse still, fail to cultivate writers who strive for authenticity.
It’s time for another clarification. I remember the first time I ever heard the expression, “I’d rather be lucky than good.” I recall the momentary surprise I felt when I thought about it and then realized it actually made sense. One could be as good as possible, only to be in the wrong place at the wrong time. Luck, as Malcolm Gladwell points out in his book Outliers, is often an integral component of success. I want to offer a variation of this saying for writers–or rather, for those writers who are serious about exploring the world of the imagination, about the craft of writing (whatever that may be), about creating something that is meaningful rather than successful. Here goes:
It is better to be authentic than to be good.
I’ve come to this maxim by thinking about the novels I love, those books that I have re-read throughout a half-century of reading: Jane Eyre, David Copperfield, The Sun Also Rises, Never Let Me Go, Till We Have Faces, Mrs. Dalloway–and the list goes on. These books are not perfect. In some places, they are not even good. It is not difficult to find passages in any of them (with the possible exception of Never Let Me Go) that make readers cringe with frustration and/or embarrassment for the author. But each one of these novels is, to a certain degree, memorable and authentic, which is why I am compelled to read them again and again throughout the years.
Certainly the term “authentic” is fraught, and I will need to define what I mean by it. I will try to do this in a timely manner; readers may look forward to a subsequent post in which I take a stab at my definition of authenticity in writing. But for now, I simply pave the way for that post by explaining why I am resisting the idea of writing good stories for the time being, even if that means rejecting the advice of a talented and well-meaning friend. And I invite my readers to weigh in on this topic, half-formed though it is at the present time, as I try to figure out just what it means for writing to be authentic.
Please note: This is a very long post. It is based on a talk I gave yesterday (October 28, 2017) at the C.S. Lewis Festival in Petoskey, Michigan. Consider yourself warned!
The study of myth seems to me to take three different paths:
Anthropological / Archeological: the study of classical mythologies (Bulfinch’s Mythology, Edith Hamilton)
Religious / Transcendent: the spiritual meaning of myth (Karen Armstrong, Joseph Campbell, Sigmund Freud)
Structuralist: the study of the same structures that recur in myths (Northrop Frye, Joseph Campbell, Roland Barthes)
This is all interesting, but I would like to back up a moment. I feel like I’ve arrived a dinner party, and that somehow I missed the first two courses. I feel as if I might get some kind of mental indigestion if I don’t start over at the very beginning.
The fact is, I want to know something more fundamental about myth and its function.
I want to know what it is and how it works.
Specifically, I want to know what distinguishes myth from other forms of story-telling.
Because for me, Story-Telling is what distinguishes human beings, homo sapiens, from all other species on this planet, as far as we know.
Studies have shown that crows have memories
Studies have shown that chimpanzees use tools
Philosophers are now beginning to agree that animals do indeed have consciousness
But we—we should be known not as homo sapiens (wise man, the man who knows), but as homo narrans—the speaking man, the man who tells, who narrates—story-telling man. Because it is clear to me that we humans communicate largely through story-telling, and this story-telling function, this tendency to rely on narration, is what makes us human.
I’m going to ask you to bear with me for a little while as I tease this out. I’d like to say that by the end of this essay, I’ll have some answers to the questions I posed (what is myth, and how does it work, and what is the difference between a really good story and a myth)—but I’m pretty sure I won’t. I may, however, ask some more questions that might eventually lead me to some answers.
So here goes. To begin with, a few people who weigh in on what myth is and what it does:
Roland Barthes, the French post-structuralist literary theorist, says that myth is a type of speech, a system of communication, a kind of message. In a way, Barthes and JRR Tolkien are not really different on this point, incredible as it is to think of Barthes and Tolkien agreeing on anything at all, much less something so important to each of them.
They are both incredibly passionate and devoted to the concept of language
Barthes, in his book Mythologies, which I have shamelessly cherry-picked for this essay, says that the myth’s objective in being told is not really important; it is the way in which it conveys that message that is important.
He says that “the knowledge contained in a mythical concept is confused, made of yielding, shapeless associations” (119).
But this isn’t as bad as it sounds, because myths actually don’t need to be deciphered or interpreted.
While they may work with “Poor, incomplete images” (127), they actually do their work incredibly efficiently. Myth, he says, gives to its story “a natural and eternal justification…a clarity which is not that of an explanation but that of a statement of fact” (143).
Myth is a story in its simple, pure form. “It acts economically: it abolishes the complexity of human acts, it gives them the simplicity of essences…” (143).
You can see how this view of myth kind of works with the myth-building that Tolkien does in The Lord of the Rings, which works with simple efficiency, whose very images are incomplete to the point of needing clarification in Appendices and further books like the Silmarillion. Yet even without having read these appendices and other books, we grasp what Tolkien is getting at. We know what Middle-Earth is like, because the myth that Tolkien presents needs no deciphering, no real interpretation for us to grasp its significance.
Tolkien, I think we can all agree, was successful in creating a myth specifically for England, as Jane Chance and many other scholars have now shown to be his intention. But is it a novel? Some might argue it isn’t—myself included. In fact, what Tolkien created in The Lord of the Rings is less a myth (I would argue that we only use that term because Tolkien himself used it to describe his work and his object—think of the poem “Mythopoeia,” which he dedicated to C.S. Lewis) than it is a full-blown epic.
For my definition of epic versus novel, I’m going to my personal literary hero, Mikhail Bakhtin, a great thinker, a marvelous student of literature, a man who wrote with virtually no audience at all for many years because he was sent into internal exile in the Soviet Union. In his essay “Epic and the Novel,” Bakhtin attributes these characteristics to epic:
It deals with an absolute past, where there is little resemblance to the present;
It is invested with national tradition, not personal experience, arousing something like piety;
There is an absolute, unbridgeable distance between the created world of epic and the real world.
The novel, says Bakhtin, is quite the opposite. It is new, changing, and it constantly “comes into contact with the spontaneity of the inconclusive present; this is what keeps the genre from congealing. The novelist is drawn toward everything that is not yet completed” (27).
I think the three characteristics of epic described by Bakhtin do in fact match up nicely with The Lord of the Rings: absolute past, national tradition, distance between the actual and the created world. But here’s another thing about epic as described by Bakhtin: “The epic world knows only a single and unified world view, obligatory and indubitably true for heroes as well as for authors and audiences” (35). It would be hard, indeed impossible, to imagine The Lord of the Rings told from a different point of view. We need that distant narrator, who becomes more distant as the book goes on. As an example, imagine The Lord of the Rings told from Saruman’s point of view, or from Gollum’s. Or even from Bilbo or Frodo’s point of view. Impossible! Of course, we share some of the point of view of various characters at various points in the narrative (I’m thinking specifically of Sam’s point of view during the Cirith Ungol episode), but it couldn’t be sustained for the whole of the trilogy.
The interesting thing here is that in The Lord of the Rings, Tolkien took the novel form and invested it with epic. And I think we can say that against all odds, he was successful. On the other hand, C.S. Lewis, in his last book Till We Have Faces, took a myth (the story of Cupid and Psyche), which is certainly more closely related to epic than it is to novel, and turned it into a successful novel. This isn’t the time and place to talk about Till We Have Faces, although I hope someday that we can come together in the C.S. Lewis Festival to do that very thing, but I couldn’t help mentioning this, because it’s striking that Lewis and Tolkien, while they clearly fed off each other intellectually and creatively, started from opposite ends in writing their greatest creative works, as they did in so many other things. It’s almost amazing that you can love both of them at the same time, but of course you can. It’s the easiest thing in the world to do.
But I’m losing the thread of my questions here. What is myth? Can we actually have modern myths? Can someone actually set out with the intention of creating a myth? And can a mythic work spontaneously just happen? Another question needs to be posed here: if this long book, which is probably classified in every bookstore and library as a novel, touches on myth but is really an epic, can a novel, as we know it, become a myth? This forces us to tighten up our definition of what a myth is and asks us to think about what myth does.
Karen Armstrong, I think, would say yes, to all three of these questions. In her book A Short History of Myth, Armstrong follows the trajectory of myths through time and argues that the advent of printing and widespread literacy changed how we perceive and how we receive myth. These developments changed myth’s object and its function—and ultimately, it changed the very essence of myth.
Armstrong points out that myths and novels have similarities:
They are both meditative
They can both be transformative
They both take a person into another world for a significant period of time
They both suspend our disbelief
They break the barriers of time and space
They both teach compassion
Inspired by Armstrong and by Bakhtin, I’m going to go out on a limb here and make a stab at answering my questions. And I’ll start by defining a modern myth as a super-story of a kind: a novel (or a film, because let’s open this up to different kinds of story-telling) that exerts its power on a significant number of people. These stories then provide, in film professor and writer Stuart Voytilla’s words, “the guiding images of our lives.”
In short, a modern myth has these characteristics:
It belongs to a certain place and time. Like epic, it is rooted in a time and a place. It might not be far removed from the actual, but it cannot be reached from the actual.
It unites a group of readers, often a generation of readers, by presenting an important image that they recognize.
It unites a group of readers by fostering a similar reaction among them.
It contains identifiable elements that are meaningful to its readers/viewers. Among these might be important messages (“the little guy can win after all,” “there’s no place like home,” the American Dream has become a nightmare”).
In other words, a mythic story can be made intentionally, as Star Wars was by George Lucas after he considered the work of Joseph Campbell; or it can happen accidentally. Surely every writer dreams of writing a mythic novel—the Great American novel—but it’s more or less an accident. The Adventures of Huckleberry Finn was a mythic novel of American, until it was displaced by To Kill a Mockingbird. And I would note here that having your novel go mythic (as we might term it—it is, in a way, like “going viral,” except mythic stories tend to last longer than viral ones) is not really such a good thing after all. Look at Harper Lee—one mythic novel, and that was the end of her artistic output—as far as we know. A mythic novel might just be the last thing a great writer ever writes.
Anyway, back to our subject: a modern myth gets adopted rather than created. Great myths are not made; they become. So let’s’ think of a few mythic novels and see how they line up with my four characteristics:
Frankenstein
Star Wars
The Wizard of Oz
The Great Gatsby or Death of a Salesman—take your pick.
Casablanca
The Case of Local Myths—family or friend myths, references you might make to certain films or novels that only a small number of people might understand. A case in point would be the re-enactments of The Rocky Horror Picture Show that take place each year around Halloween.
In essence, my answer, such as it is, to the questions I posed earlier comes down to this:
Modern myths are important stories that unite their readers or viewers with similar emotional and intellectual reactions. Modern mythology works by presenting recognizable and significant images that unite the people who read or view them. As for what distinguishes modern myths from other forms of story-telling, what tips a “normal” novel or film over into the realm of “mythic”—I don’t have an answer for this. I only have a couple of vague, unformed theories. One of my theories is this: Could one difference between myth and the novel (“mere” story-telling as such) be that myth allows the reader/listener to stay inside the story, while the novel pushes the reader back out, to return to the actual world, however reluctantly?
And let’s not forgot what Karen Armstrong wrote about myth: “It has been writers and artists, rather than religious leaders, who have stepped into the vacuum [created by the loss of religious certainty and despair created by modernism] and attempted to reacquaint us with the mythological wisdom of the past” (138). Armstrong’s closing sentence is perhaps the most important one in the book: “If professional religious leaders cannot instruct us in mythical lore, our artists and creative writers can perhaps step into this priestly role and bring fresh insight to our lost and damaged world” (149). With this in mind, perhaps it’s time to go and read some more, and find more myths that can help us repair and restore ourselves, our faith in our culture, and in doing so, the world itself.
Several months ago, I had what seemed like a fantastic idea: now that I was retired from teaching English at a community college, I could engage in critical research, something I’d missed during those years when I taught five or more classes a semester. I had managed to write a couple of critical articles in the last few years of my tenure at a small, rural two-year college in Northern Michigan, but it was difficult, not only because of the heavy demands of teaching, but also because I had very limited access to scholarly resources. Indeed, it is largely due to very generous former students who had moved on to major research institutions that I was able to engage in any kind of scholarly research, a situation which may seem ironic to some readers, but which is really just closing the loop of teacher and student in a fitting and natural way.
And so last fall, on the suggestion of a former student, I decided to throw my hat in the ring and apply to a scholarly conference on Dickens, and my proposal was chosen. In time, I wrote my paper (on Dickens and Music– specifically on two downtrodden characters who play the flute and clarinet in David Copperfield and Little Dorrit, respectively) and prepared for my part in the conference.
It had been close to 25 years since I had read a paper at a conference, and so I was understandably nervous. Back then, there was no internet to search for information about conference presentations, but now I was able to do my homework, and thus I found a piece of advice that made a lot of sense: remember, the article emphasized, that a conference paper is an opportunity to test out ideas, to play with them in the presence of others, and to learn how other scholars respond to them, rather than a place to read a paper, an article, or a section of a book out loud before a bored audience. Having taught public speaking for over a decade, I could see that this made a lot of sense: scholarly articles and papers are not adapted to oral presentations, since they are composed of complex ideas buttressed by a great many references to support their assertions. To read such a work to an audience seemed to me, once I reflected on it, a ridiculous proposition, and would surely bore not only the audience, but any self-respecting speaker as well.
I wrote my paper accordingly. I kept it under the fifteen-minute limit that the moderator practically begged the panelists to adhere to in a pre-conference email. I made sure I had amusing anecdotes and witty bon mots. I concocted a clever PowerPoint presentation to go with the paper, just in case my audience got bored with the ideas I was trying out. I triple-spaced my copy of the essay, and I–the queen of eye contact, as my former speech students can attest–I practiced it just enough to become familiar with my own words, but not so much that I would become complacent with them and confuse myself by ad-libbing too freely. In short, I arrived at the conference with a bit of nervousness, but with the feeling that I had prepared myself for the ordeal, and that my paper would meet with amused interest and perhaps even some admiration.
It was not exactly a disaster, but it was certainly not a success.
To be honest, I consider it a failure.
It wasn’t that the paper was bad. In fact, I was satisfied with the way I presented it. But my audience didn’t know what to do with presentation. This might be because it was very short compared to all the other presentations (silly me, to think that academics would actually follow explicit directions!). Or it could be because it wasn’t quite as scholarly as the other papers. After all, my presentation hadn’t been published in a journal; it was, as C.S. Lewis might have called it, much more of a “supposal” than a fully-fledged argument. Perhaps as well there was something ironic in my stance, as if I somehow communicated my feeling that research in the humanities is a kind of glorified rabbit hunt that is fun while it lasts but that rarely leads to any tangible, life-changing moment of revelation.
Yet this is not to say that humanities research is useless. It isn’t. It develops and hones all sorts of wonderful talents that enrich the lives of those who engage in it and those who merely dip into it from time to time. I believe in the value of interpreting books and arguing about those interpretations; in fact, I believe that engaging in such discussions can draw human beings together as nothing else can, even at the very moments when we argue most fiercely about competing and contrasting interpretations. This is something, as Mark Slouka points out in his magnificent essay “Dehumanized,” that STEM fields cannot do, no matter how much adminstrators and government officials laud them, pandering to them with ever-increasing budgets at the expense of the humanities.
And this is, ultimately, why I left the conference depressed and disappointed. I had created, in the years since I’d left academia, an idealized image of it that was inclusive, one that recognized its own innate absurdity. In other words, sometime in the last two decades, I had recognized that research in the humanities was valuable not because it produced any particular thing, but because it produced a way of looking at the world we inhabit with a critical acuity that makes us better thinkers and ultimately better citizens. The world of research, for me, is simply a playground in which we all can exercise our critical and creative faculties. Yet the conference I attended seemed to be focused on research as object: indeed, as an object of exchange, a widget to be documented, tallied, and added to a spreadsheet that measures worth.
Perhaps its unfair of me to characterize it in this way. After all, most of the people attending the conference were, unlike me, still very much a part of an academic marketplace, one in which important decisions like tenure, admission to graduate programs, promotions, and departmental budgets are decided, at least in part, by things like conference attendance and presentations. It is unfair of me to judge them when I am no longer engaged in that particular game.
But the very fact that I am not in the game allows me to see it with some degree of clarity, and what I see is depressing. One cannot fight the dehumanization of academia, with its insistent mirroring of capitalism, by replicating that capitalism inside the ivy tower; one cannot expect the humanities to maintain any kind of serious effect on our culture when those charged with propagating the study of humanities are complicit in reducing humanities research to mere line items on a curriculum vitae or research-laden objects of exchange.
I can theorize no solution to this problem beyond inculcating a revolution of ideas within the academy in an effort to fight the now ubiquitous goal of bankrupting the study of arts and humanities, a sordid goal which now seems to characterize the age we live in. And I have no idea how to bring about such a revolution. But I do know this: I will return to my own study with the knowledge that even my small, inconsequential, and isolated critical inquiries are minute revolutions in and of themselves. As we say in English studies, it’s the journey that’s important, not the destination. And in the end, I feel confident that it will take far more than one awkward presentation at a conference to stop me from pursuing my own idiosyncratic path of research and inquiry into the literature I love.
The title is a misnomer of sorts: most contemporary book reviews, I’ve noticed, are little more than marketing ploys designed to get you to buy the book they’re reviewing. If the reviewer is quite brave, the review might actually critique the book, but the point remains the same: to weigh in on a book that has grabbed, or wants to grab, the attention of a large body of readers.
That is not my goal in writing book reviews.
Am I alone in wailing and moaning the lost art of reading? Certainly not. Yet I am advocating here a certain kind of reading, a way of reading which demands thoughtful yet emotional responses to a book. This kind of reading and critiquing is not systematic, like a college paper; it is not formulaic and profit-generating, like a Kirkus book review; and it is certainly not aimed at gaining a readership for a book, or for this blog, either, for that matter. I am simply modeling the behavior I would like to see in other readers. I want to log my emotional and intellectual responses to certain books, to join or create a critical discussion about the the works I’m reading. Some of these works will be current, but many more will be older. As I used to tell my literature students, I specialize in works written by long-dead people. Long mesmerized by the works from the nineteenth century and before, I have, one might say, a severe case of century deprivation.
But today I am starting with a book by Susan Sontag, The Volcano Lover: A Romance. Published in 1992, it is a historical novel set in Naples, Italy, at the end of the eighteenth century, focusing on Sir William Hamilton and his second wife Emma, destined to become the mistress of Horatio Nelson.
Image from Wikipedia
Let me say that I have never read many of Sontag’s essays, and now I feel I don’t really have to, because this book seems in many ways much more a essay than a novel. There’s a good story in the lives of Sir William, Lady Hamilton, and Lord Nelson, but Sontag pushes this story into the background, eclipsing it by allowing her narrator’s cynical distance to diminish the reader’s ability to connect with the characters and events portrayed in the novel. Sontag gets in the way of the story a great deal too much. Egotism has no place in the act of telling a story; unfortunately, this lesson is something many writers are slow to learn, and indeed, some writers never learn it at all.
The true protagonist of the novel emerges only in the last eight pages. Sontag has had her revenge on the prurient reader who has picked up this novel only to delve into the lurid details of one of the most famous threesomes in British history. She pulls out a minor character, one that has had only the most fleeting reference given her, and gives her some of the best scenes to narrate. By playing hide-and-seek games with her story in this way, Sontag regrettably implodes her own narrative.
In the end, Sontag is much too clever a story-teller, and this hurts her novel–irreparably, in my view. There is one sentence in the novel that I think is worthy of remembering, however. Describing Sir William long after her own death (yes, Sontag does this, time-hopping with impunity, apparently), his first wife describes him like this in a single-sentence paragraph: “Talking with him was like talking with someone on a horse” (376). That’s a clever description, and I will give Sontag her due by calling attention to it.
In the end, though, I am left feeling frustrated and annoyed by The Volcano Lover. I have no idea how it can be construed as a romance, just as I have no idea why this novel, with its sly undercurrent of critical attitudes–towards the characters, the actions, and perhaps even the very nature of novel-writing–should hold a reader’s attention. Sontag’s work, described on the jacket as “a book of prismatic formal ingenuity, rich in speculative and imaginative inventiveness and alive with delicious humor,” is in reality a self-absorbed narrative, filled with annoying commentary, strained attempts at originality, and a smug disregard for its readers’ desire to like the book they’re reading.
I will admit it: after the election in November, I succumbed to a sense of defeat. What is the point, I moaned, if autocracy and tyranny are not merely accepted but welcomed by the masses, if the great ideal of a democratic country is systematically dismantled before our eyes? Why bother with anything, much less with the last fifty pages of a novel that no one will ever read?
At the time, I was working through the last part of a story I’d begun a couple of years earlier, and I was ready to give it up, because, well, why would I finish it when the world as I know it is coming to an end? (My feelings arose not only because of the U.S. election results or the ensuing realization that a foreign power had tinkered with our “free elections,” but also because of the global rise of a dangerous populism, coupled with imminent global climate change.)
But a good friend gave me some advice, and I soldiered on and completed the draft. Right now, I am steadily working on it, revision after revision. And I am doing this not because I think my novel can change the world. It certainly won’t; it won’t be read by more than a hundred people, and that’s if I’m lucky.
But this short essay is not about the art of writing without readers; I will deal with that in a future post. For now, all I want to do is to encourage everyone who reads this blog to go on and continue their artistic activities. I say this not as a writer, or even as a reader, but as a scholar. And I have a very simple reason for doing so.
Art is the residue left by human culture. When civilizations disappear, when lives and institutions have crumbled into the dust, what remains is the art they created. Some of this art arises from genius, like the works of Mozart and Shakespeare; some of it comes from normal people, like the rest of us. But we need it all–every last scrap of it, not only the wonderful pieces that make us cry with joy or sadness, but even the average and ungainly works of art, because even bad art is an expression of human experience, and in the end, it is the experience of being human that binds us together on this lonely little planet.
So go ahead with your art. Draw, paint, weave, write, compose or play music. Do not worry that you are fiddling as Rome burns. Rome will, ultimately, burn–history tells us that. But what is left behind are wonderful murals that will take your breath away, mosaics, epic poems, statues and monumental structures. Don’t worry about whether your art will be appreciated; it is the act of making it that is important, not whether or not it is celebrated. Think of that lonely monk who produced Beowulf; he was probably scared shitless that his Anglo-Saxon culture would be erased by the next Viking invasion, but he fought off this feeling of futility and kept going, thank goodness. Remember his small act of courage, try to emulate it, and above all, keep going.
Do not be afraid of working in the darkness; you may not be able to dispel it, but your work could provide light for others, not only now, but in the future as well.
Fair warning: this post is not political. It is for all the writers out there who hate revising their work.
Guys, I know the feeling. You labor over something for weeks, months, even years, and when you reach the end, or what you think is the end, it’s so very tempting to stop, put down your pen or push aside your keyboard, and break out the champagne. You love what you’ve written, if only because (1) it’s finished and (2) it meets your expectations, which, let’s be honest, have been systematically lowered throughout the duration of your project. The last thing you want to do is pick over every word and line you’ve sweated over in a pointless effort to tear it apart.
I used to feel that way, too. In fact, I suppose a pretty substantial part of me still does. But today, on the eve of 2017, at the end of a year that so many people are calling a very bad year, if not a catastrophic one, I pause in my own revision work to offer other writers a new way of looking at revision.
I am learning to love this part of writing, because I see it as a perfect marriage between creativity and analysis. Note that I am using the word “analysis,” not the word “criticism,” because that’s too negative for what I think we do in revision. The job of revision is to help make something better, not to tear it apart. (Tearing it apart should come later, during the critical review, but only in as much as the critic must tear something apart in order to see what it’s made of and how it works. A good critic will always put the work back together again after she does the work of criticism.)
My secret to loving revision, then, is this: Revising a work must involve a willing, enthusiastic attitude. The writer must regard the task of revising with excitement, because it is this part of writing that really shows the essence of craftsmanship, that separates those who write for fun (whether they are published authors or not) from those who write because they are compelled to do so. But how can a writer change their attitude about this pain-in-the-ass time sink? I’ve devised a very simple solution. Instead of hoping that your work contains few mistakes and needs minimal revision, you should assume that it houses many mistakes, some of them not easy to find. Rather than bewailing the need to revise, growing bored and frustrated with finding topical errors, learn to use revision as a sonar device to locate the buried as well as the superficial mistakes. Once found, even deep mistakes are usually fairly easy to fix–much easier to fix than most writers would think. I’ve found that when you let go of the inherent desire not to have to fix something and give yourself over to the idea that fixing it is not only a good thing to do, but an entertaining and satisfying aspect of the nature of the job, revision loses its drudgery. It becomes a pleasant and in some ways delightful stage in the work of creation, and it invites the best use of problem-solving tactics–and creativity–a writer possesses.
There you have it. Stop avoiding revision. (You know you have.) Change your attitude–for real. Love revision and all it offers. Because it’s revision, and not the mere act of writing itself, that makes us real artists. Any third-grader can write. Only a real writer has the ability, and the drive, to revise.
–Offered on this last day of 2016 with a minimum of revision