Spring has finally come to Northern Michigan, where I live. One might think that would make things easier, that creative juices would flow as freely as the sap in the trees and plants that are absorbing the sunshine. But unfortunately that’s not how it works. Spring is a dicey time here, and not just because of the mud left behind by the melting of the snow. (Another thing that’s left behind is shovel-loads of dog feces, which the receding snow banks offer up as yet another sacrifice to the disappearance of winter.) The truth is that when the weather clears and the outside world looks brighter, sometimes it’s disconcerting when your internal world hasn’t kept pace. It can be depressing, because it’s hard to kick yourself in gear to get things done, and in spring time, you have no excuse not to.

So when I saw that a local store was offering a poetry workshop during the month of April in honor of National Poetry Month, I signed up for it on a whim. I don’t know whether I will write any poems as a result of this workshop, but that’s not really the point. What I’d like to happen is for me to rekindle my creative impulses, and so far, though I’m still wrestling with SI (Springtime Inertia), I think I can detect the beginning of some movement towards more of a creative flow.

But the workshop has reminded me of an important question I’ve had over the last few years–one that may be unanswerable but still deserves to be asked:

What makes good writing?

It’s a question I’ve been pondering seriously, even though it might sound like I’m being flippant. Having taught literature throughout my professional career, I should be able to answer that question without too much trouble. For example, as a writing instructor, I’d say, “Good writing is clear, succinct, and precise. It shows consideration for the reader by adhering to the commonly accepted rules of grammar, spelling, and punctuation. It connects the ideas it presents in a way that is easy to read and understand.” I think that’s a good start for a college composition class, anyway.

But clearly this will not work for most creative writing. Poets, for example, often show little consideration for their readers. In fact, I’m not sure contemporary poets actually write with readers in mind; often they seem to be jotting down notes to themselves for later reading. Not that there is anything wrong with that at all–this is, after all, why I am interested in poetry at this point in my life. I’ve realized that there are certain subjects and ideas I want to explore that are better suited for poems than for short essays like this one, and I think it’s worth the time and effort to try to articulate them in poetic form.

However, let’s get back to the question: what does make good creative writing? I am having a hard time formulating an answer. As I get older, I seem to be suffering from the reverse of the Dunning-Kruger Effect. I am less sure of everything I think about, even questions which I once felt sure of the answer to. But as far as good writing goes, I have come up with a provisional answer, and although I don’t find it very satisfying, I thought I’d try it out here.

I will begin by saying that the question itself is misguided. That’s because there is no such thing as good writing–only good reading. When we ask the question “what makes good writing?” we’re actually looking through the wrong end of the telescope. A good reader, I submit, is able to read almost anything and be enriched by the experience. A good reader will read any text, be it a poem, essay, novel, or piece of non-fiction, and find connections to other works. Of course, this is not to say there is no such thing as bad writing–I think we all know that it does exist–but that is a different issue. Seeing examples of bad writing will help us understand what not to do, but it won’t really help creative writers learn what to do to create good writing, so once again, I think it’s best to turn the question on its head and focus on what makes a good reader rather than what makes good writing.

After all, it has to be far easier to learn the skills required to be a good reader than to learn to be a good writer. And there are all sorts of advantages for the good reader–not only personal and professional, but social and political, as well. I think I’ll have to ponder on this one for a week or two, however, before I begin to identify how to create good readers and what makes good reading. For now, though, I’ll end with the suggestion that the world would surely be a better place if there were more good readers in it. I’ll go even further and add that maybe we’d all better get to work to see how we can do our part to create good, solid readers, because good readers make good citizens, and we can surely use a great many more good citizens in our world right now.

Smith versus Shelley: A Tale of Two Poems

Yesterday, I co-led a poetry discussion group at one of the area retirement communities, something I’ve done for the last few years. It’s been a really interesting experience–there’s so much to learn and discuss about even mediocre poems, and I enjoy hearing the participants share their ideas about the poems, as well as the stories and memories these poems evoke.

I choose the poems at random, with very little rhyme (pardon the pun) or reason to my choice. One of the poems yesterday was Percy Bysshe Shelley’s “Ozymandias.” Yes, I proffered that old chestnut to the group, even though I’d read it thousands of times and have taught it in many classes. I just wanted another look at it, I guess, and it’s fun to do that with company. What I wasn’t expecting, however, was my co-leader bringing in another poem on the same exact topic, written at the same time.

It happens that Shelley had a friend, the prosaically named Horace Smith, and the two of them engaged in a sonnet writing contest, on the agreed-upon subject of Ancient Egypt and, presumably, Rameses II, also known as Ozymandias. We remember Shelley’s poem: every anthology of 19th-century British literature probably contains it. However, Smith’s sonnet is largely forgotten. In fact, I’ll offer a true confession here: despite having taught Brit lit for decades, I’d not heard of Smith’s version until a couple of days ago.

It turns out that Smith was himself an interesting fellow. He wrote poetry, but was not averse to making money, unlike his younger friend Shelley. Smith was a stock-broker, and made a good living, while also, according to Shelley, being very generous with it. He sounds like a generally good guy, to be honest, something which Shelley aspired to be, but was really not. For all intents and purposes, Shelley was a masterful poet but a real asshole on a personal level, and a bit of an idiot to boot. (What kind of a fool goes sailing in a boat that he didn’t know how to operate, in a storm, when he didn’t even know how to swim?) Smith knew how to make and keep friends as well as money, two things that Shelley was not very good at, by all accounts.

At any rate, I thought it might be interesting to compare the two poems. Of course, we assume Shelley’s poem will be better: it’s the one that is in every anthology of 19th-Century British literature, after all, while I–with a Ph.D. in the subject, for whatever that’s worth–didn’t even know of the existence of Smith’s poem until a few days ago. But maybe, just maybe, there’s something valuable in the stockbroker’s poem that has been missed–and wouldn’t that make a fine story in and of itself?

So here are the two poems, first Shelley’s, and then Smith’s.

Ozymandias (Shelley)

I met a traveller from an antique land,

Who said—“Two vast and trunkless legs of stone

Stand in the desert. . . . Near them, on the sand,

Half sunk a shattered visage lies, whose frown,

And wrinkled lip, and sneer of cold command,

Tell that its sculptor well those passions read

Which yet survive, stamped on these lifeless things,

The hand that mocked them, and the heart that fed;

And on the pedestal, these words appear:

My name is Ozymandias, King of Kings;

Look on my Works, ye Mighty, and despair!

Nothing beside remains. Round the decay

Of that colossal Wreck, boundless and bare

The lone and level sands stretch far away.”

Ozymandias (Smith)

In Egypt’s sandy silence, all alone,
Stands a gigantic Leg, which far off throws
The only shadow that the Desert knows:—
“I am great OZYMANDIAS,” saith the stone,
“The King of Kings; this mighty City shows
The wonders of my hand.”— The City’s gone,—
Naught but the Leg remaining to disclose
The site of this forgotten Babylon.

We wonder — and some Hunter may express
Wonder like ours, when thro’ the wilderness
Where London stood, holding the Wolf in chace,
He meets some fragment huge, and stops to guess
What powerful but unrecorded race
Once dwelt in that annihilated place.

Now, I’d say Shelley definitely has the advantage in terms of poetic language, as well as the narrative situation. His words are sibilant and flowing, and it’s a stroke of genius to make the story come from not the speaker of the poem, but from a traveler from an antique land; it makes the scene seem even more authentic. The alliteration in the last two lines (“boundless” and “bare” as well as “lone” and “level”) is a deft touch as well.

I’d also say that Shelley’s choice of the half shattered face is much better than Smith’s. There’s something much more poetic about a sneering face, even if it’s a half of a face, than a gigantic leg. There’s no way on earth Smith could have made a gigantic leg sound poetic, and that hampers the poetic feel of his sonnet, which is a bit of a shame.

Or is it?

Perhaps Smith wasn’t going for poetic feel here at all. In fact, I’d argue that he definitely wasn’t thinking along the same lines Shelley was. There are obvious similarities between the two poems. We still get the empty site, the desolation of the “forgotten Babylon” that powers so much of Shelley’s version, but it turns out that Smith is interested in something completely different. Where Shelley’s poem comments on the nature of arrogance, a human pride that ends in an ironic fall, Smith’s presents the reader with a different kind of irony. His version is much grander. In fact, it’s a cosmic irony that Smith is grappling with here, as the poem comments on the inevitable rise and fall of human civilization. What I find astounding is that in 1818, just as England was beginning its climb up to the pinnacle of world dominance for the next two centuries, Smith was able to imagine a time when the world he knew would be in tatters, with nothing remaining of the biggest city on earth, save as a hunting ground for the presumably savage descendants of stockbrokers like himself. Smith’s imagination was far more encompassing that Shelley’s, given this kind of projection into the far future.

All told, Shelley’s poem is probably the better one: it’s more quotable, after all, and no matter how much I love Smith’s message and projection into the future, he just doesn’t have the choice of words and rhythm that Shelley does. But need we really limit ourself to just one of these poems, anyway? I’d say we’ve gleaned about as much as we can from Shelley’s “Ozymandias.” Perhaps ours is an age in which we can appreciate Smith’s vision of a far distant future. Empires rise and fall, waters ebb and flow, and civilizations come and go. Smith, with his Hunter coursing through what was once London, paints this idea just as well as Shelley does with his decayed Wreck. There’s room for both of these poems in our literary canon.

A Speech

I’ve been absent from this blog for the past few weeks, but it hasn’t all been basking in the glory of my math prowess. In fact, I barely had time to celebrate the fact that I had actually passed my College Algebra course when I came down with Covid, despite getting all recommended vaccines and being oh-so-careful. At any rate, I’m just about back to normal now, but Covid is not a walk in the park. The initial symptoms aren’t too bad–pretty much the same as the side effects from the vaccine–but the aftermath of fatigue, lethargy, and depression lasted for a few weeks. My takeaway is that it’s definitely worth taking all the precautions now that most of the rest of the world seems to have blithely abandoned in order to avoid getting Covid.

At any rate, I emerged from my Covid quarantine a few weeks ago, just in time to address the local League of Women Voters unit at their annual meeting. My days of being a candidate are behind me, but that makes me all the more appreciative of the people who are still active and who are working to improve the political landscape of the United States. To be honest, I feel more than a little guilty at not joining in their efforts more actively, so the least I could do, I told myself, is to speak to them when they ask me to. Then I decided that my speech, such as it was, could make a good blog post, so here goes. The topic is, as the Belle of Amherst would call it, that “thing with feathers that perches in the soul.”

We face many problems in our world today: a lingering pandemic, mass shootings, long-standing prejudice, violence, partisan hatred… the list goes on and on. But perhaps the most serious one, because it affects so many others, is the decay of democracy in our country. I’ll come back to this in a moment, but for now, I just want to say that even though this is a humdinger of a problem, the message I’d like to give today is that there is good reason to hope for change, because change is always possible. Good things as well as bad things are happening in the world, and so optimism should not be banished from the range of emotions we feel as we confront our future. Hope, as author Rebecca Solnit points out in her book Hope in the Dark: Untold Histories, Wild Possibilities, is one of many viable responses to the dire problems that we face. We must not be afraid to hope. It’s easy to be attracted to pessimism when we are afraid to hope. Hope is frightening, because we realize that when we hope for the best, we might well be proven wrong when our hopes fail to materialize. And yet we must not be afraid of being proven wrong. Frankly, the world would be a much better place if we all were more willing to take a chance on being wrong. After all, fear of being proven wrong often prevents us from acting to make the changes we so desperately need and desire.

I also want to point out that the problems with democracy that we’re now experiencing should come as no surprise to us. There’s a kind of odd comfort in realizing that as far back as 1840, the French diplomat Alexis de Tocqueville identified some serious issues in democracy in his book Democracy in America. He pointed out that while popular sovereignty (or democracy) could work very well at the local level, where people find it easy to be well informed on issues and where power is limited, three problems lie in wait at the national level to mar the democratic experiment:

  1. Competent people are willing to leave politics in the hands of less competent people;
  2. People’s belief in the idea of innate equality could give them a false sense of their capabilities and a dangerous sense of omnipotence;
  3. Excessive individualism and the pursuit of material wealth could result in apathy.

I think it’s fair to say that we have seen all three things come to pass in recent years. And I, like many other people, have often been tempted to throw my hands up in disgust and divorce myself from the political realm. But, as Naomi Klein says in her book This Changes Everything, “If we are to have any hope of making the kind of civilizational leap required of this fateful decade, we will need to start believing, once again, that humanity is not hopelessly selfish and greedy — the image ceaselessly sold to us by everything from reality shows to neoclassical economics.”

But here’s the interesting thing: that picture of humanity as innately selfish and greedy is beginning to change. We’re beginning to realize that it’s an imperfect picture, one that was built on a misunderstanding, or at the very least on an overemphasis, of a Darwinian belief in the survival of the fittest. We need to offset this view of human nature with Peter Kropotkin’s view, as he presented it in his work Mutual Aid, of evolution depending as much on cooperation as on competition. Scientists and philosophers are now working on amending our view of nature to correct this faulty emphasis on competition; for example, biologists like Suzanne Simard (Finding the Mother Tree) have shown that natural systems are much more physically connected than previously thought, just as primatologist Frans de Waal (The Age of Empathy: Nature’s Lessons for a Kinder Society) has demonstrated that empathy and cooperation have contributed as much, if not more, to the survival of humanity through the ages.

So there is reason to hope for change, for a different perspective. What we need right now is enough hope, and determination, and endurance to get us through these rapidly changing times. We need to remember that while we ourselves might not be around to enjoy the things these changes will bring, our children will, and so will their children. And we need to be willing to lay the foundation for those changes right now.

Change is something that can be difficult to navigate. Back in 1952, Edna Ferber wrote a passage in her book Giant (so much better than the movie) in which the main character’s wise father talks to his daughter, who is troubled by all she’s seen and experienced in Texas:

“The world will [change]. It’s changing at a rate that takes my breath away. Everything has speeded up, like those terrific engines they’ve invented these past few years… Your [husband] won’t change, nor you, but your children will take another big step: enormous step, probably. Some call it revolution, but it’s evolution, really. Sometimes slow, sometimes fast. Horrible to be caught in it, helpless. But no matter how appalled you are by what you see…, you’re still interested, aren’t you?”

“Fascinated! But rebelling most of the time.”

“What could be more exciting! As long as you’re fascinated, and as long as you keep on fighting the things you think are wrong, you’re living. It isn’t the evil people in the world who do the most harm, it’s the sweet do-nothings that can destroy us. Dolce far niente–that’s the thing to avoid in this terrible and wonderful world….

So first of all, we need to buckle in for a wild ride while true change has a chance to occur. But we also need nurture a fierce belief in the possibility of this change actually happening. And for that, I’ll point to another writer who gives me the tools to hope: Rutger Bregman. In his book Utopia for Realists, he has this to say about the power of belief, which is closely linked to our ability to hope:

Those who swear by rationality, nuance, and compromise fail to grasp how ideas govern the world. A worldview is not a Lego set where a block is added here, removed there. It’s a fortress that is defended tooth and nail, with all possible reinforcements, until the pressure becomes so overpowering that the walls cave in.

If we want to change the world we live in, then, we need to apply that pressure constantly, relentlessly, until we begin to destroy those walls, that fortress of belief that prevents us from restoring the democratic values we believe in. As Bregman says, “if we want to change the world, we need to be unrealistic, unreasonable, and impossible.”

And yet, as important as it is to hope, to believe in this change, Robert Reich reminds us that “hope is not enough. In order for real change to occur, the locus of power in the system will have to change.” Nevertheless, Reich himself is hopeful about the future. In his book The System: Who Rigged It and How We Fix It, he explains why:

History shows that whenever we have stalled or slipped, the nation’s forward movement has depended on the active engagement and commitment of vast numbers of Americans who are morally outraged by how far our economy and our democracy have strayed from our ideal and are committed to move beyond outrage to real reform.

He goes on to remind us that we need to “be organized and energized, not just for a particular election but for an ongoing movement, not just for a particular policy but to reclaim democracy so an abundance of good policies are possible.” What we need, he argues, is “a common understanding of what it means to be a citizen with responsibilities for the greater good.” He ends the book with a rousing pep talk: “Your outrage and your commitment are needed once again.”

These are powerful words, and they are therapeutic in restoring a sense of hope. I can do little more than echo them. So I’ll just leave you with the following few thoughts.

For those of you engaged in the fight to restore our democratic values, I urge you to stay engaged, enraged, and determined to change the structure of American politics from the ground up.

On a personal note: Take care of yourself. Pace yourself. Do what you personally can, and don’t feel badly about what you cannot do. Don’t focus on the negative. And take time to remind yourself of the successes you’ve had, no matter how small. Be willing to celebrate and share them.

And most of all, have hope! We are all, in a variety of ways, fighting the good fight. And in this fight, hope may well be the most important weapon we have. In the words of the Welsh literary critic Raymond Williams:

To be truly radical is to make hope possible rather than despair convincing.

Thank you for your commitment to democracy. Keep up the good fight, and keep hoping for positive change.

Random Thoughts about TV Shows

A few random thoughts about television shows this morning, since the end of a long winter is in sight, and I have survived it largely by knitting, reading, and–you guessed it–watching television shows and movies.

On Mindless Murder: Why do detective shows always, without fail, focus on murder? Based on the detective shows I watch (admittedly, most of them are British), it seems that all cases in which both police and private detectives are called are murders. Hence the Cabot Cove paradox: a small town, Cabot Cove, Maine, has the highest murder rate in the world, because Jessica Fletcher lives there and she must solve a new murder every week. (Don’t get me wrong–I love Murder, She Wrote, but I think that if a detective is good at solving murder cases, she ought to be good at solving other kinds of cases as well.) What about the cases in which no murder has occurred? Much of a detective’s job, after all, involves sitting and watching people, trying to get evidence of adultery, or perhaps finding a missing person (who often, I would hope, turns out not to be murdered). Even Sherlock Holmes occasionally worked on cases that did not involve a murder of any kind. I would love to see a detective show that doesn’t focus exclusively on that most brutal of crimes. In fact, I find it deeply troubling that so much of our entertainment comes from postulated murder, as if the only way we can amuse ourselves is by imagining the ultimate violence done to another human being. If detective shows would only sprinkle some non-murderous episodes in with their usual fare, I think it would be more realistic, for one thing, as well as more humane, and it would do those of us who watch them a lot more good.

On Evil Collectives: Why is a collective always represented as something bad? Take Star Trek: Voyager. While I find the Star Trek series creative and thoughtful, the Borg (a hive-mind collective that forcibly assimilates/absorbs all entities it encounters) quickly becomes a predictable and hackneyed antagonist. Of course, someone had the brilliant idea of “rescuing” 7 of 9 and integrating her into Voyager’s crew–kudos to whoever came up with that one–but the problem remains that we seem to be unable to imagine a collective association of human beings as anything but profoundly threatening to creativity, kindness, and mutual aid. Perhaps this stems from our Western distrust of collective societies and our American horror of communism. Yet this cannot be only an American issue, since Daleks–from the Dr Who series–are also portrayed as an evil, voracious collective society. My question is this: is it possible to imagine a non-threatening collective, one that is humane and caring? Why is it that we never see such a collective portrayed on television or in films? If we could imagine one (and of course non-agressive collective societies do indeed exist in nature, among bees, for example, and many other kind of animals so we needn’t go far for inspiration), perhaps we could aspire to replicate this kind of mutual aid society in our world.

On Emo SciFi: While I’m on the subject of science fiction, here’s a question that I’ve often pondered: Why are science fiction shows almost always dark? Of course, there’s a really easy answer to this question: it’s dark in outer space. I get that, but why is it that we can only imagine space travel as something in which disasters, emergencies, and threatening events occur? Wouldn’t it be more realistic to sprinkle some humor into the plot of a scifi show sometimes? I realize that we’re living in difficult times, as we move closer to tyranny and nuclear war threatens to erupt in Europe, but isn’t that itself a reason to provide entertainment that is uplifting and amusing as well as thoughtful? For that matter, why must “thoughtful” always mean “something dire is about to happen and the whole crew, or planet, or species could die?” I would very much like to see a science fiction show that occasionally has an episode focusing on disagreements between crewmates (because God knows that would happen on a long voyage–just ask any sailor who’s ever been on deployment), on equipment malfunctions, on anything but the mission ending in a fiery ball of disaster due to an out-of-control collective that is intent on committing murder.

In other words, it would be nice if someone out in TV Land got hold of a new blueprint for their plots instead of recycling the same old trite themes. But maybe that’s my own problem for expecting real creativity from an overburdened medium….

It’s pretty bad when one has to resort to doing math problems to get exposure to new ideas!

Searching for the Most Beautiful Word

Stephen Hunt/ Getty Images , from Redbook

I find it odd that J.R.R. Tolkien believed that the most beautiful sound in the English language was the words “cellar door.” To be honest, I just can’t agree with him: to me, at least, these words don’t sound lovely or inviting. Mysterious? Yes. Intriguing? Perhaps. But certainly not beautiful.

So I’ve tried my best to identify a word I do consider beautiful, and I think I’ve found one: “senescence.” I love the way the sibilant “s” sound eases through my lips. I had a serious lisp as a child, going to speech therapy throughout my early elementary school years, so maybe the word “senescence” has the attraction of forbidden fruit to me. Whatever the reason, I find “senescence” to be an elegant word, yet at the same time both humble and understated. It truly is a lovely word, with a soft, inviting sound that charms the ear.

The unpleasant reality rests in the meaning of the word: “the condition or process of deterioration with age.”

Oops. Looks like I’ve picked a word as fraught with problems as Tolkien’s “cellar door.”

But since I’m on the subject anyway (see how I did that?), let me discuss the most moving story about senescence I’ve ever encountered. Surprisingly, it’s not about human beings, but rather about octopuses. (And yes, the plural of “octopus” is “octopuses,” not “octopi.” This short article explains why, while cleverly pointing out the irony in the whole debate, since octopuses live as solitary creatures and so presumably one might never really need to use the plural of the word in a natural setting.)

Sy Montgomery’s The Soul of an Octopus must be a good book, because I still remember it clearly, several years after I listened to an audio version of it. The part that I found most memorable is Montgomery’s discussion of the senescence of her octopus friend. It is one of the most beautiful, and one of the saddest, descriptions of the natural world I’ve ever encountered.

While octopuses don’t have a centralized nervous system or a brain, as we do, they seem to experience consciousness. Recent films, for example, have documented the friendships that certain octopuses have formed with human beings. Clearly, they have the capacity to make memories, as well as other complicated mental functions. For example, this video segment shows an octopus dreaming. The takeaway here is that despite its alien appearance, the octopus is much more than a scary-looking sea monster; it is a creature with feelings and opinions, at least as much as the other animals we live with, such as dogs and cats .

But an octopus has a very short lifespan, living only three or so years. And the last thing a female octopus does, as it enters this final stage of life, this period of senescence, is to produce a collection of lacy, bundled eggs and festoon her den with them. Below is an image of an octopus with her eggs from an NPR article:

Stuart Westmorland/Corbis

The octopus will then spend the rest of the time remaining to her caring for these eggs, and then, with her last bit of energy, her final breath, so to speak, she will launch these eggs into life, just as she herself leaves it.

Now here’s the thing about Sy Montgomery’s book: the octopus that Montgomery befriended was a female, so she produced eggs and draped them in her aquarium home, but they were never fertilized, because she was acquired too early in her life to have been able to fertilize them. Yet that made no difference to her. She cared for those empty egg sacs just as assiduously as if they had had baby octopuses within them.

Perhaps she just didn’t know the difference. But I choose to believe that there is a powerful lesson here. That octopus did what she had to do: her drive to create was inborn, and she could no more resist that urge to lay eggs and then to take care of them than she could resist the urge to eat, or to sleep, or, when the time came, to die. And here’s where I find an important parallel between the octopus and us, one that has nothing to do with our role as parents, but rather as creators.

Look at it this way: one of the functions of human beings is to create things, all sorts of things, depending on who we are and what kind of gifts we develop in ourselves. We might create stories, as Shakespeare did, or important bodies of research, as Jane Goodall has, or structures, like the Great Wall of China. We might create an epic poem, like Milton’s Paradise Lost, or we might make a baby blanket out of yarn and a set of knitting needles. We might build a beautiful bench, or craft a powerful speech. We might create relationships that continue into the next generation. It doesn’t matter what shape it takes; one thing that humans do, without fail, is create. The least talented of us cannot go through this life without having created something at some point during the time allotted to us on this earth.

The problem is, many of us don’t honor our creations. We don’t think our creations could possibly matter, so we fail to protect and nurture them. We throw them away, making them disposable, ultimately discounting their importance.

But the octopus teaches us a different lesson. She shows us that whether there are baby octopuses within the eggs or not, it’s important to treat them all with respect. She demonstrates that it’s the act of creation and our response to that act that matters, and not whether the product of our creative urge is a success or a failure.

This realization hit me powerfully when I first listened to Montgomery’s book. In fact, walking down a sunny street in Dallas, tears coursed down my cheeks, and I didn’t care whether the other people on the path around White Rock Lake noticed or not. I cried at first because the futility of the octopus’s gesture struck me like a gale-force wind. It all seemed so useless, so empty. Was life really so cruel and hopeless?

But within a few minutes I realized that the important thing here was the act of creation, not the product of creation, and there’s a big difference. It didn’t stop my tears, but it did change the cause of them. The octopus’s actions seemed so selfless, so beautiful, that her death made me ache as if I’d known her myself. Her senescence, her final actions, these seemed to me worthy of a Verdi opera or tenth symphony from Beethoven.

Because the beauty of the octopus’s dying gesture more than balances the tragedy of it.

And now, some years later, entering my own period of senescence, I realize what we human beings share with that octopus. Some of us create viable things that go on to have a life of their own; some of us create the equivalent of empty egg sacs. But it doesn’t matter. We all have engaged in the act of creation, and that’s what makes us alive.

I might never achieve an existence as beautiful as that of an octopus, but I can keep the memory of her–of her senescence combined with her act of creation–in my mind so as to give me a sense of peace as I go about my own small acts of creation, and as I proceed with my own decline into old age.

In short, I’ve discovered that senescence can be beautiful both in its sound and in its meaning as well. Take that, Mr. Tolkien!

Private Clavel: My Private Marathon

One of the things that kept me going through the dark days of following Trump’s election was translating an entire French novel, as I wrote about here. I started my translation at the end of November, 2016, and finished it in December of 2017, so it took slightly more than a year of work. Yet I never knew quite what to do with my translation. I made a few half-hearted attempts to publish it, submitting a chapter to several reviews, but nothing took, and so I put it high up on my shelf and tried to forget about it.

However, last summer I discovered that a translation of the book had been published, back in 2019. I greeted this news with mixed feelings, as can well be imagined. I had long determined that no one else was interested in Leon Werth’s Clavel Soldat, that it was too dated or obscure for publication. I also knew that I was a novice translator, and that my chances of publication were very slim. But seeing that someone else had managed to get their version into print still evoked a spasm of writerly envy–short-lived, true, but envy nonetheless–and made me, for the about a day or so, sullen and bitter.

Then, however, I did what any honest writer/translator would do: I ordered the book from its publisher, Grosvenor House Publishing Ltd. Then, in the brightest days of summer, I crushed my sour, envious attitude, and when the book arrived, I placed it on my desk, determined that when winter came and I wasn’t busy with gardening, hiking, mushrooming, and visitors, I would read Michael Copp’s translation (which he calls Private Clavel’s War on War) and compare it to mine, word for word. I was convinced that there would be much to learn from this exercise, and I felt that Mr. Copp, as well as Leon Werth, deserved this much attention from me.

For the last two months, I’ve been engaged in this activity, and I have indeed learned a great deal. True, there are times when I thought it seemed a pointless exercise, but then I realized that many people engage in pointless activities for fun and for health. As an example, consider running. Lots of people run several times a week, working to increase their endurance. What was I doing, if not working to increase my mental endurance, my ability to use every atom of intelligence and memory and reasoning I had in my poor, beleaguered brain in order to make it stronger? So I compared what I was doing to training for a marathon. After all, most runners never expect to win the marathon races they enter–merely finishing is the point. For me, finishing my translation of Clavel Soldat had to be the point, not publishing it, and reading Copp’s translation in conjunction with mine would prove that I had, indeed, completed my own private marathon.

I have indeed learned a great deal from this exercise. First of all, on a purely practical level, I learned to use the Immersive Reader / Read Aloud tab on MS Word. This function allowed me to listen to my version of the translation at the same time that I read Copp’s book, speeding up the whole process. I can see how the Read Aloud function would be a real benefit to anyone proofreading their own work and I’m sure I’ll use it again.

As far as the actual translation goes, here are a few things that I’ve learned. Most important, translation is an art, not a science. This is a truism, but it bears repeating here. I will just post two versions of the same passage from Chapter VII (page 182 of the original) to illustrate:

The next day, Clavel receives a package of newspaper clippings. He knows. Those who write far from the front lines fight in their logical citadels, everyone for his or her own lie. He knows now that there is nothing but an immense vertigo within a great cataclysm. He is in the midst of this cataclysm that the people look at from a distance, like a tourist watching the eruption of a volcano from several kilometers away.

The next day Clavel received a packet of newspaper cuttings. He knows. Those who write in the rear carry on their fight in their citadel of logic, each one supplying his own lie. He now knows that there is only a great frenzy in a great catastrophe. He is in the middle of the catastrophe that the people in the rear contemplate, as a tourist contemplates the eruption of a volcano from a distance.

And another, longer, passage, this one from the last page of Chapter XV (page 300) of the original:

The division headquarters, with its gleaming officers and its clerical workers. A field near the cemetery is chosen for the execution of Private P., from the colonial infantry.
“What did he do?...”
“He didn’t want to go into the trenches…”
It is dawn. Six hundred men are lined up: his company and parts of other units.
An ambulance wagon has been prepared in case Private P.  faints or resists.
The wagon is not needed. Private P. walks to his spot. Twenty men, bayonets at the ready, escort him. He has just as much the look of a soldier as the other men. The only difference is that he doesn’t have a rifle. He looks straight ahead. He has the face of a sick man being taken out of the trenches. 
Private P. and his escort come to the field where the troops are waiting at attention. 
Private P. is there with the other twenty men. No one has come yet to take him. 
A warrant officer orders: “Left side, line up…”
Then, “Right side, line up…”
And Private P., who is going to die, seems bothered only by not knowing how to stand. He turns his head to the right, puts his left fist on his hip. Private P. follows the order “Right side, line up” with the other soldiers.
Twelve soldiers have fired. Private P. is dead.
It's the division with its gleaming officers and its pen-pushers. A field near the cemetery has been chosen for the execution ceremony of soldier P.... of the colonial infantry.
'What did he do?'...
'He didn't want to go to the trenches'...
It is dawn. Six hundred men are drawn up; his company and parts of other troops.
An ambulance has been prepared in case soldier P.... should faint or resist.
The vehicle is not needed. Soldier P....marches to his rank. Twenty men, with fixed bayonets, escort him. He looks a soldier, just like the others. He has no rifle, that's all. He looks straight ahead. A sick man, coming back from the trenches, has this look. 
Soldier P...is there with the other twenty. They haven't yet come to take him. 
An adjutant gives the order: 'Left turn'...
Then: 'Right turn'...
And soldier P...., who is going to die, seems bothered by not knowing where to stand. He turns his head to the right, puts his left fist on his hip. Soldier P...., along with the others, carries out the order: "Right turn."
Two soldiers fired. Soldier P... is dead.

The differences are minimal, but they are there. The only major difference is a bona fide mistake in the second selection, where the French “douze” is translated as “two.” This is something I noticed by comparing translations: mistakes do happen. Sometimes words are mistranslated, and not only when there is debate or obscurity about what the word means. Even more unsettling, sometimes whole lines or short paragraphs are left out: both Copp and I are guilty of this error. Translating an entire novel is a laborious task, so it makes sense that such mistakes happen.

But this led me to another discovery, one that unsettled me more, if possible, than finding that someone else had beat me to the punch and had published an English translation of Clavel Soldat. Mistakes such as the ones I noted above are inevitable in a long scholarly work, but editors should be able to find and eliminate them; after all, that’s what they’re payed to do. Why had this not happened in Copp’s translation? The answer is simple: I believe Copp had no editors, because it turns out that Grosvenor House Publishing Limited is what was once called a “vanity press”: it is essentially the same as self publishing on Amazon (which I have done myself and, to a certain extent, now regret), and there appears to be little quality control. This discovery floored me, for reasons I’ll explain in a moment. But regarding the errors in the text, I’d still argue that Copp did an excellent job on his translation. The fact that it differs from mine attests to the finesse and subtlety required in translation itself. Like so much in life, there are no right or wrong answers, and it is important to remember that diversity is a gift, not a curse. What this does mean, however, is that any time we read works that have been translated, the translator has made choices, most of them unconscious, that reflect how he or she sees the world, and this inevitably skews the purity, so to speak, of the original words. Again, that is not necessarily a problem; it’s just important to be aware of it when reading literature in translation. When a translator creates a translation, it’s as if all his or her past reading, thinking, even life experiences, work to color the words he or she chooses, and so it makes sense that each translation would be as individual as the person who produced it.

What more have I learned from this grand, marathon-like exercise of mine? I still think Clavel Soldat is a good book, and an important one. Leon Werth created a character who despised war and dared to write about it during the war. His depiction of life at the Front in 1914 is ruthless in its clarity and in the sense of betrayal Clavel feels as he witnesses both the horrors of war and the hypocrisy of those participating in it. I understood the First World War much better after reading the novel, and so I am despondent and, to be honest, disgusted about the fact that its translation appears to be unpublishable today and that self publishing is the only recourse for a novel of this type. Consequently, few English speakers will ever read it. My conclusion — which I hope is not the result of a sour-grapes attitude — is that publishing, like so many things today, is a grand game of popularity and attention-grabbing. In times past, there was room for less popular works, if they were deemed important. Now, however, we live in an attention economy, and important works are bypassed for those works that get a bigger, louder splash.

We lose so much by this. History fades away, covered up by the clamor of contemporary voices, all competing for the biggest slice of an economic pie that really doesn’t matter in the long run. What we lose by this is access to history, is the abililty to understand, so to speak, what the long run is and how it affects us. We become more provincial in our thinking and less capable of forming big ideas because we are only able to access those works deemed most liable to get the biggest bang for publishers’ bucks. It’s a tragic situation, and I’m not sure what we can do to fix it.

In the end, I have to be selfish and say that I’m glad I spent a year plowing through Clavel Soldat, as well as the six additional weeks comparing my translation to Michael Copp’s. True, it may be time that I’ll never get back, but it was time well spent, because it has enriched my knowledge of history, literature, and not least of all, the art of translation. All of these things are valuable, and because of that, I’m satisfied.

All is Not Well

I have been writing much less frequently, for the simple reason that I find I have nothing much to say, perhaps because it’s been a busy summer filled with outdoor activities and a new puppy, or because I’ve been in reading rather than writing mode. I used to push myself to write here in order to present material, as a kind of gift, to my readers. That was before I realized that my readers are ephemeral, ghost-like entities who may or may not exist in the real world. Since that realization, I’ve not only given up on gift-giving of this sort, but also actively discouraged (if you can count de-linking this blog from Facebook as discouragement, which I do) readers from finding The Tabard Inn. I did this originally in a fit of pique, but now I believe that it was a healthy thing to do, and the sum total of this paragraph is this: if you have somehow found this blog and are reading it now, you are one of the few, the special–not to mention the exceedingly strange–people who actually read what I write. So thank you for that. I think.

Anyway, I have something to say this morning, which explains this post. Having seen an advertisement for Mona Awad’s new novel All’s Well (Simon and Schuster), I decided to read it, and even convinced a friend (thanks, Anne!) to read it as well. And now I’m moved to write about it, not because it’s good, but because I hate it.

Fair warning: the book may indeed be very good, so don’t look upon this as a bad review. After fifty-odd years of reading critically, after a career in teaching literature at the college level, after immersing myself in the world of books and reading for my entire life, I find I no longer have any confidence in my own judgments on literary works. I mean, I know that I personally think Tintern Abbey is one of the greatest pieces of writing ever written, just as I know that I personally love pretty much any book by Dickens or any Bronte (but not Anthony Trollope, who can sometimes be a huge arschloch)–but I don’t know if that constitutes great literature, or something that other people will enjoy or find value in. I seem to be entering a period of extreme intellectual solipsism, which is worrisome, yet not too worrisome considering all the crap that’s going down in the world at this point in time.

So, to continue, I hated All’s Well for several reasons. First, and most intensely, because Awad does what I have tried to do in the two novels I’ve written: identified a literary subtext and play a textual game of cat-and-mouse with it as I develop the characters, setting, and plot. For Effie Marten, it was of course Jane Eyre; for Betony Lodge, it was Far From the Madding Crowd, or perhaps The Woodlanders, or any of several Thomas Hardy novels (other than Tess of the D’Urbervilles or Jude the Obscure–I know enough to leave those two novels alone). Seeing someone else do what I’ve tried to do with uneven success sets my teeth on edge, which may not be charitable of me. To be honest, I don’t think Awad was any more successful than I was, and maybe that’s the problem.

It bothers me, too, that Awad chose a Shakespeare play (or really two, perhaps even more) as a subtext, not because Shakespeare is inviolable or holy, but because she spins her novel out of the most pedestrian, superficial reading of All’s Well That Ends Well possible. I have long held the opinion that most Shakespeare plays are monumentally misunderstood by modern audiences, a fact that is exacerbated and perhaps even caused by the fact that the plays are by and large mis-titled. The Merchant of Venice, for example, is not about the Merchant Antonio–it’s more about Shylock, or even Portia, than it is about Antonio. Is Othello about Othello or about Iago? Julius Caesar seems to focus much more on Brutus than it does on Caesar, who is killed fairly early in the play. As for the comedies, the titles are simply throwaway phrases designed to get attention.

When I used to teach Shakespeare, I would tell my students that the plays we studied could be boiled down to one word. This may or may not be true, but it is a good way to get students into reading and understanding a Shakespeare play. I’ll give a few examples below, but it’s important to realize that there is no one “right” word to describe a play. You can use this method like a tool–something like a slide rule or a kaleidoscope to lay over each play, dial up a word suggested by the play, and get to work interpreting it.

Much Ado About Nothing: Interpretation

The Merchant of Venice: Gambling

Romeo and Juliet: Obedience

Whether this method works or not isn’t the issue here. What matters to me with respect to Awad’s novel is that she picks the limpest, flimsiest interpretation of All’s Well That Ends Well possible. Granted, it is a problematic play (though I disagree with the tendency to call it a “Problem Play,” as if, like an unruly child, this label can explain everything and short-circuit any attempt to make sense of it). The whole plot, in which the heroine Helena falls in love with the idiotic but presumably handsome Bertram, who rejects her until the last line of the play, is pretty distasteful and downright stupid. But that, I would argue, is not the point of the play. Rather, I believe the play is about how Helena empowers herself in a patriarchal system, ending up in a far more powerful position by using the very tools of patriarchy to do so, while also helping other women “beat” patriarchy at its own game on the way. Granted, this limited victory is nowhere near as satisfying as it would have been had Helena smashed patriarchy to smithereens and performed a wild dance upon its writhing body parts, but that kind of action was simply not possible in the world depicted by Shakespeare. Helena, I’d argue, did the best she could in the world she found herself in.

So, to get back to Awad’s novel, my biggest problem with the novel is that it rests on a sophomoric interpretation of the play. And so, what I thought would be a witty and erudite use of All’s Well that Ends Well became a kind of albatross that made me wince while reading the book. In other words, I thought I might be getting Shakespeare ReTold (a really fine set of retellings of five plays produced by the BBC), but instead I got a mashup of Slings and Arrows plus “The Yellow Wallpaper.” It felt cobbled together, and, frankly, kind of pointless. In the end, Awad uses a kind of trick to grab her readers’ attention, then spins off into a tale that is full of sound and fury, but ultimately signifying nothing.

That, however, seems to be how I see a great deal of contemporary literature these days, full of sturm und drang but ultimately useless in my trek through life. As I said above, I don’t have the confidence or the desire to argue that my approach is the correct one–rather, I question my own judgment, wondering whether I’m the only one who feels this way. And so, rather than push my own view of this novel, I’m satsified to register my own objections to it here, acting like King Midas’s barber, who whispered that his employer had donkey ears into a hole in the ground just because he had to tell someone his grand secret.

Donkey ears? That would be A Midsummer’s Night Dream, wouldn’t it?

University Days–Redux

Photo by Rakicevic Nenad on Pexels.com

When I was teaching college English courses, my best students, the ones who really paid attention and were hungry for knowledge and ideas, would often come up to me after a class and say something like, “You brought up the French Revolution today while we were talking about William Wordsworth. This morning, in my history class, Professor X also talked about it. And yesterday, in Sociology, Professor Y mentioned it, too. Did you guys get together and coordinate your lectures for this week?”

Of course, the answer was always “no.” Most professors I know barely have time to prepare their own lectures, much less coordinate them along the lines of a master plan for illustrating Western Civilization. It was hard, however, to get the students to believe this; they really thought that since we all brought up the same themes in our classes, often in the same week, we must have done it on purpose. But the truth was simple, and it wasn’t magic or even serendipity. The students were just more aware than they had been before, and allusions that had passed by them unnoticed in earlier days were now noteworthy.

I’ve experienced something of this phenomenon myself in recent days, while reading Colin Tudge’s book The Tree and listening to Karl Popper’s The Open Society and Its Enemies–two books, one on natural science and the other on philosophy, that would seem to have few if any common themes. In this case, the subject both authors touched on was nomenclature and definitions. Previously, I would have never noticed this coincidence, but now I find myself in the same position as my former students, hyper-aware of the fact that even seemingly unrelated subjects can have common themes.

There’s a good reason why I am experiencing what my students did; I am now myself a student, so it makes sense that I’d see things through their eyes. All of which leads me to my main idea for this post: University Redux, or returning to college in later life. It’s an idea that I believe might just improve the lives of many people at this very strange point in our lives.

I happened upon the concept in this way: after five or so years of retirement, I realized that I had lost the sense of my ikigai–my purpose in life. I am not exactly sure how that happened. When I took early retirement at the end of the school year in 2015, I had grand ideas of throwing myself into writing and research projects. But somehow I lost the thread of what I was doing, and even more frightening, why I was doing it. The political climate during the past few years certainly didn’t help matters, either. And so I began to question what it was that I actually had to offer the wide world. I began to realize that the answer was very little indeed.

Terrified at some level, I clutched at the things that made me happy: gardening, pets, reading. But there was no unifying thread between these various pursuits, and I began to feel that I was just a dilettante, perhaps even a hedonist, chasing after little pleasures in life. Hedonism is fine for some people, but I’m more of a stoic myself, and so the cognitive dissonance arising from this lifestyle was difficult for me to handle. And then, after drifting around for three or four years, I discovered a solution.

A little background information first: I have a Ph.D. in English, and my dissertation was on the representation of female insanity in Victorian novels. I’ve published a small number of articles, but as a community college professor, I did not have the kind of academic career that rewarded research. (I should say I tried to throw myself into academic research as a means of finding my ikigai, to no avail. I wrote about that experience here.) As a professor, I taught freshman English, as well as survey courses, at a small, rural community college. Most of my adult life revolved around the academic calendar, which as a retiree ususally left me feeling aimless, even bereft, when my old colleagues returned to campus in the fall, while I stayed at home or headed off on a trip.

A year and a half ago, however, I found my solution, and although I’ve had a few bumps in the road, I am generally satisfied with it. Armed with the knowledge that I was, intellectually at least, most fulfilled when I was a college student, I have simply sent myself back to college. Now, I don’t mean by this that I actually enrolled in a course of study at a university. I did, in fact, think about doing so, but it really made little sense. I don’t need another degree, certainly; besides, I live in an area that is too remote to attend classes. Yet I realized that if there was one thing I knew how to do, it was how to create a course. I also knew how to research. So, I convinced myself that living in the world of ideas, that cultivating the life of the mind, was a worthy pursuit in and of itself, and I gave myself permission to undertake my own course of study. I sent myself back to college without worrying how practical it was. I relied on my own knowledge and ability (Emerson would be proud!), as well as a certain degree of nosiness (“intellectual curiosity” is a nicer term), and I began to use my time in the pursuit of knowledge–knowing, of course, that any knowledge gained would have no value in the “real” world. It wouldn’t pay my rent, or gain me prestige, or produce anything remotely valuable in practical terms.

This last bit was the hardest part. I was raised to believe, as are most people in American society, that one must have practical skills, the proof of which is whether one can gain money by exercising them. If you study literature, you must be a teacher of some kind. If you play music, you must get paying gigs. If you like numbers, then you should consider engineering, accounting, or business. The rise of social media, where everyone is constantly sharing their successes (and academics are often the worst in this respect), makes it even more difficult to slip the bonds of materialism, to escape the all-consuming attention economy. My brainwashing by the economic and social order was very nearly complete: it was, in other words, quite hard for me to give myself permission to do something for the sake of the thing itself, with no ulterior motives. I had to give myself many stern lectures in an effort to recreate the mindset of my twenty-year-old naive self, saying for example that just reading Paradise Lost to know and understand it was enough; I didn’t have to parlay my reading and understanding into an article, a blog, or a work of fiction. (Full disclosure: having just written that, I will point out that I did indeed write a blog about Paradise Lost. You can’t win them all.) One additional but unplanned benefit of this odd program of study is that it fit in quite well with the year of Covid lockdown we’ve all experienced. Since I was already engaged in a purposeless aim, the enforced break in social life really didn’t affect me that much.

What does my course of study look like? Reading, mainly, although I know YouTube has many fine lectures to access. I read books on natural science (trying to fill a large gap produced during my first time at college), as well as history; this year, the topic has been the Franco-Prussian War and the Paris Commune. I study foreign languages on Duolingo (German, French, a bit of Spanish) while occasionally trying to read books in those languages. I have participated in a highly enjoyable two-person online reading group of The Iliad and The Odyssey (thanks, Anne!) Thanks to my recent discovery of Karl Popper, I foresee myself studying philosophy, perhaps beginning with Plato and Aristotle. I’ve taken FutureLearn classes on Ancient Rome, Coursera classes on The United States through Foreign Eyes, and several others. I’ve listened and re-listened to various In Our Time podcasts. I have taxed the local library with my requests for books from other network libraries, and I swear some of those books haven’t been checked out in a decade or more. To be honest, I don’t understand a good part of what I read, but this doesn’t bother me as it used to do the first time around. If I’ve learned one thing from serving on the local city council, it’s that you don’t have to understand everything you read, but you do have to read everything you’re given. Sometimes understanding comes much later, long after the book is returned–and that’s okay.

I’m not sure where this intellectual journey will lead, or if it will in fact lead anywhere. But I’m satisfied with it. I think I’ve chanced upon something important, something which society with its various pressures has very nearly strangled in me for the last thirty years: the unimpeded desire for knowledge, the childlike ability to search for answers just because, and the confidence to look for those answers freely, unattached to any hope of gain or prestige. It takes some getting used to, rather like a new diet or exercise program, but I’m pleased with the results at last, and I am enjoying my second dose of college life.

How We Got Here: A Theory

The United States is a mess right now. Beset by a corrupt president and his corporate cronies, plagued by a — um — plague, Americans are experiencing an attack on democracy from within. So just how did we get to this point in history?

I’ve given it a bit of thought, and I’ve come up with a theory. Like many theories, it’s built on a certain amount of critical observation and a large degree of personal experience. Marry those things to each other, and you can often explain even the most puzzling enigmas. Here, then, is my stab at explaining how American society became so divisive that agreement on any political topic has become virtually impossible, leaving a vaccuum so large and so empty that corruption and the will to power can ensure political victory.

I maintain that this ideological binarism in the United States is caused by two things: prejudice (racism has, in many ways, always determined our political reality), and lack of critical thinking skills (how else could so many people fail to see Trump for what he really is and what he really represents?) Both of these problems result from poor education. For example, prejudice certainly exists in all societies, but the job of a proper education in a free society is to eradicate, or at least to combat, prejudice and flawed beliefs. Similarly, critical thinking skills, while amorphous and hard to define, can be acquired through years of education, whether by conducting experiements in chemistry lab or by explicating Shakespeare’s sonnets. It follows, then, that something must be radically wrong with our educational system for close to half of the population of the United States to be fooled into thinking that Donald Trump can actually be good for this country, much less for the world at large.

In short, there has always been a possibility that a monster like Trump would appear on the political scene. Education should have saved us from having to watch him for the last four years, and the last month in particular, as he tried to dismantle our democracy. Yet it didn’t. So the question we have to ask is this: Where does the failure in education lie?

The trendy answer would be that this failure is a feature, not a bug, in American education, which was always designed to mis-educate the population in order to make it more pliable, more willing to follow demogogues such as Trump. But I’m not satisfied with this answer. It’s too easy, and more important, it doesn’t help us get back on track by addressing the failure (if that’s even possible at this point). So I kept searching for an explanation.

I’ve come up with the following premises. First, the divisions in the country are caused by a lack of shared values–this much is clear. For nearly half the American people, Trump is the apotheosis of greedy egotism, a malignant narcissist who is willing to betray, even to destroy, his country in order to get what he wants, so that he can “win” at the system. For the other half, Trump is a breath of fresh air, a non-politician who was willing to stride into the morass of Washington in order to clean it up and set American business back on its feet. These two factions will never be able to agree–not on the subject of Trump, and very likely, not on any other subject of importance to Americans.

It follows that these two views are irreconcilable precisely because they reflect a dichotomy in values. Values are the intrinsic beliefs that an individual holds about what’s right and wrong; when those beliefs are shared by a large enough group, they become an ethical system. Ethics, the shared sense of right and wrong, seems to be important in a society; as we watch ours disintegrate, we can see that without a sense of ethics, society splinters into factions. Other countries teach ethics as a required subject in high school classes; in the United States, however, only philosophy majors in universities ever take classes on ethics. Most Americans, we might once have said, don’t need such classes, since they experience their ethics every day. If that ever was true, it certainly isn’t so any more.

Yet I would argue that Americans used to have an ethical belief system. We certainly didn’t live up to it, and it was flawed in many ways, but it did exist, and that’s very different from having no ethical system at all. It makes sense to postulate that some time back around the turn of the 21st century, ethics began to disappear from society. I’m not saying that people became unethical, but rather that ethics ceased to matter, and as it faded away, it ceased to exist as a kind of social glue that could hold Americans together.

I think I know how this happened, but be warned: my view is pretty far-fetched. Here goes. Back in the 1970s and 1980s, literary theory poached upon the realm of philosophy, resulting in a collection of theories that insisted a literary text could be read in any number of ways, and that no single reading of a text was the authoritative one. This kind of reading and interpretation amounted to an attack on the authority of the writer and the dominant ideology that produced him or her, as it destabilized the way texts were written, read, and understood. I now see that just as the text became destabilized with this new way of reading, so did everything else. In other words, if an English professor could argue that Shakespeare didn’t belong in the literary canon any longer, that all texts are equally valid and valuable (I’ve argued this myself at times), the result is an attack not only on authority (which was the intention), but also on communality, by which I mean society’s shared sense of what it values, whether it’s Hamlet or Gilligan’s Island. This splintering of values was exacerbated by the advent of cable television and internet music sources; no one was watching or listening to the same things any more, and it became increasingly harder to find any shared ideological place to begin discussions. In other words, the flip side of diversity and multiplicity–noble goals in and of themselves–is a dark one, and now, forty years on, we are witnessing the social danger inherent in dismantling not only the canon, but any system of judgment to assess its contents as well.

Here’s a personal illustration. A couple of years ago, I taught a college Shakespeare class, and on a whim I asked my students to help me define characters from Coriolanus using Dungeons and Dragons character alignment patterns. It was the kind of exercise that would have been a smashing success in my earlier teaching career, the very thing that garnered me three teaching awards within five years. But this time it didn’t work. No one was watching the same television shows, reading the same books, or remembering the same historical events, and so there was no way to come up with good examples that worked for the entire class to illustrate character types. I began to see then that a splintered society might be freeing, but at what cost if we had ceased to be able to communicate effectively?

It’s not a huge leap to get from that Shakespeare class to the fragmentation of a political ideology that leaves, in the wreckage it’s produced, the door wide open to oligarchy, kleptocracy, and fascism. There are doubtless many things to blame, but surely one of them is the kind of socially irresponsible literary theory that we played around with back in the 1980s. I distinctly remember one theorist saying something to the effect that no one has ever been shot for being a deconstructionist, and while that may be true, it is not to say that deconstructionist theory, or any kind of theory that regards its work as mere play, is safe for the society it inhabits. Indeed, we may well be witnessing how very dangerous unprincipled theoretical play can turn out to be, even decades after it has held sway.

Convent-ional Trends in Film and Television

Lately I’ve been spending quite a bit of time with the Aged Parent , and one thing we do together–something we’ve rarely done before–is watch television shows. My mother, deep in the throes of dementia, perks up when she sees Matt Dillon and Festus ride over the Kansas (it is Kansas, isn’t it?) plains to catch bad guys and rescue the disempowered from their clutches. Daytime cable television is filled with Westerns, and I find this fascinating, although I’ve never been a fan of them in the past. Part of my new-found fascination is undoubtedly inspired by Professor Heather Cox Richardson’s theory–presented in her online lectures as well as her Substack newsletter–that the United States’s fascination with the Western genre has a lot to do with the libertarian, every-man-for-himself ideal most Westerns present. I think she’s got a point, but I don’t think that this alone explains our fascination with Westerns. This, however, is an argument I’ll have to return to at a later date, because in this blog post, what I want to talk about is nuns.

Yes–that’s right–Catholic nuns. What was going on in the 1950s and ’60s that made the figure of the young, attractive nun so prevalent in films and television? Here, for example, is a short list of the movies that feature nuns from the 1960s:

  1. The Nun’s Story (1959) with Audrey Hepburn
  2. The Nun and the Sergeant (1962), itself a remake of Heaven Knows, Mr. Allison (1957)
  3. Lilies of the Field (1963) with Sidney Poitier
  4. The Sound of Music (1965), no comment needed
  5. The Singing Nun (1966) starring Debbie Reynolds
  6. The Trouble with Angels (1966) with Rosalind Russsell and Hayley Mills
  7. Where Angels Go, Trouble Follows (1968), the sequel to #6
  8. Change of Habit (1969), starring the strangely matched Mary Tyler Moore and Elvis Presley (!)

The fascination with nuns even bled over into television, with the series The Flying Nun (1967-1970), starring a post-Gidget Sally Field. This show, with its ridiculous premise of a nun who can fly, seems to have ended the fascination with nuns, or perhaps its bald stupidity simply killed it outright. From 1970 until 1992, when Sister Act appeared, there seemed to be a lull in American movies featuring nuns. Incidentally, the films I’ve mentioned here all feature saccharine-sweet characters and simple plots; in a typically American fashion, many of the difficult questions and problems involved in choosing a cloistered life are elided or simply ignored. There are, however, other movies featuring nuns that are not so wholesome; Wikipedia actually has a page devoted to what it terms “Nunsploitation.” These films, mostly foreign, seem more troubling and edgier. I leave an analysis of such films to another blogger, however, because what I really want to investigate is this: why was American culture so enamored, for the space of a decade, with nuns and convent life? I’ve argued previously that popular culture performs the critical task of reflecting and representing dominant ideologies, so my question goes deeper than just asking, “Hey, what’s with all these nuns?” Rather, it seeks to examine what conditions caused this repetitive obsession about nuns in a country that prided itself on the distance between religion and politics and, at least superfiically, religion’s exclusion from American ideology.

I have some ideas, but nothing that could be hammered together neatly enough to call a theory to explain this obsession, and so I will be looking to my readers to provide additional explanations. Surely the box-office success of films starring Audrey Hepburn, Debbie Reynolds, Sidney Poitier, and Julie Andrews count for something: Hollywood has always been a fan of the old “if it worked once, it should work again” creative strategy. But I think this might be too simple an explanation. I’ll have another go: perhaps in an era when women were beginning to explore avenues to power, self-expression, and sexual freedom, the image of a contained and circumscribed nun was a comfort to the conservative forces in American society. It’s just possible that these nuns’ stories were a representation of the desire to keep women locked up, contained, and submissive. On the other hand, the image of the nun could be just the opposite, one in which women’s struggle for independence and self-actualization was most starkly rendered by showing religious women asserting their will despite all the odds against them.

I think it’s quite possible that both these explanations, contradictory as they seem, might be correct. Certainly the depiction of women who submit to being controlled and defined by religion presents a comforting image of a hierarchical past to an audience that fears not only the future but the present as well (we should remember that the world was experiencing profoundly threatening social and political upheaval in the late 1960s). Yet at the same time, the struggle many of these nun-characters undergo in these films might well be representative of non-religious women’s search for meaning, independence, and agency in their own lives.

As I said, I have more questions than answers, and I will end this post with an obvious one: what effect did these films have on the general public? We’ve briefly explored the idea of where such movies came from and what they represent in the American ideology that produced them, but what did they do to their audiences? Was there any increase in teenage girls joining convents in the 1970s, after these films played in theatres and later, on television? What did the religious orders themselves have to say about such films? I’d be interested in learning the answers to these questions, so readers, if you have any ideas, or if you just want to compare notes and share your impressions, please feel free to comment!