Smith versus Shelley: A Tale of Two Poems

Yesterday, I co-led a poetry discussion group at one of the area retirement communities, something I’ve done for the last few years. It’s been a really interesting experience–there’s so much to learn and discuss about even mediocre poems, and I enjoy hearing the participants share their ideas about the poems, as well as the stories and memories these poems evoke.

I choose the poems at random, with very little rhyme (pardon the pun) or reason to my choice. One of the poems yesterday was Percy Bysshe Shelley’s “Ozymandias.” Yes, I proffered that old chestnut to the group, even though I’d read it thousands of times and have taught it in many classes. I just wanted another look at it, I guess, and it’s fun to do that with company. What I wasn’t expecting, however, was my co-leader bringing in another poem on the same exact topic, written at the same time.

It happens that Shelley had a friend, the prosaically named Horace Smith, and the two of them engaged in a sonnet writing contest, on the agreed-upon subject of Ancient Egypt and, presumably, Rameses II, also known as Ozymandias. We remember Shelley’s poem: every anthology of 19th-century British literature probably contains it. However, Smith’s sonnet is largely forgotten. In fact, I’ll offer a true confession here: despite having taught Brit lit for decades, I’d not heard of Smith’s version until a couple of days ago.

It turns out that Smith was himself an interesting fellow. He wrote poetry, but was not averse to making money, unlike his younger friend Shelley. Smith was a stock-broker, and made a good living, while also, according to Shelley, being very generous with it. He sounds like a generally good guy, to be honest, something which Shelley aspired to be, but was really not. For all intents and purposes, Shelley was a masterful poet but a real asshole on a personal level, and a bit of an idiot to boot. (What kind of a fool goes sailing in a boat that he didn’t know how to operate, in a storm, when he didn’t even know how to swim?) Smith knew how to make and keep friends as well as money, two things that Shelley was not very good at, by all accounts.

At any rate, I thought it might be interesting to compare the two poems. Of course, we assume Shelley’s poem will be better: it’s the one that is in every anthology of 19th-Century British literature, after all, while I–with a Ph.D. in the subject, for whatever that’s worth–didn’t even know of the existence of Smith’s poem until a few days ago. But maybe, just maybe, there’s something valuable in the stockbroker’s poem that has been missed–and wouldn’t that make a fine story in and of itself?

So here are the two poems, first Shelley’s, and then Smith’s.

Ozymandias (Shelley)

I met a traveller from an antique land,

Who said—“Two vast and trunkless legs of stone

Stand in the desert. . . . Near them, on the sand,

Half sunk a shattered visage lies, whose frown,

And wrinkled lip, and sneer of cold command,

Tell that its sculptor well those passions read

Which yet survive, stamped on these lifeless things,

The hand that mocked them, and the heart that fed;

And on the pedestal, these words appear:

My name is Ozymandias, King of Kings;

Look on my Works, ye Mighty, and despair!

Nothing beside remains. Round the decay

Of that colossal Wreck, boundless and bare

The lone and level sands stretch far away.”

Ozymandias (Smith)

In Egypt’s sandy silence, all alone,
Stands a gigantic Leg, which far off throws
The only shadow that the Desert knows:—
“I am great OZYMANDIAS,” saith the stone,
“The King of Kings; this mighty City shows
The wonders of my hand.”— The City’s gone,—
Naught but the Leg remaining to disclose
The site of this forgotten Babylon.

We wonder — and some Hunter may express
Wonder like ours, when thro’ the wilderness
Where London stood, holding the Wolf in chace,
He meets some fragment huge, and stops to guess
What powerful but unrecorded race
Once dwelt in that annihilated place.

Now, I’d say Shelley definitely has the advantage in terms of poetic language, as well as the narrative situation. His words are sibilant and flowing, and it’s a stroke of genius to make the story come from not the speaker of the poem, but from a traveler from an antique land; it makes the scene seem even more authentic. The alliteration in the last two lines (“boundless” and “bare” as well as “lone” and “level”) is a deft touch as well.

I’d also say that Shelley’s choice of the half shattered face is much better than Smith’s. There’s something much more poetic about a sneering face, even if it’s a half of a face, than a gigantic leg. There’s no way on earth Smith could have made a gigantic leg sound poetic, and that hampers the poetic feel of his sonnet, which is a bit of a shame.

Or is it?

Perhaps Smith wasn’t going for poetic feel here at all. In fact, I’d argue that he definitely wasn’t thinking along the same lines Shelley was. There are obvious similarities between the two poems. We still get the empty site, the desolation of the “forgotten Babylon” that powers so much of Shelley’s version, but it turns out that Smith is interested in something completely different. Where Shelley’s poem comments on the nature of arrogance, a human pride that ends in an ironic fall, Smith’s presents the reader with a different kind of irony. His version is much grander. In fact, it’s a cosmic irony that Smith is grappling with here, as the poem comments on the inevitable rise and fall of human civilization. What I find astounding is that in 1818, just as England was beginning its climb up to the pinnacle of world dominance for the next two centuries, Smith was able to imagine a time when the world he knew would be in tatters, with nothing remaining of the biggest city on earth, save as a hunting ground for the presumably savage descendants of stockbrokers like himself. Smith’s imagination was far more encompassing that Shelley’s, given this kind of projection into the far future.

All told, Shelley’s poem is probably the better one: it’s more quotable, after all, and no matter how much I love Smith’s message and projection into the future, he just doesn’t have the choice of words and rhythm that Shelley does. But need we really limit ourself to just one of these poems, anyway? I’d say we’ve gleaned about as much as we can from Shelley’s “Ozymandias.” Perhaps ours is an age in which we can appreciate Smith’s vision of a far distant future. Empires rise and fall, waters ebb and flow, and civilizations come and go. Smith, with his Hunter coursing through what was once London, paints this idea just as well as Shelley does with his decayed Wreck. There’s room for both of these poems in our literary canon.

An Idea for Math Teachers: Learning Logs

Image taken from a Guardian article

It’s been a busy summer for me, and I’m just getting into the swing of the school year. Some of my readers may recall that last year, for some strange reason, I decided to send myself back to school to acquire the math knowledge I never managed to master as a young adult. It’s been a struggle, but I haven’t given up yet, probably for three reasons: first, I’m a non-traditional (old) student, so I have a lot of experience being a student as well as a teacher; second, I have more patience than I did as a young person; and third, I have a lingering professional interest in how students learn. It occurred to me today, after a particularly frustrating experience which involved a trigonometry test and my concomitant inability to answer what seemed to be the most basic questions, that one thing math teachers could do is something we writing teachers have been doing for quite a few years now: asking our students to submit journals on their learning experiences. I actually think this is a somewhat brilliant idea, so, to test it out, I decided to try it out here, on my blog. So from here on out, I guess I should consider this my first entry in my math learning log.

But before I get started, a brief warning. I am horrifically bad at keeping journals. I am probably even worse at journaling than I am at math, and that’s saying a lot. From the outset, I warn any reader of this math learning log that there will huge lacunae here, because I often start a journal and then completely forget about it. Nevertheless, I am going to forge ahead with this journaling idea, because one of the reasons I started this whole math learning process in the first place was to see what happened when a person of normal intelligence but sparse math knowledge attempts to learn enough math to get to calculus. Is such a progression–from basic College Algebra to Calculus I–possible? Today, after my test this morning, I have serious doubts. But I have to set those doubts and my growing frustration aside, and remember to regard myself not as a frustrated and humbled learner who knows she should have been able to do better on this test, but as a subject in an experiment. In a sense, I’m like that scientist in a black-and-white film who decides to test out a vaccine, or an antivenom, on himself. I need to maintain a sense of calm detachment, even while knowing that the results of this test could be disastrous. And, getting back to the journaling idea, documenting my journey through writing may well be more instructive than documenting it through grades.

For the moment, I will not mention my other reason for taking math courses–the existential, ideological, philosophical, or, if you like, the religious reason for this foray into mathemaltical studies. However, I write about it here.

Now, on to the subject at hand: the trig test on basic functions. I thought I had mastered about 2/3 of the concepts for this test, but I was wrong. Even those concepts I felt sure of slipped away from me as I began the test, in exactly the same way the memory of a dream evaporates upon our waking. By the end of the test, working frantically against the clock (and I have to add here that although I’ve always been a relatively fast test-taker, today I worked up to the last second), I was both frustrated and ashamed. Yet, to be fair, my poor performance (I do expect to pass the test, but only because of partial credit and because I wrote down everything I thought that could be pertinent to each problem), is due neither to laziness nor to disinterest. I had to miss four classes because I was taking care of my grandson in a different city (okay, so that was a delightful experience, and I wouldn’t have missed it for the world, to be honest). I did go to the tutoring center when I got back, but after an hour or so there, I deluded myself into thinking that I actually understood what had been covered. I also watched about three hours total of various YouTube videos (Khan Academy, Organic Chemistry Tutor, Dennis Davis’s series) about the trig concepts covered, but in the end, I think they might have hurt me, making me think I understood things when I really didn’t.

The frustration is real. At my age (62), learning something new can be unfamiliar, and it’s easy to see why this is so. We oldsters found out what we are good at and what we like long ago, and we have kept doing those things for decades, getting better and better at them through the years. We tend to forget how hard it is to learn, how much energy and commitment it takes, and most of all, how very frustrating and embarrassing it can be. In other words, learning something new from scratch, without any context to fit things into, is not only intellectually challenging but also emotionally draining. I suspect young people learn more easily not only because they are quicker and more resilient, but also because they don’t know any better. Because they don’t know yet what it feels like to have actually mastered something, not mastering or “getting” concepts doesn’t produce as much cognitive dissonance in them as it does in us oldsters.

Whatever my grade on the test this morning, and I don’t expect it to be good, I have to say that I don’t think studying more would have helped me, because I know I wasn’t studying the right things in the right ways. (That’s probably where that missed week of classes hurt me.) And although I was incredibly frustrated when I turned in the test, I feel less so now, a couple of hours later, because I realize that I don’t have to pass this class to obtain knowledge from it. In fact, I know I can take the class over next semester whatever grade I get this time, and that doing so is no great dishonor or waste of time. I have the luxury of taking my time with this, and although that’s really hard to remember when I’m in the throes of studying and test-taking, it’s the way all learning, in a perfect world, should be.

University Days–Redux

Photo by Rakicevic Nenad on Pexels.com

When I was teaching college English courses, my best students, the ones who really paid attention and were hungry for knowledge and ideas, would often come up to me after a class and say something like, “You brought up the French Revolution today while we were talking about William Wordsworth. This morning, in my history class, Professor X also talked about it. And yesterday, in Sociology, Professor Y mentioned it, too. Did you guys get together and coordinate your lectures for this week?”

Of course, the answer was always “no.” Most professors I know barely have time to prepare their own lectures, much less coordinate them along the lines of a master plan for illustrating Western Civilization. It was hard, however, to get the students to believe this; they really thought that since we all brought up the same themes in our classes, often in the same week, we must have done it on purpose. But the truth was simple, and it wasn’t magic or even serendipity. The students were just more aware than they had been before, and allusions that had passed by them unnoticed in earlier days were now noteworthy.

I’ve experienced something of this phenomenon myself in recent days, while reading Colin Tudge’s book The Tree and listening to Karl Popper’s The Open Society and Its Enemies–two books, one on natural science and the other on philosophy, that would seem to have few if any common themes. In this case, the subject both authors touched on was nomenclature and definitions. Previously, I would have never noticed this coincidence, but now I find myself in the same position as my former students, hyper-aware of the fact that even seemingly unrelated subjects can have common themes.

There’s a good reason why I am experiencing what my students did; I am now myself a student, so it makes sense that I’d see things through their eyes. All of which leads me to my main idea for this post: University Redux, or returning to college in later life. It’s an idea that I believe might just improve the lives of many people at this very strange point in our lives.

I happened upon the concept in this way: after five or so years of retirement, I realized that I had lost the sense of my ikigai–my purpose in life. I am not exactly sure how that happened. When I took early retirement at the end of the school year in 2015, I had grand ideas of throwing myself into writing and research projects. But somehow I lost the thread of what I was doing, and even more frightening, why I was doing it. The political climate during the past few years certainly didn’t help matters, either. And so I began to question what it was that I actually had to offer the wide world. I began to realize that the answer was very little indeed.

Terrified at some level, I clutched at the things that made me happy: gardening, pets, reading. But there was no unifying thread between these various pursuits, and I began to feel that I was just a dilettante, perhaps even a hedonist, chasing after little pleasures in life. Hedonism is fine for some people, but I’m more of a stoic myself, and so the cognitive dissonance arising from this lifestyle was difficult for me to handle. And then, after drifting around for three or four years, I discovered a solution.

A little background information first: I have a Ph.D. in English, and my dissertation was on the representation of female insanity in Victorian novels. I’ve published a small number of articles, but as a community college professor, I did not have the kind of academic career that rewarded research. (I should say I tried to throw myself into academic research as a means of finding my ikigai, to no avail. I wrote about that experience here.) As a professor, I taught freshman English, as well as survey courses, at a small, rural community college. Most of my adult life revolved around the academic calendar, which as a retiree ususally left me feeling aimless, even bereft, when my old colleagues returned to campus in the fall, while I stayed at home or headed off on a trip.

A year and a half ago, however, I found my solution, and although I’ve had a few bumps in the road, I am generally satisfied with it. Armed with the knowledge that I was, intellectually at least, most fulfilled when I was a college student, I have simply sent myself back to college. Now, I don’t mean by this that I actually enrolled in a course of study at a university. I did, in fact, think about doing so, but it really made little sense. I don’t need another degree, certainly; besides, I live in an area that is too remote to attend classes. Yet I realized that if there was one thing I knew how to do, it was how to create a course. I also knew how to research. So, I convinced myself that living in the world of ideas, that cultivating the life of the mind, was a worthy pursuit in and of itself, and I gave myself permission to undertake my own course of study. I sent myself back to college without worrying how practical it was. I relied on my own knowledge and ability (Emerson would be proud!), as well as a certain degree of nosiness (“intellectual curiosity” is a nicer term), and I began to use my time in the pursuit of knowledge–knowing, of course, that any knowledge gained would have no value in the “real” world. It wouldn’t pay my rent, or gain me prestige, or produce anything remotely valuable in practical terms.

This last bit was the hardest part. I was raised to believe, as are most people in American society, that one must have practical skills, the proof of which is whether one can gain money by exercising them. If you study literature, you must be a teacher of some kind. If you play music, you must get paying gigs. If you like numbers, then you should consider engineering, accounting, or business. The rise of social media, where everyone is constantly sharing their successes (and academics are often the worst in this respect), makes it even more difficult to slip the bonds of materialism, to escape the all-consuming attention economy. My brainwashing by the economic and social order was very nearly complete: it was, in other words, quite hard for me to give myself permission to do something for the sake of the thing itself, with no ulterior motives. I had to give myself many stern lectures in an effort to recreate the mindset of my twenty-year-old naive self, saying for example that just reading Paradise Lost to know and understand it was enough; I didn’t have to parlay my reading and understanding into an article, a blog, or a work of fiction. (Full disclosure: having just written that, I will point out that I did indeed write a blog about Paradise Lost. You can’t win them all.) One additional but unplanned benefit of this odd program of study is that it fit in quite well with the year of Covid lockdown we’ve all experienced. Since I was already engaged in a purposeless aim, the enforced break in social life really didn’t affect me that much.

What does my course of study look like? Reading, mainly, although I know YouTube has many fine lectures to access. I read books on natural science (trying to fill a large gap produced during my first time at college), as well as history; this year, the topic has been the Franco-Prussian War and the Paris Commune. I study foreign languages on Duolingo (German, French, a bit of Spanish) while occasionally trying to read books in those languages. I have participated in a highly enjoyable two-person online reading group of The Iliad and The Odyssey (thanks, Anne!) Thanks to my recent discovery of Karl Popper, I foresee myself studying philosophy, perhaps beginning with Plato and Aristotle. I’ve taken FutureLearn classes on Ancient Rome, Coursera classes on The United States through Foreign Eyes, and several others. I’ve listened and re-listened to various In Our Time podcasts. I have taxed the local library with my requests for books from other network libraries, and I swear some of those books haven’t been checked out in a decade or more. To be honest, I don’t understand a good part of what I read, but this doesn’t bother me as it used to do the first time around. If I’ve learned one thing from serving on the local city council, it’s that you don’t have to understand everything you read, but you do have to read everything you’re given. Sometimes understanding comes much later, long after the book is returned–and that’s okay.

I’m not sure where this intellectual journey will lead, or if it will in fact lead anywhere. But I’m satisfied with it. I think I’ve chanced upon something important, something which society with its various pressures has very nearly strangled in me for the last thirty years: the unimpeded desire for knowledge, the childlike ability to search for answers just because, and the confidence to look for those answers freely, unattached to any hope of gain or prestige. It takes some getting used to, rather like a new diet or exercise program, but I’m pleased with the results at last, and I am enjoying my second dose of college life.

How the Study of Literature Could Save Democracy

Beowulf MS, picture from Wikipedia

Usually, I am not one to make grand claims for my discipline. There was a time, back when I was a young graduate student in the 1980s, that I would have; perhaps even more recently, I might have argued that understanding ideology through literary theory and criticism is essential to understanding current events and the conditions we live in. But I no longer believe that.

Perhaps in saying this publicly, I’m risking some sort of banishment from academia. Maybe I will have to undergo a ritual in which I am formally cashiered, like some kind of academic Alfred Dreyfus, although instead of having my sword broken in half and my military braids ripped to shreds, I will have my diploma yanked from my hands and trampled on the ground before my somber eyes. Yet unlike Dreyfus, I will have deserved such treatment, because I am in fact disloyal to my training: I don’t believe literary theory can save the world. I don’t think it’s necessary that we have more papers and books on esoteric subjects, nor do I think it’s realistic or useful for academics to participate in a market system in which the research they produce becomes a commodity in their quest for jobs, promotions, or grant opportunities. In this sense, I suppose I am indeed a traitor.

But recently I have realized, with the help of my friend and former student (thanks, Cari!), that literature classes are still important. In fact, I think studying literature can help save our way of life. You just have to look at it this way: it’s not the abstruse academic research that can save us, but rather the garden-variety study of literature that can prove essential to preserving democracy. Let me explain how.

I’ll begin, as any good scholar should, by pointing out the obvious. We are in a bad place in terms of political discourse–it doesn’t take a scholar to see that. Polarizing views have separated Americans into two discrete camps with very little chance of crossing the aisle to negotiate or compromise. Most people are unwilling to test their beliefs, for example, preferring to cling to them even in the face of contradictory evidence. As social psychologists Elliot Aronson and Carol Tavris point out in a recent article in The Atlantic, “human beings are deeply unwilling to change their minds. And when the facts clash with their preexisting convictions, some people would sooner jeopardize their health and everyone else’s than accept new information or admit to being wrong.” They use the term “cognitive dissonance,” which means the sense of disorientation and even discomfort one feels when considering two opposing viewpoints, to explain why it is so hard for people to change their ideas.

To those of us who study literature, the term “cognitive dissonance” may be new, but the concept certainly is not. F. Scott Fitzgerald writes, in an essay which is largely forgotten except for this sentence, “the test of a first-rate intelligence is the ability to hold two opposed ideas in the mind at the same time, and still retain the ability to function” (“The Crack-Up,Esquire Magazine, February 1936). In addition, cognitive dissonance isn’t that far removed from an idea expressed by John Keats in a letter he wrote to his brothers back in 1817. He invents the term “Negative Capability” to describe the ability to remain in a liminal state of doubt and uncertainty without being driven to come to any conclusion and definitive belief. Negative capability, in other words, is the capacity to be flexible in our beliefs, to be capable of changing our minds.

I believe that the American public needs to develop negative capability, lots of it, and quickly, if we are to save our democracy.

But there’s a huge problem. Both Fitzgerald and Keats believe that this function is reserved only for geniuses. In their view, a person is born with this talent for tolerating cognitive dissonance: you either have it–in which case you are incredibly gifted–or you don’t. In contrast, Aronson and Tavris clearly believe it’s possible to develop a tolerance for cognitive dissonance: “Although it’s difficult, changing our minds is not impossible. The challenge is to find a way to live with uncertainty…” While their belief in our ability to tolerate cognitive dissonance and to learn from it is encouraging, it is sobering that they do not provide a clear path toward fostering this tolerance.

So here’s where the study of literature comes in. In a good English class, when we study a text, whether it’s To Kill a Mockingbird or Beowulf, students and teacher meet as more or less equals over the work of literature in an effort to find its meaning and its relevance. Certainly the teacher has more experience and knowledge, but this doesn’t–or shouldn’t–change the dynamic of the class: we are all partners in discovering what the text has to say in general, and to us, specifically. That is our task. In the course of this task, different ideas will be presented. Some interpretations will be rejected; some will be accepted. Some will be rejected, only to be later accepted, even after the space of years (see below for an example).

If we do it well, we will reach a point in the discussion where we consider several differrent suggestions and possibilities for interpretation. This is the moment during which we become experts in cognitive dissonance, as we relish interpretive uncertainty, examining each shiny new idea and interpretation with the delight of a child holding up gorgeously colored beads to the light. We may put a bead down, but it is only to take up another, different one–and we may well take up the discarded bead only to play with it some more.

The thing that makes the study of literature so important in this process is that it isn’t really all that important in the grand scheme of things. To my knowledge, no one has ever been shot for their interpretation of Hamlet; the preservation of life and limb does not hang on an precise explanation of Paradise Lost. If we use the study of literature as a classroom designed to increase our capacity for cognitive dissonance, in other words, we can dissipate the highly charged atmosphere that makes changing our minds so difficult. And once we get used to the process, when we know what it’s like to experience cognitive dissonance, it will be easier to for us to tolerate it in other parts of our lives, even in the sphere of public policy and politics.

If I seem to be writing with conviction (no cognitive dissonance here!), it’s because I have often experienced this negative capability in real time. I will give just two examples. The first one occurred during a class on mystery fiction, when we were discussing the role of gossip in detective novels, which then devolved into a discussion on the ethics of gossip. The class disagreed violently about whether gossip could be seen as good or neutral, or whether it was always bad. A loud (and I mean loud!) discussion ensued, with such force that a janitor felt compelled to pop his head into the classroom–something that I had never seen happen either before or since then–to ask if everything was ok. While other teachers might have felt that they had lost control of the classroom, I, perversely, believe that this might have been my most successful teaching moment ever. That so many students felt safe enough to weigh in, to argue and debate passionately about something that had so little real importance suggested to me that we were exercising and developing new critical aptitudes. Some of us, I believe, changed our minds as a result of that discussion. At the very least, I think many of us saw the topic in a different way than we had to begin with. This, of course, is the result of experiencing cognitive dissonance.

My second example is similar. At the end of one very successful course on Ernest Hemingway, my class and I adjourned for the semester to meet at a local bar, at which we continued our discussion about The Sun Also Rises. My student Cari and I got into a very heated discussion about whether the novel could be seen as a pilgrimage story. Cari said it was ; I vehemently disagreed. The argument was fierce and invigorating–so invigorating, as a matter of fact, that at one point a server came to inquire whether there was something wrong, and then a neighboring table began to take sides in the debate. (For the record, I live in Hemingway country, and everyone here has an opinion about him and his works.) Cari and I left the bar firmly ensconced in our own points of view, but a couple of years ago–some three years after the original argument occurred–I came to see it from Cari’s point of view, and I now agree with her that The Sun Also Rises can be seen as a sort of pilgrimage tale. It took a while, but I was able to change my mind.

It is this capacity to change one’s mind, I will argue, that is important, indeed, indispensable, for the democratic process to thrive.

In the end, it may well be that the chief contribution that good teachers of literature make to culture is this: we provide a safe and accessible place for people to learn what cognitive dissonance feels like, and in doing so, we can help them acquire a tolerance for it. This tolerance, in turn, leads to an increase in the ability to participate in civil discourse, which is itself the bedrock of democratic thought and process. In other words, you can invest in STEAM classes all you want, but if you really want to make people good citizens, do not forget about literature courses.

In view of this discovery of mine, I feel it’s my duty to host a noncredit literature class of sorts in the fall, a discussion-type newsletter that covers the great works of English literature–whatever that means–from Beowulf to the early Romantic period, in which discussion is paramount. If you’re interested or have suggestions, please let me know by commenting or messaging me, and I’ll do my best to keep you in the loop.

And in the meantime, keep your minds open! Cognitive dissonance, uncomfortable as it is, may just be what will keep democracy alive in the critical days to come.

Elegy for Eavan Boland, 1944-2020

The only modern poet I have ever understood is Eavan Boland.

If you recognize that sentence as an echo of Boland’s wonderful poem “The Pomegranate,” you might share my feelings for her work. Boland’s death will probably not get much attention outside of Ireland, but I feel it’s right for me to acknowledge it here, where I talk about the things that are important to me.

In a time of so many losses, perhaps it’s silly to focus on one death, yet I do it out of selfishness, for myself and for what this poet’s work has meant to me. First, a confession: I am not a poet, nor am I really a great reader of poems. As a professor of literature, I have studied poetry, but I feel much more comfortable with the works of Wordsworth, Arnold, Shakespeare, even (dare I say it?) Milton than with contemporary poetry. To be honest, despite my elaborate education, I really don’t understand contemporary poetry–so I must not really “get” it. I’m willing to accept that judgment; after all, there are a lot of things I do get, so it’s a kind of trade-off. I realize I’m not a Michael Jordan of literary studies, which is why I rarely comment on poetry that was written after, say, 1850. But I feel it’s only right to mention here my attraction to, and reverence for, Boland’s poems, one of which (“This Moment“) I used for years to teach poetic language to my freshman and sophomore college students.

I first noticed Boland’s poems in the mid-90s, when I was teaching full time as an adjunct professor, still hoping to make my mark–whatever that was supposed to be–on the world. I had subscribed to the New Yorker, back in the days when it was read for literary, not political, reasons. This was during a period when poets and writers who submitted their work and not gotten it accepted for publication actually protested outside the offices of the magazine, stating that their work was just as bad as what was being published within the pages of the New Yorker and demanding equal time. (I thought about looking this story up on the internet, because, in an age of so much fake news, everything is easily verifiable, but forgive me–I decided not to. If the story about these outraged mediocre writers is not true, I don’t want to know about it. I love it and cling to it, and it does no one any harm, after all.)

I was very much aware of the opacity of much that was published in the New Yorker, and one evening after the children were in bed, having recently heard that story about the protesters, I shared it with my husband. To demonstrate how unreadable the stuff that was being published was, I grabbed a copy off our end table, thumbed through it until I found a poem, and started to read it out loud. After two or three lines, however, I stopped in mid-sentence. My husband said, “What? Why did you stop?” I looked up slowly, reluctant to pull my eyes away from the poem, and said, “It started to make sense to me. Actually, this is really good.”

I am not sure which poem of hers I was reading that evening. Perhaps it’s best that I don’t know, because it drives me to read so many of her poems, always searching for the Ur-poem, that first poem of hers that drove me to appreciate so much more of what she’s written. Boland’s poetry seems to me to explore the intersection of place and person, of history and modernity, in simple, sometimes stark, language. I love it for its depth, not for its breadth (sorry, Elizabeth Barrett Browning). I love the way it sinks its roots deep into the past, all the way back to myths and legends sometimes, yet still manages to retain a hold on the very real present.

Eavan Boland died yesterday, April 27, at the age of 75. You can read about her influence here, in an article by Fintan O’Toole of the Irish Times. Her poems can be found online at poets.org and on poetryfoundation.org.

Ecohats

Chief Petoskey sporting an Ecohat in Petoskey, Michigan.

 

We are facing an ecological emergency, and too little is being done to address the consequences of climate change. As individuals, our actions may seem inadequate. Yet every action, however small, can lead to something bigger. Change comes only as a result of collective will, and we can demonstrate that will by showing that we desire immediate political, social, and economic action in the face of global climate change.

Ecohats are not a solution, but they are a manifestation of will made public. Based on the pussyhat, a public display of support for women’s rights, Ecohats display support for immediate political action to address the need for systemic change to deal with climate change. They are easy to create; simply knit, crochet, or sew a hat in any shade of green to show your support for the people and organizations that are dedicated to addressing the issue of climate change, then wear it or display it with pride and dedication.

We can’t all be Greta Thunberg, but we can show our support for those people and organizations that are tirelessly working to address climate change, like  Citizens’ Climate Lobby, 350.org, Center for Biological Diversity, and others.

Please consider making, wearing, and displaying an Ecohat to show your support!

 

Ernest Hemingway is also wearing an Ecohat!

Six Rules for Reading (and Enjoying) Julius Caesar

I have always assumed that the best example of my argument that most people get Shakespeare plays all wrong would be Romeo and Juliet. But I have to admit I was mistaken. In fact, I think it is safe to posit that no other Shakespeare play is so maligned and misunderstood as Julius Caesar.

I think this is largely due to the way we teach the play in the United States. Of course, because we do teach the play in high school, Julius Caesar has always gotten tremendous exposure: almost everyone I’ve met has been forced to read the play during their high school career. In fact, I think it’s still on high school reading lists today. But that’s probably also exactly why it’s so misunderstood.

I’m not blaming high school teachers, because by and large they’re told to teach these plays without any adequate preparation. I suppose if anyone deserves blame, it’s the colleges that train teachers. But all blame aside, before I talk about what a great play it really is, and what a shame it is that most people summarily dismiss  Julius Caesar without ever really considering it, let’s look at why this has happened.

julius_caesarFirst of all, it goes without saying that making someone read a play is not a great way to get him or her to like it. Especially when that play is over 400 years old and written in (what seems to be) archaic language. But a still greater problem is that there is a tendency to use the play to teach Roman history, which is a serious mistake. (American high schools are not alone in this; Samuel Taylor Coleridge, for example, criticized the play for not being realistic in its portrayal of Roman politics back in the early 1800s.) In short, far too many people associate this play with a bunch of men showing a great deal of thigh or swathed in endless yards of material, flipping their togas around like an adolescent girl tosses her hair over her shoulder. It’s all too distracting, to say the least.

So, in order to set us back on the right track and get more people to read this fine play,  I’ve made a little list of rules to follow that will help my readers get the most enjoyment, emotional and intellectual, from the play.

Rule Number One: Forget about Roman history when you read this play. Forget about looking for anachronisms and mistakes on the part of Shakespeare’s use of history. Forget everything you know about tribunes, plebeians, Cicero, and the Festival of Lupercalia. The fact is, the history of the play hardly matters at all. Rather, the only thing that matters is that you know in the beginning moments that Caesar will die and that, whatever his motives and his character, Marcus Brutus will pay for his part in Caesar’s assassination with his own life and reputation.

Rule Number Two: Recognize that this is one of Shakespeare’s most suspenseful plays. Our foreknowledge of events in the play, far from making it predictable and boring, provides an element of suspense that should excite the audience. Here we can point to Alfred Hitchcock’s definition of suspense, in which he explains that it’s the fact that the audience knows there’s a bomb hidden under a table that makes the scene so fascinating to watch, that makes every sentence, every facial expression count with the audience. It’s the fact that we know Julius Caesar is going to die on the Ides of March that makes his refusal to follow the advice of the soothsayer, his wife Calpurnia, and Artemidorus so interesting. We become invested in all of his words and actions, just as our knowledge that Brutus is going to lose everything makes us become invested in him as a character as well. A good production of this play, then, would highlight the suspenseful nature within it, allowing the audience to react with an emotional response rather than mere intellectual curiosity.

Rule Number Three: Understand that this play is, like Coriolanus, highly critical of the Roman mob. Individuals from the mob may be quite witty, as in the opening scene, when a mere cobbler gets the better of one of the Roman Tribunes, but taken as a whole, the mob is easily swayed by rhetoric, highly materialistic, and downright vicious. (In one often-excluded scene–III.iii–a poet is on his way to Caesar’s funeral when he is accosted by the crowd, mistaken for one of the conspirators, and carried off to be torn to pieces.) It’s almost as if this representation of mob mentality–the Elizabethan equivalent of populism, if you will–is something that Shakespeare introduces in 1599 in Julius Caesar, only to return to it nine years later to explore in greater detail in Coriolanus.

Rule Number Four: Recognize that this play, like many of Shakespeare’s plays, is misnamed. It is not about Julius Caesar. It’s really all about Marcus Brutus, who is the tragic hero of the play. He is doomed from the outset, because (1) it is his patriotism and his love of the Roman Republic, not a desire for gain, that drives him to commit murder; (2) he becomes enamored of his own reputation and convinces himself that it is his duty to commit murder and to break the law; (3) he falls victim to this egotism and loses everything because of it. Audience members really shouldn’t give a hoot about Julius Caesar; he’s a jerk who gets pretty much what he deserves. But Brutus is a tragic hero with a tragic flaw, a character whose every step, much like Oedipus, takes him further and further into his own doom. The soliloquies Brutus speaks are similar to those in Macbeth, revealing a character that is not inherently bad but rather deficient in logic, self-awareness, and respect for others. In fact, in many ways, it’s interesting to look at Julius Caesar as a rough draft not only of Coriolanus but of Macbeth as well.

Rule Number Five: Appreciate the dark comedy in the play. Shakespeare plays with his audience from the outset, in the comic first scene between the workmen and the Roman Tribunes, but another great comedic scene is Act IV, scene iii, when Brutus and Cassius meet up before the big battle and end up in an argument that resembles nothing more than a couple of young boys squabbling, even descending into a “did not, did so” level. This scene would be hilarious if the stakes weren’t so high, and if we didn’t know that disaster was imminent.

Rule Number Six: Experience the play without preconceptions, without the baggage that undoubtedly is left over from your tenth-grade English class. Once you do this, you’ll realize that the play is timely. It explores some really pertinent questions, ones which societies have dealt with time and time again, and which we are dealing with at this very moment. For example, when is it permissible to commit a wrong in order for the greater good to benefit? (surely Immanuel Kant would have something to say about this, along with Jeremy Bentham). How secure is a republic when its citizens are poor thinkers who can be swayed by mere rhetoric and emotionalism instead of reason? What course of action should be taken when a megalomaniac takes over an entire nation, and no one has the guts to stop him through any legal or offical means?

In the end, Brutus’s tragedy is that he immolates his personal, individual self in his public and civic responsibilities. Unfortunately, it is the inability to understand this sacrifice and the conflict it creates, not the play’s historical setting in a distant and hazy past, that has made it inaccessible for generations of American high school students. Too many decades have gone by since civic responsibility has been considered an important element in our education, with the sad but inevitable result that several generations of students can no longer understand the real tragedy in this play, which is certainly not the assassination of Julius Caesar.

But perhaps this is about to change. In the last few months, we’ve been witnessing a new generation teaching themselves about civic involvement, since no one will teach it to them. And as I consider the brave civic movement begun by the students from Marjory Stoneman Douglas High School, I am hopeful that from now on it’s just possible that reading Julius Caesar could become not a wasted module in an English class, but the single most important reading experience in a high-school student’s career.

Donald-Trump-as-Julius-Caesar-500x281
Image from https://angiesdiary.com/featured/donald-trump-as-julius-caesar/

 

Teaching Behind the Lines

French resistance fighters putting up posters
French Resistance Fighters putting up posters.  Image from “History in Photos” Blog (available here)

It’s been a year now since the election, and here I am, still fighting off a sense of futility and hopelessness about the future. During that time, the United States has pulled out of the Paris Accord in an astounding demonstration of willful ignorance about climate change, suffered a spate of horrific mass murders due to lax gun laws, and threatened nuclear war with North Korea. Suffice it to say that things are not going well.

But I should point out that the emphasis in my first sentence should be on the word “fighting,” because that’s what I’m doing these days: in my own small way, I’m waging a tiny war on some of the ignorance and egotism that seems to be ruling my country these days. Somewhere (I can’t find it anymore, and perhaps that’s just as well), the French novelist Léon Werth said that any action taken against tyranny, no matter how small, no matter how personal, helps to make things better. I’ve taken his words to heart, and I’m using this space to take stock of what I’ve done in the last year. I do this not to brag–far from it, because I know I’ve done far too little–but to remind myself that although I feel powerless too much of the time, I am not quite as powerless as I seem.

Let me begin, however, by saying what I haven’t done. I have not run for office. I did that in 2012, perhaps having had an inkling that things were not going well in my part of the country, but I was crushed by an unresponsive political system, apathy, and my own supreme unsuitability for the task. I am not ready to run for office again. In fact, I may never be ready to run again. I did write about my experience, however, and over the past year, I have encouraged other people, specifically women, to run for office. I’ve talked to a few activist groups about my experiences, and perhaps most important of all, I’ve donated to campaigns.

The thing I’ve done that merits any kind of discussion, however, is what I would call “resistance teaching”: going behind the lines of smug, self-satisfied ignorance, and using any tools I have to fight it. I still believe, naive as I am, that education can fight tyranny, injustice, and inequality. So I have engaged in a few activities that will, I hope, result in creating discussions, examining benighted attitudes, and opening up minds. I haven’t done anything too flamboyant, mind you–just a few actions that will hopefully develop into something more tangible in the months to come.

Here is my list:

  1. In spite of feeling gloomy about the future, I’ve continued with my writing, because I felt that even in difficult times, people should concentrate on making art. I self-published my second novel, and I wrote about it here, explaining why self-publishing can be an act of resistance in and of itself.
  2. I began to translate a novel about WW I, written by Léon Werth. I am now nearing my second revision of the translation. I have submitted a chapter of it to several fine magazines and received some nice rejection letters. I will be using my translation to present a short paper on WW I writing and Hemingway at the International Hemingway Conference in Paris this summer.
  3. I’ve traveled–quite a bit. I went to Italy, to Wales, to France, to Dallas, to Boston, and some other places that I can’t remember now. Traveling is important to open up barriers, intellectual as well as political. For example, in France I learned that while we Americans thought of Emmanuel Macron as a kind of savior for the French, he was viewed with some real skepticism and even fear by his electorate. Sure, he was better than Marine LePen–but he was still an unknown quantity, and most French people I met expressed some degree of hesitation about endorsing him.
  4. I directed a play for my community theatre group. Although it was hard and very time-consuming, I discovered that I really believe in the value of community theatre, where a group of individuals come together in a selfless (for the most part) effort to bring the words and ideas of a person long dead back to life. So what if audiences are tiny? It’s the work that matters, not the reception of it.
  5. I gave a talk at the C.S. Lewis Festival, which you can read here. It was fun and stimulating, and I remembered just how much I enjoy thinking and exploring literature and the ideas that shape it.

All of these things are fine, but I think the most important thing I’ve done in the past year is going back into the classroom again, this time as a substitute to help out some friends, but also to engage in what I think of “resistance teaching.” As a substitute professor, as a lifelong learning instructor, I can engage students and encourage them to think without being bound by a syllabus or any other requirements. I can get behind the lines of bureaucratic structures and work to create an atmosphere of free discussion and intellectual exploration. It is small work, and it may not be very effective, but I have taken it on as my own work, my own idiosyncratic way of combating the heartless ignorance, the dangerous half-assed education that prevails in our society.

I have always loved the idea of Resistance Fighters. I just never thought I’d be one myself.

My Short, Tragic Career as an Independent Scholar

 

20170717_121338

Several months ago, I had what seemed like a fantastic idea: now that I was retired from teaching English at a community college, I could engage in critical research, something I’d missed during those years when I taught five or more classes a semester. I had managed to write a couple of critical articles in the last few years of my tenure at a small, rural two-year college in Northern Michigan, but it was difficult, not only because of the heavy demands of teaching, but also because I had very limited access to scholarly resources. Indeed, it is largely due to very generous former students who had moved on to major research institutions that I was able to engage in any kind of scholarly research, a situation which may seem ironic to some readers, but which is really just closing the loop of teacher and student in a fitting and natural way.

And so last fall, on the suggestion of a former student, I decided to throw my hat in the ring and apply to  a scholarly conference on Dickens, and my proposal was chosen. In time, I wrote my paper (on Dickens and Music– specifically on two downtrodden characters who play the flute and clarinet in David Copperfield and Little Dorrit, respectively) and prepared for my part in the conference.

It had been close to 25 years since I had read a paper at a conference, and so I was understandably nervous. Back then, there was no internet to search for information about conference presentations, but now I was able to do my homework, and thus I found a piece of advice that made a lot of sense: remember, the article emphasized, that a conference paper is an opportunity to test out ideas, to play with them in the presence of others, and to learn how other scholars respond to them, rather than a place to read a paper, an article, or a section of a book out loud before a bored audience. Having taught public speaking for over a decade, I could see that this made a lot of sense: scholarly articles and papers are not adapted to oral presentations, since they are composed of complex ideas buttressed by a great many references to support their assertions. To read such a work to an audience seemed to me, once I reflected on it, a ridiculous proposition, and would surely bore not only the audience, but any self-respecting speaker as well.

I wrote my paper accordingly. I kept it under the fifteen-minute limit that the moderator practically begged the panelists to adhere to in a pre-conference email. I made sure I had amusing anecdotes and witty bon mots. I concocted a clever PowerPoint presentation to go with the paper, just in case my audience got bored with the ideas I was trying out. I triple-spaced my copy of the essay, and I–the queen of eye contact, as my former speech students can attest–I practiced it just enough to become familiar with my own words, but not so much that I would become complacent with them and confuse myself by ad-libbing too freely. In short, I arrived at the conference with a bit of nervousness, but with the feeling that I had prepared myself for the ordeal, and that my paper would meet with amused interest and perhaps even some admiration.

It was not exactly a disaster, but it was certainly not a success.

To be honest, I consider it a failure.

It wasn’t that the paper was bad. In fact, I was satisfied with the way I presented it. But my audience didn’t know what to do with presentation. This might be because it was very short compared to all the other presentations (silly me, to think that academics would actually follow explicit directions!). Or it could be because it wasn’t quite as scholarly as the other papers. After all, my presentation hadn’t been published in a journal; it was, as C.S. Lewis might have called it, much more of a “supposal” than a fully-fledged argument. Perhaps as well there was something ironic in my stance, as if I somehow communicated my feeling that research in the humanities is a kind of glorified rabbit hunt that is fun while it lasts but that rarely leads to any tangible, life-changing moment of revelation.

Yet this is not to say that humanities research is useless. It isn’t. It develops and hones all sorts of wonderful talents that enrich the lives of those who engage in it and those who merely dip into it from time to time. I believe in the value of interpreting books and arguing about those interpretations; in fact, I believe that engaging in such discussions can draw human beings together as nothing else can, even at the very moments when we argue most fiercely about competing and contrasting interpretations. This is something, as Mark Slouka points out in his magnificent essay “Dehumanized,” that STEM fields cannot do, no matter how much adminstrators and government officials laud them, pandering to them with ever-increasing budgets at the expense of the humanities.

And this is, ultimately, why I left the conference depressed and disappointed. I had created, in the years since I’d left academia, an idealized image of it that was inclusive, one that recognized its own innate absurdity. In other words, sometime in the last two decades, I had recognized that research in the humanities was valuable not because it produced any particular thing, but because it produced a way of looking at the world we inhabit with a critical acuity that makes us better thinkers and ultimately better citizens. The world of research, for me, is simply a playground in which we all can exercise our critical and creative faculties. Yet the conference I attended seemed to be focused on research as object: indeed, as an object of exchange, a widget to be documented, tallied, and added to a spreadsheet that measures worth.

Perhaps its unfair of me to characterize it in this way. After all, most of the people attending the conference were, unlike me, still very much a part of an academic marketplace, one in which important decisions like tenure, admission to graduate programs, promotions, and departmental budgets are decided, at least in part, by things like conference attendance and presentations. It is unfair of me to judge them when I am no longer engaged in that particular game.

But the very fact that I am not in the game allows me to see it with some degree of clarity, and what I see is depressing. One cannot fight the dehumanization of academia, with its insistent mirroring of capitalism, by replicating that capitalism inside the ivy tower; one cannot expect the humanities to maintain any kind of serious effect on our culture when those charged with propagating the study of humanities are complicit in reducing humanities research to mere line items on a curriculum vitae or research-laden objects of exchange.

I can theorize no solution to this problem beyond inculcating a revolution of ideas within the academy in an effort to fight the now ubiquitous goal of bankrupting the study of arts and humanities, a sordid goal which now seems to characterize the age we live in. And I have no idea how to bring about such a revolution. But I do know this: I will return to my own study with the knowledge that even my small, inconsequential, and isolated critical inquiries are minute revolutions in and of themselves. As we say in English studies, it’s the journey that’s important, not the destination. And in the end, I feel confident that it will take far more than one awkward presentation at a conference to stop me from pursuing my own idiosyncratic path of research and inquiry into the literature I love.

On Becoming Professor Brulov

Image from www.aveleyman.com

There’s a part in one of my favorite Hitchcock movies, Spellbound (1945), in which Ingrid Bergman, a clinical psychologist, takes Gregory Peck, a man who has amnesia and may have commited a murder, to her former psychology professor’s house to hide out from the authorities. Professor Brulov is the epitome of a German academic: eccentric, kind, and highly intelligent, he is genuinely happy to see Ingrid Bergman, who, he says, was his best assistant. It’s a wonderful part of an interesting movie, but lately it’s taken on even greater significance for me.

I first watched the movie on television as a teenager, at which time I identified with Ingrid Bergman (of course I did–the movie is all about Freudian wish fulfillment, after all). Some years ago, as a middle-aged professor, I watched it again with my students when I taught a course on the films of Alfred Hitchcock, and I realized with a rather unpleasant shock that I had evolved without realizing it from the young, attractive, and inquisitive Dr. Constance Peterson into the aged, almost-but-not-quite-grumpy Profesor Brulov. (In Mel Brooks’ hilarious spoof of Alfred Hitchcock’s movies, High Anxiety [1977], Professor Brulov is transformed into Professor Lilloman, which the protagonist mistakenly pronounces as “Professor Little Old Man.”) And, while it has taken me a few years to accept this transformation, I’m now fairly comfortable with my new, much less glamorous, role as mentor to my former students.

The reason is simple. Constance Petersons are a dime a dozen. The world is filled with beautiful young people making their mark on the world. But Brulov–he’s different. In fact, he’s quite special. Think of it this way: When Peterson is in trouble, she seeks him out, and Brulov helps her without asking any difficult questions, despite the fact that he knows she’s lying to him. He trusts her even more than she trusts him, which is touching, in a way. And so one thing that this very complex movie does is set up the idea of a mentor relationship between Brulov and his former student. It’s an interesting side angle to the movie that I never really noticed before.

And, now, in my retirement, I am learning to embrace this new Brulovian stage of life. I have had very few, if any, mentors in my own career, so while I’m not too proficient at it yet, I hope to grow into the role in the years to come. The way I see it, we need more Professor Brulovs in this world; we can’t all be Ingrid Bergman or Gregory Peck, after all. I’m happy that my students remember me with something other than aversion, after all, and so becoming Professor Brulov is, at least for now, quite enough for me.

GettyImages