Spring has finally come to Northern Michigan, where I live. One might think that would make things easier, that creative juices would flow as freely as the sap in the trees and plants that are absorbing the sunshine. But unfortunately that’s not how it works. Spring is a dicey time here, and not just because of the mud left behind by the melting of the snow. (Another thing that’s left behind is shovel-loads of dog feces, which the receding snow banks offer up as yet another sacrifice to the disappearance of winter.) The truth is that when the weather clears and the outside world looks brighter, sometimes it’s disconcerting when your internal world hasn’t kept pace. It can be depressing, because it’s hard to kick yourself in gear to get things done, and in spring time, you have no excuse not to.
So when I saw that a local store was offering a poetry workshop during the month of April in honor of National Poetry Month, I signed up for it on a whim. I don’t know whether I will write any poems as a result of this workshop, but that’s not really the point. What I’d like to happen is for me to rekindle my creative impulses, and so far, though I’m still wrestling with SI (Springtime Inertia), I think I can detect the beginning of some movement towards more of a creative flow.
But the workshop has reminded me of an important question I’ve had over the last few years–one that may be unanswerable but still deserves to be asked:
What makes good writing?
It’s a question I’ve been pondering seriously, even though it might sound like I’m being flippant. Having taught literature throughout my professional career, I should be able to answer that question without too much trouble. For example, as a writing instructor, I’d say, “Good writing is clear, succinct, and precise. It shows consideration for the reader by adhering to the commonly accepted rules of grammar, spelling, and punctuation. It connects the ideas it presents in a way that is easy to read and understand.” I think that’s a good start for a college composition class, anyway.
But clearly this will not work for most creative writing. Poets, for example, often show little consideration for their readers. In fact, I’m not sure contemporary poets actually write with readers in mind; often they seem to be jotting down notes to themselves for later reading. Not that there is anything wrong with that at all–this is, after all, why I am interested in poetry at this point in my life. I’ve realized that there are certain subjects and ideas I want to explore that are better suited for poems than for short essays like this one, and I think it’s worth the time and effort to try to articulate them in poetic form.
However, let’s get back to the question: what does make good creative writing? I am having a hard time formulating an answer. As I get older, I seem to be suffering from the reverse of the Dunning-Kruger Effect. I am less sure of everything I think about, even questions which I once felt sure of the answer to. But as far as good writing goes, I have come up with a provisional answer, and although I don’t find it very satisfying, I thought I’d try it out here.
I will begin by saying that the question itself is misguided. That’s because there is no such thing as good writing–only good reading. When we ask the question “what makes good writing?” we’re actually looking through the wrong end of the telescope. A good reader, I submit, is able to read almost anything and be enriched by the experience. A good reader will read any text, be it a poem, essay, novel, or piece of non-fiction, and find connections to other works. Of course, this is not to say there is no such thing as bad writing–I think we all know that it does exist–but that is a different issue. Seeing examples of bad writing will help us understand what not to do, but it won’t really help creative writers learn what to do to create good writing, so once again, I think it’s best to turn the question on its head and focus on what makes a good reader rather than what makes good writing.
After all, it has to be far easier to learn the skills required to be a good reader than to learn to be a good writer. And there are all sorts of advantages for the good reader–not only personal and professional, but social and political, as well. I think I’ll have to ponder on this one for a week or two, however, before I begin to identify how to create good readers and what makes good reading. For now, though, I’ll end with the suggestion that the world would surely be a better place if there were more good readers in it. I’ll go even further and add that maybe we’d all better get to work to see how we can do our part to create good, solid readers, because good readers make good citizens, and we can surely use a great many more good citizens in our world right now.
It’s been a busy summer for me, and I’m just getting into the swing of the school year. Some of my readers may recall that last year, for some strange reason, I decided to send myself back to school to acquire the math knowledge I never managed to master as a young adult. It’s been a struggle, but I haven’t given up yet, probably for three reasons: first, I’m a non-traditional (old) student, so I have a lot of experience being a student as well as a teacher; second, I have more patience than I did as a young person; and third, I have a lingering professional interest in how students learn. It occurred to me today, after a particularly frustrating experience which involved a trigonometry test and my concomitant inability to answer what seemed to be the most basic questions, that one thing math teachers could do is something we writing teachers have been doing for quite a few years now: asking our students to submit journals on their learning experiences. I actually think this is a somewhat brilliant idea, so, to test it out, I decided to try it out here, on my blog. So from here on out, I guess I should consider this my first entry in my math learning log.
But before I get started, a brief warning. I am horrifically bad at keeping journals. I am probably even worse at journaling than I am at math, and that’s saying a lot. From the outset, I warn any reader of this math learning log that there will huge lacunae here, because I often start a journal and then completely forget about it. Nevertheless, I am going to forge ahead with this journaling idea, because one of the reasons I started this whole math learning process in the first place was to see what happened when a person of normal intelligence but sparse math knowledge attempts to learn enough math to get to calculus. Is such a progression–from basic College Algebra to Calculus I–possible? Today, after my test this morning, I have serious doubts. But I have to set those doubts and my growing frustration aside, and remember to regard myself not as a frustrated and humbled learner who knows she should have been able to do better on this test, but as a subject in an experiment. In a sense, I’m like that scientist in a black-and-white film who decides to test out a vaccine, or an antivenom, on himself. I need to maintain a sense of calm detachment, even while knowing that the results of this test could be disastrous. And, getting back to the journaling idea, documenting my journey through writing may well be more instructive than documenting it through grades.
For the moment, I will not mention my other reason for taking math courses–the existential, ideological, philosophical, or, if you like, the religious reason for this foray into mathemaltical studies. However, I write about it here.
Now, on to the subject at hand: the trig test on basic functions. I thought I had mastered about 2/3 of the concepts for this test, but I was wrong. Even those concepts I felt sure of slipped away from me as I began the test, in exactly the same way the memory of a dream evaporates upon our waking. By the end of the test, working frantically against the clock (and I have to add here that although I’ve always been a relatively fast test-taker, today I worked up to the last second), I was both frustrated and ashamed. Yet, to be fair, my poor performance (I do expect to pass the test, but only because of partial credit and because I wrote down everything I thought that could be pertinent to each problem), is due neither to laziness nor to disinterest. I had to miss four classes because I was taking care of my grandson in a different city (okay, so that was a delightful experience, and I wouldn’t have missed it for the world, to be honest). I did go to the tutoring center when I got back, but after an hour or so there, I deluded myself into thinking that I actually understood what had been covered. I also watched about three hours total of various YouTube videos (Khan Academy, Organic Chemistry Tutor, Dennis Davis’s series) about the trig concepts covered, but in the end, I think they might have hurt me, making me think I understood things when I really didn’t.
The frustration is real. At my age (62), learning something new can be unfamiliar, and it’s easy to see why this is so. We oldsters found out what we are good at and what we like long ago, and we have kept doing those things for decades, getting better and better at them through the years. We tend to forget how hard it is to learn, how much energy and commitment it takes, and most of all, how very frustrating and embarrassing it can be. In other words, learning something new from scratch, without any context to fit things into, is not only intellectually challenging but also emotionally draining. I suspect young people learn more easily not only because they are quicker and more resilient, but also because they don’t know any better. Because they don’t know yet what it feels like to have actually mastered something, not mastering or “getting” concepts doesn’t produce as much cognitive dissonance in them as it does in us oldsters.
Whatever my grade on the test this morning, and I don’t expect it to be good, I have to say that I don’t think studying more would have helped me, because I know I wasn’t studying the right things in the right ways. (That’s probably where that missed week of classes hurt me.) And although I was incredibly frustrated when I turned in the test, I feel less so now, a couple of hours later, because I realize that I don’t have to pass this class to obtain knowledge from it. In fact, I know I can take the class over next semester whatever grade I get this time, and that doing so is no great dishonor or waste of time. I have the luxury of taking my time with this, and although that’s really hard to remember when I’m in the throes of studying and test-taking, it’s the way all learning, in a perfect world, should be.
When I was teaching college English courses, my best students, the ones who really paid attention and were hungry for knowledge and ideas, would often come up to me after a class and say something like, “You brought up the French Revolution today while we were talking about William Wordsworth. This morning, in my history class, Professor X also talked about it. And yesterday, in Sociology, Professor Y mentioned it, too. Did you guys get together and coordinate your lectures for this week?”
Of course, the answer was always “no.” Most professors I know barely have time to prepare their own lectures, much less coordinate them along the lines of a master plan for illustrating Western Civilization. It was hard, however, to get the students to believe this; they really thought that since we all brought up the same themes in our classes, often in the same week, we must have done it on purpose. But the truth was simple, and it wasn’t magic or even serendipity. The students were just more aware than they had been before, and allusions that had passed by them unnoticed in earlier days were now noteworthy.
I’ve experienced something of this phenomenon myself in recent days, while reading Colin Tudge’s book The Tree and listening to Karl Popper’s The Open Society and Its Enemies–two books, one on natural science and the other on philosophy, that would seem to have few if any common themes. In this case, the subject both authors touched on was nomenclature and definitions. Previously, I would have never noticed this coincidence, but now I find myself in the same position as my former students, hyper-aware of the fact that even seemingly unrelated subjects can have common themes.
There’s a good reason why I am experiencing what my students did; I am now myself a student, so it makes sense that I’d see things through their eyes. All of which leads me to my main idea for this post: University Redux, or returning to college in later life. It’s an idea that I believe might just improve the lives of many people at this very strange point in our lives.
I happened upon the concept in this way: after five or so years of retirement, I realized that I had lost the sense of my ikigai–my purpose in life. I am not exactly sure how that happened. When I took early retirement at the end of the school year in 2015, I had grand ideas of throwing myself into writing and research projects. But somehow I lost the thread of what I was doing, and even more frightening, why I was doing it. The political climate during the past few years certainly didn’t help matters, either. And so I began to question what it was that I actually had to offer the wide world. I began to realize that the answer was very little indeed.
Terrified at some level, I clutched at the things that made me happy: gardening, pets, reading. But there was no unifying thread between these various pursuits, and I began to feel that I was just a dilettante, perhaps even a hedonist, chasing after little pleasures in life. Hedonism is fine for some people, but I’m more of a stoic myself, and so the cognitive dissonance arising from this lifestyle was difficult for me to handle. And then, after drifting around for three or four years, I discovered a solution.
A little background information first: I have a Ph.D. in English, and my dissertation was on the representation of female insanity in Victorian novels. I’ve published a small number of articles, but as a community college professor, I did not have the kind of academic career that rewarded research. (I should say I tried to throw myself into academic research as a means of finding my ikigai, to no avail. I wrote about that experience here.) As a professor, I taught freshman English, as well as survey courses, at a small, rural community college. Most of my adult life revolved around the academic calendar, which as a retiree ususally left me feeling aimless, even bereft, when my old colleagues returned to campus in the fall, while I stayed at home or headed off on a trip.
A year and a half ago, however, I found my solution, and although I’ve had a few bumps in the road, I am generally satisfied with it. Armed with the knowledge that I was, intellectually at least, most fulfilled when I was a college student, I have simply sent myself back to college. Now, I don’t mean by this that I actually enrolled in a course of study at a university. I did, in fact, think about doing so, but it really made little sense. I don’t need another degree, certainly; besides, I live in an area that is too remote to attend classes. Yet I realized that if there was one thing I knew how to do, it was how to create a course. I also knew how to research. So, I convinced myself that living in the world of ideas, that cultivating the life of the mind, was a worthy pursuit in and of itself, and I gave myself permission to undertake my own course of study. I sent myself back to college without worrying how practical it was. I relied on my own knowledge and ability (Emerson would be proud!), as well as a certain degree of nosiness (“intellectual curiosity” is a nicer term), and I began to use my time in the pursuit of knowledge–knowing, of course, that any knowledge gained would have no value in the “real” world. It wouldn’t pay my rent, or gain me prestige, or produce anything remotely valuable in practical terms.
This last bit was the hardest part. I was raised to believe, as are most people in American society, that one must have practical skills, the proof of which is whether one can gain money by exercising them. If you study literature, you must be a teacher of some kind. If you play music, you must get paying gigs. If you like numbers, then you should consider engineering, accounting, or business. The rise of social media, where everyone is constantly sharing their successes (and academics are often the worst in this respect), makes it even more difficult to slip the bonds of materialism, to escape the all-consuming attention economy. My brainwashing by the economic and social order was very nearly complete: it was, in other words, quite hard for me to give myself permission to do something for the sake of the thing itself, with no ulterior motives. I had to give myself many stern lectures in an effort to recreate the mindset of my twenty-year-old naive self, saying for example that just reading Paradise Lost to know and understand it was enough; I didn’t have to parlay my reading and understanding into an article, a blog, or a work of fiction. (Full disclosure: having just written that, I will point out that I did indeed write a blog about Paradise Lost. You can’t win them all.) One additional but unplanned benefit of this odd program of study is that it fit in quite well with the year of Covid lockdown we’ve all experienced. Since I was already engaged in a purposeless aim, the enforced break in social life really didn’t affect me that much.
What does my course of study look like? Reading, mainly, although I know YouTube has many fine lectures to access. I read books on natural science (trying to fill a large gap produced during my first time at college), as well as history; this year, the topic has been the Franco-Prussian War and the Paris Commune. I study foreign languages on Duolingo (German, French, a bit of Spanish) while occasionally trying to read books in those languages. I have participated in a highly enjoyable two-person online reading group of The Iliad and The Odyssey (thanks, Anne!) Thanks to my recent discovery of Karl Popper, I foresee myself studying philosophy, perhaps beginning with Plato and Aristotle. I’ve taken FutureLearn classes on Ancient Rome, Coursera classes on The United States through Foreign Eyes, and several others. I’ve listened and re-listened to various In Our Time podcasts. I have taxed the local library with my requests for books from other network libraries, and I swear some of those books haven’t been checked out in a decade or more. To be honest, I don’t understand a good part of what I read, but this doesn’t bother me as it used to do the first time around. If I’ve learned one thing from serving on the local city council, it’s that you don’t have to understand everything you read, but you do have to read everything you’re given. Sometimes understanding comes much later, long after the book is returned–and that’s okay.
I’m not sure where this intellectual journey will lead, or if it will in fact lead anywhere. But I’m satisfied with it. I think I’ve chanced upon something important, something which society with its various pressures has very nearly strangled in me for the last thirty years: the unimpeded desire for knowledge, the childlike ability to search for answers just because, and the confidence to look for those answers freely, unattached to any hope of gain or prestige. It takes some getting used to, rather like a new diet or exercise program, but I’m pleased with the results at last, and I am enjoying my second dose of college life.
The United States is a mess right now. Beset by a corrupt president and his corporate cronies, plagued by a — um — plague, Americans are experiencing an attack on democracy from within. So just how did we get to this point in history?
I’ve given it a bit of thought, and I’ve come up with a theory. Like many theories, it’s built on a certain amount of critical observation and a large degree of personal experience. Marry those things to each other, and you can often explain even the most puzzling enigmas. Here, then, is my stab at explaining how American society became so divisive that agreement on any political topic has become virtually impossible, leaving a vaccuum so large and so empty that corruption and the will to power can ensure political victory.
I maintain that this ideological binarism in the United States is caused by two things: prejudice (racism has, in many ways, always determined our political reality), and lack of critical thinking skills (how else could so many people fail to see Trump for what he really is and what he really represents?) Both of these problems result from poor education. For example, prejudice certainly exists in all societies, but the job of a proper education in a free society is to eradicate, or at least to combat, prejudice and flawed beliefs. Similarly, critical thinking skills, while amorphous and hard to define, can be acquired through years of education, whether by conducting experiements in chemistry lab or by explicating Shakespeare’s sonnets. It follows, then, that something must be radically wrong with our educational system for close to half of the population of the United States to be fooled into thinking that Donald Trump can actually be good for this country, much less for the world at large.
In short, there has always been a possibility that a monster like Trump would appear on the political scene. Education should have saved us from having to watch him for the last four years, and the last month in particular, as he tried to dismantle our democracy. Yet it didn’t. So the question we have to ask is this: Where does the failure in education lie?
The trendy answer would be that this failure is a feature, not a bug, in American education, which was always designed to mis-educate the population in order to make it more pliable, more willing to follow demogogues such as Trump. But I’m not satisfied with this answer. It’s too easy, and more important, it doesn’t help us get back on track by addressing the failure (if that’s even possible at this point). So I kept searching for an explanation.
I’ve come up with the following premises. First, the divisions in the country are caused by a lack of shared values–this much is clear. For nearly half the American people, Trump is the apotheosis of greedy egotism, a malignant narcissist who is willing to betray, even to destroy, his country in order to get what he wants, so that he can “win” at the system. For the other half, Trump is a breath of fresh air, a non-politician who was willing to stride into the morass of Washington in order to clean it up and set American business back on its feet. These two factions will never be able to agree–not on the subject of Trump, and very likely, not on any other subject of importance to Americans.
It follows that these two views are irreconcilable precisely because they reflect a dichotomy in values. Values are the intrinsic beliefs that an individual holds about what’s right and wrong; when those beliefs are shared by a large enough group, they become an ethical system. Ethics, the shared sense of right and wrong, seems to be important in a society; as we watch ours disintegrate, we can see that without a sense of ethics, society splinters into factions. Other countries teach ethics as a required subject in high school classes; in the United States, however, only philosophy majors in universities ever take classes on ethics. Most Americans, we might once have said, don’t need such classes, since they experience their ethics every day. If that ever was true, it certainly isn’t so any more.
Yet I would argue that Americans used to have an ethical belief system. We certainly didn’t live up to it, and it was flawed in many ways, but it did exist, and that’s very different from having no ethical system at all. It makes sense to postulate that some time back around the turn of the 21st century, ethics began to disappear from society. I’m not saying that people became unethical, but rather that ethics ceased to matter, and as it faded away, it ceased to exist as a kind of social glue that could hold Americans together.
I think I know how this happened, but be warned: my view is pretty far-fetched. Here goes. Back in the 1970s and 1980s, literary theory poached upon the realm of philosophy, resulting in a collection of theories that insisted a literary text could be read in any number of ways, and that no single reading of a text was the authoritative one. This kind of reading and interpretation amounted to an attack on the authority of the writer and the dominant ideology that produced him or her, as it destabilized the way texts were written, read, and understood. I now see that just as the text became destabilized with this new way of reading, so did everything else. In other words, if an English professor could argue that Shakespeare didn’t belong in the literary canon any longer, that all texts are equally valid and valuable (I’ve argued this myself at times), the result is an attack not only on authority (which was the intention), but also on communality, by which I mean society’s shared sense of what it values, whether it’s Hamlet or Gilligan’s Island. This splintering of values was exacerbated by the advent of cable television and internet music sources; no one was watching or listening to the same things any more, and it became increasingly harder to find any shared ideological place to begin discussions. In other words, the flip side of diversity and multiplicity–noble goals in and of themselves–is a dark one, and now, forty years on, we are witnessing the social danger inherent in dismantling not only the canon, but any system of judgment to assess its contents as well.
Here’s a personal illustration. A couple of years ago, I taught a college Shakespeare class, and on a whim I asked my students to help me define characters from Coriolanus using Dungeons and Dragons character alignment patterns. It was the kind of exercise that would have been a smashing success in my earlier teaching career, the very thing that garnered me three teaching awards within five years. But this time it didn’t work. No one was watching the same television shows, reading the same books, or remembering the same historical events, and so there was no way to come up with good examples that worked for the entire class to illustrate character types. I began to see then that a splintered society might be freeing, but at what cost if we had ceased to be able to communicate effectively?
It’s not a huge leap to get from that Shakespeare class to the fragmentation of a political ideology that leaves, in the wreckage it’s produced, the door wide open to oligarchy, kleptocracy, and fascism. There are doubtless many things to blame, but surely one of them is the kind of socially irresponsible literary theory that we played around with back in the 1980s. I distinctly remember one theorist saying something to the effect that no one has ever been shot for being a deconstructionist, and while that may be true, it is not to say that deconstructionist theory, or any kind of theory that regards its work as mere play, is safe for the society it inhabits. Indeed, we may well be witnessing how very dangerous unprincipled theoretical play can turn out to be, even decades after it has held sway.
Like many other people my age, I received a fairly narrow education in high school. I compounded the damage done by willingly narrowing down my fields of interest even further once I got to college, and by necessity, still more when I was in graduate school. It follows that now, as a retired professor of English entering my seventh decade on this planet, I can discourse endlessly about William Wordsworth, Sir Walter Scott, and the fact that the former gave the latter a gift of a Border Terrier dog named Pepper (honest!), but until recently, I couldn’t tell you a thing about how trees grow and what happens in a forest.
Had I become aware of this lacuna in my education earlier in my life, it probably would not have bothered me. Many people, perhaps most people, live with their own ignorance staring them in the face. I was no different; if I was ignorant in one topic, I had other areas of knowledge to compensate for it, right? Perhaps what is remarkable isn’t so much my ignorance, but rather my decision to remedy it, although this scenario must be repeated endlessly among human beings, in adults and children alike. One day, seemingly out of the blue, we decide we want to know more about something, and so we take it up, read about it, perhaps compulsively, until we educate ourselves out of our own ignorance.
The event which sparked my self-education is simple to identify: my husband and I bought a forest. Describing it in this way, however, makes me cringe with distaste. I hate using the term “bought,” which denotes a mere cash transaction. It’s better, I think, to say that I acquired a forest, becoming its guardian and its careful observer. By acquiring it and walking in it hundreds of times, I fell in love with it. And like any new lover, I wanted to know everything about it I could; so, when I wasn’t in the forest, or doing the endless quotidian tasks that make up the greater part of a person’s life, I set out on my journey to learn as much as I could absorb at this late date about trees, ecology, and conservation.
One of the first books I read was Aldo Leopold’s Sand County Almanac, which is pretty much the Lyrical Ballads (for you Wordsworth lovers) of ecology writing. The idea of a Land Ethic was new to me, but I could see that it was as important, in its way, as the idea of negative capability was to Keats. I took two full pages of handwritten notes from Leopold’s book, which I won’t go into here, as it deserves its own blog post. For now, I’ll just say that several generations of conservationists and ecologists have been influenced by Leopold’s book; I was late to the party, as usual, and had never even heard of it–evidence of my flimsy natural science/ecology education.
Leopold’s book is important, but I have also been touched by a book I picked, at random, from the shelf of my local library: The Biography of a Tree, by James P. Jackson. It is just what it says it is: the life story of a white oak tree, from acorn to seedling, then on to forest giant and finally as a dead trunk rotting on the forest floor. This may not sound interesting, but Jackson pulls it off beautifully. The writing is careful and precise, while at the same time evocative. Jackson’s work, however, seems to be virtually unknown. (To prove this, I’ll just point out that his Amazon best-seller rating is even lower than that of my own two novels, and that’s saying something.) He seems never to have written anything else. In this age of instant contact, I found myself wanting to email him, or to follow him on Twitter to thank him for his painstaking work–but there’s no digital trace of him at all.
Alas, it is a sad and lonely business being a writer–as many of my readers know from first-hand experience. Pouring one’s heart and soul into a labor that will go unacknowledged is a risky business, and a thankless one, but at the same time it is a vitally important one. Can we take some slight solace in the fact that once published, a work may go unread for decades (The Biography of a Tree was published in 1979), only to spring to life, on cue, in a new reader’s hands, inspiring new thoughts, emotions, and passions decades after it was forgotten by an inhospitable public? Reading Jackson’s work, I realized that a writer’s life resembles that of the seventeen-year cicada, a recurring character in his book, a forest denizen that burrows itself below the soil for almost two decades, only to emerge into the sun for a mere month or two of life in the sunshine. The majority of its life is invisible, almost dormant–but it can accomplish so much during those short days it spends above the ground. Jackson’s book is the same: it may be inert and ineffective while it sits on the shelf of libraries, but once in the hands of an interested reader, what power it has! What influence! And though my appreciation of James P. Jackson comes too late to do him any good, at least it has done me worlds of good to have read, and appreciated, his work.
I’ll be honest: for a moment I thought about entitling this post “Reflections on Re-reading the Iliad,” but aside from sounding very dull, I will admit that I’m not sure I ever did read that pillar of Western Literature in college. Of course, like most other people, I’d heard of it. I’m old enough to have gotten my first and greatest dose of mythology–Greek, Roman, and a small bit of Norse myths–from Edith Hamilton’s Mythology almost fifty years ago, back when I was in high school.
To be honest, I’ve always wondered why American schools even bother to teach mythology. For a long time, I thought it was just to provide an introduction to the basis of Western culture, but then I realized, with a shock, that mythology in high-school English curricula actually had no point; rather, it was an oversight, a leftover from previous educational imperatives. Our insistence on teaching mythology to bored high school students, in other words, is something like having an appendix in our guts: there is no real purpose for it. While it once did have a function, it now simply dangles there with any reason for existing.
Here’s my version of why we have mythology in high school. It certainly isn’t for them to become acquainted with stories of Greek and Roman gods and goddesses. After all, these stories are brimming with violence and sex, and are totally unsuitable for young learners. How do you explain the rape of Leda by Zeus–in the shape of a swan, no less–to high school students? Yet this is where the Trojan War, and the Iliad, really begins, as William Butler Yeats reminds us in his masterful poem “Leda and the Swan.” No, the reason we teach such things is because they were once vehicles for learning Latin and Greek. All language learners know that it’s no fun simply to do exercise after exercise when you’re trying to acquire a second language; you want to get to stories and dialogues, no matter how puerile or simplistic. (Incidentally, the language-learning computer platform Duolingo has figured this out and now provides an entire block of lessons with short stories to keep its learners interested. It’s worked for me.) Since a truly educated person, from the Renaissance to the early twentieth century, needed to know at least some Latin and less Greek (as the poet Ben Jonson rated Shakespeare’s knowledge), schools were obsessed with drumming classical languages into recalcitrant students’ heads. What better way to get them to learn than to present them with violent, prurient tales of heroes and heroines? For generations, apparently, the scheme worked. But gradually the need and desire to showcase one’s Latin and Greek knowledge wore off, and these languages ceased to be taught in schools.
But the mythology remained. And thank goodness it did.
A few years ago, a friend of mine and I decided to read the then newly published translation of the The Iliad by Caroline Alexander. We never got past the first few books then, but Covidtimes provided us a new opportunity, and we started over. I began by being less than impressed with the story, but I have to admit that now I am pretty much hooked. The world that it presents is violent and nasty, but there are some moments of real beauty, too.
Yet what has really caught my attention is that the world of the Iliad is totally random. Things happen for no reason, or for reasons well beyond the control of the humans involved. You may think you’re winning a battle, but then a god shows up, sometimes disguised as a human, sometimes in a fog, and things go to hell in a handbasket quickly, and suddenly you’re terrified and hiding by your ships wondering if you should push off for home. Events kaleidoscope by and you can’t do anything about them, because even if you do take action, often it has the opposite effect you intend.
In other words, life as represented in The Iliad is something like life in a pandemic. Covid seems to hit randomly, and to hurt randomly. We don’t know why some people are barely affected by the virus while others are struck down, killed or incapacitated by it. We don’t know how long the pandemic will last. We don’t know what steps the government will take to protect us from it. We are like the characters in the Iliad, taking action in good faith but knowing in our bones that anything can happen.
Nowhere is this brought out more poignantly than in a relatively insignificant scene in Book 8, which takes place in the middle of a raging battle. The Trojan Paris shoots an arrow at the Greek Nestor, which hits the seasoned warrior’s horse in the head, “where a horse’s forelock / grows on the skull, and where is most fatal” (lines 84-85). Then something truly odd happens; the narrative perspective changes and instead of watching sweeping actions–men swinging swords and throwing spears, horses stamping over bodies, chariots careening and crashing about–suddenly we are watching a single arrow as it plunges into a horse’s head. We watch, transfixed, as the arrow skewers the poor horse, who in his death agony “flung the horses with him into panic as he writhed around the arrow point” (line 87). We go from big action (battle), to smaller actions (arrow shooting), to an even smaller action (the arrow penetrating the horse’s brain). The center of focus has contracted to the tiny tip of an arrow, and we, like the horse itself, are flung around this arrow, orbiting it just as the earth orbits the sun. We have changed our perspective, from large heroic actions taken by men, to a single arrow around which a horse rotates. It’s as if we’re inhabiting a kaleidoscope, living on the inside of it, subject to its twists and turns at any moment. The effect on the reader is disorienting, just as it is meant to be, because it reinforces the sense that events in the story are random, uncontrollable, and largely unpredictable, while at the same time suggesting that mere perspective determines our allegiance and our ideology.
This is why I find reading The Iliad right now so very meaningful. This is a poem that was written ages before the Enlightenment values of logic, continuity, causality–in short, Reason–had been adopted in Western culture. These values are being tested right now in our daily lives, and reading this ancient epic reinforces the sense that values come and go, that worldviews shift and change, and that our sense of primacy is, and should be, rather fragile. If there is anything that Covid-19 has taught us, it is that, at least in the short run, we are all at the mercy of the gods, whoever and whatever those gods may be, and we must, like Odysseus, Agamemnon, Hector, and Achilles, simply get along as best we can in the face of a world we cannot control, even if we desperately want to believe we can.
As it happens, recent research suggests that the appendix does, in fact, have a function: to protect and nurture healthy bacteria until they are needed in the gut. Perhaps teaching mythology serves a similar purpose; perhaps, appendix-like, it preserves and protects various ideas, attitudes, and perspectives that, while outmoded and seemingly unnecessary in modern life, can provide us some kind of insight in difficult times. At any rate, reading The Iliad has certainly given me food for thought these past few weeks.
The debate over whether to open schools is revealing an important question that has lurked just below the surface for a generation–indeed, perhaps for as long as free public education has existed in the United States: what is the purpose of our schools? Is it to teach people crucial skills and allow them to acquire important knowledge, or is it rather to provide a holding tank, a safe and dependable place for a part of the population that cannot yet care for themselves?
Some teachers take umbrage at the thought that K-12 schools are used as childcare centers; they say that they are not babysitters, and that the push to open schools is an attempt to get the economy going again by providing workers with childcare that is not otherwise available to them. There is truth in this assertion. But universities, too, have been used for the last fifty years as childcare centers of a sort, places where a group of people is deposited under the guise of acquiring a higher education until they are ready to enter the workforce, or until the working world is convinced to let them in. Our educational institutions, in other words, have been, at least for the last fifty years, both places of learning and care facilities at the same time.
It’s best if we accept this dual role of educational institutions, rather than rail against it. A K-12 school can be both a place where education occurs as well as a place where parents can send their children for safe care (school shootings and pandemics aside). A university or college can be a place to teach important skill sets, including knowledge that is difficult to acquire on one’s own, as well as a place where young adults are sent while they wait their turn to enter a work force that isn’t quite ready for them yet. This leads to the question of opening the schools: are they essential for our country? In the short-term, the answer is a resounding “yes”: providing such a safe space is essential in order to run the economy we’ve grown used to, one in which financial necessity compels parents to scramble to find childcare, as well as one in which young adults require an expensive university education merely to snag an entry-level job in a field that becomes outmoded within years.
In this sense, teachers and professors are indeed essential workers; they are, in fact, babysitters. (Note that I do not say “mere” babysitters. The term itself is a demeaning one, indicating that a caregiver’s job is completely passive, but anyone who has ever been around young children knows this is far from the truth. I will leave that topic for future post, however. At any rate, babysitting is at least as important a role in our society as being a university professor, perhaps much more so.) But at the same time they are caregivers, teachers are also purveyors of knowledge and skills, and we need to keep both functions in mind as we think about the job they do.
I’ll be honest: I can see no clear solution as to whether schools should be opening up in a few short weeks. Sadly, we have completely squandered the time we bought back in March, when schools were summarily shut down in order to stem the spread of Covid-19. We did not stop the disease from spreading, which is bad, but what is even worse is that we completely failed to create a workable plan for re-opening schools and instead just held our breath, hoping that the pandemic would simply die down or fade away. It didn’t have to be this way; the complete lack of leadership at the federal level is to blame for this awful situation. During this time, other countries’ schools have created solutions that we can learn from, and we must study them closely to find our own, but here is one simple takeaway: flexibility is the key to fighting this pandemic. As argued in Tomas Pueyo’s important article published the early days of the pandemic, we need to shift between strict containment measures, including lockdowns, and loosened restrictions, again and again until Covid-19 becomes manageable. This demands that we act with flexibility, becoming responsive to the current situation.
And here we find a heartbreaking irony: flexibility is precisely what is lacking in the educational institutions we have come to rely on for childcare. And this in turn is a direct result of the binary role of schools in our society and our unwillingness to recognize it. In other words, what matters in childcare is dependability, after all; we need to know that our children have a safe place to go with someone watching over them whenever we need to be at work. But as far as education goes, flexibility is the most important thing. If one learning method doesn’t work, a good teacher always has a host of other methods to try out. Learning itself has to be flexible, because knowledge is acquired through a series of attempts, failures, and (hopefully) successes; a good education should always provide its student with the ability to be flexible. In other words, critical thinking, simply described, is the ability to see a problem in a variety of ways in order to solve it. Flexibility, elasticity, and adaptability are excellent things in education, however unwelcome they may have become in the working world (or the political world, for that matter). I would even argue that ignoring the role of flexibility in education has actually led to the demise of its effectiveness in our country, as we came to rely on testing and objective-chasing rather than more organic approaches to teaching, but that, too, I will have to leave for another post, or to another blogger.
My point here is simply this: it isn’t necessarily bad for education to serve as child (or young adult) care, but not recognizing and accommodating this dual nature of our educational institutions will lead us to make faulty, even disastrous, choices as we move forward to confront our new future.
This pandemic, awful as it is, may well have good consequences. One of them, I hope, is the bright light it shines, often harshly, on the institutions and traditions we’ve come to accept so blithely through the years. Though it may be painful in the beginning, we can work to make these institutions work for our society much better than they have in the past. But the first step, as always, is to see things as they are, and in this case, we must accept the idea that schools have been necessary in this country not only because they teach the skills and knowledge that citizens of a democracy must have, but also because they provide childcare to people who need to work and otherwise could not afford to do so. Let us look at the situation clearly, transparently, and earnestly: only then can we hope to meet the challenges that face us in this difficult and unprecedented time.
Usually, I am not one to make grand claims for my discipline. There was a time, back when I was a young graduate student in the 1980s, that I would have; perhaps even more recently, I might have argued that understanding ideology through literary theory and criticism is essential to understanding current events and the conditions we live in. But I no longer believe that.
Perhaps in saying this publicly, I’m risking some sort of banishment from academia. Maybe I will have to undergo a ritual in which I am formally cashiered, like some kind of academic Alfred Dreyfus, although instead of having my sword broken in half and my military braids ripped to shreds, I will have my diploma yanked from my hands and trampled on the ground before my somber eyes. Yet unlike Dreyfus, I will have deserved such treatment, because I am in fact disloyal to my training: I don’t believe literary theory can save the world. I don’t think it’s necessary that we have more papers and books on esoteric subjects, nor do I think it’s realistic or useful for academics to participate in a market system in which the research they produce becomes a commodity in their quest for jobs, promotions, or grant opportunities. In this sense, I suppose I am indeed a traitor.
But recently I have realized, with the help of my friend and former student (thanks, Cari!), that literature classes are still important. In fact, I think studying literature can help save our way of life. You just have to look at it this way: it’s not the abstruse academic research that can save us, but rather the garden-variety study of literature that can prove essential to preserving democracy. Let me explain how.
I’ll begin, as any good scholar should, by pointing out the obvious. We are in a bad place in terms of political discourse–it doesn’t take a scholar to see that. Polarizing views have separated Americans into two discrete camps with very little chance of crossing the aisle to negotiate or compromise. Most people are unwilling to test their beliefs, for example, preferring to cling to them even in the face of contradictory evidence. As social psychologists Elliot Aronson and Carol Tavris point out in a recent article in The Atlantic, “human beings are deeply unwilling to change their minds. And when the facts clash with their preexisting convictions, some people would sooner jeopardize their health and everyone else’s than accept new information or admit to being wrong.” They use the term “cognitive dissonance,” which means the sense of disorientation and even discomfort one feels when considering two opposing viewpoints, to explain why it is so hard for people to change their ideas.
To those of us who study literature, the term “cognitive dissonance” may be new, but the concept certainly is not. F. Scott Fitzgerald writes, in an essay which is largely forgotten except for this sentence, “the test of a first-rate intelligence is the ability to hold two opposed ideas in the mind at the same time, and still retain the ability to function” (“The Crack-Up,” Esquire Magazine, February 1936). In addition, cognitive dissonance isn’t that far removed from an idea expressed by John Keats in a letter he wrote to his brothers back in 1817. He invents the term “Negative Capability” to describe the ability to remain in a liminal state of doubt and uncertainty without being driven to come to any conclusion and definitive belief. Negative capability, in other words, is the capacity to be flexible in our beliefs, to be capable of changing our minds.
I believe that the American public needs to develop negative capability, lots of it, and quickly, if we are to save our democracy.
But there’s a huge problem. Both Fitzgerald and Keats believe that this function is reserved only for geniuses. In their view, a person is born with this talent for tolerating cognitive dissonance: you either have it–in which case you are incredibly gifted–or you don’t. In contrast, Aronson and Tavris clearly believe it’s possible to develop a tolerance for cognitive dissonance: “Although it’s difficult, changing our minds is not impossible. The challenge is to find a way to live with uncertainty…” While their belief in our ability to tolerate cognitive dissonance and to learn from it is encouraging, it is sobering that they do not provide a clear path toward fostering this tolerance.
So here’s where the study of literature comes in. In a good English class, when we study a text, whether it’s To Kill a Mockingbird or Beowulf, students and teacher meet as more or less equals over the work of literature in an effort to find its meaning and its relevance. Certainly the teacher has more experience and knowledge, but this doesn’t–or shouldn’t–change the dynamic of the class: we are all partners in discovering what the text has to say in general, and to us, specifically. That is our task. In the course of this task, different ideas will be presented. Some interpretations will be rejected; some will be accepted. Some will be rejected, only to be later accepted, even after the space of years (see below for an example).
If we do it well, we will reach a point in the discussion where we consider several differrent suggestions and possibilities for interpretation. This is the moment during which we become experts in cognitive dissonance, as we relish interpretive uncertainty, examining each shiny new idea and interpretation with the delight of a child holding up gorgeously colored beads to the light. We may put a bead down, but it is only to take up another, different one–and we may well take up the discarded bead only to play with it some more.
The thing that makes the study of literature so important in this process is that it isn’t really all that important in the grand scheme of things. To my knowledge, no one has ever been shot for their interpretation of Hamlet; the preservation of life and limb does not hang on an precise explanation of Paradise Lost. If we use the study of literature as a classroom designed to increase our capacity for cognitive dissonance, in other words, we can dissipate the highly charged atmosphere that makes changing our minds so difficult. And once we get used to the process, when we know what it’s like to experience cognitive dissonance, it will be easier to for us to tolerate it in other parts of our lives, even in the sphere of public policy and politics.
If I seem to be writing with conviction (no cognitive dissonance here!), it’s because I have often experienced this negative capability in real time. I will give just two examples. The first one occurred during a class on mystery fiction, when we were discussing the role of gossip in detective novels, which then devolved into a discussion on the ethics of gossip. The class disagreed violently about whether gossip could be seen as good or neutral, or whether it was always bad. A loud (and I mean loud!) discussion ensued, with such force that a janitor felt compelled to pop his head into the classroom–something that I had never seen happen either before or since then–to ask if everything was ok. While other teachers might have felt that they had lost control of the classroom, I, perversely, believe that this might have been my most successful teaching moment ever. That so many students felt safe enough to weigh in, to argue and debate passionately about something that had so little real importance suggested to me that we were exercising and developing new critical aptitudes. Some of us, I believe, changed our minds as a result of that discussion. At the very least, I think many of us saw the topic in a different way than we had to begin with. This, of course, is the result of experiencing cognitive dissonance.
My second example is similar. At the end of one very successful course on Ernest Hemingway, my class and I adjourned for the semester to meet at a local bar, at which we continued our discussion about The Sun Also Rises. My student Cari and I got into a very heated discussion about whether the novel could be seen as a pilgrimage story. Cari said it was ; I vehemently disagreed. The argument was fierce and invigorating–so invigorating, as a matter of fact, that at one point a server came to inquire whether there was something wrong, and then a neighboring table began to take sides in the debate. (For the record, I live in Hemingway country, and everyone here has an opinion about him and his works.) Cari and I left the bar firmly ensconced in our own points of view, but a couple of years ago–some three years after the original argument occurred–I came to see it from Cari’s point of view, and I now agree with her that The Sun Also Rises can be seen as a sort of pilgrimage tale. It took a while, but I was able to change my mind.
It is this capacity to change one’s mind, I will argue, that is important, indeed, indispensable, for the democratic process to thrive.
In the end, it may well be that the chief contribution that good teachers of literature make to culture is this: we provide a safe and accessible place for people to learn what cognitive dissonance feels like, and in doing so, we can help them acquire a tolerance for it. This tolerance, in turn, leads to an increase in the ability to participate in civil discourse, which is itself the bedrock of democratic thought and process. In other words, you can invest in STEAM classes all you want, but if you really want to make people good citizens, do not forget about literature courses.
In view of this discovery of mine, I feel it’s my duty to host a noncredit literature class of sorts in the fall, a discussion-type newsletter that covers the great works of English literature–whatever that means–from Beowulf to the early Romantic period, in which discussion is paramount. If you’re interested or have suggestions, please let me know by commenting or messaging me, and I’ll do my best to keep you in the loop.
And in the meantime, keep your minds open! Cognitive dissonance, uncomfortable as it is, may just be what will keep democracy alive in the critical days to come.
The only modern poet I have ever understood is Eavan Boland.
If you recognize that sentence as an echo of Boland’s wonderful poem “The Pomegranate,” you might share my feelings for her work. Boland’s death will probably not get much attention outside of Ireland, but I feel it’s right for me to acknowledge it here, where I talk about the things that are important to me.
In a time of so many losses, perhaps it’s silly to focus on one death, yet I do it out of selfishness, for myself and for what this poet’s work has meant to me. First, a confession: I am not a poet, nor am I really a great reader of poems. As a professor of literature, I have studied poetry, but I feel much more comfortable with the works of Wordsworth, Arnold, Shakespeare, even (dare I say it?) Milton than with contemporary poetry. To be honest, despite my elaborate education, I really don’t understand contemporary poetry–so I must not really “get” it. I’m willing to accept that judgment; after all, there are a lot of things I do get, so it’s a kind of trade-off. I realize I’m not a Michael Jordan of literary studies, which is why I rarely comment on poetry that was written after, say, 1850. But I feel it’s only right to mention here my attraction to, and reverence for, Boland’s poems, one of which (“This Moment“) I used for years to teach poetic language to my freshman and sophomore college students.
I first noticed Boland’s poems in the mid-90s, when I was teaching full time as an adjunct professor, still hoping to make my mark–whatever that was supposed to be–on the world. I had subscribed to the New Yorker, back in the days when it was read for literary, not political, reasons. This was during a period when poets and writers who submitted their work and not gotten it accepted for publication actually protested outside the offices of the magazine, stating that their work was just as bad as what was being published within the pages of the New Yorker and demanding equal time. (I thought about looking this story up on the internet, because, in an age of so much fake news, everything is easily verifiable, but forgive me–I decided not to. If the story about these outraged mediocre writers is not true, I don’t want to know about it. I love it and cling to it, and it does no one any harm, after all.)
I was very much aware of the opacity of much that was published in the New Yorker, and one evening after the children were in bed, having recently heard that story about the protesters, I shared it with my husband. To demonstrate how unreadable the stuff that was being published was, I grabbed a copy off our end table, thumbed through it until I found a poem, and started to read it out loud. After two or three lines, however, I stopped in mid-sentence. My husband said, “What? Why did you stop?” I looked up slowly, reluctant to pull my eyes away from the poem, and said, “It started to make sense to me. Actually, this is really good.”
I am not sure which poem of hers I was reading that evening. Perhaps it’s best that I don’t know, because it drives me to read so many of her poems, always searching for the Ur-poem, that first poem of hers that drove me to appreciate so much more of what she’s written. Boland’s poetry seems to me to explore the intersection of place and person, of history and modernity, in simple, sometimes stark, language. I love it for its depth, not for its breadth (sorry, Elizabeth Barrett Browning). I love the way it sinks its roots deep into the past, all the way back to myths and legends sometimes, yet still manages to retain a hold on the very real present.
Eavan Boland died yesterday, April 27, at the age of 75. You can read about her influence here, in an article by Fintan O’Toole of the Irish Times. Her poems can be found online at poets.org and on poetryfoundation.org.
I want to refer to a day many years ago, back when the world was normal and my kids were still at home. It was a weekday afternoon, and I was making chili for dinner, chopping up ingredients at the kitchen counter. My daughter, a high school student who was also taking classes at the local community college, breezed through the back door, walked through the kitchen, put her books down on the dining room table, and returned to the doorway to say, “Mom, the kids in my school are so stupid. I mean, they’re just so dumb that I get worked up about it. I actually think I’ve gone through the Stages of Grief about their stupidity.”
“What?” I had been dicing bell peppers, but I put down my knife and looked up at her. She had just come home from her college psychology class.
“Well, we were learning about Elisabeth Kübler-Ross’s theory about the stages of grief, and I realized that the kids I know are so annoying and stupid that I’ve gone through all those stages about them.”
I asked her to explain, and she went on. “So, the first stage is Denial. I start out thinking, ‘I cannot believe these people are so stupid. Maybe if I ignore them, I won’t have to deal with them at all. The extent of their stupidity actually scares me, so I’ll stay away.'”
I nodded and said, “Go on.”
“The next stage is Anger. I get angry at their stupidity, because they frustrate me, and they make me anxious. I’m just mad that they’re dumb and they don’t care about changing.”
I waited for her to continue.
“Okay, then comes Depression. I seriously get depressed about how stupid they are. I begin to think that they’ll never be anything but stupid, no matter how much I — or anyone else — tries to help them. It makes me sad that anyone can be alive and so dumb.”
By this point I had nothing to say. It’s always a little overwhelming the first time your child shares a truly interesting thought that you didn’t plant in their brain.
“That’s when I start Bargaining. I say to myself, ‘Oh, they may be stupid in this class. They may be stupid in all their classes, but maybe they’re good athletes. Yeah, they’re probably great at football or basketball or volleyball. They’re in band, so maybe that’s what they’re good at. See, they’re really stupid, but there are ways to compensate for that, aren’t there?”
She paused a moment, then finished by saying, “But I always end up Accepting their stupidity. I just factor it into my plans, sometimes I even use it to get what I want, and then I move on to something else.”
She stood up, grabbed her books, and went upstairs to her room, leaving me staring after her. I had nothing to say in the face of such brilliance, but she didn’t even notice.
Every single thing she’d said made perfect sense, and I promised myself one day I would write about it.
And now, 15 years later, awake at the crack of dawn because I can’t stop thinking and fretting and worrying, I realize that we’re probably all going through the Stages of Grief about the Coronavirus, and I’ve finally made good on my promise.
P.S. If you’re looking for more stuff to read, check out my friend John’s blog: TomatoPlanet! at https://ininva.com/. John’s been doing this blogging stuff since way before it was cool, and he’s got some great stuff there.