A Modest Proposal, and an Example

Proposal:

I would like to use this platform to issue a call to action. I believe that contemporary culture has a desperate need, one which we can all work to address. For too long we have outsourced the intellectual and critical work that must be done to understand the world we live in. We’ve told ourselves that we cannot engage in really intellectual discussions, because such things lie exclusively in the province of the thinkers and researchers who live and work in the universities. We’ve allowed the creation of a huge silo, one which we never enter, in which important thinking and analysis occurs.

In my view, this is a big mistake.

Of course, we need universities and the scholars they produce and the research that they in turn produce. But that does not give us license to stop thinking ourselves, to stop considering the works and ideas that, whether we know it or not, affect our lives and help us make sense of the world. What I’m calling for, in other words and in the simplest terms, is the idea of the amateur intellectual.

It’s not such a crazy idea, when you think about it. I mean, we all know what an amateur detective is; half of mystery fiction wouldn’t exist without the likes of Miss Marple, Lord Peter Wimsey, Father Brown, and Jessica Fletcher. Why not have the same kind of category for thinkers, philosophers, literary critics, historians? Why must we delegate the important intellectual work to others, and not explore works and ideas on our own? One needn’t be a forensic expert to solve a mystery; likewise, one shouldn’t need an advanced degree to make sense of the ideas that swirl around us in this world we live in.

I offer an example below. It’s an idea I’ve had about Frankenstein. In my earlier days (read: before I retired), I’d probably have tried to create a more academic argument out of it, researching what other readers/scholars have said, scaffolding and buttressing my argument to make it airtight and worthy of an academic audience. But I’m done with all that now, and besides, I don’t think it’s right or necessary to let academics have all the fun. I also think it’s an abdication of our own intellectual responsibility, as I said above, to outsource or delegate all the work to a bunch of overworked and myopic academics. We’re all capable of this kind of intellectual play (I refuse to call it “work”), although it may take more practice for some of us to get to the point where we feel comfortable making judgments for ourselves and, going a step further, sharing them with others.

Example:

I will be honest: Frankenstein is not my favorite book. Far from it, and for a number of reasons. Of course, like everyone else in the English-speaking world, perhaps in the entire world, I acknowledge how important the themes of the novel have become, how the ideas of runaway science, intellectual hubris, and regret or guilt about the act of creation itself have become central metaphors of the world we live in. I get that these things are important. I can also see that Frankenstein might lay claim to being the first science fiction novel in English (although there are other claimants), and like everyone else, I am amazed that it was written by a 19-year-old woman, albeit one with illustrious intellects for parents. I love the fact (forgive me), that the self-important and perfidious Percy Shelley has been all but eclipsed by his young wife’s audacity in writing such an important work. I mean, every schoolchild today at least knows about Frankenstein; how many know “To a Skylark” or “Ode to the West Wind” or even “Ozymandias”?

All these things about the novel are important, and they are things I have long appreciated about it. But the novel itself is, to be frank, a mess. It looks back to the amorphous 18th-century English novel much more than it looks forward to the 19th-century novel we all associate with the very form of the novel itself. It’s episodic, for one thing. For another, it’s a hodgepodge in terms of its form. Starting out as a letter, it becomes the first-person narrative of Victor Frankenstein, then it somehow becomes the first-person narrative of the monster itself (never mind how the monster learned not only to speak and understand language without any kind of instruction but actually write as well), then returns to doctor’s narrative. It is, to be brief, all over the place in terms of narrative structure.

Yet whenever I referred to the novel–I don’t think I ever had the temerity to actually put it on the syllabus of any of the classes I taught–I always did so with respect. As I said above, I understand the place it has come to occupy in our culture. It has become a central metaphor, much like the story of Darmok and Jalad at Talagra in that Star Trek: The Next Generation episode (second episode of the fifth season) became to the Tamarians. But I would never argue that the novel itself was a stellar example of artistic creation.

Until yesterday, that is, when I had an epiphany.

I don’t know if Mary Wollstonecraft Shelley knew what she was doing, and it hardly matters if she did, but I can now see that the novel’s very structure, that patchwork, rag-tag narrative that bounces from one story to another without any proper cohesion, is exactly what Dr. Frankenstein’s monster is. Just as the monster is a wild assortment of body parts, taken from various villains and paupers whose cadavers Victor Frankenstein managed to appropriate, so the the novel is a wild assortment of stories, sewn together into a loose and rather ugly novel that lacks the grace of any work by Jane Austen or Sir Walter Scott, Shelley’s contemporaries. In other words, Frankenstein’s very structure mirrors the monstrousness of its central character.

And now suddenly I no longer see the novel as one with an important message but a disastrous execution. Rather, I see it as just plain brilliant. Did Shelley see or know what she was doing? I doubt she did; I think it might have been good luck. But it doesn’t really matter. It’s an amazing achievement, whether intentional or not, and is, at the moment, my favorite aspect of the book, and a discovery so thrilling that it makes me wish I was still teaching so I could share it with my students.

And here’s where I return to my proposal: why on earth should we amateurs–those of us not connected to a university or research institution–deny ourselves the pleasure of a discovery like this? If we can read and think, we can do the work/engage in the play. And our lives will be so much richer for doing it.

Amateur intellectuals, unite! It’s time to take up our thinking caps and enter the fray! Or as I would rather put it, let’s enter the playground and begin to have some fun for ourselves. I look forward to seeing your stories of discovery and interpretation.

Austen and Ozu: A Comparison

One of the things that gets me through a long winter (one which hasn’t had much snow, or cold, but still lacks light and is therefore depressing), is a subscription to a streaming service called the Criterion Collection. It airs a variety of films, cycling some of them in and out month by month to add a bit of suspense to their line-up. Mostly, I use it to watch Hollywood films from the 1930s and ’40s, occasionally later, but this month subscribers have been treated to a collection of films directed by Yasujiro Ozu.

I never learned much about films until fairly recently. I mean, I knew I liked the earlier films of Alfred Hitchcock much better than his later ones, and that I’d always used film as a way to experience not only American culture but my own family history. But about fifteen years ago, when I got the approval to teach a college class on Hitchcock, I realized I should know something more than my own idiosyncratic likes and dislikes, and so I set to work educating myself on the history of film.

I did a haphazard job. That much is clear to me now, because while I watched films like L’Aventurra and Bicycle Thieves, I missed the films of Yasujiro Ozu. This week, however, I’ve begun to remedy that situation, and I have to say that several days after watching Late Spring (1949), I am still thinking about the film. This morning, it came to me in a flash of inspiration that one of the reasons I love it so much is that the film bears a marked similarity to the work of Jane Austen.

I don’t mean to say that Ozu modeled the film on Austen’s work or her technique; far from it. He may not even have been aware of her existence. But it seems to me that Late Spring, with its very slim plot (a woman who lives with her aging father is urged to marry but doesn’t want to), its focus on manners, its subtle but striking attention to setting–all these things are very similar to any Jane Austen novel–let’s say, for convenience, Pride and Prejudice. There’s a quiet understatement at work in both works, which somehow demands an emotional and an intellectual investment on the part of the viewer/reader. In addition, Ozu, like Austen, focuses on very simple, quotidian events; nothing major or earth-shattering happens in either work. Rather, life as it is represented is made up of fairly unexceptional moments, for the most part, and only occasionally punctuated by significant and revealing scenes, so subtly drawn that their significance might initially escape unnoticed.

Watching the film left me feeling of peaceful and satisfied. It evoked the sense of a world that is balanced and somehow justified in its simplicity. This, despite the fact that the world Ozu depicts is clearly putting itself back together after the horrors of World War II. I get the same feeling from reading Austen’s works. True, there are horrible things happening in the world she creates–the Napoleonic Wars, the festering slave trade in the Caribbean, her characters’ aristocratic lives that are built on the suffering of a hideously deprived underclass–and yet, while not specifically trying to hide such things, Austen’s focus, like Ozu’s, remains centered on the small, daily events which together make up the entirety of a human life. For some reason, I take comfort in this attention to the tiny, unexceptional details of the process of living. I suppose, in a way, it makes me appreciate my own, fairly boring (I’m not complaining!) life all the more.

I’d like to say that I’m the only person who has ever thought of comparing Jane Austen to Yasujiro Ozu, because I think it’s a fairly brilliant idea, but unfortunately, I’m not. A quick internet search unearthed this article, which is somewhat disappointing to the scholar in me. The downside is that my idea isn’t as original as I thought it was; the upside, however, is that there is strength in numbers, and someone apparently agrees with me, even if that agreement dates from ten years ago.

And now, I’m headed back to the television set to watch more of Ozu’s films. There are a lot of them. At this rate, I might be able to make it until springtime!

Please comment below if you there’s anything you think I should know about Ozu, or Austen, for that matter.

A Confession

Recently this article appeared in The Atlantic, all about why Taylor Swift is worthy of being the subject of a class at Harvard University. I have to say, I’m not all that impressed by the argument author Stephanie Burt makes. It’s not that I don’t think Swift deserves critical attention; of course she does. Certainly more people know more about Taylor Swift than they do about Jonathan Swift (hint: Gulliver’s Travels and A Modest Proposal), and for that reason alone, she is worthy of interest. But Burt seems to think that she’s doing something avant-garde, busting through genres as well as expectations, in offering the class.

I beg to differ.

For decades, many of us teachers have been referring to popular culture in an effort to make past literary works come alive for our students. How many of us compared the Romantic poets to John Denver, with his “back to nature” themes? Or Byron to the Beatles, as the first popular superstar (well, second, since Wordsworth also had to deal with pilgrims landing on his doorstep)? If you’ve spent any time in the classroom at all, chances are you’ve made analogies like these–and whether they are apt or not (please don’t argue with me about mine, because I’m retired now and I really don’t care any more), they succeeded in getting students to look at old works of literature with curiosity and some slight degree of affinity.

What Burt has done, however, is the opposite: she’s using the old stuff to shed light on the new songs that Swift has recently created. As she points out, “I would not be teaching this course, either, if I could not bring in other works of art, from other genres and time periods, that will help my students better understand Swift and her oeuvre.” There is nothing wrong with this approach, either. Interpretation and criticism go both ways, and the very act of exploring one work can shed light on the meaning and structure of another, very different, one. What I object to here is that Burt seems to think she needs to defend a mode of teaching that has existed since the dawn of pedagogy itself, most likely. Perhaps she’s out to blow up the canon; that, too, however, has already been done many times over. I guess my take on the whole subject is that there is nothing new under the sun. Not even Taylor Swift.

But here’s a counterargument to the article, one which may or may not be valid. Does Swift really need interpretation? Isn’t it pretty easy for people to understand and appreciate her work? It’s likely that people need more help understanding and seeing the value in Shakespeare’s works, or Charlotte Bronte’s, or Virginia Woolf’s than Taylor Swift’s songs–works which, separated from us by time and circumstance and perhaps even language itself, remain alien to us, despite the fact that we know they had enormous influence on the writers who came after them.

Still, I’m all for teaching Taylor Swift’s works (as long as we can avoid the trap of criticism, making sure we do not “murder to dissect,” as Wordsworth put it). What really bothers me, I think, is that I appear to be one of the few people on the planet who don’t really “get” Taylor Swift. I don’t dislike her songs; I just am not forcibly struck by them, or powerfully moved by them, either musically or thematically. And this despite asking my 25-year-old offspring to perform exposure therapy on me using a Spotify playlist. I realize this is probably a defect in me, a matter of faulty taste or lack of experience of some kind. After all, I felt the same way about Hamilton.

Now that the secret is out, I give all my readers permission to stop reading here. I am so clearly out of step with the times, so serious a victim of what I call “century deprivation,” that nothing I say is viable or important. I accept that designation, with what I hope is humility, bafflement, and only a small amount of FOMO, or rather COMO (Certainty Of Missing Out). As proof of the fact that I am completely out of step, I offer the following things I do find endlessly amusing. If I could, I’d offer a class on these clips alone, working to introduce students to media in its infancy and the role of comic relief in the twentieth century.

I’d start with this little snippet from the BBC Archives, in which Lt. Com Thomas Woodrooffe, reporting on the Royal Review of naval ships in 1937, inadvertently revealed that he had imbibed a bit too many toasts with his former shipmates earlier in the afternoon, as evidenced by his incoherent but hilarious commentary.

Then I’d move to this wacky clip from the dawn of television broadcasting, an almost-lost skit from the Ernie Kovacs Show. I don’t know why I find this so funny, but perhaps it has to do with the music behind it. I will freely confess that this little jewel was instrumental in getting me through a good deal of the pandemic three years ago.

And finally, here’s the Vitameatavegimin episode from I Love Lucy. I would often use this to teach my public speaking students how to give a persuasive speech using Monroe’s Motivated Sequence, a five-step formula that was often used in advertisements. (The five steps are: grab your audience’s attention, set out a problem, solve the problem, visualize the solution, and urge your audience to action.) Some of my students had never seen a single episode of I Love Lucy, which I found mind-boggling. I’d wager that most of them probably still remember this episode much better than they do Monroe’s Motivated Sequence, which is fine with me.

To all the Swifties out there, I apologize for being tone deaf. I fully understand your pain and disappointment, and I share in it. The clips above are offered as nothing more than a kind of peace offering or compensation to make up for my failure.

As my grandmother used to say, there’s no accounting for taste!

Bah Humbug: Some Thoughts on A Christmas Carol

Illustration from Wikimedia Commons

I finally sat down yesterday and made myself read Dickens’s A Christmas Carol. As a scholar of Victorian literature, I should have read this story long ago, back when I was in graduate school, if not well before, but I don’t think I ever did. And really, why should I have? It’s not considered Dickens’s best work by Victorian scholars; in addition, it’s entered our culture so thoroughly, in so many forms, that it hardly seems necessary to read the original because we all know the story and characters so well. Like the story of Adam and Eve, we’ve imbibed so many versions of the original tale that we might not even recognize the original if we were, for some reason, to take it up and read it for ourselves. (Back when I taught English literature to community college students, I would make them read the part of Genesis that dealt with Adam and Eve’s exile from Eden before we tackled Milton’s Paradise Lost. The original was a tiny passage–just a few lines long–compared to Milton’s magnum opus, which is undoubtedly more familiar, at least in the way it presents the main story, to us than the original.) For most of my life, I have been content to ignore Dickens’s original story, perhaps thinking that watching the Mr. Magoo version was good enough.

So what prompted me to correct this defect in my reading at this late date? Simply this: I encountered an advertisement for an online course that promised to reveal A Christmas Carol as a story of Christian redemption, and I immediately bristled at what I thought was a misguided interpretation of the whole thing. Of course, you can find anything you want in anything you read: I was once an academic, so I can attest to this. But just because you can do something doesn’t mean you should do it. Presenting an entire course for the purpose of forcing this reading on Dickens’s tale seemed wrong to me, because I believed something like the opposite is more likely to be true. And so, to check my theory, I decided to read the original myself and that’s what brought me, a person who long ago tired of Christmas hoopla, to engage in that most Christmas-y of Christmas activities: reading A Christmas Carol.

It’s well worth the read, but I realize I’m probably not going to convince anyone to spend a couple of hours reading a story written 180 years ago. And also, some people–the fact astounds me–some people simply don’t like Dickens. But that’s no reason to go around saying his best-known work is something that it really isn’t. I’ll freely admit that Dickens wrote a story celebrating what he considered the spirit of Christmas: an antidote to the greed and lack of empathy, a story designed to combat the misery produced by industrial capitalism that gripped much of Victorian London. (Indeed, a mere two years later, Friedrich Engels would produce his seminal study, The Condition of the Working Class in England, focusing on Liverpool and Manchester instead of London.) This much is clear: Dickens intended to, and succeeded in, writing a powerful story that drew on the emotional appeal of Christmas.

So why do I refuse to consider the novella a Christian story? My argument is a simple one, and in fact I’d argue that the very popularity of the story (go ahead and try to determine how many recorded versions exist–I gave up, but not before I became distracted by one that must have taken place during COVID lockdown, in which surviving members of Dark Shadows read it through on a Zoom call) does much to prove that I am right.

So here’s my argument: Dickens witnessed the greed and heartlessness in the world around him. He recognized the need for a correction of sorts, and he determined that spreading the spirit of Christmas–an idea that he himself largely willed into existence–was one such creative measure to provide this correction. True, there’s a link from Christmas back to Christianity and Christ, but by the Victorian period, that link was growing ever more tenuous in an age riddled with religious doubt. Thanks to Dickens (with a bit of help from Prince Albert, who brought German Christmas traditions to England), by the end of the century, people who were not devout Christians, or not Christian at all, would be able to to take part in Christmas festivities without feeling profoundly uncomfortable.

Dickens’s genius was that he recognized that the original Christmas story, the one celebrated in many Christmas carols (pa-rum-pa-pum-pum), was rapidly losing its cachet; it was no longer performing the function it needed to in order to make society more livable. Thus, genius that he was, he set out to create a new Christmas story, one for his time. He succeeded beyond even his wildest imagination. Readers caught hold of his story, which then entered into the culture and disseminated what he had called Christmas spirit–to wit, generosity, good cheer, lovingkindness–throughout a society corrupted by industrial capitalism, in order to administer a corrective, if only for a few days at a specific time of year. In other words, A Christmas Carol was created because the original Christmas story had begun to lose its hold on an England that was no longer uniform in its Christian belief (and perhaps never had been) and had therefore lost its power to influence society.

Was Dickens aware of how ambitious his project was? Almost certainly not. He was simply trying to create a compelling story that would capture his audience’s attention and sell lots of books. But that doesn’t mean he wasn’t on to something really big. Like J.R.R. Tolkien, he was setting out to create a mythology for the people of his time, since he recognized, at least at some level, that the one they had inherited had lost much of its power. But there’s an important difference between Tolkien’s project and Dickens’s: Tolkien was deliberately trying to create a new mythology for Britain and was aware of what it was he was aiming for. Dickens, I’d argue, was not. His new mythology was thus tacked onto the existing one, as a kind of appendix that would someday come to supplant, or at least threaten to eclipse, its predecessor.

So, in the end, A Christmas Carol may well have a Christian message, but if so, it’s a pretty wide definition of “Christian,” so wide as to be ultimately meaningless. Rather, its message is a critique of industrial capitalist society, subtle enough to co-exist with that society without causing too much friction. In writing it, Dickens created a new parable, actually replacing and not merely reinforcing the original Christmas story.

My takeaway from this? Stories are important. They influence the societies we live in. Our capacity to get caught up in them, to believe in them and their messages, have profound effects on societies, on culture, and ultimately, on the arc of human civilization. Sometimes, as with A Christmas Carol, a story comes along with such resonance that we are able to see, in real time as it were, how very important they can be, and how some stories that were once powerful in their own time can be supplanted by others when they begin to lose their influence. In the end, it behoves us all to understand how stories work, and how they not only describe, but actually create, the world we live in.

On Reading

I think a lot about reading–and not just my reading, but reading in general and how fascinating and miraculous an activity it is. I think about how reading (and writing, of course) has been used to transmit ideas from one generation to another, encompassing thousands of years. In fact, on my spacier days, I’ve even considered writing/reading a kind of alternate existence. Certainly it’s common to think of reading as a way of time travel, or as a way of living a life completely different from one’s own, but this isn’t quite what I mean by “alternate existence.” I’m talking about writing/reading as an actual different life form.

Bear with me, because this view requires some athletic imaginative leaps, but I think it’s worth it, if only to defamiliarize ourselves with writing/reading as an everyday function and look at it in a new light. To make my view clear, I need to start with a crazy premise: writing is a life form in and of itself. Think of it like this: you’re in a Star Trek Universe, and you’ve met another life form that you can’t see or touch, but you know it’s there. You can see its effects on the physical environment. So naturally, you have to expand your notion of what a life form is in order to identify and describe this one.

Now that I’ve prepared the way, let me make my argument for writing as an alternative life form. When human beings began to write their thoughts and stories down, they took the first step towards creating a different form of life, one made of the intersection of thought and the squiggly lines we call writing, grafted onto paper in a process similar to how lichens occurred: a fungus and an alga got together and, through symbiosis, united to create a new, independent life form that has characteristics of both its predecessors. How is writing like that? Take human ideas and stories, graft them onto the remnants of trees, and you get something that didn’t exist before: books. And these books, as we all know, go on to have lives of their own. They influence human events long after their initial publication dates; they take on a kind of incorporeal existence that despite its lack of physical substance, nevertheless exerts an effect not only on individual people, but on whole cultures. In a science fiction-type way, these creations could be seen as having some kind of life, although of course we’d have to define life differently to get there.

I’m not really arguing this, although I confess I do think it’s fascinating to delve into this way of looking at books and writing. I’m simply suggesting it as a thought experiment to help us see reading/writing as the kind of miracle it really is. For one thing, all the forms of long distance and long term communications we enjoy today are derivatives of this writing/reading model. This electronic blog you’re reading right now comes directly from papyrus sheets and Gutenberg. That TikTok video you watched and laughed at on Instagram this morning? Though admittedly transformed, it’s also the honest descendant of the first books compiled by Greeks, Romans, Babylonians. In fact, I would argue that human existence probably remained pretty static, with minimal changes from generation to generation, until the advent and spread of reading and writing, which allowed communication unfettered by place and time, and this in turn allowed men and women to improve upon the ideas and practices of the past, giving rise to steady change until we got to where we are today.

So why is this important? As I said, I think it’s an interesting thought experiment, but it’s more than that. When we talk about writing and reading these days, we tend to think about the publishing world: what’s getting published, what’s been published, who’s reading what, who’s writing what. The emphasis seems to be on the book, not the reader. I’d like more people to take the opportunity to look at the act of reading to see what a miracle it is, and by extension, how we can refine our reading skills to become better, more thoughtful readers. C.S. Lewis (who I’d argue is horribly undersold as the author of the Narnia series and low-grade Christian propaganda) suggested in a little book called An Experiment in Criticism that we divide readers into two categories: “Users” and “Consumers.” A user, if I remember correctly, is a reader who thoughtfully reads a book and considers it seriously. A consumer simply reads a book and sets it aside. It is the most common thing in the world for a user to re-read a book not once but several time, whereas a consumer will not re-read a book unless s/he has forgotten the plot. Users make use of their reading; consumers use their reading as mere escapism.

I don’t mean to bash reading as escapism; it has its time and its place and can be very useful, even healing. I’d rather focus instead on making all readers capable of both kinds of reading. In short, I’d argue for teaching reading not as it is taught now, as a basic skill required of all citizens, but rather as a higher level thinking skill, one which demands interaction with the text.

And here we return to the idea of the act of reading as a special miracle, one which somehow unites written thoughts from previous ages and long-dead writers, from disparate places and situations, with a reader who is enriched by experiencing different points of view, different ideas, and, to be succinct, completely different existences. Think of it this way: last week I read a book by Georges Simenon, a detective novel featuring Inspector Maigret, in French. I am not bilingual, not even fluent in French, although it’s true that I majored in it in college (about forty years ago!). But I could make my way through the novel and follow the plot for the most part. The situations and characters were conveyed to me in a language completely foreign from my own. That, I think, qualifies as a kind of miracle. When we read works in their original languages, when we slow down to enter the world of the text, whether it’s Beowulf or Candide, we engage with the text in a way that simply takes us out of our world into a completely different one. And when we return to our own world, we see with different eyes, think with a different mind. It is reading that makes this possible, since it provides us with a gift that for much of human existence has been absolutely unthinkable: the gift of transcending our own private existence.

Can we expect to make the most of such a gift with just the basic tools of reading? Shouldn’t we work harder to deserve this gift and enjoy it in all its glory?

Ah–but the question is how to do this. I have no doubt that if we start thinking about this and addressing it, we will come up with many different answers, some of which I hope to consider in future blogs. Until then, we can always improve with practice!

Happy Reading!

“A Lost Generation”

Heminway and friends in Spain. Photo from Wikimedia

Gertrude Stein, that enigmatic and difficult writer, is credited with coining this phrase, which Ernest Hemingway famously used as an epigraph for The Sun Also Rises (1926). I can just imagine her saying to her young acolyte, “Ernie, my boy, you and your friends, you’re all just a lost generation,” then taking another sip of absinthe, brought to her on a silver tray by Alice B. Toklas, and changing the subject to talk about Ezra Pound’s latest poems. Somewhere, long ago, I read that Stein also told Hemingway, after having read his first novel, that journalism was not the same thing as literature. I don’t remember whether this hurt the young novelist’s feelings, but it just goes to show that Stein did not pull her punches.

I’ve sometimes wondered what she meant by that term, “a lost generation.” It came to define the entire post WWI generation, to denote the feeling of hopelessness and desperation that lay just below the surface the frenetic enjoyment of the Roaring Twenties. Sadly, it was this very generation, lost as it was, that would later see its own children off to yet another global war. Of course, Stein couldn’t have known that such a fate was in store when she first used the phrase. But what exactly did she mean by “lost”?

There are several ways of being lost, some of which I will now briefly explore. A person, or I suppose an entire generation, can lose their way, wandering from place to place; this certainly constitutes being lost. It involves the panic not knowing where one is going, of growing more and more confused as time passes and one is no closer to one’s destination–perhaps even farther from it, in fact. This is certainly one way of being lost.

But a thing can be lost, too, and maybe that’s what Stein meant: when one goes to look for something and it isn’t in the place it should be, it’s lost. Perhaps Hemingway’s generation was lost in this way; it defied expectations by not doing what it should be doing or being where it should be. In other words, it had simply disappeared from view.

There’s yet another connotation of “lost,” and this one is disturbing to consider, although I’ve often thought it likely that this is what Stein meant when she called Hemingway and his friends “a lost generation.” This sense of the word means, more or less, a lost cause. When something is lost, gone for good, one simply makes do without it. Stein could have thought that Hemingway’s generation was proverbially out to lunch, that they were lost without hope of rescue–AWOL, so to speak–and that nothing good or important could be expected of them. In this sense, they were worse than merely lost; they were lost with no hope of recovery.

That would have been a pretty harsh judgment on the part of Stein, and I have no real evidence to back me up on my theory. Yet it is definitely one of the meanings of “lost.” The idea of an entire generation being disposable–disappeared, in fact–is perhaps one of the cruelest things Stein could ever have said, yet I think she might just have been capable of it.

But it’s when I think of contemporary culture that the cruelty of the term really resonates with me. In fact, I’ve been thinking for a while that the term “lost generation” has become an especially fitting phrase these days, as I watch the struggles of the generation that includes my children and my students as they try to make meaningful lives for themselves. In short, all of the definitions of “lost” listed above could apply to people 40 and under today. They are hopping from university to job, and then from one underpaid job to another, never settling down to any kind of stability for a variety of reasons: the crushing burden of student loan debt, inflation, lack of affordable housing, inflation, Covid fallout–the number of problems besetting these young people seems infinite, in fact.

Politically, they are lost, too–this generation, which has so much to be angry about, seems to be AWOL from the political scene. They are the cell phone generation, so they don’t respond to polls (usually conducted on landlines), but much worse is that they are disengaged and seem to have little hope of changing a world that has been so patently unfair to them. They are a political black hole right now; when candidates go to look for them, to canvass for and rely on their votes, many of them are simply not to be found.

But what really horrifies me is the third possible definition of “lost”– as in a “lost cause.” I believe that previous generations, including mine, have essentially cannibalized this generation, selling them on the myth of education as the solution to their problems (it isn’t, unfortunately), or the way to achieve a solid job (not true, either), or the way to solve society’s ills (eye roll here). It grieves me to say this, as a parent as well as a former college professor, but higher education seems to have ensured that this generation will indeed be lost: a lost cause, a piece of collateral damage produced by greedy student loan corporations, ill-conceived government initiatives, ignorant parents, and–to my shame–college employees who sought to fill classes and to bolster enrollment figures in an effort to ensure that their jobs were secure. Because of this willful blindness, an entire generation has been relegated to the status of a lost cause, offered up on the altar of capitalism, jingoistic slogans, and complacent greed.

This is a tragedy. That an entire generation should have to scramble for jobs, housing, and most important of all, a meaningful life, constitutes a struggle that eclipses the anomie of Hemingway’s generation. And, like that first Lost Generation, this generation bears no blame for its condition, though it is often unfairly criticized for a lack of initiative and other sins against the accepted norms of society. As I said above, it’s my generation, and the one before it, that deserves the blame, because we are the ones who helped to enslave them, either willfully or by turning our eyes away from the situation.

I am not sure what, if anything, we can do about this horrible situation. I only know that the solution to all problems begins with acknowledging that the problem exists. Only after we explore the problem in detail, fully admitting our own culpability, can we hope to provide any kind of viable solution.

That seems like a platitude to me, unfortunately. It may be that there is no solution to this problem other than the kind of revolution predicted by Karl Marx 150 years ago: an attempt to throw off the shackles forged by an eminently unfair economic system. After all, an entire generation has been exploited, cruelly offered up as a vicious sacrifice to greed and complacency.

And if revolution is the only answer to the problems besetting this new Lost Generation so be it. I know who I’ll be rooting for if it does in fact come to that.

Spring has finally come to Northern Michigan, where I live. One might think that would make things easier, that creative juices would flow as freely as the sap in the trees and plants that are absorbing the sunshine. But unfortunately that’s not how it works. Spring is a dicey time here, and not just because of the mud left behind by the melting of the snow. (Another thing that’s left behind is shovel-loads of dog feces, which the receding snow banks offer up as yet another sacrifice to the disappearance of winter.) The truth is that when the weather clears and the outside world looks brighter, sometimes it’s disconcerting when your internal world hasn’t kept pace. It can be depressing, because it’s hard to kick yourself in gear to get things done, and in spring time, you have no excuse not to.

So when I saw that a local store was offering a poetry workshop during the month of April in honor of National Poetry Month, I signed up for it on a whim. I don’t know whether I will write any poems as a result of this workshop, but that’s not really the point. What I’d like to happen is for me to rekindle my creative impulses, and so far, though I’m still wrestling with SI (Springtime Inertia), I think I can detect the beginning of some movement towards more of a creative flow.

But the workshop has reminded me of an important question I’ve had over the last few years–one that may be unanswerable but still deserves to be asked:

What makes good writing?

It’s a question I’ve been pondering seriously, even though it might sound like I’m being flippant. Having taught literature throughout my professional career, I should be able to answer that question without too much trouble. For example, as a writing instructor, I’d say, “Good writing is clear, succinct, and precise. It shows consideration for the reader by adhering to the commonly accepted rules of grammar, spelling, and punctuation. It connects the ideas it presents in a way that is easy to read and understand.” I think that’s a good start for a college composition class, anyway.

But clearly this will not work for most creative writing. Poets, for example, often show little consideration for their readers. In fact, I’m not sure contemporary poets actually write with readers in mind; often they seem to be jotting down notes to themselves for later reading. Not that there is anything wrong with that at all–this is, after all, why I am interested in poetry at this point in my life. I’ve realized that there are certain subjects and ideas I want to explore that are better suited for poems than for short essays like this one, and I think it’s worth the time and effort to try to articulate them in poetic form.

However, let’s get back to the question: what does make good creative writing? I am having a hard time formulating an answer. As I get older, I seem to be suffering from the reverse of the Dunning-Kruger Effect. I am less sure of everything I think about, even questions which I once felt sure of the answer to. But as far as good writing goes, I have come up with a provisional answer, and although I don’t find it very satisfying, I thought I’d try it out here.

I will begin by saying that the question itself is misguided. That’s because there is no such thing as good writing–only good reading. When we ask the question “what makes good writing?” we’re actually looking through the wrong end of the telescope. A good reader, I submit, is able to read almost anything and be enriched by the experience. A good reader will read any text, be it a poem, essay, novel, or piece of non-fiction, and find connections to other works. Of course, this is not to say there is no such thing as bad writing–I think we all know that it does exist–but that is a different issue. Seeing examples of bad writing will help us understand what not to do, but it won’t really help creative writers learn what to do to create good writing, so once again, I think it’s best to turn the question on its head and focus on what makes a good reader rather than what makes good writing.

After all, it has to be far easier to learn the skills required to be a good reader than to learn to be a good writer. And there are all sorts of advantages for the good reader–not only personal and professional, but social and political, as well. I think I’ll have to ponder on this one for a week or two, however, before I begin to identify how to create good readers and what makes good reading. For now, though, I’ll end with the suggestion that the world would surely be a better place if there were more good readers in it. I’ll go even further and add that maybe we’d all better get to work to see how we can do our part to create good, solid readers, because good readers make good citizens, and we can surely use a great many more good citizens in our world right now.

Smith versus Shelley: A Tale of Two Poems

Yesterday, I co-led a poetry discussion group at one of the area retirement communities, something I’ve done for the last few years. It’s been a really interesting experience–there’s so much to learn and discuss about even mediocre poems, and I enjoy hearing the participants share their ideas about the poems, as well as the stories and memories these poems evoke.

I choose the poems at random, with very little rhyme (pardon the pun) or reason to my choice. One of the poems yesterday was Percy Bysshe Shelley’s “Ozymandias.” Yes, I proffered that old chestnut to the group, even though I’d read it thousands of times and have taught it in many classes. I just wanted another look at it, I guess, and it’s fun to do that with company. What I wasn’t expecting, however, was my co-leader bringing in another poem on the same exact topic, written at the same time.

It happens that Shelley had a friend, the prosaically named Horace Smith, and the two of them engaged in a sonnet writing contest, on the agreed-upon subject of Ancient Egypt and, presumably, Rameses II, also known as Ozymandias. We remember Shelley’s poem: every anthology of 19th-century British literature probably contains it. However, Smith’s sonnet is largely forgotten. In fact, I’ll offer a true confession here: despite having taught Brit lit for decades, I’d not heard of Smith’s version until a couple of days ago.

It turns out that Smith was himself an interesting fellow. He wrote poetry, but was not averse to making money, unlike his younger friend Shelley. Smith was a stock-broker, and made a good living, while also, according to Shelley, being very generous with it. He sounds like a generally good guy, to be honest, something which Shelley aspired to be, but was really not. For all intents and purposes, Shelley was a masterful poet but a real asshole on a personal level, and a bit of an idiot to boot. (What kind of a fool goes sailing in a boat that he didn’t know how to operate, in a storm, when he didn’t even know how to swim?) Smith knew how to make and keep friends as well as money, two things that Shelley was not very good at, by all accounts.

At any rate, I thought it might be interesting to compare the two poems. Of course, we assume Shelley’s poem will be better: it’s the one that is in every anthology of 19th-Century British literature, after all, while I–with a Ph.D. in the subject, for whatever that’s worth–didn’t even know of the existence of Smith’s poem until a few days ago. But maybe, just maybe, there’s something valuable in the stockbroker’s poem that has been missed–and wouldn’t that make a fine story in and of itself?

So here are the two poems, first Shelley’s, and then Smith’s.

Ozymandias (Shelley)

I met a traveller from an antique land,

Who said—“Two vast and trunkless legs of stone

Stand in the desert. . . . Near them, on the sand,

Half sunk a shattered visage lies, whose frown,

And wrinkled lip, and sneer of cold command,

Tell that its sculptor well those passions read

Which yet survive, stamped on these lifeless things,

The hand that mocked them, and the heart that fed;

And on the pedestal, these words appear:

My name is Ozymandias, King of Kings;

Look on my Works, ye Mighty, and despair!

Nothing beside remains. Round the decay

Of that colossal Wreck, boundless and bare

The lone and level sands stretch far away.”

Ozymandias (Smith)

In Egypt’s sandy silence, all alone,
Stands a gigantic Leg, which far off throws
The only shadow that the Desert knows:—
“I am great OZYMANDIAS,” saith the stone,
“The King of Kings; this mighty City shows
The wonders of my hand.”— The City’s gone,—
Naught but the Leg remaining to disclose
The site of this forgotten Babylon.

We wonder — and some Hunter may express
Wonder like ours, when thro’ the wilderness
Where London stood, holding the Wolf in chace,
He meets some fragment huge, and stops to guess
What powerful but unrecorded race
Once dwelt in that annihilated place.

Now, I’d say Shelley definitely has the advantage in terms of poetic language, as well as the narrative situation. His words are sibilant and flowing, and it’s a stroke of genius to make the story come from not the speaker of the poem, but from a traveler from an antique land; it makes the scene seem even more authentic. The alliteration in the last two lines (“boundless” and “bare” as well as “lone” and “level”) is a deft touch as well.

I’d also say that Shelley’s choice of the half shattered face is much better than Smith’s. There’s something much more poetic about a sneering face, even if it’s a half of a face, than a gigantic leg. There’s no way on earth Smith could have made a gigantic leg sound poetic, and that hampers the poetic feel of his sonnet, which is a bit of a shame.

Or is it?

Perhaps Smith wasn’t going for poetic feel here at all. In fact, I’d argue that he definitely wasn’t thinking along the same lines Shelley was. There are obvious similarities between the two poems. We still get the empty site, the desolation of the “forgotten Babylon” that powers so much of Shelley’s version, but it turns out that Smith is interested in something completely different. Where Shelley’s poem comments on the nature of arrogance, a human pride that ends in an ironic fall, Smith’s presents the reader with a different kind of irony. His version is much grander. In fact, it’s a cosmic irony that Smith is grappling with here, as the poem comments on the inevitable rise and fall of human civilization. What I find astounding is that in 1818, just as England was beginning its climb up to the pinnacle of world dominance for the next two centuries, Smith was able to imagine a time when the world he knew would be in tatters, with nothing remaining of the biggest city on earth, save as a hunting ground for the presumably savage descendants of stockbrokers like himself. Smith’s imagination was far more encompassing that Shelley’s, given this kind of projection into the far future.

All told, Shelley’s poem is probably the better one: it’s more quotable, after all, and no matter how much I love Smith’s message and projection into the future, he just doesn’t have the choice of words and rhythm that Shelley does. But need we really limit ourself to just one of these poems, anyway? I’d say we’ve gleaned about as much as we can from Shelley’s “Ozymandias.” Perhaps ours is an age in which we can appreciate Smith’s vision of a far distant future. Empires rise and fall, waters ebb and flow, and civilizations come and go. Smith, with his Hunter coursing through what was once London, paints this idea just as well as Shelley does with his decayed Wreck. There’s room for both of these poems in our literary canon.

A Speech

I’ve been absent from this blog for the past few weeks, but it hasn’t all been basking in the glory of my math prowess. In fact, I barely had time to celebrate the fact that I had actually passed my College Algebra course when I came down with Covid, despite getting all recommended vaccines and being oh-so-careful. At any rate, I’m just about back to normal now, but Covid is not a walk in the park. The initial symptoms aren’t too bad–pretty much the same as the side effects from the vaccine–but the aftermath of fatigue, lethargy, and depression lasted for a few weeks. My takeaway is that it’s definitely worth taking all the precautions now that most of the rest of the world seems to have blithely abandoned in order to avoid getting Covid.

At any rate, I emerged from my Covid quarantine a few weeks ago, just in time to address the local League of Women Voters unit at their annual meeting. My days of being a candidate are behind me, but that makes me all the more appreciative of the people who are still active and who are working to improve the political landscape of the United States. To be honest, I feel more than a little guilty at not joining in their efforts more actively, so the least I could do, I told myself, is to speak to them when they ask me to. Then I decided that my speech, such as it was, could make a good blog post, so here goes. The topic is, as the Belle of Amherst would call it, that “thing with feathers that perches in the soul.”

We face many problems in our world today: a lingering pandemic, mass shootings, long-standing prejudice, violence, partisan hatred… the list goes on and on. But perhaps the most serious one, because it affects so many others, is the decay of democracy in our country. I’ll come back to this in a moment, but for now, I just want to say that even though this is a humdinger of a problem, the message I’d like to give today is that there is good reason to hope for change, because change is always possible. Good things as well as bad things are happening in the world, and so optimism should not be banished from the range of emotions we feel as we confront our future. Hope, as author Rebecca Solnit points out in her book Hope in the Dark: Untold Histories, Wild Possibilities, is one of many viable responses to the dire problems that we face. We must not be afraid to hope. It’s easy to be attracted to pessimism when we are afraid to hope. Hope is frightening, because we realize that when we hope for the best, we might well be proven wrong when our hopes fail to materialize. And yet we must not be afraid of being proven wrong. Frankly, the world would be a much better place if we all were more willing to take a chance on being wrong. After all, fear of being proven wrong often prevents us from acting to make the changes we so desperately need and desire.

I also want to point out that the problems with democracy that we’re now experiencing should come as no surprise to us. There’s a kind of odd comfort in realizing that as far back as 1840, the French diplomat Alexis de Tocqueville identified some serious issues in democracy in his book Democracy in America. He pointed out that while popular sovereignty (or democracy) could work very well at the local level, where people find it easy to be well informed on issues and where power is limited, three problems lie in wait at the national level to mar the democratic experiment:

  1. Competent people are willing to leave politics in the hands of less competent people;
  2. People’s belief in the idea of innate equality could give them a false sense of their capabilities and a dangerous sense of omnipotence;
  3. Excessive individualism and the pursuit of material wealth could result in apathy.

I think it’s fair to say that we have seen all three things come to pass in recent years. And I, like many other people, have often been tempted to throw my hands up in disgust and divorce myself from the political realm. But, as Naomi Klein says in her book This Changes Everything, “If we are to have any hope of making the kind of civilizational leap required of this fateful decade, we will need to start believing, once again, that humanity is not hopelessly selfish and greedy — the image ceaselessly sold to us by everything from reality shows to neoclassical economics.”

But here’s the interesting thing: that picture of humanity as innately selfish and greedy is beginning to change. We’re beginning to realize that it’s an imperfect picture, one that was built on a misunderstanding, or at the very least on an overemphasis, of a Darwinian belief in the survival of the fittest. We need to offset this view of human nature with Peter Kropotkin’s view, as he presented it in his work Mutual Aid, of evolution depending as much on cooperation as on competition. Scientists and philosophers are now working on amending our view of nature to correct this faulty emphasis on competition; for example, biologists like Suzanne Simard (Finding the Mother Tree) have shown that natural systems are much more physically connected than previously thought, just as primatologist Frans de Waal (The Age of Empathy: Nature’s Lessons for a Kinder Society) has demonstrated that empathy and cooperation have contributed as much, if not more, to the survival of humanity through the ages.

So there is reason to hope for change, for a different perspective. What we need right now is enough hope, and determination, and endurance to get us through these rapidly changing times. We need to remember that while we ourselves might not be around to enjoy the things these changes will bring, our children will, and so will their children. And we need to be willing to lay the foundation for those changes right now.

Change is something that can be difficult to navigate. Back in 1952, Edna Ferber wrote a passage in her book Giant (so much better than the movie) in which the main character’s wise father talks to his daughter, who is troubled by all she’s seen and experienced in Texas:

“The world will [change]. It’s changing at a rate that takes my breath away. Everything has speeded up, like those terrific engines they’ve invented these past few years… Your [husband] won’t change, nor you, but your children will take another big step: enormous step, probably. Some call it revolution, but it’s evolution, really. Sometimes slow, sometimes fast. Horrible to be caught in it, helpless. But no matter how appalled you are by what you see…, you’re still interested, aren’t you?”

“Fascinated! But rebelling most of the time.”

“What could be more exciting! As long as you’re fascinated, and as long as you keep on fighting the things you think are wrong, you’re living. It isn’t the evil people in the world who do the most harm, it’s the sweet do-nothings that can destroy us. Dolce far niente–that’s the thing to avoid in this terrible and wonderful world….

So first of all, we need to buckle in for a wild ride while true change has a chance to occur. But we also need nurture a fierce belief in the possibility of this change actually happening. And for that, I’ll point to another writer who gives me the tools to hope: Rutger Bregman. In his book Utopia for Realists, he has this to say about the power of belief, which is closely linked to our ability to hope:

Those who swear by rationality, nuance, and compromise fail to grasp how ideas govern the world. A worldview is not a Lego set where a block is added here, removed there. It’s a fortress that is defended tooth and nail, with all possible reinforcements, until the pressure becomes so overpowering that the walls cave in.

If we want to change the world we live in, then, we need to apply that pressure constantly, relentlessly, until we begin to destroy those walls, that fortress of belief that prevents us from restoring the democratic values we believe in. As Bregman says, “if we want to change the world, we need to be unrealistic, unreasonable, and impossible.”

And yet, as important as it is to hope, to believe in this change, Robert Reich reminds us that “hope is not enough. In order for real change to occur, the locus of power in the system will have to change.” Nevertheless, Reich himself is hopeful about the future. In his book The System: Who Rigged It and How We Fix It, he explains why:

History shows that whenever we have stalled or slipped, the nation’s forward movement has depended on the active engagement and commitment of vast numbers of Americans who are morally outraged by how far our economy and our democracy have strayed from our ideal and are committed to move beyond outrage to real reform.

He goes on to remind us that we need to “be organized and energized, not just for a particular election but for an ongoing movement, not just for a particular policy but to reclaim democracy so an abundance of good policies are possible.” What we need, he argues, is “a common understanding of what it means to be a citizen with responsibilities for the greater good.” He ends the book with a rousing pep talk: “Your outrage and your commitment are needed once again.”

These are powerful words, and they are therapeutic in restoring a sense of hope. I can do little more than echo them. So I’ll just leave you with the following few thoughts.

For those of you engaged in the fight to restore our democratic values, I urge you to stay engaged, enraged, and determined to change the structure of American politics from the ground up.

On a personal note: Take care of yourself. Pace yourself. Do what you personally can, and don’t feel badly about what you cannot do. Don’t focus on the negative. And take time to remind yourself of the successes you’ve had, no matter how small. Be willing to celebrate and share them.

And most of all, have hope! We are all, in a variety of ways, fighting the good fight. And in this fight, hope may well be the most important weapon we have. In the words of the Welsh literary critic Raymond Williams:

To be truly radical is to make hope possible rather than despair convincing.

Thank you for your commitment to democracy. Keep up the good fight, and keep hoping for positive change.

Random Thoughts about TV Shows

A few random thoughts about television shows this morning, since the end of a long winter is in sight, and I have survived it largely by knitting, reading, and–you guessed it–watching television shows and movies.

On Mindless Murder: Why do detective shows always, without fail, focus on murder? Based on the detective shows I watch (admittedly, most of them are British), it seems that all cases in which both police and private detectives are called are murders. Hence the Cabot Cove paradox: a small town, Cabot Cove, Maine, has the highest murder rate in the world, because Jessica Fletcher lives there and she must solve a new murder every week. (Don’t get me wrong–I love Murder, She Wrote, but I think that if a detective is good at solving murder cases, she ought to be good at solving other kinds of cases as well.) What about the cases in which no murder has occurred? Much of a detective’s job, after all, involves sitting and watching people, trying to get evidence of adultery, or perhaps finding a missing person (who often, I would hope, turns out not to be murdered). Even Sherlock Holmes occasionally worked on cases that did not involve a murder of any kind. I would love to see a detective show that doesn’t focus exclusively on that most brutal of crimes. In fact, I find it deeply troubling that so much of our entertainment comes from postulated murder, as if the only way we can amuse ourselves is by imagining the ultimate violence done to another human being. If detective shows would only sprinkle some non-murderous episodes in with their usual fare, I think it would be more realistic, for one thing, as well as more humane, and it would do those of us who watch them a lot more good.

On Evil Collectives: Why is a collective always represented as something bad? Take Star Trek: Voyager. While I find the Star Trek series creative and thoughtful, the Borg (a hive-mind collective that forcibly assimilates/absorbs all entities it encounters) quickly becomes a predictable and hackneyed antagonist. Of course, someone had the brilliant idea of “rescuing” 7 of 9 and integrating her into Voyager’s crew–kudos to whoever came up with that one–but the problem remains that we seem to be unable to imagine a collective association of human beings as anything but profoundly threatening to creativity, kindness, and mutual aid. Perhaps this stems from our Western distrust of collective societies and our American horror of communism. Yet this cannot be only an American issue, since Daleks–from the Dr Who series–are also portrayed as an evil, voracious collective society. My question is this: is it possible to imagine a non-threatening collective, one that is humane and caring? Why is it that we never see such a collective portrayed on television or in films? If we could imagine one (and of course non-agressive collective societies do indeed exist in nature, among bees, for example, and many other kind of animals so we needn’t go far for inspiration), perhaps we could aspire to replicate this kind of mutual aid society in our world.

On Emo SciFi: While I’m on the subject of science fiction, here’s a question that I’ve often pondered: Why are science fiction shows almost always dark? Of course, there’s a really easy answer to this question: it’s dark in outer space. I get that, but why is it that we can only imagine space travel as something in which disasters, emergencies, and threatening events occur? Wouldn’t it be more realistic to sprinkle some humor into the plot of a scifi show sometimes? I realize that we’re living in difficult times, as we move closer to tyranny and nuclear war threatens to erupt in Europe, but isn’t that itself a reason to provide entertainment that is uplifting and amusing as well as thoughtful? For that matter, why must “thoughtful” always mean “something dire is about to happen and the whole crew, or planet, or species could die?” I would very much like to see a science fiction show that occasionally has an episode focusing on disagreements between crewmates (because God knows that would happen on a long voyage–just ask any sailor who’s ever been on deployment), on equipment malfunctions, on anything but the mission ending in a fiery ball of disaster due to an out-of-control collective that is intent on committing murder.

In other words, it would be nice if someone out in TV Land got hold of a new blueprint for their plots instead of recycling the same old trite themes. But maybe that’s my own problem for expecting real creativity from an overburdened medium….

It’s pretty bad when one has to resort to doing math problems to get exposure to new ideas!