Category Archives: culture

The Ideological Work of Television and the Zombie Apocalyse

I have long argued that television programs, particularly situation comedies, perform an important piece of ideological work in our culture. Far from being pure entertainment, they introduce ideas that society may not want to confront. Of course, no one who can remember All in the Family or Murphy Brown will dispute this; but we may well be surprised to realize that television has always done this, even from its earliest days.

The two examples I have chosen to demonstrate this theory come from The Honeymooners (1955) and Bewitched (1964-1972). Back in the 1950s and ’60s, these sitcoms had to code their messages, making them available only to subtle and clever television viewers. In fact, the entire premise of both series rests on the implicit understanding that while women may have to kow-tow to their husbands, they are in fact the brains in their marriages. After all, Samantha is presumably all-powerful, yet she chooses to remain with the awkward and pouty Darren. Alice Kramden’s situation is less enviable–she is constrained by the 1950s dictum that proclaims women to be subservient to their husbands–but at the same time, she demonstrates to herself, to Ralph, and most importantly, to the audience, that she is in fact much more capable than Ralph and that he is head of the household only because of society awards him this position.

Ideological work is hidden, or coded, in early sitcoms, but it’s still there. For example, in The Honeymooners, in Episode 4 (“A Woman’s Work is Never Done”), Alice decides to get a job after Ralph berates her for not being able to keep up with the housework, while telling him it’s easier to work outside the home than within it. Ralph ridicules the notion, but Alice succeeds quite well, and even earns enough money to hire a maid to carry out the household chores, a maid who turns out to be so efficient and sarcastic that Ralph begs Alice to quit and return to being a homemaker. The message here, years before either That Girl or The Mary Tyler Moore Show appear on television, is that women can indeed be successful in the professional world. This message might have been too revolutionary to appear without coding, but it is delivered nonetheless through this subtle means.

Perhaps more interesting is Episode 7 of the first season of Bewitched (“The Witches Are Out”), in which Darren’s work on an advertising campaign that features witches is critiqued by Samantha as being clichéd and, even worse, rife with prejudice. She takes to the streets to spearhead protests against the campaign, joining a picket line, clearly reflecting the actual protests that were taking place in 1964, when this episode first aired. Since it was too dangerous to talk openly about racial prejudice, the show used a fictional prejudice–against witches–that the viewers would still understand, though perhaps unconsciously.

Neither of these episodes were intentional about their ideological work: in early situation comedies, these shows’ writers merely reflected and refracted the social reality they observed. In other words, during the early years of television, shows didn’t consciously represent the women’s movement or the civil rights movement. They simply reflected and displaced the social trends that were present at the time of their creation and presented them in a non-threatening, palatable form for their viewers.

But by the mid-1970s and beyond, television changed and became more outspoken, taking on a more direct role in society, and at the same time becoming much less afraid to stand on a soap-box. The velvet gloves came off, and we grappled openly with all sorts of issues, from bigotry (All in the Family), to homosexuality (Will and Grace). However, I believe that television still uses coded messages from time to time, and I think I’ve found an example of one genre that horrifies me, and not for its intended reason.

Since the mid 2000s, zombie-themed shows and books have proliferated. I first noticed a fascination with zombies among my students in about 2005, and I found it strange that a genre that had lain dormant for so long was coming back to life (pardon the pun, please). Since then, we’ve had World War Z, Pride and Prejudice and Zombies, and The Walking Dead. Ever the cultural analyst, I wondered what this preoccupation with zombie infestation might represent: just what kind of ideological work is it performing? At first, I thought it might indicate a fear of contagion, of a swift-moving and deadly pandemic. After all, we’ve seen, in the last twenty years, outbreaks of swine and bird flu, SARS, and Ebola. It would certainly make sense for a fear of virulent and lethal illness to express itself as a zombie invasion.

But recently it dawned on me that the imagined zombie invasion might represent something far worse: an invasion of migrants. And, before you dismiss this idea, let me pose a question: Is it possible that the populist rhetoric directed against immigrants is connected, through a subtle, ideological sleight-of-hand, to the rise of the zombie genre in film and television?

After all, so much of zombie plots resemble the imagined threat of uncontrolled immigration: the influx of great numbers of threatening beings who are completely foreign to our way of thinking, who are willing to fight for resources, who will not give up easily, who make us just like them–and who must be destroyed at any cost. I think it’s just possible, in other words, that the present social climate of suspicion, of protectionism, of hostility towards outsiders, has been fostered and cultivated by our ideological immersion in the genre of the zombie plot. Again, as with early television situation comedies, I don’t think this is an intentional linkage on the part of the writers; but intentional or not, the ideological work gets done, and suddenly we find our culture and civilization hostile to the very force that made us what we Americans are.

About ten years ago, I had a student who adored horror films and books. I asked him how he could stand to be made frightened by what he loved and spent so much time on. His answer haunts me today: “This isn’t what frightens me,” he said, pointing to a Lovecraft novel. “What frightens me is the day-to-day things, such as how I’m going to pay my rent.” In the same vein, I’ll end by asking this question: what if the really frightening thing about zombie shows isn’t what happens to their characters, but what happens to us when we watch them?

Advertisements

1 Comment

Filed under Criticism, culture, Films, Politics, Television, The Arts, Uncategorized

Anthony Trollope wants to know: Are you a Liberal or a Conservative?

Drawing_of_Anthony_Trollope

Anthony Trollope. Image from Wikipedia

There’s a lot of ink being spilled right now about the failure of liberal democracies, and I am guilty of pouring some of it myself. But it might be helpful to go back to redefine the two terms which invest so much of our discussions and arguments these days.

What, exactly, is the difference between liberal and conservative thought?

I’m not satisfied with responses that point to contemporary political positions: they are too fraught with bias, and thus don’t yield a reliable answer. In order to provide such a good answer, then, we will need to go back and define the terms themselves, to think about what it really means to be a liberal or a conservative.

And this proves quite tricky–so tricky, in fact, that although I first asked myself this question back in the 1980s, I have never been able to come up with a good answer. But thankfully, I don’t have to, because it turns out that Anthony Trollope provided an excellent answer back in 1876.

In his novel The Prime Minister, the Duke of Omnium, who is serving as the ineffective prime minister of Great Britain in a coalition government (and who fully realizes that nothing of consequence will be accomplished during his term of office) pauses to consider why people align with either the Liberal or the Conservative Party. In Chapter 68 (it is a very long novel), entitled “The Prime Minister’s Political Creed,” the duke questions his colleague Phineas Finn about why he is a liberal. (The duke, while obviously an aristocrat, is somewhat paradoxically a member of the Liberal Party.) In doing so, he reveals why he himself is a liberal:

I began life with the misfortune of a ready-made political creed. There was a seat in the House for me when I was twenty-on. Nobody took the trouble to ask my opinions. It was a matter of course that I should be a Liberal…. It was a tradition of the family, and was as inseparable from it as any of the titles which [we] had inherited…”

But now, at the apex of his political career, when he realizes that he will soon have to resign as prime minister, the duke thinks about what makes him a liberal. He begins by explaining what he considers conservative thought: the idea that God has fashioned the world in a certain way, and it is up to man to maintain that structure. The liberal thinker, says the duke, works to improve the world in order to reach a millenium (which I take to mean a Utopian period of human existence) in which the social and political order is perfected. However, this millenium, he says, “is so distant that we need not even think of it as possible.” He goes on to tell Phineas, “You are a Liberal because you know that it is not all as it ought to be.”

I think there’s quite a lot to learn from this chapter, even after though more than a century has passed since its publication. First of all, many of us begin our adult lives as liberals or conservatives simply because we have been handed those labels and told that they belong to us. Perhaps our parents were conservatives, so we identify as one–or perhaps we go the other way, rebelling against our parents and their beliefs. But I think it would be better for us, like the Duke of Omnium, to stop and think about why we behave as we do, and why we believe the things we believe.

When you simplify the issue as much as possible (I realize the danger of simplistic analysis, but it is sometimes worth the risk), the difference between the Liberal thinker and the Conservative one, as Trollope’s novel portrays it, is this: the conservative view holds that things were better in the past and should be maintained that way, while the liberal view holds that, however things were in the past, they are highly imperfect in the present and should be improved–and although a state of human perfection, while theoretically possible, is light years away, this is no reason to shirk the work involved in getting there.

In other words, the conservative view looks to the past, wanting to keep things as they are: stable, predictable, and functioning. After all, the past got us to the present, so it must work. The liberal view, in contrast, looks to the future, with a supreme confidence that improvement is possible in the human condition.

I endorse neither views at this point. I just want to posit a new way of looking at these terms to help open up a badly-needed space for discussion.

… But I also want to say that Anthony Trollope totally rocks the Victorian novel.

1 Comment

Filed under culture, History, Literature, Politics, Reading, Victorian Novel

The End of Democracy

20180121_082615.jpg

Chief Petoskey might agree that democracy is a failure.

It may be my bad luck, and my generation’s bad luck, to be alive at a time when we are witnessing the limits of democracy. We’ve had a good run–over two hundred years now–but it may be time to call it a day and start over with some new form of government.

I suppose I am as patriotic as anyone. There are two times in my life when I felt tears well up in my eyes solely because of my pride in being an American. One was after a three-week trip to Iceland, Scotland, and England in 1996, when I returned with my young family to Houston Intercontinental Airport. Waiting in customs, I noticed a babble of languages, and looking around, I saw myself surrounded by people of color, dressed in a variety of ways, many with headscarves or turbans. At that time, it was easy to imagine that these people, if not Americans themselves, would be welcomed as visitors to the United States, or perhaps as potential citizens. That was enough to make me sentimental about the diversity of my country, to be thankful to live in a country that valued all people.

(I will pass over for now the very real possibility—indeed, the near certainty—that this was simply a fiction, even at that time. My belief, however, was real enough to draw tears of pride from my eyes, which of course I quickly wiped away.)

The second time I became emotional with pride in my country was in about 2003 or 2004, when, as a union member from the local community college I stood in solidarity with nurses who were striking at the hospital. I was proud to do so—it is our right as Americans to stand and protest, as so many of us have recently found out. I was proud to be a citizen of a country that allows its citizens to congregate for such a purpose, despite the inconveniences that may be caused by it.

In the last couple of years, I’ve seen protests, but I haven’t taken part in them. I’ve supported them, but I have not been able to make myself participate in them. During the Women’s March, I stayed home, dissolved into a teary mess most of the day. But these were not tears of pride. Perhaps there was some pride mixed in, and admiration for the women who dedicated themselves to the cause, but there was also a feeling of profound despair at the need for such a march. It was the same thing with the March for Our Lives. What a beautiful expression of solidarity, but why should the people of this country need to march in order to protect our children, in order to stand up against an organization that should have no part in our electoral process, to protest the very electoral process that has been shown to be corrupt—not only because of foreign interference, but because of outrageously large campaign donations that fund and sway our elected officials?

Don’t get me wrong. To those of you who are participating in these movements, I want to say that I admire and love you for what you are doing. Yet I cannot help but feel that the need for such movements marks the decline of democracy, the end of this glorious experiment in civic rule that began over 200 years ago.

(Again, I will pass over the fact that this glorious experiment probably started, as so many others have, with the desire for personal gain on the part of the architects of the experiment.)

Democracy cannot work when it is corrupted by the desire for financial gain. It cannot work when the electorate is divided along the lines of hard-held, incontestable beliefs that brook no argument or discussion. It cannot work when our elected officials are, like the people who elect them, small-minded, fearful, and utterly dependent on large corporations who try to direct every facet of their lives and thoughts and are free to do so if they spend enough money on licit and illicit media campaigns.

Recently, retired Supreme Court Justice John Paul Stevens called for the repeal of the Second Amendment. It may in fact be time for such a step. But I fear it may be time for a more drastic step: to admit that our democracy, such as it is, has failed, and that it is time to go back to the drawing board to find a new, more equitable, more humane way of living together in this world that we have created for ourselves.

2 Comments

Filed under Criticism, culture, Politics

How I Became a Writer, part 1

I cannot remember a time in my life when I didn’t want to be a writer. Perhaps I didn’t care about writing back when I was too young to understand what being a writer meant–before I’d really learned to read, in those days when, as a young child, I read only the books  that were placed in my willing hands, those rhyming, oddly illustrated children’s books that were so common in the 1960s. It’s quite possible that back then I didn’t have a hankering  to join what I have come to consider the Great Conversation, that I was content to look and pass, not feeling compelled to offer something–some small tidbit at least–to the exchange of stories and ideas that has gone on for centuries now.

My first memory of reading was from the Little Bear books, which my father, an accountant, got by the cartload, since he worked for the publisher (I think?). I am confused about this, however. It’s just as likely that we had a surplus of these books laying around our house. I was the youngest of three children, after all, so it makes sense that children’s books would pile up, and that they would be handed off to me. I don’t remember actually learning to read, but I do remember the laughter that ensued when I tried to sound out “Chicago,” as well as having to struggle with the word “maybe,” which I pronounced incorrectly, with the accent on the “be” and not the “may.”

But these books certainly didn’t enchant me. That would have to wait for some years. In the meantime,  I remember seeing a copy of Julius Caesar on our dining room table, with its cover illustration featuring a lurid, bloody toga attracting more than a mere glance at it, and although I didn’t try to read Shakespeare’s misnamed tragedy, it couldn’t have been mere coincidence that I became enamored of the story of Caesar and Cleopatra, to such an extent that I would wrap myself in striped beach towels and stomp through our Brooklyn duplex declaring, in all seriousness, “I wish to be buried with Mark Anthony.” My elaborately crafted Cleopatra-fantasy imploded, however, when I convinced my second-grade class to put on a short play about Cleopatra, Julius Caesar, and Mark Anthony. (Is it possible that I wrote the play myself? That seems unlikely, but I cannot imagine that many age-suitable plays on that subject were available.) I was over the moon–until I got the news that I was to play Julius Caesar. And that was the end of that fantasy, much to the relief of my family.

The books that did grab my attention were a set of great books that my grandmother had20171225_142442 bought for her two children back in the 1930s: a set of all of Dickens, all of Twain, and some odds-and-ends, such as William M. Thackeray’s Vanity Fair, as well as a full set of encyclopedias. (I still have a few of the Dickens works, but most of the books were destroyed in a flooded warehouse back in the 1990s.) I am sure that my grandmother’s purchase was an investment in wishful thinking: I would swear an oath that neither my uncle nor my father ever read a word of these books. I am equally sure that I, feeling sorry for the books (which is something I still do–and explains why I sometimes check out books from the library that I have no interest in but will read because I think someone should pay them some attention), picked a few of them off the dusty shelf one summer and began to read them. I remember reading, and delighting in, Mark Twain’s Innocents Abroad long before I ever read Tom Sawyer or Huckleberry Finn.

As the youngest child in the family, I was frequently left to my own devices, and that was fine with me. But I think I might have been a little lonely, a little too strange for children my own age, and this was something that my parents wouldn’t have noticed, not back in the 1960s and ’70s, when there was less attention placed on the lives of children. So it’s natural that the books became my friends. When I visited my father after my parents got divorced (there was no joint custody back then, which was delightful for me, as it meant that I was able to stay with my father in NYC for a huge swath of the summer vacation), I started reading through the set of Dickens. In doing so, I found a whole new set of friends and family. Even today, when I open a Dickens novel–any Dickens novel–I feel like I am at a family reunion full of quirky, oddball relatives. It is a wonderful feeling.

This oddly rambling blog post is doing a fine job of explaining how I became a reader, but it is completely missing what I set out to do: explain how I became a writer. That, I can see now, will have to wait for another post.

 

 

1 Comment

Filed under culture, Literature, Publishing, Reading, self-publishing, Writing

Did Madame Defarge Knit Pussy Hats?

knitting picture

Sketch by Harry Furniss (1910).  Image scanned by Philip V. Allingham and located on the Victorian Web

Last night, in shameful parody of democratic rule, the U.S. Senate passed a sweeping tax bill that undercuts the middle and lower classes, eviscerates health care, and attacks education–all while giving more money to the very entities that need it least: the wealthiest portion of the population and the corporations they control.

Instead of analyzing how this happened, or why the people we elect have sold us out to the people who keep them in office–their political campaign donors–I will make some grand generalizations here to shed light on how the United States has become what it is today, on December 2, 2017: a plutocracy.

Let’s start with history: in the late Bronze Age (1200 BC or so), the growth of powerful societies was carefully controlled by a simple custom. Simon Stoddart, Fellow of Magdalene College, Oxford, speaking on In Our Time, a BBC 4 radio podcast, explains that in these societies, “it was not permitted to become too powerful,” and if a king did attain too much wealth, he was expected to throw a huge feast to dispense his wealth, or even bury excess wealth in a hoard. Doing this would gain the king prestige and restore economic balance to the region, but it would also lead to some instability in succession, because great wealth could not be inherited. Yet the custom was apparently  practiced by European Bronze Age societies as a levelling mechanism, to prevent one person or family from becoming too powerful.

Now why on earth would a powerful king consent to this kind of rule? The answer is simple: it was the custom of the time–he could not avoid doing so. And why was there such a custom? My guess is that early societies, living close to nature, observed  a natural balance in the world they lived in, and they knew that no good could ever come from upsetting this balance. Think of it this way: early societies observed first-hand what happened if there were too many lambs born in a certain year, or if too much rain fell on crops–or if one man became too powerful.

Human societies unlearned this lesson when they became less reliant on the natural environment they inhabited. By the early modern age (1500 or so), people were beginning to control nature to meet their needs. Transportation was easier, so if you depleted your farm’s soil, you might move to another one. You could drain bogs to make more arable soil on which to grow more crops and raise more livestock. You could even, as so many people were beginning to do, move to the city and try your hand at making a living completely divorced from the cycle of nature.

By the late 1700s, we see the beginnings of  the massive growth in urban areas that will characterize the world we live in today. It is no coincidence that we also see the rise of capitalism–as a philosophy and as a practice–at this time. And while the idea of capitalism is based on balance–the invisible hand adjusting the scales–it’s clear now that such an invisible hand was more wishful thinking than reality.

My point is that we have lost any notion of the need for balance in our political and economic systems. We have forgotten that when the very wealthy take more than they can possibly use, they leave other people in penury. Certainly the wealthy people have forgotten this law of nature and are simply grabbing all they can while the grabbing is good. The real problem is that too many people in the United States have identified with those wealthy people (how this has happened is interesting but must wait for another post), trusting that if the very wealthy are taken care of, they will be taken care of, too.

This tragically flawed logic is like thinking that in a shipwreck, if a powerful and athletic man manages to secure a place on a lifeboat, he will always make room for–in fact, he will always exert himself to save–the women, children, and less fortunate men who are still waving from the deck as it sinks below the waves. But here’s the problem: exerting oneself to save others demands a strong sense of either ethics or altruism, both of which seem to be lacking in the American upper class.

I don’t have any answers or solutions to offer. We live in dark and troubled times. To say that I despair for my country is not an exaggeration. Indeed, I never knew how much love I had for this country, this grand and imperfect experiment in democracy, until last year, when I witnessed what I think now might be the first chapter of  its descent into decay and destruction. Last night, while we were sleeping, we may well have seen the second one.

But I do know one thing: one way or another, balance will be re-established. It may be a somewhat orderly process, in which case we will see a great deal of corrective legislation coming from a newly elected Congress in the early months of 2019…. Or it may come after a long, destructive, and painful struggle, with lives lost and ruined in the process.

Either way, fasten your seatbelts, Americans. It’s going to be a bumpy night.

1 Comment

Filed under culture, History, Politics

Teaching Behind the Lines

French resistance fighters putting up posters

French Resistance Fighters putting up posters.  Image from “History in Photos” Blog (available here)

It’s been a year now since the election, and here I am, still fighting off a sense of futility and hopelessness about the future. During that time, the United States has pulled out of the Paris Accord in an astounding demonstration of willful ignorance about climate change, suffered a spate of horrific mass murders due to lax gun laws, and threatened nuclear war with North Korea. Suffice it to say that things are not going well.

But I should point out that the emphasis in my first sentence should be on the word “fighting,” because that’s what I’m doing these days: in my own small way, I’m waging a tiny war on some of the ignorance and egotism that seems to be ruling my country these days. Somewhere (I can’t find it anymore, and perhaps that’s just as well), the French novelist Léon Werth said that any action taken against tyranny, no matter how small, no matter how personal, helps to make things better. I’ve taken his words to heart, and I’m using this space to take stock of what I’ve done in the last year. I do this not to brag–far from it, because I know I’ve done far too little–but to remind myself that although I feel powerless too much of the time, I am not quite as powerless as I seem.

Let me begin, however, by saying what I haven’t done. I have not run for office. I did that in 2012, perhaps having had an inkling that things were not going well in my part of the country, but I was crushed by an unresponsive political system, apathy, and my own supreme unsuitability for the task. I am not ready to run for office again. In fact, I may never be ready to run again. I did write about my experience, however, and over the past year, I have encouraged other people, specifically women, to run for office. I’ve talked to a few activist groups about my experiences, and perhaps most important of all, I’ve donated to campaigns.

The thing I’ve done that merits any kind of discussion, however, is what I would call “resistance teaching”: going behind the lines of smug, self-satisfied ignorance, and using any tools I have to fight it. I still believe, naive as I am, that education can fight tyranny, injustice, and inequality. So I have engaged in a few activities that will, I hope, result in creating discussions, examining benighted attitudes, and opening up minds. I haven’t done anything too flamboyant, mind you–just a few actions that will hopefully develop into something more tangible in the months to come.

Here is my list:

  1. In spite of feeling gloomy about the future, I’ve continued with my writing, because I felt that even in difficult times, people should concentrate on making art. I self-published my second novel, and I wrote about it here, explaining why self-publishing can be an act of resistance in and of itself.
  2. I began to translate a novel about WW I, written by Léon Werth. I am now nearing my second revision of the translation. I have submitted a chapter of it to several fine magazines and received some nice rejection letters. I will be using my translation to present a short paper on WW I writing and Hemingway at the International Hemingway Conference in Paris this summer.
  3. I’ve traveled–quite a bit. I went to Italy, to Wales, to France, to Dallas, to Boston, and some other places that I can’t remember now. Traveling is important to open up barriers, intellectual as well as political. For example, in France I learned that while we Americans thought of Emmanuel Macron as a kind of savior for the French, he was viewed with some real skepticism and even fear by his electorate. Sure, he was better than Marine LePen–but he was still an unknown quantity, and most French people I met expressed some degree of hesitation about endorsing him.
  4. I directed a play for my community theatre group. Although it was hard and very time-consuming, I discovered that I really believe in the value of community theatre, where a group of individuals come together in a selfless (for the most part) effort to bring the words and ideas of a person long dead back to life. So what if audiences are tiny? It’s the work that matters, not the reception of it.
  5. I gave a talk at the C.S. Lewis Festival, which you can read here. It was fun and stimulating, and I remembered just how much I enjoy thinking and exploring literature and the ideas that shape it.

All of these things are fine, but I think the most important thing I’ve done in the past year is going back into the classroom again, this time as a substitute to help out some friends, but also to engage in what I think of “resistance teaching.” As a substitute professor, as a lifelong learning instructor, I can engage students and encourage them to think without being bound by a syllabus or any other requirements. I can get behind the lines of bureaucratic structures and work to create an atmosphere of free discussion and intellectual exploration. It is small work, and it may not be very effective, but I have taken it on as my own work, my own idiosyncratic way of combating the heartless ignorance, the dangerous half-assed education that prevails in our society.

I have always loved the idea of Resistance Fighters. I just never thought I’d be one myself.

2 Comments

Filed under culture, Education, Politics, Retirement, Teaching, The Arts, Travel

On the Relationship of Myth and Story

The_Lord_of_the_Rings_Trilogy

Image from the lotr.wiki.com

Please note: This is a very long post. It is based on a talk I gave yesterday (October 28, 2017) at the C.S. Lewis Festival in Petoskey, Michigan. Consider yourself warned!

 

The study of myth seems to me to take three different paths:

  • Anthropological / Archeological: the study of classical mythologies (Bulfinch’s Mythology, Edith Hamilton)
  • Religious / Transcendent: the spiritual meaning of myth (Karen Armstrong, Joseph Campbell, Sigmund Freud)
  • Structuralist: the study of the same structures that recur in myths (Northrop Frye, Joseph Campbell, Roland Barthes)

This is all interesting, but I would like to back up a moment. I feel like I’ve arrived a dinner party, and that somehow I missed the first two courses. I feel as if I might get some kind of mental indigestion if I don’t start over at the very beginning.

The fact is, I want to know something more fundamental about myth and its function.

  • I want to know what it is and how it works.
  • Specifically, I want to know what distinguishes myth from other forms of story-telling.

Because for me, Story-Telling is what distinguishes human beings, homo sapiens, from all other species on this planet, as far as we know.

  • Studies have shown that crows have memories
  • Studies have shown that chimpanzees use tools
  • Philosophers are now beginning to agree that animals do indeed have consciousness

But we—we should be known not as homo sapiens (wise man, the man who knows), but as homo narrans—the speaking man, the man who tells, who narrates—story-telling man.  Because it is clear to me that we humans communicate largely through story-telling, and this story-telling function, this tendency to rely on narration, is what makes us human.

I’m going to ask you to bear with me for a little while as I tease this out. I’d like to say that by the end of this essay, I’ll have some answers to the questions I posed (what is myth, and how does it work, and what is the difference between a really good story and a myth)—but I’m pretty sure I won’t. I may, however, ask some more questions that might eventually lead me to some answers.

So here goes. To begin with, a few people who weigh in on what myth is and what it does:

Roland Barthes, the French post-structuralist literary theorist, says that myth is a type of speech, a system of communication, a kind of message. In a way, Barthes and JRR Tolkien are not really different on this point, incredible as it is to think of Barthes and Tolkien agreeing on anything at all, much less something so important to each of them.

  • They are both incredibly passionate and devoted to the concept of language
  • Barthes, in his book Mythologies, which I have shamelessly cherry-picked for this essay, says that the myth’s objective in being told is not really important; it is the way in which it conveys that message that is important.
  • He says that “the knowledge contained in a mythical concept is confused, made of yielding, shapeless associations” (119).
    • But this isn’t as bad as it sounds, because myths actually don’t need to be deciphered or interpreted.
    • While they may work with “Poor, incomplete images” (127), they actually do their work incredibly efficiently. Myth, he says, gives to its story “a natural and eternal justification…a clarity which is not that of an explanation but that of a statement of fact” (143).
    • Myth is a story in its simple, pure form. “It acts economically: it abolishes the complexity of human acts, it gives them the simplicity of essences…” (143).
  • You can see how this view of myth kind of works with the myth-building that Tolkien does in The Lord of the Rings, which works with simple efficiency, whose very images are incomplete to the point of needing clarification in Appendices and further books like the Silmarillion. Yet even without having read these appendices and other books, we grasp what Tolkien is getting at. We know what Middle-Earth is like, because the myth that Tolkien presents needs no deciphering, no real interpretation for us to grasp its significance.

Tolkien, I think we can all agree, was successful in creating a myth specifically for England, as Jane Chance and many other scholars have now shown to be his intention. But is it a novel? Some might argue it isn’t—myself included. In fact, what Tolkien created in The Lord of the Rings is less a myth (I would argue that we only use that term because Tolkien himself used it to describe his work and his object—think of the poem “Mythopoeia,” which he dedicated to C.S. Lewis) than it is a full-blown epic.

For my definition of epic versus novel, I’m going to my personal literary hero, Mikhail Bakhtin, a great thinker, a marvelous student of literature, a man who wrote with virtually no audience at all for many years because he was sent into internal exile in the Soviet Union. In his essay “Epic and the Novel,” Bakhtin attributes these characteristics to epic:

  1. It deals with an absolute past, where there is little resemblance to the present;
  2. It is invested with national tradition, not personal experience, arousing something like piety;
  3. There is an absolute, unbridgeable distance between the created world of epic and the real world.

The novel, says Bakhtin, is quite the opposite. It is new, changing, and it constantly “comes into contact with the spontaneity of the inconclusive present; this is what keeps the genre from congealing. The novelist is drawn toward everything that is not yet completed” (27).

I think the three characteristics of epic described by Bakhtin do in fact match up nicely with The Lord of the Rings: absolute past, national tradition, distance between the actual and the created world. But here’s another thing about epic as described by Bakhtin: “The epic world knows only a single and unified world view, obligatory and indubitably true for heroes as well as for authors and audiences” (35).  It would be hard, indeed impossible, to imagine The Lord of the Rings told from a different point of view. We need that distant narrator, who becomes more distant as the book goes on. As an example, imagine The Lord of the Rings told from Saruman’s point of view, or from Gollum’s. Or even from Bilbo or Frodo’s point of view. Impossible! Of course, we share some of the point of view of various characters at various points in the narrative (I’m thinking specifically of Sam’s point of view during the Cirith Ungol episode), but it couldn’t be sustained for the whole of the trilogy.

The interesting thing here is that in The Lord of the Rings, Tolkien took the novel form and invested it with epic. And I think we can say that against all odds, he was successful. On the other hand, C.S. Lewis, in his last book Till We Have Faces, took a myth (the story of Cupid and Psyche), which is certainly more closely related to epic than it is to novel, and turned it into a successful novel. This isn’t the time and place to talk about Till We Have Faces, although I hope someday that we can come together in the C.S. Lewis Festival to do that very thing, but I couldn’t help mentioning this, because it’s striking that Lewis and Tolkien, while they clearly fed off each other intellectually and creatively, started from opposite ends in writing their greatest creative works, as they did in so many other things. It’s almost amazing that you can love both of them at the same time, but of course you can. It’s the easiest thing in the world to do.

But I’m losing the thread of my questions here. What is myth? Can we actually have modern myths? Can someone actually set out with the intention of creating a myth? And can a mythic work spontaneously just happen? Another question needs to be posed here: if this long book, which is probably classified in every bookstore and library as a novel, touches on myth but is really an epic, can a novel, as we know it, become a myth? This forces us to tighten up our definition of what a myth is and asks us to think about what myth does.

Karen Armstrong, I think, would say yes, to all three of these questions. In her book A Short History of Myth, Armstrong follows the trajectory of myths through time and argues that the advent of printing and widespread literacy changed how we perceive and how we receive myth. These developments changed myth’s object and its function—and ultimately, it changed the very essence of myth.

Armstrong points out that myths and novels have similarities:

  • They are both meditative
  • They can both be transformative
  • They both take a person into another world for a significant period of time
  • They both suspend our disbelief
  • They break the barriers of time and space
  • They both teach compassion

Inspired by Armstrong and by Bakhtin, I’m going to go out on a limb here and make a stab at answering my questions. And I’ll start by defining a modern myth as a super-story of a kind: a novel (or a film, because let’s open this up to different kinds of story-telling) that exerts its power on a significant number of people. These stories then provide, in film professor and writer Stuart Voytilla’s words, “the guiding images of our lives.”

In short, a modern myth has these characteristics:

  1. It belongs to a certain place and time. Like epic, it is rooted in a time and a place. It might not be far removed from the actual, but it cannot be reached from the actual.
  2. It unites a group of readers, often a generation of readers, by presenting an important image that they recognize.
  3. It unites a group of readers by fostering a similar reaction among them.
  4. It contains identifiable elements that are meaningful to its readers/viewers. Among these might be important messages (“the little guy can win after all,” “there’s no place like home,” the American Dream has become a nightmare”).

In other words, a mythic story can be made intentionally, as Star Wars was by George Lucas after he considered the work of Joseph Campbell; or it can happen accidentally. Surely every writer dreams of writing a mythic novel—the Great American novel—but it’s more or less an accident. The Adventures of Huckleberry Finn was a mythic novel of American, until it was displaced by To Kill a Mockingbird.  And I would note here that having your novel go mythic (as we might term it—it is, in a way, like “going viral,” except mythic stories tend to last longer than viral ones) is not really such a good thing after all. Look at Harper Lee—one mythic novel, and that was the end of her artistic output—as far as we know. A mythic novel might just be the last thing a great writer ever writes.

Anyway, back to our subject: a  modern myth gets adopted rather than created. Great myths are not made; they become. So let’s’ think of a few mythic novels and see how they line up with my four characteristics:

  1. Frankenstein
  2. Star Wars
  3. The Wizard of Oz
  4. The Great Gatsby or Death of a Salesman—take your pick.
  5. Casablanca
  6. The Case of Local Myths—family or friend myths, references you might make to certain films or novels that only a small number of people might understand. A case in point would be the re-enactments of The Rocky Horror Picture Show that take place each year around Halloween.

In essence, my answer, such as it is, to the questions I posed earlier comes down to this:

Modern myths are important stories that unite their readers or viewers with similar emotional and intellectual reactions. Modern mythology works by presenting recognizable and significant images that unite the people who read or view them. As for what distinguishes modern myths from other forms of story-telling, what tips a “normal” novel or film over into the realm of “mythic”—I don’t have an answer for this. I only have a couple of vague, unformed theories. One of my theories is this: Could one difference between myth and the novel (“mere” story-telling as such) be that myth allows the reader/listener to stay inside the story, while the novel pushes the reader back out, to return to the actual world, however reluctantly?

And let’s not forgot what Karen Armstrong wrote about myth: “It has been writers and artists, rather than religious leaders, who have stepped into the vacuum [created by the loss of religious certainty and despair created by modernism] and attempted to reacquaint us with the mythological wisdom of the past” (138).  Armstrong’s closing sentence is perhaps the most important one in the book: “If professional religious leaders cannot instruct us in mythical lore, our artists and creative writers can perhaps step into this priestly role and bring fresh insight to our lost and damaged world” (149). With this in mind, perhaps it’s time to go and read some more, and find more myths that can help us repair and restore ourselves, our faith in our culture, and in doing so, the world itself.

 

5 Comments

Filed under Criticism, culture, Films, Historical Fiction, Literary theory, Literature, Reading, Writing

On Directing a Play

5c56935504dd91d04620c9df83355192

Richard Chamberlain and Eileen Atkins in a television production of Christopher Fry’s The Lady’s Not For Burning  (1974).

From early August until now, I have been lucky enough to be involved with a community theatre’s production of The Lady’s Not For Burning. I am, at least in name, the director of the production, despite having very little experience in acting. I rose through the distaff side of theatre productions, having started out as a fairly excellent audience member, then graduating to backstage functions such as handling props and set changes, and finally taking the plunge and directing a play myself.

The best thing about directing a play is that you can, for once in your life, make people experience a piece of literature that you think is worthwhile. As an English professor, I spent most of my professional life begging my students to read things like Keats, Eliot (George, not T.S.!), and Dickens–and being soundly ignored most of the time. But now, I can be satisfied that some 100 or so people, perhaps more if audiences pick up during this, our last week of performances, will be introduced to this play. (I am, of course, counting the actors, set crew, sound crew, and producers in that 100 people.) I have to admit I feel pretty good about making people aware of this play, even if they aren’t as enthusiastic about it as I am.

I picked The Lady’s Not For Burning for several reasons, which I will explain below. But like most everything else in my retired life, I encountered it in the first place through random serendipity. When Margaret Thatcher died several years ago, the news media played and re-played a snippet of what was perhaps her most famous speech, in which she declared, referring to her stance on the Falklands War, “The lady’s not for turning.” This made me curious about the dramatic work she was referring to in her clever word-play, and so I checked it out from the library and read it, surprising myself by actually liking it…a lot. I told myself at that time that if I ever got the chance to make a new generation of  readers aware of it, I would take that chance.

The Lady’s Not For Burning was written in 1948 by English poet and playwright Christopher Fry. Delightfully absurd, it deals with the theme of existential despair, ultimately defeating it through a blend of physical and conversational humor, but most of all, through the power of love. Set in the middle ages, from the opening moments of the play we watch Thomas Mendip, a recently discharged soldier who has seen too much of battlefields and human misery, as he tries to get himself hanged in an effort to end a life he can no longer bear to live. Yet it is his misfortune to have arrived in Cool Clary, a dysfunctional village that is in the midst of a witch-hunt. Within a short time of his arrival, a young woman (Jennet Jourdemayne) appears, trying with all her might to convince the town elders that she is not guilty of witchcraft. Unlike Thomas, she has gotten into the habit of living, and she is not inclined to give it up so easily. The rest of the play follows the fortunes of these two people, one who wants to end his life and the other who desperately wants to live, two individuals caught up in a world whose vicissitudes they cannot fully understand, all against a backdrop of hilariously ineffective and hare-brained villagers.

As I mentioned above, I found The Lady’s Not For Burning delightfully funny when I first read it, but I have come to know the play a great deal better over the last few months, as I watched the cast of hard-working amateur actors spend hour after hour memorizing lines, getting thrown about on stage, and strutting about in strange clothing. I have learned a great deal along the way, but two things stand out. First, I know now that the play is even funnier than I first thought it was. But the second thing I learned is that it also exhibits a deep sadness that seems to fit the times we live in. After all, the world is all too often not a pretty place, as Thomas readily tells us. In fact, it’s frequently a downright ugly place. However, it is possible to find beauty, and humor, and love, upon this imperfect planet we inhabit, and I believe that if we have a duty in this life, it is to find and celebrate such things in the midst of suffering and death. In the end, it is the relatively minor character Nicholas Hebble who utters the words that embody the crucial message of the play: “The best thing we can do is to make wherever we’re lost in / Look as much like home as we can.” These lines are echoed by Thomas Mendip at the very end of the play, when he offers to help Jennet Jourdemayne find her way home, though neither one of them has any idea where on earth that home could be.

In a way, I feel that the actors, stage crew, producers, and I have also been trying to find our way home, to a definitive view of the play that is several months in the making. We may have gotten lost, but we have kept each other company, and we can be satisfied that we have done our best, I think. I will be glad when the play is over and I have my life back again, as I’m sure all members of the cast and crew will be, but I will also always be grateful for an opportunity to work closely, not only with a great group of people, but also with this overlooked piece of literature–to be able to study it, understand it, and appreciate it in a way that I could never have done without getting involved in an actual stage production.

 

1 Comment

Filed under culture, Literature, Miscellaneous Musings, Retirement, The Arts

On Self-Publishing and Why I Do It

Screen Shot 2017-09-21 at 1.11.27 PM

Let me get one thing straight right from the get-go: I know self-publishing is not the same thing as publishing one’s work through a legitimate, acknowledged publishing company. I also know that self-publishing is looked down upon by the established writing community and by most readers. In fact, for the most part I agree with this estimation. After all, I spent much of last year writing freelance book reviews for Kirkus Reviews, so I know what’s being published by indie authors: some of it is ok, but much more of it is not very good at all.

Knowing this, then, why would I settle for publishing my novels on Amazon and CreateSpace? This is a tricky question, and I have thought about it a great deal. Whenever anyone introduces me as an author, I am quick to point out that I am, in fact, just a self-published author, which is very different from a commercial writer. (And if at any time I am liable to forget this important fact, there are enough bookstores in my area that will remind me of it, stating that they don’t carry self-published books.) When I meet other writers who are looking for agents, I do my best to encourage them, largely by sharing with them the only strategy I know: Be patient, and persist in sending your queries out.

So why, since I know all this, do I resort to self-publishing my work? I’ve boiled it down to four main reasons.

First of all, I self-publish because I am not invested in becoming a commercially successful writer. I write what I want, when I want, and when I decide my work is complete, I submit it to an electronic platform that makes it into a book, which I can then share with family and friends and anyone else who cares to read it. In other words, for me writing is not a means by which to create a career, celebrity, or extra income. I have long ago given up the fantasy of being interviewed by Terry Gross on Fresh Air; my fantasies are more mundane these days.

Second, I do not need to be a commercial writer, with a ready-made marketing machine to sell my books, because I am not hoping to make any money from them. Rather, I look upon writing as a hobby, just as I look upon my interest in Dickens, Hardy, and the Brontes as a hobby. I am helped here by having spent many years engaged in academic research, a world in which publications may win their authors momentary notice, but certainly not any money, unless one happens to sell out to the lure of literary celebrity, as Stephen Greenblatt has. I have a few publications out in the academic world, but no celebrity and certainly no money to show for them–and I am totally fine with that. In my creative writing, I am lucky enough to have a hobby that satisfies me and costs me relatively little–far less, in fact, than joining a golf or tennis club would cost.

The third reason that I self-publish my work is that I actually enjoy doing so. There are some aspects of publication that have surprised me. For example, I have found that I really enjoy working with a great graphic designer (thanks, Laura!) to develop the cover of my novels. It is an extension of the creative process that is closely related to my work but something that I could never do myself, and this makes me all the more grateful and fascinated as I watch the cover come to life and do its own crucial part to draw readers into the world I have created.

As a retired writing professor, I realize how important revision and proofreading is, and to be honest, this is the only part of self-publishing that gives me pause, because I worry about niggling little errors that evade my editorial eye. But for the most part, I am old enough now to have confidence in my writing. Plus, the beauty of self-publishing is that it is electronic: if there are errors (and there are always errors, even in mainstream published books), I can fix them as soon as a kind reader points them out. So I suppose the fourth reason to self-publish lies in the fact that it is so very easy to do it these days.

These are four good reasons for me to self-publish, but the most important reason is that I apparently love to write, and self-publishing allows me to do this without worrying about submitting the same piece of work over and over again to agents and publishers, stalling out my creativity. While at the Bronte Parsonage Museum this past summer, I picked up a card that expresses how I feel about it, a quote from Charlotte Brontë: “I’m just going to write because I cannot help it.” (It is a testament to my literary nerdiness that I happen to know that this quotation comes from Brontë’s Roe Head Journal, but strangely enough, before I encountered it on a greeting card I never realized that it applied to myself as well as to Brontë.) In my idiosyncratic view, self-publishing allows the reader to decide whether a novel is worth reading, rather than punting that responsibility over to an overworked and market-fixated literary agent or editorial assistant. I am willing to trust that reader’s judgment, even if it means I will never sell many books.

And so today, as I am releasing my second self-published novel (Betony Lodge, available on Amazon and CreateSpace–and this is my last attempt at marketing it here on my blog), I am fully aware of the stigma of self-publishing, but I realize that what’s right for other writers may not be right for me. Today, then, I am taking my courage into my own hands and pushing that key to make the book go live.

And tonight I will be making my own champagne toast: here’s to living in the 21st century,  when digital publishing makes authors of us all!

6 Comments

Filed under Criticism, culture, Literature, Publishing, Reading, Retirement, self-publishing, Uncategorized, Writing

A Reader’s Dilemma: On Books and Racism

standfast

Image from Wikipedia

What does one do when one is reading a book that is entertaining but turns out to be blatantly racist? Does one stop and refuse to read it? Does one relegate it to the status of those books which, as Dorothy Parker famously said, deserve not to be set aside lightly, but thrown with great force? I pose this question as an ethical problem, not merely as a matter of taste.

 

Btweedsmuir2

Wikipedia image

The book in question is Mr. Standfast, by John Buchan, a man more famous as the author of The Thirty-Nine Steps, which was made into a movie by the young Alfred Hitchcock some twenty years after its publication. John Buchan was a career diplomat who served in South Africa in the aftermath of the Boer War and as an intelligence officer in WW I. He is perhaps most famous, however, for becoming the Governor General of Canada in 1935, and by most accounts, he did a good job, as evidenced by his declaration, as Doug Saunders reports in this article, that Canada’s strength as a nation depends on its cultural diversity.

 

As one of his first public acts, Buchan created the Governor General’s Literary Awards, among whose later recipients number Michael Ondaatje, Alice Munroe, Margaret Atwood, Yann Martel, and Rohinton Mistry. By most accounts, then, Buchan was a fairly good guy, a champion for diversity and the arts, and a pretty good story-teller. So what am I to feel and to think when the first-person narrator of Mr. Standfast expresses open derision and contempt for conscientious objectors? Or when I read passages in which he makes fun of certain characters’ profound desire for peace, for an end to the debilitating war that has cut short hundreds of thousands of lives, hopes, and aspirations, an end to a war that has robbed not one but several generations of their hopes and dreams? What am I to feel when I see another passage which documents a disgusting contempt for the budding African movement towards self-determination, lines so replete with a complacent sense of superiority that I hesitate even to bring myself to quote them here? Such lines are particularly offensive to me because I have been reading Njabulo Ndebele’s excellent book, Fools and Other Stories, which offers a compelling view of life in Soweto, South Africa. So it infuriates me when Hannay, the narrator of Buchan’s novel, reports on “a great buck nigger who had a lot to say about ‘Africa for the Africans.’ I had a few words with him in Sesutu afterwards, and rather spoiled his visit.” It is significant, I’d argue, that the narrator offers neither the Sesutu words themselves, nor a translation, nor even a summary of them, an omission that renders his boastful declaration of a logical victory over the African speaker both empty and bombastic.

And yet I don’t think the answer to my anger and dismay about this is to throw Mr. Standfast across the room. Or perhaps it is to do just that, but then to go and pick it up again, after my temper has cooled, and go on reading it. Certainly my enjoyment of the novel will be less than if I had not encountered such ugly things in the narrative. After all, I would like Stevie Smith’s poems much better if I hadn’t come across baldly antisemitic sloganism in her Novel on Yellow Paper. (As it is, I like Smith well enough to have named one of my cats after her.) Rather, I think the lesson to be learned here is that racism comes in many forms; that, in all probability, it resides in every single human being. Furthermore, we must remember that we cannot eradicate racism by simply looking the other way, by trying to ignore its presence–either in our heroes or in ourselves.

Only by confronting racism dead on, by calling it by its true name without trying to excuse it, can we quash it when it rises up, as it will continue to do for the next few generations at least. At the same time, we cannot afford to dissimulate, as the Introduction to my edition of Mr. Standfast (Wordsworth Classics, 1994) does when it attempts to excuse Buchan: “Some of the language and many of the attitudes find little favour today,” the anonymous editor explains, “and have prompted some commentators to label Buchan with a number of those epithets that are fashionable among the historically illiterate. It should be remembered that Buchan was a high Tory politician, and also that the views he expresses are relatively liberal for his time.” No–this isn’t good enough. Let us admit once and for all that in this book at least, Buchan wrote as if he was a deplorable racist.

But let us also admit that the novel in question is a mere snapshot of him taken in 1919, and that it is unfair to judge the entirety of his life by this snapshot. It’s possible that he changed by the 1930s; but even if he didn’t, we can learn from his example. We can look at his works and see how very far we’ve come, and we can, without dissimulation or censorship, confront his racism for what it is: we can critique it, we can condemn it–and then we can move past it. If we choose not to, if we set the book down, then we miss out on the experience of reading it, and I’d say that would be a victory for racism, because it would shut down our capacity to explore human experience in its great variety.

I am not willing to foreclose on  such experiences. I believe that I am strong enough as a reader, indeed, as a person, to encounter racism in literature and to move past it so that I can gather a more complete picture of human culture, not as it should have been, but as it was. Distasteful as it can be to read reflections of ugliness, we must continue to do so if we want to try to understand our past and to control both our present and our future.

For me, it’s simply a matter of honesty. And so I will grit my teeth, shake my head, and continue reading Mr. Standfast. If it’s a good book, or even a spectacularly bad one, I may even write about it here in a few weeks.

6751

Buchan and Hitchcock, image from the Hitchcock Zone Wiki

7 Comments

Filed under culture, Films, Literature, Miscellaneous Musings