Category Archives: Criticism

An Unexpected Masterpiece: Felix Holt

Turducken_quartered_cross-section

Turducken–Image from Wikipedia

 

One of the joys of being a retired English professor is that you never really leave your work behind: you just leave all the parts of it that aren’t that much fun. This means that while I don’t have to grade papers or go to pointless committee meetings, I still get to do what inspired me to go to graduate school in the first place: read.

And I do read–a lot. I read all sorts of things, but of course my favorite thing to read is (guess what) Victorian novels. I have taken a lot of pleasure in re-reading the Victorian novels that I studied in depth, like David Copperfield and Jane Eyre, but there is a special sort of pleasure in discovering a new favorite novel. It’s like finding a new star hidden in a constellation you’ve looked at for years, or in a more mundane manner of speaking, like finding that lost sock that went missing in the last load of wash you did.

My lost sock is, to mix metaphors, a turducken of a Victorian novel. We all know that George Eliot is probably the most brilliant of the Victorian novelists; if we didn’t, we have Virginia Woolf declaring, in her autocratic way, that Middlemarch is “one of the few English novels written for grown-up people” (The Common Reader, “George Eliot”). We also have New Yorker writer Rebecca Mead’s take on the novel in her book My Life in Middlemarch, which apparently qualified her to write a nice essay on the book in the magazine.

But then there are the Eliot works that are too seldom read these days: Romola, Scenes of Clerical Life, Daniel Deronda. And who among us has actually read Felix Holt, the Radical? I confess that I have had a copy of it on my bookshelf since 1999, yet I never opened it until last week. I am thankful that I did, because I now think it’s one of the best Victorian novels I’ve read, despite the fact that, according to Wikipedia, it is one of the least popular of Eliot’s novels–so unpopular, in fact, that although it would make a fantastic mini-series (PBS or BBC, are you listening?), the last time it was adapted for film was in 1915. Yes, 1915.

What makes the novel so wonderful are not just some fantastic statements that are eminently quotable, although the book does contain a couple of real gems. Here is one, from the Introduction:

“Posterity may be shot, like a bullet through a tube, by atmospheric pressure from Winchester to Newcastle: that is a fine result to have among our hopes; but the slow old-fashioned way of getting from one end of our country to the other is the better thing to have in the memory. The tube-journey can never lend much to picture and narrative; it is as barren as the exclamatory O!”

Eliot goes on to explain that it is a slow, surface journey that allows the traveler to see and experience the varieties of life, not a quick, subterranean (we might add “aerial” here) journey. How prescient Eliot must have been to have seen what would happen to travel in the next generations, to have understood the way in which “getting there” is no longer fun or important. She makes us understand that the saying “it’s the journey that counts, not the destination” refers only to some kind of moral or experiential journey, and sadly, no longer a real, actual one.

Here is another famous quote, from Chapter 3, in which Eliot displays a remarkable sensitivity to social life:

“…there is no private life which has not been determined by a wider public life, from the time when the primeval milkmaid had to wander with the wanderings of her clan, because the cow she milked was one of a herd that had made the pastures bare.”

The insight revealed in this novel is remarkable, but these selected quotations are not the chief strengths of Felix Holt. What is absolutely amazing to me is that in this one book Eliot combines a variety of different Victorian novels and still manages to create an incredibly good story, one which pulls you back to it day after day because you cannot wait to find out how the characters will respond to the events they become caught up in.

Here’s a simple way of putting it: In George Eliot’s Felix Holt, the Radical (ironically, we are halfway through the book before we realize that Felix Holt is no Radical), we find a turducken of a novel, one which combines and recombines aspects of several different subgenres of the 19th-century novel, fitting many novels, miraculously, into one organic whole. For example, we see the re-education of Esther Lyon, in a Mansfield Park (Jane Austen) narrative; we have the political machinations that are redolent of Anthony Trollope’s Palliser novels; we have the emphasis on hidden secrets and parentage, on madness and eccentricity that Dickens loved to play with; we are treated to a look at a kind of Orientalism, which is worthy of Wilkie Collins; and we have a legal plot about long-hidden heirs and family trusts that blends both Trollope and Dickens with Thomas Hardy. And at the center of it all, we find a difficult love story, starring Esther Lyon and Felix Holt, who are clearly borrowed from some of Sir Walter Scott’s best romances.

With all these things going on, you’d think this would be a mess of a novel, but Eliot is a master craftsman, and she manages to create a wonderful story from these disparate threads, replete with excellent character depictions and some memorable scenes. In short, this is a fine novel, probably just as good as Middlemarch, and quite a bit shorter. It deserves to be read. I certainly wish I hadn’t waited almost 20 years to read it, but I’m very glad I finally have.

So go out and get a copy and read it. Or, if you want, you can always wait for the BBC Miniseries to air. Julian Fellowes or Emma Thompson, it’s time for you to get to work on the script!

Advertisements

Leave a comment

Filed under Criticism, Historical Fiction, Literature, Reading, Victorian Novel

Six Rules for Reading (and Enjoying) Julius Caesar

I have always assumed that the best example of my argument that most people get Shakespeare plays all wrong would be Romeo and Juliet. But I have to admit I was mistaken. In fact, I think it is safe to posit that no other Shakespeare play is so maligned and misunderstood as Julius Caesar.

I think this is largely due to the way we teach the play in the United States. Of course, because we do teach the play in high school, Julius Caesar has always gotten tremendous exposure: almost everyone I’ve met has been forced to read the play during their high school career. In fact, I think it’s still on high school reading lists today. But that’s probably also exactly why it’s so misunderstood.

I’m not blaming high school teachers, because by and large they’re told to teach these plays without any adequate preparation. I suppose if anyone deserves blame, it’s the colleges that train teachers. But all blame aside, before I talk about what a great play it really is, and what a shame it is that most people summarily dismiss  Julius Caesar without ever really considering it, let’s look at why this has happened.

julius_caesarFirst of all, it goes without saying that making someone read a play is not a great way to get him or her to like it. Especially when that play is over 400 years old and written in (what seems to be) archaic language. But a still greater problem is that there is a tendency to use the play to teach Roman history, which is a serious mistake. (American high schools are not alone in this; Samuel Taylor Coleridge, for example, criticized the play for not being realistic in its portrayal of Roman politics back in the early 1800s.) In short, far too many people associate this play with a bunch of men showing a great deal of thigh or swathed in endless yards of material, flipping their togas around like an adolescent girl tosses her hair over her shoulder. It’s all too distracting, to say the least.

So, in order to set us back on the right track and get more people to read this fine play,  I’ve made a little list of rules to follow that will help my readers get the most enjoyment, emotional and intellectual, from the play.

Rule Number One: Forget about Roman history when you read this play. Forget about looking for anachronisms and mistakes on the part of Shakespeare’s use of history. Forget everything you know about tribunes, plebeians, Cicero, and the Festival of Lupercalia. The fact is, the history of the play hardly matters at all. Rather, the only thing that matters is that you know in the beginning moments that Caesar will die and that, whatever his motives and his character, Marcus Brutus will pay for his part in Caesar’s assassination with his own life and reputation.

Rule Number Two: Recognize that this is one of Shakespeare’s most suspenseful plays. Our foreknowledge of events in the play, far from making it predictable and boring, provides an element of suspense that should excite the audience. Here we can point to Alfred Hitchcock’s definition of suspense, in which he explains that it’s the fact that the audience knows there’s a bomb hidden under a table that makes the scene so fascinating to watch, that makes every sentence, every facial expression count with the audience. It’s the fact that we know Julius Caesar is going to die on the Ides of March that makes his refusal to follow the advice of the soothsayer, his wife Calpurnia, and Artemidorus so interesting. We become invested in all of his words and actions, just as our knowledge that Brutus is going to lose everything makes us become invested in him as a character as well. A good production of this play, then, would highlight the suspenseful nature within it, allowing the audience to react with an emotional response rather than mere intellectual curiosity.

Rule Number Three: Understand that this play is, like Coriolanus, highly critical of the Roman mob. Individuals from the mob may be quite witty, as in the opening scene, when a mere cobbler gets the better of one of the Roman Tribunes, but taken as a whole, the mob is easily swayed by rhetoric, highly materialistic, and downright vicious. (In one often-excluded scene–III.iii–a poet is on his way to Caesar’s funeral when he is accosted by the crowd, mistaken for one of the conspirators, and carried off to be torn to pieces.) It’s almost as if this representation of mob mentality–the Elizabethan equivalent of populism, if you will–is something that Shakespeare introduces in 1599 in Julius Caesar, only to return to it nine years later to explore in greater detail in Coriolanus.

Rule Number Four: Recognize that this play, like many of Shakespeare’s plays, is misnamed. It is not about Julius Caesar. It’s really all about Marcus Brutus, who is the tragic hero of the play. He is doomed from the outset, because (1) it is his patriotism and his love of the Roman Republic, not a desire for gain, that drives him to commit murder; (2) he becomes enamored of his own reputation and convinces himself that it is his duty to commit murder and to break the law; (3) he falls victim to this egotism and loses everything because of it. Audience members really shouldn’t give a hoot about Julius Caesar; he’s a jerk who gets pretty much what he deserves. But Brutus is a tragic hero with a tragic flaw, a character whose every step, much like Oedipus, takes him further and further into his own doom. The soliloquies Brutus speaks are similar to those in Macbeth, revealing a character that is not inherently bad but rather deficient in logic, self-awareness, and respect for others. In fact, in many ways, it’s interesting to look at Julius Caesar as a rough draft not only of Coriolanus but of Macbeth as well.

Rule Number Five: Appreciate the dark comedy in the play. Shakespeare plays with his audience from the outset, in the comic first scene between the workmen and the Roman Tribunes, but another great comedic scene is Act IV, scene iii, when Brutus and Cassius meet up before the big battle and end up in an argument that resembles nothing more than a couple of young boys squabbling, even descending into a “did not, did so” level. This scene would be hilarious if the stakes weren’t so high, and if we didn’t know that disaster was imminent.

Rule Number Six: Experience the play without preconceptions, without the baggage that undoubtedly is left over from your tenth-grade English class. Once you do this, you’ll realize that the play is timely. It explores some really pertinent questions, ones which societies have dealt with time and time again, and which we are dealing with at this very moment. For example, when is it permissible to commit a wrong in order for the greater good to benefit? (surely Immanuel Kant would have something to say about this, along with Jeremy Bentham). How secure is a republic when its citizens are poor thinkers who can be swayed by mere rhetoric and emotionalism instead of reason? What course of action should be taken when a megalomaniac takes over an entire nation, and no one has the guts to stop him through any legal or offical means?

In the end, Brutus’s tragedy is that he immolates his personal, individual self in his public and civic responsibilities. Unfortunately, it is the inability to understand this sacrifice and the conflict it creates, not the play’s historical setting in a distant and hazy past, that has made it inaccessible for generations of American high school students. Too many decades have gone by since civic responsibility has been considered an important element in our education, with the sad but inevitable result that several generations of students can no longer understand the real tragedy in this play, which is certainly not the assassination of Julius Caesar.

But perhaps this is about to change. In the last few months, we’ve been witnessing a new generation teaching themselves about civic involvement, since no one will teach it to them. And as I consider the brave civic movement begun by the students from Marjory Stoneman Douglas High School, I am hopeful that from now on it’s just possible that reading Julius Caesar could become not a wasted module in an English class, but the single most important reading experience in a high-school student’s career.

 

4 Comments

Filed under Criticism, culture, Education, Literature, Politics, Reading, Shakespeare, Teaching

The Ideological Work of Television and the Zombie Apocalyse

I have long argued that television programs, particularly situation comedies, perform an important piece of ideological work in our culture. Far from being pure entertainment, they introduce ideas that society may not want to confront. Of course, no one who can remember All in the Family or Murphy Brown will dispute this; but we may well be surprised to realize that television has always done this, even from its earliest days.

The two examples I have chosen to demonstrate this theory come from The Honeymooners (1955) and Bewitched (1964-1972). Back in the 1950s and ’60s, these sitcoms had to code their messages, making them available only to subtle and clever television viewers. In fact, the entire premise of both series rests on the implicit understanding that while women may have to kow-tow to their husbands, they are in fact the brains in their marriages. After all, Samantha is presumably all-powerful, yet she chooses to remain with the awkward and pouty Darren. Alice Kramden’s situation is less enviable–she is constrained by the 1950s dictum that proclaims women to be subservient to their husbands–but at the same time, she demonstrates to herself, to Ralph, and most importantly, to the audience, that she is in fact much more capable than Ralph and that he is head of the household only because of society awards him this position.

Ideological work is hidden, or coded, in early sitcoms, but it’s still there. For example, in The Honeymooners, in Episode 4 (“A Woman’s Work is Never Done”), Alice decides to get a job after Ralph berates her for not being able to keep up with the housework, while telling him it’s easier to work outside the home than within it. Ralph ridicules the notion, but Alice succeeds quite well, and even earns enough money to hire a maid to carry out the household chores, a maid who turns out to be so efficient and sarcastic that Ralph begs Alice to quit and return to being a homemaker. The message here, years before either That Girl or The Mary Tyler Moore Show appear on television, is that women can indeed be successful in the professional world. This message might have been too revolutionary to appear without coding, but it is delivered nonetheless through this subtle means.

Perhaps more interesting is Episode 7 of the first season of Bewitched (“The Witches Are Out”), in which Darren’s work on an advertising campaign that features witches is critiqued by Samantha as being clichéd and, even worse, rife with prejudice. She takes to the streets to spearhead protests against the campaign, joining a picket line, clearly reflecting the actual protests that were taking place in 1964, when this episode first aired. Since it was too dangerous to talk openly about racial prejudice, the show used a fictional prejudice–against witches–that the viewers would still understand, though perhaps unconsciously.

Neither of these episodes were intentional about their ideological work: in early situation comedies, these shows’ writers merely reflected and refracted the social reality they observed. In other words, during the early years of television, shows didn’t consciously represent the women’s movement or the civil rights movement. They simply reflected and displaced the social trends that were present at the time of their creation and presented them in a non-threatening, palatable form for their viewers.

But by the mid-1970s and beyond, television changed and became more outspoken, taking on a more direct role in society, and at the same time becoming much less afraid to stand on a soap-box. The velvet gloves came off, and we grappled openly with all sorts of issues, from bigotry (All in the Family), to homosexuality (Will and Grace). However, I believe that television still uses coded messages from time to time, and I think I’ve found an example of one genre that horrifies me, and not for its intended reason.

Since the mid 2000s, zombie-themed shows and books have proliferated. I first noticed a fascination with zombies among my students in about 2005, and I found it strange that a genre that had lain dormant for so long was coming back to life (pardon the pun, please). Since then, we’ve had World War Z, Pride and Prejudice and Zombies, and The Walking Dead. Ever the cultural analyst, I wondered what this preoccupation with zombie infestation might represent: just what kind of ideological work is it performing? At first, I thought it might indicate a fear of contagion, of a swift-moving and deadly pandemic. After all, we’ve seen, in the last twenty years, outbreaks of swine and bird flu, SARS, and Ebola. It would certainly make sense for a fear of virulent and lethal illness to express itself as a zombie invasion.

But recently it dawned on me that the imagined zombie invasion might represent something far worse: an invasion of migrants. And, before you dismiss this idea, let me pose a question: Is it possible that the populist rhetoric directed against immigrants is connected, through a subtle, ideological sleight-of-hand, to the rise of the zombie genre in film and television?

After all, so much of zombie plots resemble the imagined threat of uncontrolled immigration: the influx of great numbers of threatening beings who are completely foreign to our way of thinking, who are willing to fight for resources, who will not give up easily, who make us just like them–and who must be destroyed at any cost. I think it’s just possible, in other words, that the present social climate of suspicion, of protectionism, of hostility towards outsiders, has been fostered and cultivated by our ideological immersion in the genre of the zombie plot. Again, as with early television situation comedies, I don’t think this is an intentional linkage on the part of the writers; but intentional or not, the ideological work gets done, and suddenly we find our culture and civilization hostile to the very force that made us what we Americans are.

About ten years ago, I had a student who adored horror films and books. I asked him how he could stand to be made frightened by what he loved and spent so much time on. His answer haunts me today: “This isn’t what frightens me,” he said, pointing to a Lovecraft novel. “What frightens me is the day-to-day things, such as how I’m going to pay my rent.” In the same vein, I’ll end by asking this question: what if the really frightening thing about zombie shows isn’t what happens to their characters, but what happens to us when we watch them?

1 Comment

Filed under Criticism, culture, Films, Politics, Television, The Arts, Uncategorized

The End of Democracy

20180121_082615.jpg

Chief Petoskey might agree that democracy is a failure.

It may be my bad luck, and my generation’s bad luck, to be alive at a time when we are witnessing the limits of democracy. We’ve had a good run–over two hundred years now–but it may be time to call it a day and start over with some new form of government.

I suppose I am as patriotic as anyone. There are two times in my life when I felt tears well up in my eyes solely because of my pride in being an American. One was after a three-week trip to Iceland, Scotland, and England in 1996, when I returned with my young family to Houston Intercontinental Airport. Waiting in customs, I noticed a babble of languages, and looking around, I saw myself surrounded by people of color, dressed in a variety of ways, many with headscarves or turbans. At that time, it was easy to imagine that these people, if not Americans themselves, would be welcomed as visitors to the United States, or perhaps as potential citizens. That was enough to make me sentimental about the diversity of my country, to be thankful to live in a country that valued all people.

(I will pass over for now the very real possibility—indeed, the near certainty—that this was simply a fiction, even at that time. My belief, however, was real enough to draw tears of pride from my eyes, which of course I quickly wiped away.)

The second time I became emotional with pride in my country was in about 2003 or 2004, when, as a union member from the local community college I stood in solidarity with nurses who were striking at the hospital. I was proud to do so—it is our right as Americans to stand and protest, as so many of us have recently found out. I was proud to be a citizen of a country that allows its citizens to congregate for such a purpose, despite the inconveniences that may be caused by it.

In the last couple of years, I’ve seen protests, but I haven’t taken part in them. I’ve supported them, but I have not been able to make myself participate in them. During the Women’s March, I stayed home, dissolved into a teary mess most of the day. But these were not tears of pride. Perhaps there was some pride mixed in, and admiration for the women who dedicated themselves to the cause, but there was also a feeling of profound despair at the need for such a march. It was the same thing with the March for Our Lives. What a beautiful expression of solidarity, but why should the people of this country need to march in order to protect our children, in order to stand up against an organization that should have no part in our electoral process, to protest the very electoral process that has been shown to be corrupt—not only because of foreign interference, but because of outrageously large campaign donations that fund and sway our elected officials?

Don’t get me wrong. To those of you who are participating in these movements, I want to say that I admire and love you for what you are doing. Yet I cannot help but feel that the need for such movements marks the decline of democracy, the end of this glorious experiment in civic rule that began over 200 years ago.

(Again, I will pass over the fact that this glorious experiment probably started, as so many others have, with the desire for personal gain on the part of the architects of the experiment.)

Democracy cannot work when it is corrupted by the desire for financial gain. It cannot work when the electorate is divided along the lines of hard-held, incontestable beliefs that brook no argument or discussion. It cannot work when our elected officials are, like the people who elect them, small-minded, fearful, and utterly dependent on large corporations who try to direct every facet of their lives and thoughts and are free to do so if they spend enough money on licit and illicit media campaigns.

Recently, retired Supreme Court Justice John Paul Stevens called for the repeal of the Second Amendment. It may in fact be time for such a step. But I fear it may be time for a more drastic step: to admit that our democracy, such as it is, has failed, and that it is time to go back to the drawing board to find a new, more equitable, more humane way of living together in this world that we have created for ourselves.

3 Comments

Filed under Criticism, culture, Politics

On the Relationship of Myth and Story

The_Lord_of_the_Rings_Trilogy

Image from the lotr.wiki.com

Please note: This is a very long post. It is based on a talk I gave yesterday (October 28, 2017) at the C.S. Lewis Festival in Petoskey, Michigan. Consider yourself warned!

 

The study of myth seems to me to take three different paths:

  • Anthropological / Archeological: the study of classical mythologies (Bulfinch’s Mythology, Edith Hamilton)
  • Religious / Transcendent: the spiritual meaning of myth (Karen Armstrong, Joseph Campbell, Sigmund Freud)
  • Structuralist: the study of the same structures that recur in myths (Northrop Frye, Joseph Campbell, Roland Barthes)

This is all interesting, but I would like to back up a moment. I feel like I’ve arrived a dinner party, and that somehow I missed the first two courses. I feel as if I might get some kind of mental indigestion if I don’t start over at the very beginning.

The fact is, I want to know something more fundamental about myth and its function.

  • I want to know what it is and how it works.
  • Specifically, I want to know what distinguishes myth from other forms of story-telling.

Because for me, Story-Telling is what distinguishes human beings, homo sapiens, from all other species on this planet, as far as we know.

  • Studies have shown that crows have memories
  • Studies have shown that chimpanzees use tools
  • Philosophers are now beginning to agree that animals do indeed have consciousness

But we—we should be known not as homo sapiens (wise man, the man who knows), but as homo narrans—the speaking man, the man who tells, who narrates—story-telling man.  Because it is clear to me that we humans communicate largely through story-telling, and this story-telling function, this tendency to rely on narration, is what makes us human.

I’m going to ask you to bear with me for a little while as I tease this out. I’d like to say that by the end of this essay, I’ll have some answers to the questions I posed (what is myth, and how does it work, and what is the difference between a really good story and a myth)—but I’m pretty sure I won’t. I may, however, ask some more questions that might eventually lead me to some answers.

So here goes. To begin with, a few people who weigh in on what myth is and what it does:

Roland Barthes, the French post-structuralist literary theorist, says that myth is a type of speech, a system of communication, a kind of message. In a way, Barthes and JRR Tolkien are not really different on this point, incredible as it is to think of Barthes and Tolkien agreeing on anything at all, much less something so important to each of them.

  • They are both incredibly passionate and devoted to the concept of language
  • Barthes, in his book Mythologies, which I have shamelessly cherry-picked for this essay, says that the myth’s objective in being told is not really important; it is the way in which it conveys that message that is important.
  • He says that “the knowledge contained in a mythical concept is confused, made of yielding, shapeless associations” (119).
    • But this isn’t as bad as it sounds, because myths actually don’t need to be deciphered or interpreted.
    • While they may work with “Poor, incomplete images” (127), they actually do their work incredibly efficiently. Myth, he says, gives to its story “a natural and eternal justification…a clarity which is not that of an explanation but that of a statement of fact” (143).
    • Myth is a story in its simple, pure form. “It acts economically: it abolishes the complexity of human acts, it gives them the simplicity of essences…” (143).
  • You can see how this view of myth kind of works with the myth-building that Tolkien does in The Lord of the Rings, which works with simple efficiency, whose very images are incomplete to the point of needing clarification in Appendices and further books like the Silmarillion. Yet even without having read these appendices and other books, we grasp what Tolkien is getting at. We know what Middle-Earth is like, because the myth that Tolkien presents needs no deciphering, no real interpretation for us to grasp its significance.

Tolkien, I think we can all agree, was successful in creating a myth specifically for England, as Jane Chance and many other scholars have now shown to be his intention. But is it a novel? Some might argue it isn’t—myself included. In fact, what Tolkien created in The Lord of the Rings is less a myth (I would argue that we only use that term because Tolkien himself used it to describe his work and his object—think of the poem “Mythopoeia,” which he dedicated to C.S. Lewis) than it is a full-blown epic.

For my definition of epic versus novel, I’m going to my personal literary hero, Mikhail Bakhtin, a great thinker, a marvelous student of literature, a man who wrote with virtually no audience at all for many years because he was sent into internal exile in the Soviet Union. In his essay “Epic and the Novel,” Bakhtin attributes these characteristics to epic:

  1. It deals with an absolute past, where there is little resemblance to the present;
  2. It is invested with national tradition, not personal experience, arousing something like piety;
  3. There is an absolute, unbridgeable distance between the created world of epic and the real world.

The novel, says Bakhtin, is quite the opposite. It is new, changing, and it constantly “comes into contact with the spontaneity of the inconclusive present; this is what keeps the genre from congealing. The novelist is drawn toward everything that is not yet completed” (27).

I think the three characteristics of epic described by Bakhtin do in fact match up nicely with The Lord of the Rings: absolute past, national tradition, distance between the actual and the created world. But here’s another thing about epic as described by Bakhtin: “The epic world knows only a single and unified world view, obligatory and indubitably true for heroes as well as for authors and audiences” (35).  It would be hard, indeed impossible, to imagine The Lord of the Rings told from a different point of view. We need that distant narrator, who becomes more distant as the book goes on. As an example, imagine The Lord of the Rings told from Saruman’s point of view, or from Gollum’s. Or even from Bilbo or Frodo’s point of view. Impossible! Of course, we share some of the point of view of various characters at various points in the narrative (I’m thinking specifically of Sam’s point of view during the Cirith Ungol episode), but it couldn’t be sustained for the whole of the trilogy.

The interesting thing here is that in The Lord of the Rings, Tolkien took the novel form and invested it with epic. And I think we can say that against all odds, he was successful. On the other hand, C.S. Lewis, in his last book Till We Have Faces, took a myth (the story of Cupid and Psyche), which is certainly more closely related to epic than it is to novel, and turned it into a successful novel. This isn’t the time and place to talk about Till We Have Faces, although I hope someday that we can come together in the C.S. Lewis Festival to do that very thing, but I couldn’t help mentioning this, because it’s striking that Lewis and Tolkien, while they clearly fed off each other intellectually and creatively, started from opposite ends in writing their greatest creative works, as they did in so many other things. It’s almost amazing that you can love both of them at the same time, but of course you can. It’s the easiest thing in the world to do.

But I’m losing the thread of my questions here. What is myth? Can we actually have modern myths? Can someone actually set out with the intention of creating a myth? And can a mythic work spontaneously just happen? Another question needs to be posed here: if this long book, which is probably classified in every bookstore and library as a novel, touches on myth but is really an epic, can a novel, as we know it, become a myth? This forces us to tighten up our definition of what a myth is and asks us to think about what myth does.

Karen Armstrong, I think, would say yes, to all three of these questions. In her book A Short History of Myth, Armstrong follows the trajectory of myths through time and argues that the advent of printing and widespread literacy changed how we perceive and how we receive myth. These developments changed myth’s object and its function—and ultimately, it changed the very essence of myth.

Armstrong points out that myths and novels have similarities:

  • They are both meditative
  • They can both be transformative
  • They both take a person into another world for a significant period of time
  • They both suspend our disbelief
  • They break the barriers of time and space
  • They both teach compassion

Inspired by Armstrong and by Bakhtin, I’m going to go out on a limb here and make a stab at answering my questions. And I’ll start by defining a modern myth as a super-story of a kind: a novel (or a film, because let’s open this up to different kinds of story-telling) that exerts its power on a significant number of people. These stories then provide, in film professor and writer Stuart Voytilla’s words, “the guiding images of our lives.”

In short, a modern myth has these characteristics:

  1. It belongs to a certain place and time. Like epic, it is rooted in a time and a place. It might not be far removed from the actual, but it cannot be reached from the actual.
  2. It unites a group of readers, often a generation of readers, by presenting an important image that they recognize.
  3. It unites a group of readers by fostering a similar reaction among them.
  4. It contains identifiable elements that are meaningful to its readers/viewers. Among these might be important messages (“the little guy can win after all,” “there’s no place like home,” the American Dream has become a nightmare”).

In other words, a mythic story can be made intentionally, as Star Wars was by George Lucas after he considered the work of Joseph Campbell; or it can happen accidentally. Surely every writer dreams of writing a mythic novel—the Great American novel—but it’s more or less an accident. The Adventures of Huckleberry Finn was a mythic novel of American, until it was displaced by To Kill a Mockingbird.  And I would note here that having your novel go mythic (as we might term it—it is, in a way, like “going viral,” except mythic stories tend to last longer than viral ones) is not really such a good thing after all. Look at Harper Lee—one mythic novel, and that was the end of her artistic output—as far as we know. A mythic novel might just be the last thing a great writer ever writes.

Anyway, back to our subject: a  modern myth gets adopted rather than created. Great myths are not made; they become. So let’s’ think of a few mythic novels and see how they line up with my four characteristics:

  1. Frankenstein
  2. Star Wars
  3. The Wizard of Oz
  4. The Great Gatsby or Death of a Salesman—take your pick.
  5. Casablanca
  6. The Case of Local Myths—family or friend myths, references you might make to certain films or novels that only a small number of people might understand. A case in point would be the re-enactments of The Rocky Horror Picture Show that take place each year around Halloween.

In essence, my answer, such as it is, to the questions I posed earlier comes down to this:

Modern myths are important stories that unite their readers or viewers with similar emotional and intellectual reactions. Modern mythology works by presenting recognizable and significant images that unite the people who read or view them. As for what distinguishes modern myths from other forms of story-telling, what tips a “normal” novel or film over into the realm of “mythic”—I don’t have an answer for this. I only have a couple of vague, unformed theories. One of my theories is this: Could one difference between myth and the novel (“mere” story-telling as such) be that myth allows the reader/listener to stay inside the story, while the novel pushes the reader back out, to return to the actual world, however reluctantly?

And let’s not forgot what Karen Armstrong wrote about myth: “It has been writers and artists, rather than religious leaders, who have stepped into the vacuum [created by the loss of religious certainty and despair created by modernism] and attempted to reacquaint us with the mythological wisdom of the past” (138).  Armstrong’s closing sentence is perhaps the most important one in the book: “If professional religious leaders cannot instruct us in mythical lore, our artists and creative writers can perhaps step into this priestly role and bring fresh insight to our lost and damaged world” (149). With this in mind, perhaps it’s time to go and read some more, and find more myths that can help us repair and restore ourselves, our faith in our culture, and in doing so, the world itself.

 

5 Comments

Filed under Criticism, culture, Films, Historical Fiction, Literary theory, Literature, Reading, Writing

Three Things I’ve Learned from Kazuo Ishiguro

06Nobel-master768

Image from the New York Times (October 5, 2017)

 

I had actually planned this post a couple of days before my favorite living writer, Kazuo Ishiguro, won the Nobel Prize in Literature (announced on on October 5th). So, along with the satisfaction and sense of vindication I felt when I woke up last Thursday morning and discovered that he’d been awarded the Prize, I also felt a sense chagrin at being late in making this post. After all, I could have gone on record about Ishiguro’s talent days before the Nobel committee made its announcement. Still, better late than never, so I will offer my belated post now, and explain the three most important things I’ve learned from Ishiguro over the years.

The most important thing I’ve learned from Kazuo Ishiguro is this: great writing often goes unnoticed by readers. (This point, of course, is now somewhat diluted by the fact that Ishiguro has indeed won acclaim for his work, but I think it deserves to be made all the same.) I remember reading Never Let Me Go about eight years ago and being gob-smacked by its subtle narrative brilliance and its emotional resonance. And yet I’ve met many readers of the book who, while affected by the narrative, seemed unimpressed by Ishiguro’s writerly achievement. It’s almost embarrassing that my reaction to the novel was so different than other people’s. Could I have gotten it wrong, somehow? Was it possible that Never Let Me Go really wasn’t the masterpiece I thought it was? While I considered this, I never once really believed I had made a mistake in my estimation: it is a tremendous book. The fact that few other people see it as such does not change my view of it. It simply means that I see something in it that other people don’t. Hence my first object lesson from reading Ishiguro: genius isn’t always obvious to the mass of readers out there. Perhaps it just isn’t that noticeable with so many other distracting claims for our attention.

The second thing I’ve learned from Ishiguro also stems from Never Let Me Go: genre doesn’t matter. When you really think about it, categorizing a work based on its plot is a silly thing to do, and yet we are firmly locked into that prison of categorization, since almost all bookstores and libraries, as well as readers, demand that every work fit into a narrow slot. I commend Ishiguro for defying the convention of genre, incorporating elements from both science fiction and fantasy into realist narratives. In my view, the sooner we break the shackles of genre, the better. Good, responsible readers should never restrict themselves to a certain genre any more than good, imaginative writers should. A certain amount of artistic anarchy is always a good thing, releasing creative juices and livening things up.

And finally, the third thing I’ve learned is this: a good writer does not hit the bull’s eye every time he or she writes. The Remains of the Day and Never Let Me Go are truly wonderful books. An Artist of the Floating World is promising, but not nearly as good as Ishiguro’s later works.  The Buried Giant, I’d argue, is a failure–but it is a magnificent failure, one whose flaws emanate from the very nature of the narrative itself, and thus it transcends its own inability to tell a coherent story. I’ve learned from this that a writer should never be afraid to fail, because failing in one way might be succeeding in another, less obvious, way. This is as good a place as any other to admit that I have never been able to get through The Unconsoled. And as for When We Were Orphans–well, the less said about that disaster of a book, perhaps the better. I can’t imagine what Ishiguro was thinking there–but I will certainly defend his right to fail. And I am thankful that even a writer with such talent as Ishiguro does, from time to time, fail–and fail big. It certainly gives the rest of us hope that while we fail, we can still aspire to success.

I will close by saying that I am grateful to Kazuo Ishiguro for the wonderful books he’s written. If you haven’t read any of them, you should–and not just because some panel gave him an award. But I am just as grateful to him for the three important lessons he has taught me about the nature of writing.

 

Leave a comment

Filed under Criticism, Literature, Reading, The Arts, Writing

On Self-Publishing and Why I Do It

Screen Shot 2017-09-21 at 1.11.27 PM

Let me get one thing straight right from the get-go: I know self-publishing is not the same thing as publishing one’s work through a legitimate, acknowledged publishing company. I also know that self-publishing is looked down upon by the established writing community and by most readers. In fact, for the most part I agree with this estimation. After all, I spent much of last year writing freelance book reviews for Kirkus Reviews, so I know what’s being published by indie authors: some of it is ok, but much more of it is not very good at all.

Knowing this, then, why would I settle for publishing my novels on Amazon and CreateSpace? This is a tricky question, and I have thought about it a great deal. Whenever anyone introduces me as an author, I am quick to point out that I am, in fact, just a self-published author, which is very different from a commercial writer. (And if at any time I am liable to forget this important fact, there are enough bookstores in my area that will remind me of it, stating that they don’t carry self-published books.) When I meet other writers who are looking for agents, I do my best to encourage them, largely by sharing with them the only strategy I know: Be patient, and persist in sending your queries out.

So why, since I know all this, do I resort to self-publishing my work? I’ve boiled it down to four main reasons.

First of all, I self-publish because I am not invested in becoming a commercially successful writer. I write what I want, when I want, and when I decide my work is complete, I submit it to an electronic platform that makes it into a book, which I can then share with family and friends and anyone else who cares to read it. In other words, for me writing is not a means by which to create a career, celebrity, or extra income. I have long ago given up the fantasy of being interviewed by Terry Gross on Fresh Air; my fantasies are more mundane these days.

Second, I do not need to be a commercial writer, with a ready-made marketing machine to sell my books, because I am not hoping to make any money from them. Rather, I look upon writing as a hobby, just as I look upon my interest in Dickens, Hardy, and the Brontes as a hobby. I am helped here by having spent many years engaged in academic research, a world in which publications may win their authors momentary notice, but certainly not any money, unless one happens to sell out to the lure of literary celebrity, as Stephen Greenblatt has. I have a few publications out in the academic world, but no celebrity and certainly no money to show for them–and I am totally fine with that. In my creative writing, I am lucky enough to have a hobby that satisfies me and costs me relatively little–far less, in fact, than joining a golf or tennis club would cost.

The third reason that I self-publish my work is that I actually enjoy doing so. There are some aspects of publication that have surprised me. For example, I have found that I really enjoy working with a great graphic designer (thanks, Laura!) to develop the cover of my novels. It is an extension of the creative process that is closely related to my work but something that I could never do myself, and this makes me all the more grateful and fascinated as I watch the cover come to life and do its own crucial part to draw readers into the world I have created.

As a retired writing professor, I realize how important revision and proofreading is, and to be honest, this is the only part of self-publishing that gives me pause, because I worry about niggling little errors that evade my editorial eye. But for the most part, I am old enough now to have confidence in my writing. Plus, the beauty of self-publishing is that it is electronic: if there are errors (and there are always errors, even in mainstream published books), I can fix them as soon as a kind reader points them out. So I suppose the fourth reason to self-publish lies in the fact that it is so very easy to do it these days.

These are four good reasons for me to self-publish, but the most important reason is that I apparently love to write, and self-publishing allows me to do this without worrying about submitting the same piece of work over and over again to agents and publishers, stalling out my creativity. While at the Bronte Parsonage Museum this past summer, I picked up a card that expresses how I feel about it, a quote from Charlotte Brontë: “I’m just going to write because I cannot help it.” (It is a testament to my literary nerdiness that I happen to know that this quotation comes from Brontë’s Roe Head Journal, but strangely enough, before I encountered it on a greeting card I never realized that it applied to myself as well as to Brontë.) In my idiosyncratic view, self-publishing allows the reader to decide whether a novel is worth reading, rather than punting that responsibility over to an overworked and market-fixated literary agent or editorial assistant. I am willing to trust that reader’s judgment, even if it means I will never sell many books.

And so today, as I am releasing my second self-published novel (Betony Lodge, available on Amazon and CreateSpace–and this is my last attempt at marketing it here on my blog), I am fully aware of the stigma of self-publishing, but I realize that what’s right for other writers may not be right for me. Today, then, I am taking my courage into my own hands and pushing that key to make the book go live.

And tonight I will be making my own champagne toast: here’s to living in the 21st century,  when digital publishing makes authors of us all!

7 Comments

Filed under Criticism, culture, Literature, Publishing, Reading, Retirement, self-publishing, Uncategorized, Writing

My Short, Tragic Career as an Independent Scholar

 

20170717_121338

Several months ago, I had what seemed like a fantastic idea: now that I was retired from teaching English at a community college, I could engage in critical research, something I’d missed during those years when I taught five or more classes a semester. I had managed to write a couple of critical articles in the last few years of my tenure at a small, rural two-year college in Northern Michigan, but it was difficult, not only because of the heavy demands of teaching, but also because I had very limited access to scholarly resources. Indeed, it is largely due to very generous former students who had moved on to major research institutions that I was able to engage in any kind of scholarly research, a situation which may seem ironic to some readers, but which is really just closing the loop of teacher and student in a fitting and natural way.

And so last fall, on the suggestion of a former student, I decided to throw my hat in the ring and apply to  a scholarly conference on Dickens, and my proposal was chosen. In time, I wrote my paper (on Dickens and Music– specifically on two downtrodden characters who play the flute and clarinet in David Copperfield and Little Dorrit, respectively) and prepared for my part in the conference.

It had been close to 25 years since I had read a paper at a conference, and so I was understandably nervous. Back then, there was no internet to search for information about conference presentations, but now I was able to do my homework, and thus I found a piece of advice that made a lot of sense: remember, the article emphasized, that a conference paper is an opportunity to test out ideas, to play with them in the presence of others, and to learn how other scholars respond to them, rather than a place to read a paper, an article, or a section of a book out loud before a bored audience. Having taught public speaking for over a decade, I could see that this made a lot of sense: scholarly articles and papers are not adapted to oral presentations, since they are composed of complex ideas buttressed by a great many references to support their assertions. To read such a work to an audience seemed to me, once I reflected on it, a ridiculous proposition, and would surely bore not only the audience, but any self-respecting speaker as well.

I wrote my paper accordingly. I kept it under the fifteen-minute limit that the moderator practically begged the panelists to adhere to in a pre-conference email. I made sure I had amusing anecdotes and witty bon mots. I concocted a clever PowerPoint presentation to go with the paper, just in case my audience got bored with the ideas I was trying out. I triple-spaced my copy of the essay, and I–the queen of eye contact, as my former speech students can attest–I practiced it just enough to become familiar with my own words, but not so much that I would become complacent with them and confuse myself by ad-libbing too freely. In short, I arrived at the conference with a bit of nervousness, but with the feeling that I had prepared myself for the ordeal, and that my paper would meet with amused interest and perhaps even some admiration.

It was not exactly a disaster, but it was certainly not a success.

To be honest, I consider it a failure.

It wasn’t that the paper was bad. In fact, I was satisfied with the way I presented it. But my audience didn’t know what to do with presentation. This might be because it was very short compared to all the other presentations (silly me, to think that academics would actually follow explicit directions!). Or it could be because it wasn’t quite as scholarly as the other papers. After all, my presentation hadn’t been published in a journal; it was, as C.S. Lewis might have called it, much more of a “supposal” than a fully-fledged argument. Perhaps as well there was something ironic in my stance, as if I somehow communicated my feeling that research in the humanities is a kind of glorified rabbit hunt that is fun while it lasts but that rarely leads to any tangible, life-changing moment of revelation.

Yet this is not to say that humanities research is useless. It isn’t. It develops and hones all sorts of wonderful talents that enrich the lives of those who engage in it and those who merely dip into it from time to time. I believe in the value of interpreting books and arguing about those interpretations; in fact, I believe that engaging in such discussions can draw human beings together as nothing else can, even at the very moments when we argue most fiercely about competing and contrasting interpretations. This is something, as Mark Slouka points out in his magnificent essay “Dehumanized,” that STEM fields cannot do, no matter how much adminstrators and government officials laud them, pandering to them with ever-increasing budgets at the expense of the humanities.

And this is, ultimately, why I left the conference depressed and disappointed. I had created, in the years since I’d left academia, an idealized image of it that was inclusive, one that recognized its own innate absurdity. In other words, sometime in the last two decades, I had recognized that research in the humanities was valuable not because it produced any particular thing, but because it produced a way of looking at the world we inhabit with a critical acuity that makes us better thinkers and ultimately better citizens. The world of research, for me, is simply a playground in which we all can exercise our critical and creative faculties. Yet the conference I attended seemed to be focused on research as object: indeed, as an object of exchange, a widget to be documented, tallied, and added to a spreadsheet that measures worth.

Perhaps its unfair of me to characterize it in this way. After all, most of the people attending the conference were, unlike me, still very much a part of an academic marketplace, one in which important decisions like tenure, admission to graduate programs, promotions, and departmental budgets are decided, at least in part, by things like conference attendance and presentations. It is unfair of me to judge them when I am no longer engaged in that particular game.

But the very fact that I am not in the game allows me to see it with some degree of clarity, and what I see is depressing. One cannot fight the dehumanization of academia, with its insistent mirroring of capitalism, by replicating that capitalism inside the ivy tower; one cannot expect the humanities to maintain any kind of serious effect on our culture when those charged with propagating the study of humanities are complicit in reducing humanities research to mere line items on a curriculum vitae or research-laden objects of exchange.

I can theorize no solution to this problem beyond inculcating a revolution of ideas within the academy in an effort to fight the now ubiquitous goal of bankrupting the study of arts and humanities, a sordid goal which now seems to characterize the age we live in. And I have no idea how to bring about such a revolution. But I do know this: I will return to my own study with the knowledge that even my small, inconsequential, and isolated critical inquiries are minute revolutions in and of themselves. As we say in English studies, it’s the journey that’s important, not the destination. And in the end, I feel confident that it will take far more than one awkward presentation at a conference to stop me from pursuing my own idiosyncratic path of research and inquiry into the literature I love.

5 Comments

Filed under Careers, clarinet, Criticism, Education, Literary theory, Literature, Music, Reading, Retirement, Teaching, The Arts, Writing

Border Country

51-XpOD0BKL._SX321_BO1,204,203,200_.jpg

 

I have fairly sloppy reading habits these days, moving randomly from one book to the next, choosing them for the slightest of reasons. A couple of weeks ago, I was in Wales, and I stopped in a bookstore. This bookstore was not in Hay-on-Wye, which is noted for its bookstores and its annual literary festival; frankly, I found Hay-on-Wye to be too commercial and couldn’t get out of there soon enough. Rather, it was a small bookstore in Crickhowell, in South Wales, which, it turns out, was a place that Tolkien visited on a holiday as a young adult and whose influence can be found in The Hobbit.

Whenever I go into a bookstore, I feel obligated to purchase something. For me, it’s like getting a table in a restaurant: you wouldn’t go in at all if you didn’t mean to buy something. And, because I was in Wales, and because the bookstore had a wonderful collection of Welsh books written in English, I picked up a novel by Raymond Williams called Border Country. I chose it because I am a retired English professor and am familiar with some of Williams’s critical work. I was hoping it would be a good book, because I always root for scholars who write fiction, being one myself.

I will simply say here that Border Country surpassed any hope I had that it would be an interesting book to read on vacation. It really is a fine novel, a beautiful and thoughtful narrative in which Welsh village life is depicted against the background of labor struggles, the clash of generations, and the difficulty involved in leaving one’s home and then returning to it.

Williams creates a subtle story with a strong narrative pull, largely because of the lively, interesting characters he presents. The protagonist is a professor of economics who lives in London with his wife and two sons; he must return to the Welsh border country, however, because his father has had a stroke. But “border country” also refers to the space that Matthew Price (called “Will” back in his hometown of Glynmawr) occupies within his own world: neither fully in the cosmopolitan world of London intellectuals (we get only a glimpse of his life there) nor in the village of his birth, Matthew is caught between worlds and a strange, palpable dysphoria ensues.

Yet the novel does not dwell on this unease. Rather, it provides flashbacks to an earlier time, when Matthew’s father Harry first arrives in Glynmawr to work as a railway signalman with his young wife Ellen, and in doing so it recounts the struggles involved in making a life in that beautiful and rugged country. The novel, true to its form (and no one would know that form better than Williams, who was a literary scholar of the highest merit), presents a varied and beautiful mix of narratives, woven together so subtly and with such artistry that the reader moves effortlessly between them.

I am new to Welsh literature, but I have learned this from Border Country: reading Welsh novels means reading about the Welsh landscape, with its rough yet welcoming mountains, where life is difficult but well worth living. Williams manages to get that feeling across to the reader in his simple, almost elegiac tone. The threads of the story keep us turning the pages, but the message of the book will stay with us long after we finish reading.

This is a novel that deserves to be read. It is both a pleasure and a pain to say that: a pleasure to discover a hidden gem, and a pain to realize that this gem has been obscured by newer, less deserving but flashier novels, and has only been revealed by the undisciplined, random choice of a reader strolling into a bookstore looking for something to read while on holiday in Wales. So I’m doing my part to gain it the readership it deserves by saying here: get this book and read it. You will be glad that you did.

Border Country, Raymond Williams

Parthian, Library of Wales, 2017

 

 

I

1 Comment

Filed under Criticism, Historical Fiction, Literature, Politics, Reading, Travel, Writing

New Feature: Book Reviews

The title is a misnomer of sorts: most contemporary book reviews, I’ve noticed, are little more than marketing ploys designed to get you to buy the book they’re reviewing. If the reviewer is quite brave, the review might actually critique the book, but the point remains the same: to weigh in on a book that has grabbed, or wants to grab, the attention of a large body of readers.

That is not my goal in writing book reviews.

Am I alone in wailing and moaning the lost art of reading? Certainly not. Yet I am advocating here a certain kind of reading, a way of reading which demands thoughtful yet emotional responses to a book. This kind of reading and critiquing is not systematic, like a college paper; it is not formulaic and profit-generating, like a Kirkus book review; and it is certainly not aimed at gaining a readership for a book, or for this blog, either, for that matter. I am simply modeling the behavior I would like to see in other readers. I want to log my emotional and intellectual responses to certain books, to join or create a critical discussion about the the works I’m reading. Some of these works will be current, but many more will be older. As I used to tell my literature students, I specialize in works written by long-dead people. Long mesmerized by the works from the nineteenth century and before, I have, one might say, a severe case of century deprivation.

But today I am starting with a book by Susan Sontag, The Volcano Lover: A Romance. Published in 1992, it is a historical novel set in Naples, Italy, at the end of the eighteenth century, focusing on Sir William Hamilton and his second wife Emma, destined to become the mistress of Horatio Nelson.

The_Volcano_Lover_(Sontag_novel)

Image from Wikipedia

Let me say that I have never read many of Sontag’s essays, and now I feel I don’t really have to, because this book seems in many ways much more a essay than a novel. There’s a good story in the lives of Sir William, Lady Hamilton, and Lord Nelson, but Sontag pushes this story into the background, eclipsing it by allowing her narrator’s cynical distance to diminish the reader’s ability to connect with the characters and events portrayed in the novel. Sontag gets in the way of the story a great deal too much. Egotism has no place in the act of telling a story; unfortunately, this lesson is something many writers are slow to learn, and indeed, some writers never learn it at all.

The true protagonist of the novel emerges only in the last eight pages. Sontag has had her revenge on the prurient reader who has picked up this novel only to delve into the lurid details of one of the most famous threesomes in British history. She pulls out a minor character, one that has had only the most fleeting reference given her, and gives her some of the best scenes to narrate. By playing hide-and-seek games with her story in this way, Sontag regrettably implodes her own narrative.

In the end, Sontag is much too clever a story-teller, and this hurts her novel–irreparably, in my view. There is one sentence in the novel that I think is worthy of remembering, however. Describing Sir William long after her own death (yes, Sontag does this, time-hopping with impunity, apparently), his first wife describes him like this in a single-sentence paragraph: “Talking with him was like talking with someone on a horse” (376). That’s a clever description, and I will give Sontag her due by calling attention to it.

In the end, though, I am left feeling frustrated and annoyed by The Volcano Lover. I have no idea how it can be construed as a romance, just as I have no idea why this novel, with its sly undercurrent of critical attitudes–towards the characters, the actions, and perhaps even the very nature of novel-writing–should hold a reader’s attention. Sontag’s work, described on the jacket as “a book of prismatic formal ingenuity, rich in speculative and imaginative inventiveness and alive with delicious humor,” is in reality a self-absorbed narrative, filled with annoying commentary, strained attempts at originality, and a smug disregard for its readers’ desire to like the book they’re reading.

2 Comments

Filed under Criticism, Historical Fiction, History, Literary theory, Literature, Publishing, Reading, The Arts, Writing