Charles Baxter’s dysfunctional narratives

Charles Baxter
Charles Baxter

What if I told you that there’s been a sea-change in American storytelling over the past half-century? Not merely a change in subject matter, but that the fundamental nature of American narratives radically shifted? Would you believe me?

Now, what if I told you that a writer twenty-five years ago described these “new” stories, and even predicted they would become the dominant mode in our future? Would you believe that?

In 1997, Charles Baxter published Burning Down the House, a collection of essays on the state of American literature. It opens with “Dysfunctional Narratives: or, ‘Mistakes were Made,’” a blistering piece of criticism that not only detailed the kinds of stories he was reading back then, but predicted the types of stories we read and tell each other today.

Baxter appropriated the term “dysfunctional narrative” from poet C. K. Williams, but he expounded and expanded upon it so much, it’s fair to say he’s made the term his own. He borrowed a working definition of dysfunctional narratives from poet Marilynne Robinson, who described this modern mode of writing as a “mean little myth:”

One is born and in passage through childhood suffers some grave harm. Subsequent good fortune is meaningless because of the injury, while subsequent misfortune is highly significant as the consequence of this injury. The work of one’s life is to discover and name the harm one has suffered.

Baxter adds that the source of this injury “can never be expunged.” As for the ultimate meaning of these stories: “The injury is the meaning.”

To claim this mode of writing has become the dominant one in American culture demands proof, or at least some supporting evidence. Baxter lists examples, such as Richard Nixon’s passive-voice gloss over the Watergate cover-up (“mistakes were made”), Jane Smiley’s A Thousand Acres, and conspiracy theories, among others.

“Dysfunctional Narratives” doesn’t succeed by tallying a score, however. Rather, it describes a type of story that sounds all-too-familiar to modern ears:

Reading begins to be understood as a form of personal therapy or political action. In such an atmosphere, already moralized stories are more comforting than stories in which characters are making complex or unwitting mistakes.

Don’t merely consider Baxter’s descriptions in terms of books. News stories, the social media posts scrolling up your daily feed, even the way your best friend goes into how their boss has slighted them at work—all constitute narratives, small or large. Dysfunctional narratives read as if the storyteller’s thumb is heavy on the moral scale—they feel rigged.

It does seem curious that in contemporary America—a place of considerable good fortune and privilege—one of the most favored narrative modes from high to low has to do with disavowals, passivity, and the disarmed protagonist.

(I could go one quoting Baxter’s essay—he’s a quotable essayist—but you should go out and read all of Burning Down the House instead. It’s that good.)

Dysfunctional narratives are a literature of avoidance, a strategic weaving of talking points and selective omissions to block counter-criticism. If that sounds like so much political maneuvering, that’s because it is.

“Mistakes were made”

Let’s start with what dysfunctional narratives are not: They’re not merely stories about dysfunction, as in dysfunctional families, or learning dysfunctions. Yes, a dysfunctional narrative may feature such topics, but that is not what makes it dysfunctional. It describes how the story is told, the strategies and choices the author had made to tell their story.

Baxter points to Richard Nixon’s “mistakes were made” as the kernel for the dysfunctional narrative in modern America. (He calls Nixon “the spiritual godfather of the contemporary disavowal movement.”) He also holds up conspiracy theories as prototypes:

No one really knows who’s responsible for [the JFK assassination]. One of the signs of a dysfunctional narrative is that we cannot leave it behind, and we cannot put it to rest, because it does not, finally, give us the explanations we need to enclose it. We don’t know who the agent of action is. We don’t even know why it was done.

Recall the tagline for The X-Files, a TV show about the investigation of conspiracy theories: “The truth is out there.” In other words, the show’s stories can’t provide the truth—it’s elsewhere.

More memorably—and more controversially—Baxter also turns his gaze upon Jane Smiley’s A Thousand Acres, which features the use of recovered memories (“not so much out of Zola as Geraldo“) and grows into “an account of conspiracy and memory, sorrow and depression, in which several of the major characters are acting out rather than acting, and doing their best to find someone to blame.”

In a similar vein, a nearly-dysfunctional story would be The Prince of Tides by Pat Conroy. It centers on a family man who, via therapy, digs through memories of a childhood trauma which has paralyzed him emotionally as an adult. He gradually heals, and goes on to repair his relationship with his family. Notably, his elderly father does not remember abusing him years earlier, leaving one wound unhealed.

Another example would be Nathanael West‘s A Cool Million, which follows a clueless naif on a cross-American journey as he’s swindled, robbed, mugged, and framed. By the end, the inventory of body parts he’s lost is like counting the change in your pocket. It might be forgiven as a satire of the American dream, but A Cool Million remains a heavy-handed tale.

This leads to another point: A dysfunctional narrative is not necessarily a poorly told one. The dysfunction is not in the quality of the telling, but something more innate.

Examples of more topical dysfunctional narratives could be the story of Aziz Ansari’s first-date accuser. The complaints of just about any politician or pundit who claims they’ve been victimized or deplatformed by their opponents is dysfunctional. In almost every case, the stories feature a faultless, passive protagonist being traumatized by the more powerful or the abstract.

There’s one more point about dysfunctional narratives worth making: The problem is not that dysfunctional narratives exist. The problem is the sheer volume of them in our culture, the sense that we’re being flooded—overwhelmed, even—by their numbers. That’s what seems to concern Baxter. It certainly concerns me.

A literature of avoidance

In his essay Ur-Fascism, Umberto Eco offers this diagram:

onetwothreefour
abcbcdcdedef

Each column represents a political group or ideology, all distinct, yet possessing many common traits. (Think of different flavors of Communism, or various factions within a political party.) Groups one and two have traits b and c in common, groups two and four have trait d in common, and so on.

Eco points out that “owing to the uninterrupted series of decreasing similarities between one and four, there remains, by a sort of illusory transitivity, a family resemblance between four and one,” even though they do not share any traits. The traits form a chain—there is a common “smell” between the political groups.

Not all dysfunctional narratives are exactly alike, or have the exact traits as the rest, but they do have a common “smell.” Even if a 9/11 conspiracy theory seems utterly unlike A Cool Million, they both may be dysfunctional.

"Burning Down the House" by Charles Baxter

Likewise, in the traits that follow, just because a story doesn’t include all doesn’t mean it “avoids dysfunction.” Rather, dysfunctional narratives are built by the storyteller selecting the bricks they need to buttress their message:

  • A disarmed protagonist
  • An absent antagonist
  • Minimal secondary characters
  • An authorial thumb on the scale
  • “Pre-moralized”
  • A vaporous conclusion
  • Authorial infallibility and restricted interpretations

The most common trait of the dysfunctional narrative is a faultless, passive main character. Baxter calls this the “disarmed protagonist.” Baxter differentiates between “I” stories (“the protagonist makes certain decisions and takes some responsibility for them”) and “me” stories (“the protagonists…are central characters to whom things happen”). Dysfunctional narratives are the “me” stories.

And the errors these “me” characters make—if any—are forgivable, understanding, or forced upon them by dire circumstances. Compare this to the mistakes the people around them make—monstrous, unpardonable sins:

…characters [in stories] are not often permitted to make interesting and intelligent mistakes and then to acknowledge them. The whole idea of the “intelligent mistake,” the importance of the mistake made on impulse, has gone out the window. Or, if fictional characters do make such mistakes, they’re judged immediately and without appeal.

Power dynamics are a cornerstone of all narratives, but one “smell” of the dysfunctional variety is an extraordinary tilting of power against the main character. The system, or even the world, is allied against the protagonist. Close reads of these narratives reveals an authorial thumb on the story’s moral scale, an intuition that the situation has been soured a bit too much in the service of making a point. This scale-tipping may be achieved many ways, but often it requires a surgical omission of detail.

Hence how often in dysfunctional narratives the antagonist is absent. A crime in a dysfunctional novel doesn’t require a criminal. All it needs, in Robinson’s words, is for the main character to have endured some great wrong: “The work of one’s life is to discover and name the harm one has suffered.”

Poet Marilynne Robinson
Poet Marilynne Robinson

Name the harm, not the perpetrator. Why not the perpetrator? Because often there’s no person to name. The harm is a trauma or a memory. The perpetrator may have disappeared long ago, or died, or have utterly forgotten the wrongs they inflicted (as the father does in Prince of Tides). The malefactor may be an abstraction, like capitalism or sexism. But naming an abstraction as the villain does not name anything. It’s like naming narcissism as the cause of an airliner crash. This is by design. Abstractions and missing antagonists don’t have a voice. Even Satan gets to plead his case in Paradise Lost.

No ending is reached in a dysfunctional narrative, because there’s only a trauma, or a memory, or an abstraction to work against. These injuries never heal. Memories may fade, but the past is concrete. By telling the story, the trauma is now recorded and notarized like a deed. “There’s the typical story in which no one is responsible for anything,” Baxter complained in 2012. “Shit happens, that’s all. It’s all about fate, or something. I hate stories like that.” These stories trail off at the end, employing imagery like setting suns or echoes fading off to signify a story that will never conclude.

The most surface criticism of these narratives is that we, the readers, sense we’re being talked down to by the author. “In the absence of any clear moral vision, we get moralizing instead,” Baxter writes. A dysfunctional narrative dog-whistles its morality, and those who cannot decode the whistle are faulted for it. The stories are pre-moralized: The reader is expected to understand beforehand the entirety of the story’s moral universe. For a reader to admit otherwise, or to argue an alternate interpretation, is to risk personal embarrassment or confrontation from those who will not brook dissent.

And making the reader uncomfortable is often the outright goal of the dysfunctional narrative. The writer is the presumed authority; the reader, the presumed student. It’s a retrograde posture, a nagging echo from a lesser-democratic time. (When I read A Brief History of Time, I was most certainly the student—but Hawking admirably never made me feel that way.) Dysfunctional narratives are often combative with the reader; they do not acknowledge the reader’s right to negotiate or question the message. With dysfunctional narratives, it’s difficult to discern if the writer is telling a story or digging a moat around their main character.

“What we have instead is not exactly drama and not exactly therapy,” Baxter writes. “No one is in a position to judge.” A dysfunctional narrative portrays a world with few to no alternatives. A functional narrative explores alternatives. (This is what I mean when I write of fiction as an experiment.)

This is why so many dysfunctional narratives are aligned to the writer’s biography—who can claim to be a better authority on your life, after all? But the moment a reader reads a story, its protagonist is no longer the author’s sole property. The character is now a shared construct. Their decisions may be questioned (hence the passive nature of the protagonists—inaction avoids such judgements). If the author introduces secondary characters, they can’t claim similar authority over them—every additional character is one more attack vector of criticism, a chipping away of absolute authority over the story itself. That’s what happened to sensitivity reader Kosoko Jackson in 2019, whose debut novel was pulped due to questions over his secondary characters.

Of all the traits listed—from the disarmed protagonist to the vaporous conclusion—the trait I find the “smelliest” is authorial infallibility and restricted interpretation. That’s why I used weasel language when I called Prince of Tides “nearly-dysfunctional:” The book is most certainly open to interpretation and questioning. In contrast, questioning a conspiracy theory could get you labeled an unwitting dupe, a useful idiot, or worse.

A Cambrian explosion

What Baxter doesn’t explore fully is why we’ve had this Cambrian explosion of dysfunctional narratives. He speculates a couple of possibilities, such as them coming down to us from our political leadership (like Moses carrying down the stone tablets), or as the byproduct of consumerism. I find myself at my most skeptical when his essay stumbles down these side roads.

When Baxter claims these stories arose out of “groups in our time [feeling] confused or powerless…in such a consumerist climate, the perplexed and unhappy don’t know what their lives are telling them,” it seems Baxter is offering a dysfunctional narrative to explain the existence of dysfunctional narratives. He claims these dysfunctional stories are produced by people of “irregular employment and mounting debts.” I strongly doubt this as well. In my experience, this type of folk are not the dominant producers of such narratives. Rather, these are the people who turn to stories for escape and uplift…the very comforts dysfunctional narratives cannot provide, and are not intended to provide.

Rather than point the finger at dead presidents or capitalism, I’m more inclined to ascribe the shift to a handful of changes in our culture.

The term “The Program Era” comes from a book by the same name detailing the postwar rise and influence of creative writing programs in the United States. This democratization of creative writing programs was not as democratic as once hoped, but it still led to a sharp increase in the numbers of people writing fiction. Most of those students were drawn from America’s upwardly-striving classes. And, as part of the workshop method used in these programs, it also led to a rise in those people having to sit quietly and listen to their peers criticize their stories, sometimes demolishing them. (Charles Baxter was a creative writing professor and the head of a prominent writing program in the Midwest. Many of his examples in Burning Down the House come from manuscripts he read as an instructor.)

With the expansion of writing programs came a rise in aspiring writers scratching around for powerful subject matter. Topics like trauma and abuse are lodestones when seeking supercharged dramatic stakes. Naturally, these writers also drew from personal biography for easy access to subject matter.

Another reason related to the Program Era is the heavy-handed emphasis on character-driven fiction over plot-driven fiction. I explore this theory here.

Another reason is staring back at you: The World Wide Web has empowered the masses to tell their stories to a global audience. This has created a dynamic where everyone can be a reader, a writer, and a critic, and all at the same time.

The natural next step in the evolution of the above is for storytellers to strategize how best to defend their work—to remove any fault in the story’s armor, to buttress it with rearguards and fortifications. (This is different than working hard to produce a high-quality experience, which, in my view, is a better use of time.) And there’s been a shift in why we tell stories: Not necessarily to entertain or enrich, but as an act of therapy or grievance, or to collect “allies” in a climate where you’re either with me or against me. Inaction in fiction has come to be praised as a literary virtue. Stories with characters who take matters into their own hands often are derided as genre fiction.

Pick up a university literary magazine and read it from cover to cover. The “smell” of dysfunctional narratives is awfully similar to the smell of social media jeremiads.

These are not the kind of stories I want to read, but it’s becoming increasingly difficult to distance myself from them. Writers should strive to offer more than a list grievances, or perform acts of score-settling. If it’s too much to ask stories to explain, then certainly we can expect them to connect dots. Even if the main character does not grow by the last page, we should grow by then, if only a little.

Will we finally see Neuromancer on the screen?

See also “One year later: Will we finally see Neuromancer on the screen?”

Neuromancer (Brazilian edition)

The Illuminerdi (via) reports Apple TV+ is tooling up to produce a streaming adaptation of William Gibson’s cyberpunk masterpiece Neuromancer. The big question Illuminerdi concerns itself with is which actor will play protagonist Case, a drug-abusing hacker hired to pull off a virtual heist in cyberspace.

The story buries the lede. The truly big news is that Neuromancer has a reasonable chance of being adapted to the screen. Apple TV+ may not be the leading force in streaming entertainment today, but it’s established a track record of producing high-quality material and taking some risks along the way. I know I sound like the eternal fanboy when I say this, but, “This time it might be real.”

Neuromancer is a brilliant novel, one of my favorites, and by my lights, the book that rearranged science fiction. Just as Raymond Chandler did not invent the hard-boiled detective novel, William Gibson did not invent cyberpunk. But both authors took earlier bricklaying done by them and other writers, pulled it all together, and buffed the final result to a chrome-like sheen. There’s science fiction before Neuromancer, and there’s science fiction after Neuromancer.

Hence Neuromancer on film has been a hot topic among science fiction fans since the book was first published in 1984. Every few years over the subsequent decades, news would percolate up that a movie adaptation was in the works, only for the organizers to lose interest, fail to find finding, or simply not get the green light. The Wikipedia section on Neuromancer‘s numerous aborted film adaptations doesn’t do justice to its rocky history. Fake movie trailers have been sewn together; fan-made movie posters have been photoshopped. The rumors, anticipation, and disappointments surrounding the film’s production are legion. (My response to hearing of this latest adaptation attempt: “I’ll believe it when I see it.”)

There were several sidelights along the road to this moment, starting with Johnny Mnemonic in 1996. At first glance, it appeared the perfect aperitif for Neuromancer fans: Mnemonic was an adaptation of a Gibson short story set in the same story universe. The film landed flat, though, and is pretty grating to watch. (Some call it a cult classic—I can’t tell if they’re being ironic or not). Keanu Reeves turned in a cold performance (which he claims was intentional) within a confounding and bizarrely campy narrative. Some say Mnemonic was underfunded. Gibson said it was overfunded. Even if the studio execs were clueless in their meddling—not a stretch to imagine—I still think postmodernist director Robert Longo was simply in over his head.

(That said, I’ve not seen the new re-edit Johnny Mnemonic: In Black and White, so I’ll reserve judgment whether the film is irredeemable. I admit: The stills look damn promising.)

Movie still from Johnny Mnemonic: In Black and White
Johnny Mnemonic: In Black and White

It took The Matrix (1999) to give hungry cyberpunks the cinematic meal they were waiting for. There’s so many parallels between it and Neuromancer, you can’t help but think the writing/directing Wachowskis owe Gibson a pitcher of beer (if not a brewery). But Darren Aronofsky (Pi, Requiem for a Dream) was on to something when, after viewing the film, he claimed “Cyberpunk? Done.” By using up Neuromancer‘s best devices, as well as every philosophical question explored by Philip K. Dick, the Wachowskis came close to shutting the door on the most interesting development in genre fiction since the 1930s. The banality and repetitiousness of the next three Matrix films—including 2021’s Resurrections, which I held a sliver of hope for—only seemed to cement Aronofsky’s point.

(Cyberpunk’s heyday in the 1990s has passed, but neo-cyberpunk lives. The new breed exists where a worldwide computer network is no longer an imagined future, but a concrete element of the story’s past.)


I’m perennially suspicious of Hollywood adapting books to the screen, especially science fiction. Too often screenwriters will ditch the most memorable and gripping parts of the source material to slide in Tinseltown’s tired narrative shorthand. Amazon’s The Man in the High Castle leaps to mind. I’ve not seen the recent adaptation of Foundation, but at least one reviewer thinks Asimov’s classic hasn’t actually been adapted. Still, Illuminerdi reports William Gibson is signed on as an executive producer for Neuromancer. That gives me a touch more confidence in the direction of the project.

But only a touch. In 2015, I wrote how Hollywood has abandoned “‘tight, gapless screenwriting’ to scripts focused on world-building, sequels, expansion, rebooting.” That was written in time when superhero franchises were claiming greater real estate at the cineplexes, and Hollywood had finished converting Tolkien’s charming tale about wee folk into a eight-hour epic-action trilogy. Cinema houses still ruled back then. Like a sneeze coming on, the theater owners knew a violent upheaval was imminent. Today, streaming services are the premier way to deliver movies to eager audiences. And that’s what worries me the most.

MIlla Jovovich as Molly Millions in Neuromancer (fan-made movie poster)

My dread is not that this cyberpunk classic will be adapted to television instead of the silver screen—it’s to see it adapted to a medium that expects seasons and episodes. As with High Castle and Foundation, the streaming services love season-long episodic television: All the better for binge-watching.

Episodic television ushers in the narrative shorthand that Neuromancer absolutely does not need: every hour ending on a contrived cliffhanger; the sexual tension of when-will-they-hook-up; the let-down of the couple separating (complete with the trite break-up language of television: “I need some space” or, “This is going too fast”); and so on.

As Rob Bricken noted in his review of Foundation, which was serialized for Apple TV+:

Even if you’re coming in without having read a page of Asimov, you’ll still notice the drawn-out plots that go nowhere, the padding, and the weird choices the show has the characters make to keep the plot from moving forward. Cheap, nonsensical melodrama fills the series…The show also wants to have pew-pew laser battles and ship fights and spacewalk mishaps and junk, none of which offer anything you haven’t seen before, and are usually used to just run out the clock anyway.

He makes this sharp observation:

Then there’s the show’s terror that people might not make certain connections, so it shows something, has the character comment on it to themself, and then maybe throws in a flashback to someone saying something relevant even if it was said three minutes prior.

This comes from television writing 101: “Tell them what they’re going to see, show it to them, and then tell them what they saw.” If that sounds like how to organize a Powerpoint presentation, you’re right. It’s also why television writing in 2022 remains hard-wired to the narrative structures of I Love Lucy.

Just as Gibson’s console jockeys rewired systems to hijack signal broadcasts and repurposed wet-tech to bore holes through firewalls, let’s hope modern streaming technology is bent to Neuromancer‘s whims, and not vice-versa.


Addendum: One of the criticisms I’ve received, here and elsewhere, is that Neuromancer cannot properly be condensed into a two-hour movie, hence a series is a better fit for its adaptation.

I agree a multi-part show is appropriate for Neuromancer‘s intricate story line. I loathe condensing Neuromancer into a ninety-minute film almost as much as I loathe seeing Neuromancer: Season Two on my TV screen. However, when I originally wrote the above post, I kept fishing around for a good example of a multi-episode streaming series (for illustrative purposes), and failed to locate one.

This morning I recalled The People v. O. J. Simpson: American Crime Story (which started life on FX and moved to Netflix). Its miniseries format would work well for Neuromancer. Each segment builds the story and develops characters toward a conclusion, like chapters in a novel. There’s a beginning, a middle, and a door-closing end.

My gripe is that Apple TV+ may attempt to “episodize” Neuromancer, making it more like a soap opera or a recurring show than a single story told a chapter at a time. This is what happened to Man in the High Castle—which was more “inspired by” than a retelling of the source material—and what appears happened to Foundation.

Follow-up: “One year later: Will we finally see Neuromancer on the screen?”

Why I Wrote “A Man Named Baskerville”

See the “Twenty Writers, Twenty Books” home page
for more information on this series.


A Man Named Baskerville by Jim Nelson

[Note: The following is adapted and compressed from the afterword to A Man Named Baskerville. It reveals some details from the book. It also contains spoilers to the book it was inspired by, Arthur Conan Doyle’s The Hound of the Baskervilles.]

Years ago, while traveling Japan via its Shinkansen bullet train, I found myself without a book to read. An ebook reader I’d installed on my phone came with a free sample to whet the reader’s appetite. That book was Arthur Conan Doyle’s The Adventures of Sherlock Holmes, a collection of the earliest Holmes short stories. (I explore this incident in greater detail in my 2016 post “Sherlock by Train.”)

The collection stands as a record of a remarkably creative streak. So remarkable, if Doyle were to have stopped writing after its publication, we would still be talking about his literary creation and storytelling prowess. The titles of the stories within are as familiar as the books of the Bible: “A Scandal in Bohemia,” “The Red-Headed League,” “The Man with the Twisted Lip,” “The Adventure of the Speckled Band.” Perhaps the only missing short story title of comparable infamy is “The Adventure of Silver Blaze,” published in The Memoirs of Sherlock Holmes a mere two years later. In toto, they represent the height of Doyle’s powers and inventiveness.

None of this inspired me to write A Man Named Baskerville. As exciting and inventive as a great Sherlock Holmes story can be, never have I entertained the question that has dogged countless other producers of Doyle homages and pastiches: Could I write my own Sherlock Holmes story? Honestly, the thought has never crossed my mind.

After consuming the first collection in a rush of reading, I used the opportunity of a brief train stop and some free wireless Internet access to download more Sherlock Holmes books for our continued journey. I had read a little of Doyle’s work before, and never found much interest in it. They were too Victorian for my tastes, too concerned with Empire and upright decency and British morality. My California upbringing, and the plain-speaking tastes I inherited from my parents, led me to the hardboiled school of Chandler, Hammett, and Cain. Nathanael West’s grotesqueries and William Gibson’s cyberpunks are a better fit for me than Holmes’ Irregulars.

On that train ride, my interest in Sherlock Holmes kindled. Holmes may not have walked Chandler’s mean streets, but he did present a more compelling moral force than I’d sensed before. As with the hardboiled school, Holmes time and again must balance his own sense of justice against the British legal system’s notion of the same. Doyle wrote for an audience who would understand those boundaries implicitly. A hundred and ten years later, I viewed Holmes’ sense of justice through a different lens. This came to a point when my reading reached The Hound of the Baskervilles.

The book was first serialized in 1901, ten years after that auspicious run of early short stories. Doyle had killed off Holmes in “The Final Problem” (1893) hoping to rid himself of the literary creation upstaging all his other work. An appalled public demanded more stories featuring Holmes, and publishers increasingly pressured Doyle to satisfy the market’s cravings.

Inspired by a trip to Devon and its local folklore of wisht hellhounds roaming the countryside at night, Doyle produced The Hound of the Baskervilles. To avoid what we today call “continuity problems,” he retroactively dated its events to October 1888, three years before the publication of his earliest stories. This places the story square in the middle of the Autumn of Terror, when a serial killer dubbed Saucy Jack terrified London, while, across the Atlantic, the Empire of Brazil was warily beginning its dissolution.

One overlooked quality of Doyle’s writing is that his knack for concise storytelling in the short form executes equally brilliantly in the longer form. I’ve seen adept short story writers get fouled up when they attempt to tackle the novel. The pacing and breathing cadences that permit a runner to win the 100-meter dash do not sustain when attempting a marathon. Yet Doyle’s economical style holds up with Hound, making for dazzling quick cuts between crucial scenes, and exposition that does not lead the reader to impatiently flip ahead. Doyle had a gift for paring down prose to its vital emotional and informational elements without stripping it of that uniquely English sense of mood and atmosphere. One also sees in Hound Doyle’s assiduous control of pacing. The early chapters draw out their eerie scenes, while the closing chapters barrel headlong toward the conclusion. The movement becomes so breathless at the end, it takes pure inference on the part of the reader to detect scene changes.

Readers either love or hate this no-nonsense approach to storytelling. Either way, the final output of his opus on the moors is consistent with this quality, and obviously has held the public’s interest for well over a century.

None of this inspired me to write this book, either. I grew to admire Doyle’s writing while traveling by bullet train, but I never craved to imitate it. The first fourteen chapters of The Hound of the Baskervilles served to reaffirm my growing estimation of the man’s talents, but not to pick up a pen.

What did inspire me to write A Man Named Baskerville? The fifteenth and final chapter of the book it derives from.

All detective mysteries deal in sleight-of-hand. Keeping the perpetrator out of the narrative limelight until the moment the solution is announced is a tried-and-true technique for maintaining the element of surprise. In response, savvy readers have learned to guess whodunnit by evaluating how much “screen time” the author gives the suspects. The most obvious suspect is never culpable. The suspect we’ve read the least about is quite often guilty up to their eyeballs.

First edition cover of The Hound of the Baskervilles by Arthur Conan Doyle
First edition of The Hound of the Baskervilles

And that’s pretty much the case in The Hound of the Baskervilles. The perpetrator is one we hear precious little about, an absentminded collector of butterflies and moths named Jack Stapleton who lives with his sister (the nineteenth-century equivalent to rooming in your parents’ basement, apparently). He’s not the least elaborated-upon character in the book, but he is pictured as far removed from the crimes and the curse of the Baskervilles. When Holmes and Watson finally suspect his guilt, Doyle spends no time speculating on his motivations in favor of keeping the story moving at a brisk clip.

Doyle knew the reader would eventually demand to know why Stapleton posed under an assumed identity to murder his uncle in such a contrived way, and then attempt the same on his cousin. To sew things up, in Chapter 15, Watson calls on Holmes to explain the background of Jack Stapleton. Holmes launches into fourteen pages of exposition, a matter-of-fact recounting of Rodger’s life from the New World to Devonshire, England.

Much detail is omitted, of course, but Holmes’ reckoning of Rodger’s life is a far more plumbed-out biography than I think any reader expected. After all, Holmes could have simply stated, “He was raised abroad and returned to England to kill his uncle and claim his estate.” Yes, that could be worded more artfully, but Doyle stretched himself to fill in the blanks.

I don’t know why Doyle felt the need to so thoroughly detail Rodger Baskerville’s life. I’m not sure anyone does. In my research for A Man Named Baskerville, I never located a definitive answer to the question. Perhaps in Doyle’s papers, or in a complete treatise on his life and work, an answer may be found. Perhaps it was a modernist faith in the triumph of reason—all things must be explained that can be explained—that led Doyle to stretch himself, much as he uses many pages to lay out the backstory in A Study in Scarlet and some of his short stories.

What I do know is, reading those seemingly superfluous fourteen pages of Rodger’s life struck me as a kind of boggy sinkhole in the tale. It felt Arthur Conan Doyle had wanted to write two books, Rodger’s life story and The Hound of the Baskervilles. Unable or uninterested in writing the first, he wrote the latter and included a précis of the former in the final chapter.

Fascinated, I made copious notes of Holmes’ reckoning of Rodger’s life. Later, I transferred and organized them on my computer. A bell tinkled in my mind, a Pavlovian reaction all writers develop: Is there a novel here? I let the idea stew. Holmes’ reckoning might appear a rich vein to mine, but once I started digging, it might yield little more than a couple of small gems.

And how would readers react to Rodger as a main character? Yes, everyone says they like stories about villains—but too often those so-called villains are more like lovable rogues or bad boys with a soft spot. Was I trying to humanize Rodger Baskerville? That’s exactly what a novel does: It humanizes. Would it be a Victorian “Sympathy for the Devil”?

Maybe, I thought, I should just write the damn thing and see what comes out of the keyboard.

I made a private agreement with myself: I would not write yet another pastiche of Sherlock Holmes, of which there are plenty to pass around. The book would be told in Rodger’s voice and not in imitation of Doyle’s Watson. Of course, that didn’t excuse me from the challenges of writing a historical novel, which include diction, grammar, tone of voice, colloquialisms, and historical accuracy. Nor could I write such a book without featuring Holmes and Watson at some point.

Mostly, though, my doubts centered on originality. Certainly someone had executed on this idea since the publication of Doyle’s book. Internet searches yielded nothing of the sort.

It became a secret too juicy to keep to myself: In the final chapter of The Hound of the Baskervilles, Arthur Conan Doyle embedded a working outline for a novel—a rousing novel, in my estimation—that had been overlooked for over a century. It took me five years to set aside my private doubts and write it.

Yes, it was exhilarating to liberally borrow from a master’s synopsis and expand it into this novel. No, having said synopsis to work from did not make my job any easier.

When I planned A Man Named Baskerville, I failed to see how a man with Rodger’s background would not bring to Dartmoor one or more Central or South American dialects along with his impeccable upper-class English accent. He would also bring with him a rich and varied New World culture as his starting point of reference.

Once in England, around his neck would be the weight of several albatrosses: His father’s suspicious exile; his “ethnic” upbringing and foreign tongue; his lack of secure income; his marriage to a dusky woman most un-Anglo-Saxon. Only his upper-crust accent would save him. It would work in the British Isles like a charge card with no spending limit. After all, he didn’t merely fool the English into thinking he was one of them; he fooled them into thinking he was better than most of them.

Freud’s narcissism of small differences is an underappreciated observation of the continuing human condition. As long as people lift themselves up by cataloging their differences with outsiders, there will always be Rodger Baskervilles walking among us.

That’s why I wrote A Man Named Baskerville.

Twenty Years Later: B. R. Myers, A Reader’s Manifesto

See the “Twenty Writers, Twenty Books” home page for more information on this series.


Twenty years ago this month, The Atlantic published a critical essay on the then-current state of American prose. As dry and dusty a topic that sounds—doubly so when published by an august New England monthly—the essay improbably became a cultural sensation, triggering op-eds in international newspapers, vitriolic letters-to-the-editor, and screechy denunciations from professional reviewers. Suddenly readers everywhere were debating—of all things—the modern novel.

Writer B. R. Myers unexpectedly touched a raw nerve in an America that was better-read than the literati believed possible. “A Reader’s Manifesto” dissected without mercy the work of such literary lights as Don DeLillo, Annie Proulx, Cormac McCarthy, Paul Auster, and David Guterson. Myers didn’t merely criticize their prose on terms of its grammar and diction. He attacked these writers on grounds of pretentiousness, and accused the literary establishment of abetting their ascendancy.

Charged stuff, but still very inside baseball. To rouse an impassioned response from readers over books like White Noise and Snow Falling on Cedars was a remarkable moment in American culture. It’s all the more notable a moment considering some of the above authors’ books satirize the inanity of American culture.

Looking back, it seems dream-like for a critical examination of literary novels to ignite such a furor. I can’t imagine such a thing happening today. Then again, it seemed equally unimaginable twenty years ago.

History of Manifesto

Fed-up with fawning reviews of works like Timbuktu and All the Pretty Horses, Myers first wrote his manifesto in 1999. Using careful, reasoned prose punctuated with wit and scathing humor, he roasted passages from prize-winning books—passages which had been the subject of so much praise by literary reviewers as examples of masterful writing. Using tried-and-true close-reading techniques, he punctured these writers’ obtuse and repetitive language to reveal their prose to be turgid, meaningless, and pretentious.

Myers was convinced no magazine or newspaper would publish his critique. He was an unknown in the literary world; a near-anonymous monograph on the quality of modern literary prose hardly promises to fly off bookstore shelves.

So Myers did what many writers would do in later years: He self-published his manifesto on Amazon. He titled it Gorgons in the Pool: The Trouble with Contemporary “Literary” Prose after a particularly choice passage in a Cormac McCarthy novel. “Nothing happened,” he later wrote. “I went online and ordered three copies for myself; they were the only ones ever sold.”

One of the copies he mailed out wound up in the hands of an Atlantic editor, who offered to publish rather than review it. The Atlantic demanded severe cuts and revisions, and the version published in the magazine comes off nastier than he’d intended. He also had the gut-wrenching task of waving off the Times Literary Supplement from publishing a review of Gorgons, as he’d already signed a contract with The Atlantic. (“As someone said to me the other day, ‘How do you know [Times Literary Supplement] wasn’t going to tear you apart?'” he later remarked. “I suppose everything worked out for the best.”) Bad timing would develop into a cadence for Manifesto.

Gorgons in the Pool by B. R. Myers

The Atlantic article, tucked away deep inside the July/August double-issue, improbably made Myers a name-brand overnight among contemporary lit readers and writers. His outsider status only buffed his credentials as a hard-nosed reviewer. Even his use of first initials added a mysterious air to his origins. Although he received praise from many quarters, it mostly came from readers and (interestingly) journalists, a profession notorious for attracting writers shut-out of the book publishing world.

While the literati initially ignored the essay, drumbeats of support from readers for Myers basic thesis—modern lit is pretentious—soon couldn’t be denied. Much of the early criticism directed back at Myers originated from book reviewers, book supplement editors, and literary novelists. Some of it was quite vitriolic, outraged anyone could suggest the writers he selected weren’t unassailable geniuses. Many exuded an air of befuddled annoyance: How could anyone give Myers or his thesis an ounce of credence? A few were outright smug about it, as though their coy refutations slammed the door on Myers and put an end to the dreary affair once and for all.

It didn’t work. The rebuttals only stoked increased support for Myers from readers around the world. The back-and-forth debate raged online and, as a mark of the times, across letters-to-the-editor pages, which printed point and counterpoint letters. This simply did not happen, even in a time when most people had their news delivered to them via bicycle.

Frustrated, the literary professional class took up what is today recognized as a surefire stratagem for shutting down an Internet debate: They doxxed him.

Not exactly—while The New York Times Book Review didn’t print Myers’ phone number and street address, they did see fit to delve into his past for anything incriminating (much like the Twitterati today will dumpster-dive people’s feeds to dig up embarrassing tweets from eight years ago). Demonstrating the ethics of a tabloid reporter, editor Judith Shulevitz dished to her readers that Myers was a foreigner (he’s not) who lived in New Mexico (i.e., not New York City) and was at that moment preparing to spend a year in Seoul “teaching North Korean literature to the South Koreans.” (Myers’ response: “I would probably have described my job in a way less calculated to evoke the phrase ‘selling ice to the eskimos.'”)

Shulevitz wrote Myers “is not just a man without a stake in the literary establishment. He is foreign to it in every way.” His manifesto could have

proved that a critic needs nothing more than taste to make a case. Does Myers’s essay do all this? It does not, because Myers doesn’t have a sure grasp of the world he’s attacking.

Most of the denunciations of Manifesto are steeped in this kind of a haughty condescension, and it served Myers well.

(I should add that I’m uncomfortable throwing around the phrase “literary establishment” as a catch-all for a wide and disjointed segment. Yet Shulevitz seemed comfortable acknowledging its existence in 2001, so I’ll assume it existed then and exists today.)

Manifesto continued to be a lodestone of bad timing. The Times‘ nativist pillorying of Myers was published on September 9, 2001. Two days later, the Times—and the rest of the world—was focused on a very different subject. The literary debate Myers had sparked that summer ground to a halt.

The history of Manifesto could easily have ended with the attacks on the World Trade Center, if not for events which nudged a little harder on the snowball Myers had started rolling in 1999.

First was Oprah selecting Jonathan Franzen’s The Corrections for her book club. To get an idea of how close this shaved against Myer’s Manifesto—and his continued game of footsie with bad timing—the same edition of the New York Times Book Review that exposed Myers as a Korean-teaching foreigner also included a glowing review of The Corrections laden with an irony of Oedipal proportions: The reviewer gives a winking approval that the book contains “just enough novel-of-paranoia touches so Oprah won’t assign it and ruin Franzen’s street cred.” Actually, Oprah was set to announce The Corrections as her next book club pick four days later (only to postpone it due to 9/11). When Franzen bristled that Oprah was attempting to smarten-up her book club by associating it with the “high-art literary tradition,” a new literary controversy erupted to displace Manifesto.

Although the imbroglio between Oprah and Franzen is better framed as tabloid-level tit-for-tat, Manifesto played a minor role. Online commenters made the point that Myers’ gripes about the literary establishment sneering down on the reading public were playing out before the nation’s eyes. Gone was his critics’ suggestion that, on this point, Myers was jousting with windmills.

The second event was Melville House publishing A Reader’s Manifesto: An Attack on the Growing Pretentiousness in American Literary Prose in 2002 (one of the two first books produced by the then-fledgling publisher). This full-length treatment gave Myers the opportunity to restore much of what was lost from Gorgons in the Pool when it was adapted for The Atlantic. It’s this edition I’ve based this review on.

The backward glance

The Atlantic Monthly, July/August 2001
The Atlantic Monthly, July/August 2001.

I vividly recall reading “Manifesto” in the summer of 2001. I’d written my first novel and was discovering the ego-melting process called “finding a literary agent.” Over the prior years I had enrolled in evening and weekend creative writing courses around the Bay Area, where many of the books Myers lay judgment upon were held up as models exemplar. Also at the time I was a member of a weekly “writers’ reading group.” A member of the group handed me a Xerox of The Atlantic essay along with a half-joking warning not to take anything this Myers guy has to say too seriously.

I wound up taking B. R. Myers quite seriously. I had never read anything like “A Reader’s Manifesto.” Rereading Myer’s book for this post, I still marvel over his concision and convictions. It can be read in a single sitting, and unless you’re a grump, it will keep you engaged from start to finish. Myers understands well the game he’s taken up: He can’t poke a stick at others’ bad prose if his own prose is lacking. His manifesto is meticulous, refreshing, lively, and enlightening, as seen here when he trains his gimlet eye on McCarthy’s All the Pretty Horses:

As a fan of movie westerns I refuse to quibble with the myth that a rugged landscape can bestow an epic significance on the lives of its inhabitants. But as Conrad understood better than Melville, the novel is a fundamentally irreverent form; it tolerates epic language only when used with a selective touch. To record with the same majesty every aspect of a cowboy’s life, from a knife-fight to his lunchtime burrito, is to create what can only be described as kitsch.

Not only is this arguable, there’s a lot packed in there to argue with: I find this to be a positive.

Or here, where he’s analyzing David Guterson’s output:

…a slow tempo is as vital to his pseudo-lyrical effects as a fast one is to Proulx’s. What would otherwise be sprightly sentences are turned into mournful shuffles through the use of tautology. “Anything I said was a blunder, a faux pas,” “a clash of sound, discordant,” “She could see that he was angry, that he was holding it in, not exposing his rage,” “Wyman was gay, a homosexual,” and so on.

This level of tight engagement with the work at hand shows this is well above the usual culture-war crap that’s saturated our nation’s dialogue for decades now.

Some of his lines of attack are novel. Performing a close and scathing read of Annie Proulx’s self-approving dedication in Close Range (“my strangled, work-driven ways”) is the kind of antic you’d expect of the University Wits or Alexander Pope. His oft-quoted rejoinder to an exchange between Oprah and Toni Morrison is his most acidic and least endearing: “Sorry, my dear Toni, but it’s actually called bad writing.” (Less oft-quoted is his explanation: “Great prose isn’t always easy but it’s always lucid; no one of Oprah’s intelligence ever had to puzzle over what Joseph Conrad was trying to say in a particular sentence.”)

Regardless of what you might have read elsewhere, the boilerplate attacks on Myers don’t stand up to scrutiny. Supposedly he values plot over form; he disdains “difficult” books; he cherry-picked bad passages from the books he attacks; he selected writers who’d gone out of fashion; or the confounding claim that he’s a humorless cur prone to sarcasm and snide shots. Having read his book at least four times now, I say none of these complaints hold water. (Sarcasm may be the lowest form of wit, but it’s not humorless.) I’m not saying there’s no room for criticizing Manifesto, only that dismissing Myers without engaging his points is not fruitful.

And there’s plenty in Manifesto for writers to take away. Rather than being satisfied with throwing spitballs at modern lit, he contrasts prose he finds vapid with prose that stands up. Myers will forever get grief for quoting Louis L’Amour’s Hondo with approval, but the passage he includes is a model of clean, effective writing that succeeds in characterizing the protagonist with the deftness of a parable. Myers makes the point several times that the prose he’s complaining about could have been written with less-pompous English, and takes a few stabs at editing it as proof. He’s engaged with the texts under the gun, a marked difference from his critics who sniff down on him (and, it seems, cannot be bothered to quote and refute his specific claims).

My take-away from Manifesto for writers is, don’t produce affected writing, produce affecting writing: Language that stirs the reader and shines a light rather than obscures. Good editing requires close reads of your prose, and questioning what every word is doing in a sentence. Ditch the idea that affecting prose is “easy” and affected prose is “difficult,” an avant-garde pose. One critic complained “‘prose,’ for [Myers], equals syntax plus diction, and is expected to denote, rather than to evoke.” I think he expects it to do both.

Revolt of the reading public

The significance of Myer’s Manifesto is not a perverse thrill of taking down holy cows like McCarthy and DeLillo, but how eerily it presaged the next twenty years in American publishing. The circuitous route Myers followed from Gorgons in the Pool to The Atlantic Monthly to Melville House is a once-in-a-generation aberration, but the elements of getting said critique out of the word processor and into the hands of readers rings awfully familiar today.

When I read in 2002 of Myers self-publishing Gorgons on Amazon, I was floored: I had no idea such an opportunity was available to mere mortals. It was a bona fide light-bulb moment, the first time I pondered the possibility of making an end-run around the New York City publishers and selling my work directly to readers. Ten years later, not only was Amazon still open to self-publishing, the company was rapidly tooling up to make publishing your own e-book as easy as clicking a mouse button.

Less obvious today, but notable in 2001, was Myers praising Amazon user reviews (of the books Myers was criticizing, not his own overlooked Gorgons). Before Manifesto, any reference in the popular media to Amazon’s user reviews was bound to be dismissive or sardonic. Back then, cultural commentators saw putting opinion-making into the hands of readers as ludicrous as a truck driver penning a starred Michelin review. (Don’t forget, there were still people in 2001 arguing the Internet was a passing fad—that it was faster to drive to the bookstore and buy a book than for Amazon to deliver it, ergo Amazon’s days were numbered.) Myers didn’t merely approve of Amazon user reviews, he used them as evidence that readers can and do understand difficult literature. I believe this is the first time I saw anyone in the cultural sphere do this.

Self-publishing; “average people” versus the experts; the power of reader reviews; the pseudo-doxxing Myers was subjected to; online discussion boards keeping the debate alive; and vitriolic denunciations from on high. All that’s missing is a hash tag and some Bitcoin changing hands, and the dust-up around Manifesto would sound like any number of social media episodes we’ve seen in recent years.

Martin Gurri’s The Revolt of the Public deserves mention here. Although I’ve not read it, I have read plenty of reviews and analyses, simply because this 2014 book is claimed to have predicted the rise of Donald Trump, Brexit, cancel culture, the Capitol Hill attacks, QAnon, #MeToo, and more. (It too was self-published on Amazon.)

Gurri’s thesis is that the Internet is destabilizing public respect for institutional authority and, in due course, undermining the authorities’ control over social and political narratives. The expert class, once considered the final word, now must defend itself from an increasingly skeptical public.

It seems to me that the narratives being disrupted by digital communications may not merely be political narratives but also traditional ones—the narratives offered by the literary novel, and the narratives sold to the public by the literary expert class. Not only are big-name authors being treated with skepticism by the general public, so are the stories they’re proffering as significant both in terms of literary heft and their cultural insights. Look no further than the controversy surrounding last year’s American Dirt by Jeanine Cummins for an example of voices from below shouting up at the ensconced above, or the backlash suffered by Sarah Dessen after shaming a critical reader.

The disruption to the literary world even extends to novelists’ fawning reviewers. There is less distinction here than would first appear: Literary novels are often reviewed by other literary novelists. This incestuousness would be a scandal in other fields. “Imagine what would happen if the Big Three were allowed to review each other’s cars in Consumer Reports,” Myers noted in an interview. “They’d save the bad reviews for outsiders like the Japanese.”

A before-and-after example of the Internet’s effect on the publishing world is Lorenzo Carcaterra’s Sleepers (1995) and James Frey’s A Million Little Pieces (2003). Both were mega-bestsellers whose publication dates bookend the Internet’s ascension in daily life. Both were published as memoirs, and both had their factual accuracy challenged. The mass media reported the controversy around Sleepers by copy-and-pasting publisher press releases and quoting book agents. A Million Little Pieces was put under the Internet’s collective magnifying glass thanks to an investigation by the amateur web site The Smoking Gun.

This people-powered exposé became a nightmare for James Frey, and his reputation never recovered. Editions of A Million Little Pieces (another Oprah book club pick!) now include a publisher’s note warning of “certain embellishments” and “invented” details: “The reader should not consider this book anything other than a work of literature.”

Carcaterra largely escaped unscathed in 1995 thanks to the controversy being framed by the media as a publishing industry squabble. Sleepers remains sold as memoir. (Funnily enough, it’s also listed under Amazon’s “Hoaxes & Deceptions” category.) Carcaterra’s luck can be measured in years. If Sleepers had been a bestselling memoir in 2005, the Internet would have torn it to shreds.

“Leaders can’t stand at the top of pyramids anymore and talk down to people,” Martin Gurri writes. “The digital revolution flattened everything.” I say A Reader’s Manifesto was the initial deflating puncture of the literary world’s cozy status quo.

Engendered reputations

In the conclusion of Manifesto, Myers writes:

I don’t believe anything I write will have much effect on these writers’ careers. The public will give them no more thought in twenty years than it gives, say, Norman Rush today, but that will have nothing to do with me, and everything to do with what engendered their reputations in the first place.

(If you’re wondering who Norman Rush is, I confess I had to look him up myself.)

Some of the rebuttals directed at Myers in 2001 claimed a few of these authors were already “on their way out,” although each critic seemed to formulate a different list of who remained relevant and who was exiting stage left. I’m tempted to produce a list of the writers whose work Myers criticized to see where their reputations stand today. I won’t do that; any reader so inclined could make such a list on their own.

I will point out that some of Myers’ subjects have sunk into a comfortable life of teaching, penning the occasional pop culture piece, and a general resting upon of laurels. Myers makes a couple of pointed barbs about Old Man and the Sea, but at least Hemingway was still throwing left-hooks at the end of his life.

(When Myers’ critics claim that literary book awards and glowing reviews in highbrow magazines are meaningless, or that Myers ignored genre fiction’s own system of awards and reviews, they’re overlooking the enduring social capital of “literary significance.” A science-fiction writer receiving big-time accolades in 2001 is not going to be, in 2021, a tenured professor traveling the writer’s retreat circuit as a featured speaker and penning fluffy think pieces for Harper’s. The self-propelling feedback loop that is the literary world should not be discounted.)

Note that Myers leaves unsaid what exactly engendered these authors’ reputations in the first place. The optimist in me thinks he’s referring to the evanescence of their writing postures—live by the sword, die by the sword.

The pessimist in me suspects what really engendered their reputations is a resilient enabling literary class which eagerly maintains its country-club exclusivity while claiming commitments to diversity. Even in the face of a massive shift in digital publishing, and the concomitant explosion of voices now available via e-books and print-on-demand, the literary establishment remains a closed shop. Its reviewers walk hand-in-hand with big publishers, who in turn regularly ink seven-figure publishing deals and expect a return on said investment. Positive reviews in well-placed periodicals are an important component of any publishing marketing plan. (The podcast “Personal Rejection Letter” explored this question in 2017, along with a retrospective of Myer’s Manifesto.)

In other words, the authors Myers put under the microscope may or may not be relevant twenty years later, but the system that held them aloft remains alive and strong. The Internet has kneecapped it some—the literary establishment is less commanding than it once was—but it’s still humming along.

Could Myers have at least shifted the conversation? I say he did. While Jonathan Franzen’s 1996 “Perchance to Dream” (re-titled “Why Bother?”) and Tom Wolfe’s 1989 “Stalking the Billion-Footed Beast” are both considered modern literary manifestos of great import, it’s plain to me that Myers’ Manifesto has shown far more staying power in the public’s and writers’ consciousness. Even in a 2010 critical response to B. R. Myers review of Franzen’s Freedom, the comments section swings back and forth on the significance of Myer’s Manifesto, with the most recent comment coming in 2016. There are YouTube videos produced as late as last year going over the debate Myers ignited twenty years ago.

Meanwhile, in creative writing courses across America, mentioning Myers’ name will still earn an eye-roll and a dramatic sigh from the instructor, wordlessly asking when this guy will just go away.

Quote

A publisher’s note worth reflecting upon

Anthony Boucher

Years ago I discovered on Forest Books‘ sidewalk cart an unassuming hardback with an unassuming title, Great American Detective Stories, edited by legendary Bay Area writer Anthony Boucher and published in June 1945.

Curious what stories made the cut, I expected the usual names and the usual reprinted titles. I did see the usual names—Raymond Chandler, Dashiell Hammett, and Cornell Woolrich being of the most interest to me—but was surprised by Boucher’s story selections. He included one of the three Sam Spade short stories Hammett wrote for easy money (which were not widely reprinted until recently). The Chandler selection was a novella I’d never heard of before, “No Crime in the Mountains”, which appears to have been the nucleus (Chandler would say “cannibalized”) for The Lady in the Lake.

Over the years I’ve dipped into this collection on occasion, and discovered it to be a fine snapshot of late-World War II popular American writing.

But it was the book’s front matter that gives me pause, specifically the publisher’s note:

This book is manufactured in compliance with the War Production Board’s ruling for conserving paper. … Thinner and smaller books will not only save paper, plate metal and man power, but will make more books available to the reading public.

The reader’s understanding of this wartime problem will enable the publisher to cooperate more fully with our Government.

It’s a fine-print reminder of how the cost of war used to burden everyone’s daily life, and therefore was a constant reminder of war’s price, both in human cost and economic.

Printed above the note:

“Books are weapons in the war of ideas.”

– President Roosevelt

They still are, but we’ve allowed our attention to wander, and it’s to our detriment.

Kurt Vonnegut on story structure and punctuation

Kurt Vonnegut

Previously I wrote on Kurt Vonnegut’s considerable body of interviews, especially his comments on story shape and fiction as a series of experiments.

One fascinating (Vonnegut-esque?) tidbit in his interviews was an offhand moment in a 1971 profile by Richard Todd (New York Times Magazine):

The class began in a surprising way. Vonnegut remarked that last time they had been talking about form, and he walked to the blackboard and drew there a question mark, an exclamation point and a period. He said these bits of punctuation were the outline of a three act play or story:

? ! .

A student asked if the end might be “Dot, dot, dot.” Vonnegut agreed.

? ! …

So maybe this is a gimmicky or silly way to describe story structure, but I’m game to play along.

I’ve written about organizing structure to motivate my fiction, so this little lesson in punctuation caught my attention. The way I organize my thinking, the three acts of story (really, four) look something like this:

  • Act 1: Setup
  • Act 2A: Complication
  • Act 2B: Confrontation
  • Act 3: Resolution

This list comes from my reading of Syd Field’s books on film structure, which I’ve modified (slightly) for the purposes of writing fiction, especially novels.

Syd Field

I agree with Vonnegut that most stories, if not all, open with question marks. Even if I’ve read a book two dozen times—and there are books I can make that claim—the pleasure of the opening chapters is the illusion I do not know what is coming. (I would say this is related in spirit to Coleridge’s willing suspension of disbelief.) There are numerous, sometimes playful, ways to pose those questions when a story opens, but those questions are almost always there. Rare if ever does an interesting story open with all the questions answered and the main characters in possession of all the facts. Jim Thompson said there was only one type of story: “Things are not what they seem.” That is another way to say stories open with question marks.

Vonnegut’s exclamation point jibes with what I’ve labeled 2A, Complication. Exclamation points do not have to be action or cliffhangers. Sometimes a quiet revelation or admission can turn a story on its head and rearrange how we see the main character and their situation. I hold a pet theory that the art of storytelling lays in reversals (perhaps I’ll post about that some day). The exclamation point is one such reversal for the characters: a well-kept secret revealed, a surprising discovery, a fortune amassed, a fortune lost, and so on.

Vonnegut’s three punctuation marks (and most of his story shapes) imply three acts. I wondered if my idea about a fourth act, Confrontation, could fit into his punctuation-as-story-structure?

Confrontation, I think, could be expressed as an em-dash. Tension draws taut in the confrontation phase of a story. More than any other part of a story, confrontation is where the reader should be asking herself “Wait—what happens next?” In contrast, final acts are generally not “What next?” but rather “How will it end?”

With an em-dash, then, Vonnegut’s story structure could be punctuated like this:

? ! — …

Which seems about right to me.

From Chimpan-A to Chimpanzee: The Swiftian genius of Planet of the Apes

English translation of La Planète des singes (1963), Pierre Boulle.

A guilty pleasure of mine is Planet of the Apes. I’m speaking of the original movie—not the 1970s sequels, not the remakes, not the one with the Golden Gate Bridge, and not the one everyone hates with Mark Wahlberg. I am a fan of the 1968 film that launched them all, the Charlton Heston vehicle spawning over fifty years of sequels, reboots, “re-imaginings,” TV shows, and comic books. The others have little draw for me. It’s the original I return to time and again.

The first Planet of the Apes is campy, riveting, preachy, and provocative— Franklin J. Schaffner’s sci-fi classic is the very definition of middle-brow entertainment, in that it pleases the senses while challenging the mind. The film has only grown on me over the years. I’ve come to appreciate its complexities and contradictions, even as its flaws have become more apparent as well.

Although I find it difficult to believe any person in the industrialized world today is not a little familiar with Planet of the Apes and its shocking ending, let me open by saying: Spoilers follow.

Seriously: If you’ve not watched the original 1968 film, do not continue reading. The ending is simply that stunning. I hate to think I would spoil it for anyone.

Background of the Apes

It’s sometimes said Planet of the Apes is the rare instance of the film being better than the book. I’m not so certain, although I’m convinced the film’s impact is the stronger of the two. A better description, I think, is that the novel and film are different approaches to the same source material, much as the gospels of Matthew and Luke are thought to draw upon common material from an earlier source known to scholars as Q.

The original novel—Pierre Boulle’s epistolary La Planète des singes—regards a space traveler named Ulysee who finds himself stranded on a planet circling the distant star Betelgeuse. There he discovers a modern ape civilization that enjoys all the trappings of 20th century man. The simians smoke tobacco, drive cars, shop for clothes, take walks in parks. The humans on this planet are mute, savage, and hunted for trophy. To the apes the narrator is a freak, this human who can talk and reason and claims to have fallen from the sky.

The Q Document for Boulle’s lean novel would seem to be Jonathan Swift’s Gulliver’s Travels, in particular Gulliver’s final voyage to the land of Houyhnhnms. Swift’s race of intelligent and speaking horses administrate an orderly and rational society on their island. Like Boulle’s apes, the Houyhnhnms are plagued by the Yahoos, mute and savage humans who rape and kill with abandon. The Houyhnhnms shocking final solution for this plague is what today we would call genocide: Their Assembly votes to exterminate the Yahoos. The apes on Boulle’s planet seem to be building toward a similar resolution.

In Bright Eyes, Ape City: Examining the Planet of the Apes Mythos, a recent critical examination of all things Ape, contributor Stephen R. Bissette reveals a number of antecedents to Boulle’s slim novel of apes running a world. In particular, he focuses on the popular 1904 French short story “Le Gorilloide et Autres Contes de L’Avenir.” This story and its author, Edmond Haraucourt, were so popular in the first half of the 20th century, Bissette notes “it is impossible to read ‘Le Gorilloide’ for the first time and not be rocked by the realization that Pierre Boulle had to have read this story, or at least heard of it. … This is the Holy Grail for Apes devotees.” Furthermore, Bissette catalogs an entire corpus of ape-world stories predating Boulle’s novel, from science-fiction shorts to comic books.

These earlier ape-world stories would seem to dilute the originality of Boulle’s novel. From Bissette’s capsule summaries, none of these earlier authors seemed to have the same ambition as Boulle, making La Planète des singes a kind of sui generis in the intersection of science fiction and “social fantasy,” as Boulle himself called it.

While I can’t offer any direct literary evidence Boulle was working out of Swift’s mode of satire, it seems rather obvious he was. The other stories cataloged by Bissette make much of apes running a planet; Boulle made much about a man being treated as an animal because he is an animal, which corresponds neatly with Swift’s dark worldview.

Like Gulliver, Boulle’s protagonist Ulysee is a wide-eyed narrator brimming with wonder and curiosity. The common device of animals with human-like qualities allows Swift and Boulle to hold a polished mirror up to man’s violence, avarice, and ignorance, to interrogate our supposed civilized rise over the beasts, and to show our vices for what they are. Ulysee and Gulliver adapt to the animals’ ways, even going so far as to adopt their language, culture, and mannerisms. Both animal societies are generally better-run than man’s, although Boulle and Swift are smart enough to give their respective animal races blind spots. For Boulle’s apes, it’s unquestioned faith; for Swift’s horses, excessive pride.

“I shot an arrow into the air”

Rod Serling, 1959.

Shortly after publication of Boulle’s novel, Rod Serling of Twilight Zone fame was tapped to write a big-screen adaptation. Serling kept the device of an astronaut landing on a planet run by apes—how could he not?—as well as Ulysee’s crueler treatment at the hands of the apes. He also carried forward the main ape characters, scientists Zira and Cornelius, and Dr. Zaius, the authoritarian keeper of ape law. Beyond that, Serling jettisoned most of Boulle’s novel in favor of a darker, more cynical territory.

Because the movie’s credits list Michael Wilson before Serling, there was a long-running question of how much of the movie emerged from the Twlight Zone creator’s pen. Wilson’s silver screen credits include such monuments as Lawrence of Arabia, Boulle’s The Bridge on the River Kwai, and A Place in the Sun. Some wondered if Serling’s contributions had been overplayed compared to this seasoned Hollywood writer.

A 1998 issue of Creative Screenwriter settled the question by examining Serling’s early scripts. They show Serling definitely shaped the novel into what the movie finally resembled, including his idea for the stunning final moment, one of the most shocking endings in Hollywood history.

Serling also carried forward Boulle’s concept of apes living in a modern world much like our own, which was abandoned before production began. The final movie’s apes inhabit mud huts on the edge of a desert. They make do with horseback travel and wooden carts and dirt roads. This change was apparently due to studio budget restrictions, and Michael Wilson’s revisions accorded this limitation. Wilson or another writer (probably John T. Kelley) also “punched up” Serling’s dialogue, yielding several groan-inducing aphorisms (“I never met an ape I didn’t like”, “human see, human do”). Later drafts also introduced the teenage ape Lucius, who preposterously carpet-bombs the last act of the film with anachronistic counterculture lingo. Leaving these “improvements” on the cutting room floor would have done much to polish the final film.

Still, the artistic compromise of a pre-industrial ape society fortuitously contributed to, rather than detracted from, Serling’s vision. In the film, the apes maintain an austere, Puritanical social structure. They live close to the field, close to the well, and close to their scripture. Zira’s and Cornelius’ claim of discovering a man who can talk doesn’t merely fly in the face of scientific thinking (as in Boulle’s novel), but against religious orthodoxy, adding an Inherit the Wind subtext that supercharges the movie’s stakes.

Planet of the Apes: Visionaries by Dana Gould & Chad Lewis adapts Rod Serling’s original screenplay. The cover illustrates the original concept of apes living in a modern rather than agrarian world.

As with the original novel, Gulliver’s Travels seems to be a source for the film, and not merely the device of man trapped in an animal society. Swift’s misanthropy is far stronger in Serling’s vision than Boulle’s. When the Houyhynms vote to destroy the Yahoos, it’s not a stretch to think Swift is advocating for the destruction of the actual human race. This level of misanthropy is nowhere to be found in Boulle’s work; Serling’s script is marinated in it. Serling’s nicotine-addled gaze and his penchant for purple dialogue means the script affords more time for damning philosophizing than in Boulle’s work.

The vehicle for Serling’s misanthropy is Taylor (Charlton Heston), a red-blooded, cigar-chomping astronaut who replaces Boulle’s wide-eyed French explorer. Unlike Ulysee and Gulliver, Taylor never lowers himself to the ape’s level—or, perhaps, he never rises to their stature. Considering the European backgrounds of Swift and Boulle, perhaps Serling’s choice of a hardheaded “cowboy” astronaut is indicative of Serling’s American-ness. Or, perhaps Serling was making a statement about the hubris of his countrymen. America was deep in the Vietnam War by this point. American exceptionalism was being questioned from all sides.

“I leave the 20th century with no regrets”

Although some critics knock the movie as pretentious action-adventure fare, the film’s sensitive opening doesn’t line up with such dismissive claims. Staring out a spaceship’s viewport at an expanse of stars, astronaut Taylor notes that, due to Einsteinian relativity, hundreds of years have passed on Earth although the crew has only been traveling for six months of ship time. Speaking into a black box recorder, the cynical Taylor announces “I leave the 20th century with no regrets.”

Then he admits

“Seen from out here, everything seems different. Time bends. Space is boundless. It squashes a man’s ego. I feel lonely.”

Before slipping into extended stasis for the final leg of their journey, Taylor offers the listener of his voice recorder—whomever it may be—his final thoughts:

“Does man, that marvel of the universe, that glorious paradox that sent me to the stars, still make war against his brother? Keep his neighbor’s children starving?”

Serling’s script opens not with high action or tense drama. It doesn’t even open with the inciting incident, a bang to set the story into motion. Taylor’s admission before falling into hypersleep sounds like a confession from a man not in the habit of making confessions.

Charlton Heston, from a promotional still for Planet of the Apes (1968)

After the opening credits, the crew awakens to discover they’ve crash-landed on a desolate planet. Taylor’s vigorous misanthropy immediately fills the screen. He taunts the others for holding any faith in the survival of America or even civilization. He declares mankind all-but-extinct, and with the only female crew member dead, its destruction now seems assured. He boasts how satisfied he is to leave Earth behind, and wonders if the remaining crew will last a week on this daunting new planet.

It’s Taylor’s private admission to the black box—”It squashes a man’s ego. I feel lonely”—that reveals his confident cynicism is to some degree a facade. Not coincidentally, packed inside those two sentences is a concise foreshadowing of the film’s conclusion.

“I’ve always feared man”

Taylor, one of the last examples of “that marvel of the universe, that glorious paradox,” is hunted and captured by the apes. He’s subjected daily to humiliations fit for a laboratory animal: Stored in a cage, rewarded with food, washed down from a hose (a movie visual sickeningly similar to a strategy police used in the 1960s on peaceful civil rights protestors), and given a mate to encourage breeding. The threat of castration and lobotomization looms in the background (the latter a trendy topic in a decade preoccupied with the treatment of mental illness).

Charlton Heston turns in a rather physical performance as he’s stripped naked, manacled, chained, gagged, and paraded through streets on a leash. Due to a throat injury Taylor literally has no voice against his captors. When he regains his voice—the campy “Take your stinkin’ paws off me, you damn dirty ape!”—he discovers he still has no say in this society of apes, where he’s regarded as an ignorant freak of nature.

Taylor’s hatred of man isn’t challenged by the apes or ape society, it’s reinforced. As with the lands Gulliver travels to, evidence of man’s failings abound on this supposedly backwards planet. The apes don’t kill each other. They seem to have no war or famine or deprivation to speak of. While not perfect, they seem to have built a more egalitarian society than the world Taylor left behind. And yet Taylor can’t help but see himself as their equal. Soon his human ego leads him to believe he’s their superior. This shift is so subtle and smooth the viewer doesn’t even sense it.

Serling is a cruel god to his creation. It’s not ironic enough for misanthropic Taylor to entertain notions of equality with the apes. Serling forces Taylor to defend mankind as intelligent and rational while standing naked and unwashed before a tribunal of jeering, dismissive apes. When Taylor visits an archeological dig and proves man ruled the planet before the apes, his defense seems all the more credible. Headstrong Taylor makes much hay over his victory.

Consider the film’s opening once more. In his voice recorder, Taylor’s questions—”Does man still make war on his brother? Keep his neighbor’s children starving?”—are obviously not questions at all. Taylor believes man is incapable of changing his basic nature. After crash-landing on the ape planet, Taylor mocks the notion mankind might have survived the five thousand-year span. His attitude is one of good riddance: “I leave the 20th century with no regrets.” Yet by the third act, this man is sneering down on the apes and declaring man was better, stronger, and most importantly, first.

With Taylor’s hubris reaching a crescendo, he twists out this admission from Dr. Zaius:

“I’ve always feared man. From the evidence, his wisdom must have walked hand-in-hand with his idiocy. His emotions must rule his brain. He must be a warlike creature who gives battle to everything around him. Even himself.”

At this point, Taylor has won. He’s gained his freedom as well as an admission of man’s primacy from the ape’s top authority. If the movie ended with Taylor riding his horse into the horizon, perfectly free, it would have been a tidy, if unsatisfying, ending.

Discovering the Statue of Liberty is the door slamming shut on Taylor. Cowering on the beach naked as Adam, he realizes

“I’m back. I’m home. All the time it was—”

Taylor’s character arc is one of the cruelest, most severe punishments I’ve ever seen in film or literature. Gulliver leaves the island of the horse Houyhnhnms unable to stand the sight or smell of other humans, but his change is orders of magnitude less than Taylor’s crushing defeat at the foot of the Statue of Liberty. Taylor’s misanthropy has been scooped out of his black heart and fed back to him in a dog’s bowl.

In the first act, one of the surviving astronauts attempts to plant a tiny American flag in the dirt of their newly-discovered planet. Taylor roars in laughter at the absurd futility. In the final scene, another American symbol in ruins towers over him, with Taylor on the beach like a meager, limp flag planted in the sand. He’s the object of amusement now, and his climb from misanthropist to philanthropist has collapsed and crushed him whole.