Not rethinking realism, as in rethinking philosophy’s single, objective reality, hard as rocks and nails. No, I mean rethinking realism in the sense of questioning the elevation of literary realism over the many other forms of fiction.
Realism has long been the go-to form in literature for telling a story a certain way. An entire literary style—Naturalism—sprung from the sense that Romanticism had gone too far and produced a literature divorced from the world as commonly experienced. The pendulum later shifted the other direction, and for a period of time realistic literature was derided as bourgeois and reactionary. Since World War II, with the rise of creative writing programs and a reinvigorated enforcement of upper-class distinctions, kitchen-table realism has returned to the pinnacle of literary loftiness in America.
So it’s funny to me that realism is also so important in popular entertainment. This is nowhere as true as with television, which is obsessed with depicting reality—from the “you are there”-style news reporting to game shows branded as “reality TV.” When the writers of TV’s M*A*S*H killed off Col. Henry Blake in a season finale, they were inundated with letters from outraged viewers. The Emmy award-winning writing team’s response was, “Well, that’s reality.” American auteur Robert Altman famously ends Nashville with an out-of-the-blue assassination of a central character. Why? Because, he explained, that’s reality.
It’s not that these plot points are faulty or wrong-headed. My complaint is that the excuse—”It’s reality”—is a lazy defense of artistic choices. Writers should cop to their decision rather than take the passive route and saying reality made the choice for them. Writers should ask themselves if a “realistic” moment is adding to, or subtracting from, the story.
Anyone who’s attended a creative writing class, workshop, or MFA program is familiar with the high ground presumed by realism. The trendy term is “psychologically realistic fiction.” In writing programs, names like Raymond Carver, Amy Hempel, Tobias Wolff, and Tim O’Brien are tossed out as the zenith of American writing. Students are explicitly encouraged to emulate them, and their importance is implicitly signaled by their repeated presence in syllabi and required-reading lists. (I’ve read “The Things They Carried” at least eight times over the course of decades of writing groups and classes.) These authors are lionized for many reasons, but importantly, they all wrote about reality.
(There are two exceptions worth mentioning: One is magical realism, although its high regard in writing programs is tied up with identity politics. The other is Borges, whom I jokingly refer to as science-fiction for MFA students. It must be noted that both exceptions originate from outside the United States. Kafka, incidentally, is read and praised in writing programs as well, but not in such a way as to encourage emulation—I suspect my instructors liked the idea of Kafka more than Kafka’s output.)
Look at how so much literary fiction operates. Protagonists tend to be thoughtful, rational, and deliberative—often, they exhibit little to no affect. Characters in opposition tend to be boorish, thoughtless, and emotional. Dialogue is either flat and unadorned, or snappy, like the patter of a stand-up comic. Scenes flow as one character uttering a brief line, followed by paragraphs of rumination. The other character responds, and more paragraphs of rumination.
The prose might be good—it might even be inspired—but is this realism? Going through contemporary literary magazines, reading one story after another, I’m not sure one will find a lot of psychological realism, in the sense of psychiatry’s DSM-5.
Genre fiction is not immune either. Too often connoisseurs of hard-boiled detective fiction and tough-guy novels claim their favorite authors are superior because of their attention to realism. Raymond Chandler’s “The Simple Art of Murder” is wonderful and insightful criticism, but at its heart is a trashing of the classic British mystery because “fiction in any form has always intended to be realistic.” It’s one of the few arguments in the essay that I question.
Janet Burroway wrote, “Sometimes reality doesn’t make for good fiction.” It’s a tough lesson to learn, and one that even seasoned writers fail to grasp.
After all, there is no widely-accepted maxim stating the primary purpose of story is to reproduce reality. Fiction is supposed to be an expression of a writer’s inner state, not a dry report of the who, what, where, and when. Besides, why do we need to reproduce reality with such fidelity? We’re soaking in it. If you want reality, put down your phone or leave your computer screen. You have returned to reality, effortlessly.
In a writing class I attended, one of the students was a fan of horror, particularly H. P. Lovecraft and Robert Chambers’ The King in Yellow. At an end-of-semester presentation before the class, he expressed frustration at the hard-realism reading list we’d been given, and of the months of instruction requiring him to write in similar form. “Reading about reality is like reading about your job on your day off,” he told us. There’s something to that.
Story creates a transcendence within the reader. This transcendence defies reality while mimicking it—reality is Play-Doh in the hands of an adept writer. From hard realism to squishy-soft fantasy and everything in-between, great writing takes me to another place and time, a chance to live another person’s life. Books are “portable dreamweavers.”
Wearily, I started his essay expecting more of the same, and lo, finding it: Computers and the Internet, he contends, has done much to destroy literary fiction. By this point, I’m surprised any writer pursuing such a thesis would bother fortifying their argument with examples or statistics. Blythe does not fail on that count either: Other than some “c’mon, look around, you know what I’m saying,” the argument is made sans corroborative evidence. Of course the Internet has wrecked American literature. Why bother denying it?
It’s telling, then, that Blythe opens with the usual barrage of accusations about digital distractions—”Can you read anything at all from start to finish, i.e. an essay or a short story, without your mind being sliced apart by some digital switchblade?”—and then, to prove how things used to be so much better way back when, he segues to life as an Esquire editor in the 1980s and 90s:
[Rust Hill] and I would occasionally drink two or three Negronis at lunch, sometimes at the New York Delicatessen on 57th Street, and talk about the writers and novels and short stories we loved (and hated). … Then he and I would happily weave our way back to the office at 1790 Broadway, plop down in our cubicles and make enthusiastic phone calls to writers and agents, our voices probably a little louder than usual.
The jokes about fiction editors at a national magazine choosing stories to publish after a three-cocktail lunch write themselves, so I won’t bother. (Although I should, since, as an early writer, I had high hopes for placing a short story with a publication like Esquire. Perhaps I should have mailed a bottle of Bombay with each of my submissions.)
The dichotomy Blythe illustrates is telling: The hellish “after” is the mob writing Amazon user reviews and him not knowing how to turn off iPhone notifications; the blissful “before” is editorial cocktail lunches and not having to give a rat’s ass what anyone else thinks.
One counterpoint to Blythe’s thesis: The 1980s had plenty of distractions, including the now-obvious inability to silence your telephone without taking it off the hook. Another counterpoint: If you want to drink Negronis and argue literature over Reubens, well, you can do that today too. A third counterpoint: A short story printed in the pages of Esquire was sandwiched between glossy full-color ads for sports cars, tobacco, and liquor—most featuring leggy models in evening gowns or swimsuits. Distractions abounded, even before the Internet.
But none of these are what Blythe is really talking about. What he bemoans is the diffusion of editorial power over the past twenty years.
Blythe throws a curveball—a predictable curveball—after his reminisces about Negronis and schmears. Sure, computers are to blame for everything, but the real crime is that computers now permit readers to make their opinions on fiction known:
Writers and writing tend to be voted upon by readers, who inflict economic power (buy or kill the novel!) rather than deeply examining work the way passionate critics once did in newspapers and magazines. Their “likes” and “dislikes” make for massive rejoinders rather than critical insight. It’s actually a kind of bland politics, as if books and stories are to be elected or defeated. Everyone is apparently a numerical critic now, though not necessarily an astute one.
I don’t actually believe Blythe has done a thorough job surveying the digital landscape to consider the assortment and quality of reader reviews out there. There are, in fact, a plenitude of readers penning worthy critical insight over fiction. Just as there are so many great writers out there that deserve wider audiences, there also exist critical readers who should be trumpeted farther afield.
Setting that aside, I still happily defend readers content to note a simple up/down vote as their estimation of a book. Not every expression of having read a book demands an in-depth 8,000 word essay on the plight of the modern Citizen of the World.
Rather, I believe Blythe—as with so many others in the literary establishment—cannot accept readers could have any worthwhile expressible opinion about fiction. The world was so much easier when editors at glossy magazines issued the final word on what constituted good fiction and what was a dud. See also a book I’m certain Blythe detests, A Reader’s Manifesto, which tears apart—almost point by point—Blythe’s gripes.
When B. R Myers’ Manifesto was published twenty years ago, a major criticism of it was that Myers was tilting at windmills—that the literary establishment was not as snobbish and elitist as he described. Yet here Blythe is practically copping to the charges.
Thus the inanity of him complaining that today’s readers hold the power to “inflict economic power” when, apparently, such power should reside solely with critics and magazine editors. I don’t even want to argue this point; this idea is a retrograde understanding of how the world should work. This is why golden age thinking is so pernicious—since things used to be this way, it was the best way. Except when it’s not.
Of course the world was easier for the editors of national slicks fifty years ago, just as life used to be good for book publishers, major news broadcasters, and the rest of the national media. It was also deeply unsatisfying if one were not standing near the top of those heaps. It does not take much scratching in the dirt to understand the motivations of the counterculture and punk movements in producing their own criticism. The only other option back then was to bow to the opinions of a klatch of New York City editors and critics whose ascendancy was even more opaque than the bishops of the Holy See.
That said, it’s good to see a former Esquire editor praise the fiction output of magazines that, not so long ago, editors at that level were expected to sneer down upon: Publications such as Redbook, McCall’s, Analog, and Asimov’s Science Fiction all get an approving nod from Blythe.
But to cling to the assertion that in mid-century America “short fiction was a viable business, for publishers and writers alike” is golden age-ism at its worst. Sure, a few writers could make a go at it, but in this case the exceptions do not prove the rule. The vast sea of short story writers in America had to settle for—and continue to settle for—being published in obscure literary magazines and paid in free copies.
No less than Arthur Miller opined that the golden age of American theater arced in his own lifetime. Pianist Bill Evans remarked he was blessed to have experienced the tail end of jazz’s golden age in America before rock ‘n’ roll sucked all the oxygen out of the room. Neither of those artistic golden ages perished because of the Internet.
What caused them to die? That’s complicated, sure, but their demise—or, at least, rapid descents—were preceded by a turn toward the avant-garde. Which is to say, it became fashionable for jazz and theater to distance themselves from their audience under the guise of moving the art forward. The only moving that happened, though, was the audience for the exits.
Blythe then turns his attention to a third gripe in his meandering essay. Without a shred of evidence, he argues that the digital revolution of the last twenty-five years metastasized into a cultural Puritanism in today’s publishing world:
Perhaps because of online mass condemnations, there’s simply too much of an ethical demand in fiction from fearful editors and “sensitivity readers,” whose sensitivity is not unlike that of children raised in religious families… Too many authors and editors fear that they might write or publish something that to them, at least, is unknowingly “wrong,” narratives that will reveal their ethical ignorance, much to their shame. It’s as if etiquette has become ethics, and blasphemy a sin of secularity.
I cannot deny that there appears to be a correlation between the rise of the Internet in our daily lives and the shift over the last decade to cancel or ban “problematic” literature. What I fail to see is how pop-up alerts or a proliferation of Wi-Fi hot spots is to blame for this situation.
If Blythe were to peer backwards once more to his golden age of gin-soaked lunches, he would recall a nascent cultural phenomenon called “political correctness.” P.C. was the Ur-movement to today’s sensitivity readers and skittish editors. Social media whipped political correctness’ protestations into a hot froth of virtuous umbrage—a video game of oneupsmanship in political consciousness, where high scores are tallied with likes and follower counts. Using social media as leverage to block books from publication was the logical next step. But blaming computers for this situation is like blaming neutrons for the atom bomb.
After a dozen paragraphs of shaking my head at Blythe’s litany of complaints, I was pleasantly surprised to find myself in agreement with him:
The power of literary fiction—good literary fiction, anyway—does not come from moral rectitude. … Good literature investigates morality. It stares unrelentingly at the behavior of its characters without requiring righteousness.
At the risk of broken-record syndrome, I’ll repeat my claim that Charles Baxter’s “Dysfunctional Narratives” (penned twenty-five years ago, near the beginning of the Internet revolution) quietly predicted the situation Blythe is griping about today. Back then, Baxter noticed the earliest stirrings of a type of fiction where “characters are not often permitted to make intelligent and interesting mistakes and then to acknowledge them. … If fictional characters do make such mistakes, they’re judged immediately and without appeal.” He noted that reading had begun “to be understood as a form of personal therapy or political action,” and that this type of fiction was “pre-moralized.”
Unlike Blythe, Baxter did not fret that literary fiction would perish. Baxter was a creative writing instructor at a thriving Midwestern MFA program. He knew damn well that writing literary fiction was a growth industry, and in no danger of extinction. What concerned him was how much of this fiction was (and is) “me” fiction, that is, centered around passive protagonists suffering through some wrong. He noticed a dearth of “I” fiction with active protagonists who make decisions and face consequences.
As Blythe writes:
Too many publishers and editors these days seem to regard themselves as secular priests, dictating right and wrong, as opposed to focusing on the allure of the mystifying and the excitement of uncertainty. Ethics and aesthetics appear in this era to be intentionally merged, as if their respective “good” is identical.
If Blythe is going to roll his eyes at the glut of reader-led cancellations and moralizing editors, perhaps he could consider another glut in the literary world: The flood of the literary memoir, with its “searing” psychic wounds placed under microscope, and its inevitably featherweight closing epiphany. These testaments of self-actualization may be shelved under nonfiction, but they are decidedly fictional in construction. In the literary world, stories of imagination and projection have been superseded by stories of repurposed memory, whose critical defense is, invariably, “But this really happened.”
It was not always so. Memoir was once synonymous with popular fiction. Autobiography was reserved for celebrities such as Howard Cosell and Shirley MacLaine, or a controversial individual who found themself in the nation’s spotlight for a brief moment. It was not treated as a high art form, and perceived in some quarters as self-indulgent. No more.
There remains an audience for great fiction. Readers know when they’re being talked down to. They know the difference between a clueless author being crass and a thoughtful author being brutally honest. They also know the difference between a ripping yarn and a pre-moralized story they’re “supposed” to read, like eating one’s vegetables.
The death of literary fiction—especially the short story—will not be due to iPhone notifications and social media cancellations. Perhaps the problem Blythe senses is the loss of a mission to nurture and promote great fiction. The literary world has turned inward and grown insular. Its priorities are so skewed, I’ve witnessed literary writers question if fiction can even be judged or critiqued. The worsening relationship of class to literary fiction should not be overlooked, either.
If Blythe laments Asimov’s Science Fiction, perhaps he should check out the thriving Clarkesworld. Substacks of regular short fiction are regularly delivering work to thousands of readers. I don’t know if these publications’ editors are gulping down Negronis during their daily Zoom meetings—but as long as they’re putting out quality fiction that challenges and questions and enlightens, maybe that doesn’t matter, and never did.
What if I told you that there’s been a sea-change in American storytelling over the past half-century? Not merely a change in subject matter, but that the fundamental nature of American narratives radically shifted? Would you believe me?
Now, what if I told you that a writer twenty-five years ago described these “new” stories, and even predicted they would become the dominant mode in our future? Would you believe that?
In 1997, Charles Baxter published Burning Down the House, a collection of essays on the state of American literature. It opens with “Dysfunctional Narratives: or, ‘Mistakes were Made,’” a blistering piece of criticism that not only detailed the kinds of stories he was reading back then, but predicted the types of stories we read and tell each other today.
Baxter appropriated the term “dysfunctional narrative” from poet C. K. Williams, but he expounded and expanded upon it so much, it’s fair to say he’s made the term his own. He borrowed a working definition of dysfunctional narratives from poet Marilynne Robinson, who described this modern mode of writing as a “mean little myth:”
One is born and in passage through childhood suffers some grave harm. Subsequent good fortune is meaningless because of the injury, while subsequent misfortune is highly significant as the consequence of this injury. The work of one’s life is to discover and name the harm one has suffered.
Baxter adds that the source of this injury “can never be expunged.” As for the ultimate meaning of these stories: “The injury is the meaning.”
To claim this mode of writing has become the dominant one in American culture demands proof, or at least some supporting evidence. Baxter lists examples, such as Richard Nixon’s passive-voice gloss over the Watergate cover-up (“mistakes were made”), Jane Smiley’s A Thousand Acres, and conspiracy theories, among others.
“Dysfunctional Narratives” doesn’t succeed by tallying a score, however. Rather, it describes a type of story that sounds all-too-familiar to modern ears:
Reading begins to be understood as a form of personal therapy or political action. In such an atmosphere, already moralized stories are more comforting than stories in which characters are making complex or unwitting mistakes.
Don’t merely consider Baxter’s descriptions in terms of books. News stories, the social media posts scrolling up your daily feed, even the way your best friend goes into how their boss has slighted them at work—all constitute narratives, small or large. Dysfunctional narratives read as if the storyteller’s thumb is heavy on the moral scale—they feel rigged.
It does seem curious that in contemporary America—a place of considerable good fortune and privilege—one of the most favored narrative modes from high to low has to do with disavowals, passivity, and the disarmed protagonist.
(I could go one quoting Baxter’s essay—he’s a quotable essayist—but you should go out and read all of Burning Down the House instead. It’s that good.)
Dysfunctional narratives are a literature of avoidance, a strategic weaving of talking points and selective omissions to block counter-criticism. If that sounds like so much political maneuvering, that’s because it is.
“Mistakes were made”
Let’s start with what dysfunctional narratives are not: They’re not merely stories about dysfunction, as in dysfunctional families, or learning dysfunctions. Yes, a dysfunctional narrative may feature such topics, but that is not what makes it dysfunctional. It describes how the story is told, the strategies and choices the author had made to tell their story.
Baxter points to Richard Nixon’s “mistakes were made” as the kernel for the dysfunctional narrative in modern America. (He calls Nixon “the spiritual godfather of the contemporary disavowal movement.”) He also holds up conspiracy theories as prototypes:
No one really knows who’s responsible for [the JFK assassination]. One of the signs of a dysfunctional narrative is that we cannot leave it behind, and we cannot put it to rest, because it does not, finally, give us the explanations we need to enclose it. We don’t know who the agent of action is. We don’t even know why it was done.
Recall the tagline for The X-Files, a TV show about the investigation of conspiracy theories: “The truth is out there.” In other words, the show’s stories can’t provide the truth—it’s elsewhere.
More memorably—and more controversially—Baxter also turns his gaze upon Jane Smiley’s A Thousand Acres, which features the use of recovered memories (“not so much out of Zola as Geraldo“) and grows into “an account of conspiracy and memory, sorrow and depression, in which several of the major characters are acting out rather than acting, and doing their best to find someone to blame.”
In a similar vein, a nearly-dysfunctional story would be The Prince of Tides by Pat Conroy. It centers on a family man who, via therapy, digs through memories of a childhood trauma which has paralyzed him emotionally as an adult. He gradually heals, and goes on to repair his relationship with his family. Notably, his elderly father does not remember abusing him years earlier, leaving one wound unhealed.
Another example would be Nathanael West‘s A Cool Million, which follows a clueless naif on a cross-American journey as he’s swindled, robbed, mugged, and framed. By the end, the inventory of body parts he’s lost is like counting the change in your pocket. It might be forgiven as a satire of the American dream, but A Cool Million remains a heavy-handed tale.
This leads to another point: A dysfunctional narrative is not necessarily a poorly told one. The dysfunction is not in the quality of the telling, but something more innate.
Examples of more topical dysfunctional narratives could be the story of Aziz Ansari’s first-date accuser. The complaints of just about any politician or pundit who claims they’ve been victimized or deplatformed by their opponents is dysfunctional. In almost every case, the stories feature a faultless, passive protagonist being traumatized by the more powerful or the abstract.
There’s one more point about dysfunctional narratives worth making: The problem is not that dysfunctional narratives exist. The problem is the sheer volume of them in our culture, the sense that we’re being flooded—overwhelmed, even—by their numbers. That’s what seems to concern Baxter. It certainly concerns me.
A literature of avoidance
In his essay Ur-Fascism, Umberto Eco offers this diagram:
one
two
three
four
abc
bcd
cde
def
Each column represents a political group or ideology, all distinct, yet possessing many common traits. (Think of different flavors of Communism, or various factions within a political party.) Groups one and two have traits b and c in common, groups two and four have trait d in common, and so on.
Eco points out that “owing to the uninterrupted series of decreasing similarities between one and four, there remains, by a sort of illusory transitivity, a family resemblance between four and one,” even though they do not share any traits. The traits form a chain—there is a common “smell” between the political groups.
Not all dysfunctional narratives are exactly alike, or have the exact traits as the rest, but they do have a common “smell.” Even if a 9/11 conspiracy theory seems utterly unlike A Cool Million, they both may be dysfunctional.
Likewise, in the traits that follow, just because a story doesn’t include all doesn’t mean it “avoids dysfunction.” Rather, dysfunctional narratives are built by the storyteller selecting the bricks they need to buttress their message:
A disarmed protagonist
An absent antagonist
Minimal secondary characters
An authorial thumb on the scale
“Pre-moralized”
A vaporous conclusion
Authorial infallibility and restricted interpretations
The most common trait of the dysfunctional narrative is a faultless, passive main character. Baxter calls this the “disarmed protagonist.” Baxter differentiates between “I” stories (“the protagonist makes certain decisions and takes some responsibility for them”) and “me” stories (“the protagonists…are central characters to whom things happen”). Dysfunctional narratives are the “me” stories.
And the errors these “me” characters make—if any—are forgivable, understanding, or forced upon them by dire circumstances. Compare this to the mistakes the people around them make—monstrous, unpardonable sins:
…characters [in stories] are not often permitted to make interesting and intelligent mistakes and then to acknowledge them. The whole idea of the “intelligent mistake,” the importance of the mistake made on impulse, has gone out the window. Or, if fictional characters do make such mistakes, they’re judged immediately and without appeal.
Power dynamics are a cornerstone of all narratives, but one “smell” of the dysfunctional variety is an extraordinary tilting of power against the main character. The system, or even the world, is allied against the protagonist. Close reads of these narratives reveals an authorial thumb on the story’s moral scale, an intuition that the situation has been soured a bit too much in the service of making a point. This scale-tipping may be achieved many ways, but often it requires a surgical omission of detail.
Hence how often in dysfunctional narratives the antagonist is absent. A crime in a dysfunctional novel doesn’t require a criminal. All it needs, in Robinson’s words, is for the main character to have endured some great wrong: “The work of one’s life is to discover and name the harm one has suffered.”
Name the harm, not the perpetrator. Why not the perpetrator? Because often there’s no person to name. The harm is a trauma or a memory. The perpetrator may have disappeared long ago, or died, or have utterly forgotten the wrongs they inflicted (as the father does in Prince of Tides). The malefactor may be an abstraction, like capitalism or sexism. But naming an abstraction as the villain does not name anything. It’s like naming narcissism as the cause of an airliner crash. This is by design. Abstractions and missing antagonists don’t have a voice. Even Satan gets to plead his case in Paradise Lost.
No ending is reached in a dysfunctional narrative, because there’s only a trauma, or a memory, or an abstraction to work against. These injuries never heal. Memories may fade, but the past is concrete. By telling the story, the trauma is now recorded and notarized like a deed. “There’s the typical story in which no one is responsible for anything,” Baxter complained in 2012. “Shit happens, that’s all. It’s all about fate, or something. I hate stories like that.” These stories trail off at the end, employing imagery like setting suns or echoes fading off to signify a story that will never conclude.
The most surface criticism of these narratives is that we, the readers, sense we’re being talked down to by the author. “In the absence of any clear moral vision, we get moralizing instead,” Baxter writes. A dysfunctional narrative dog-whistles its morality, and those who cannot decode the whistle are faulted for it. The stories are pre-moralized: The reader is expected to understand beforehand the entirety of the story’s moral universe. For a reader to admit otherwise, or to argue an alternate interpretation, is to risk personal embarrassment or confrontation from those who will not brook dissent.
And making the reader uncomfortable is often the outright goal of the dysfunctional narrative. The writer is the presumed authority; the reader, the presumed student. It’s a retrograde posture, a nagging echo from a lesser-democratic time. (When I read A Brief History of Time, I was most certainly the student—but Hawking admirably never made me feel that way.) Dysfunctional narratives are often combative with the reader; they do not acknowledge the reader’s right to negotiate or question the message. With dysfunctional narratives, it’s difficult to discern if the writer is telling a story or digging a moat around their main character.
“What we have instead is not exactly drama and not exactly therapy,” Baxter writes. “No one is in a position to judge.” A dysfunctional narrative portrays a world with few to no alternatives. A functional narrative explores alternatives. (This is what I mean when I write of fiction as an experiment.)
This is why so many dysfunctional narratives are aligned to the writer’s biography—who can claim to be a better authority on your life, after all? But the moment a reader reads a story, its protagonist is no longer the author’s sole property. The character is now a shared construct. Their decisions may be questioned (hence the passive nature of the protagonists—inaction avoids such judgements). If the author introduces secondary characters, they can’t claim similar authority over them—every additional character is one more attack vector of criticism, a chipping away of absolute authority over the story itself. That’s what happened to sensitivity reader Kosoko Jackson in 2019, whose debut novel was pulped due to questions over his secondary characters.
Of all the traits listed—from the disarmed protagonist to the vaporous conclusion—the trait I find the “smelliest” is authorial infallibility and restricted interpretation. That’s why I used weasel language when I called Prince of Tides “nearly-dysfunctional:” The book is most certainly open to interpretation and questioning. In contrast, questioning a conspiracy theory could get you labeled an unwitting dupe, a useful idiot, or worse.
A Cambrian explosion
What Baxter doesn’t explore fully is why we’ve had this Cambrian explosion of dysfunctional narratives. He speculates a couple of possibilities, such as them coming down to us from our political leadership (like Moses carrying down the stone tablets), or as the byproduct of consumerism. I find myself at my most skeptical when his essay stumbles down these side roads.
When Baxter claims these stories arose out of “groups in our time [feeling] confused or powerless…in such a consumerist climate, the perplexed and unhappy don’t know what their lives are telling them,” it seems Baxter is offering a dysfunctional narrative to explain the existence of dysfunctional narratives. He claims these dysfunctional stories are produced by people of “irregular employment and mounting debts.” I strongly doubt this as well. In my experience, this type of folk are not the dominant producers of such narratives. Rather, these are the people who turn to stories for escape and uplift…the very comforts dysfunctional narratives cannot provide, and are not intended to provide.
Rather than point the finger at dead presidents or capitalism, I’m more inclined to ascribe the shift to a handful of changes in our culture.
The term “The Program Era” comes from a book by the same name detailing the postwar rise and influence of creative writing programs in the United States. This democratization of creative writing programs was not as democratic as once hoped, but it still led to a sharp increase in the numbers of people writing fiction. Most of those students were drawn from America’s upwardly-striving classes. And, as part of the workshop method used in these programs, it also led to a rise in those people having to sit quietly and listen to their peers criticize their stories, sometimes demolishing them. (Charles Baxter was a creative writing professor and the head of a prominent writing program in the Midwest. Many of his examples in Burning Down the House come from manuscripts he read as an instructor.)
With the expansion of writing programs came a rise in aspiring writers scratching around for powerful subject matter. Topics like trauma and abuse are lodestones when seeking supercharged dramatic stakes. Naturally, these writers also drew from personal biography for easy access to subject matter.
Another reason related to the Program Era is the heavy-handed emphasis on character-driven fiction over plot-driven fiction. I explore this theory here.
Another reason is staring back at you: The World Wide Web has empowered the masses to tell their stories to a global audience. This has created a dynamic where everyone can be a reader, a writer, and a critic, and all at the same time.
The natural next step in the evolution of the above is for storytellers to strategize how best to defend their work—to remove any fault in the story’s armor, to buttress it with rearguards and fortifications. (This is different than working hard to produce a high-quality experience, which, in my view, is a better use of time.) And there’s been a shift in why we tell stories: Not necessarily to entertain or enrich, but as an act of therapy or grievance, or to collect “allies” in a climate where you’re either with me or against me. Inaction in fiction has come to be praised as a literary virtue. Stories with characters who take matters into their own hands often are derided as genre fiction.
Pick up a university literary magazine and read it from cover to cover. The “smell” of dysfunctional narratives is awfully similar to the smell of social media jeremiads.
These are not the kind of stories I want to read, but it’s becoming increasingly difficult to distance myself from them. Writers should strive to offer more than a list grievances, or perform acts of score-settling. If it’s too much to ask stories to explain, then certainly we can expect them to connect dots. Even if the main character does not grow by the last page, we should grow by then, if only a little.
Mystery and Suspense Magazine has published my article “Sherlock Holmes: The enduring allure of history’s greatest detective” on their web site. In it, I explore the traditional reasons why critics and fans think the Baker Street detective remains popular—even immortal—to this day, and offer in return my own thoughts on the subject:
What is the enduring appeal of this shape-shifting character? Doyle gives no indication that Holmes is particularly attractive or magnetic in personality. He can be cold, abrasive, and downright rude in moments. One cannot help but feel Dr. Watson is a man with a remarkable reservoir of patience. How many Sherlock Holmes adventures open with the detective challenging Watson to discern the history of a person from nothing more than a walking stick, a battered hat, or a muddied shoe? Watson entertains Holmes and his games of deduction, but always as the lesser, never as the equal. (In my experience, medical doctors are not the sorts of people who take well to being talked down to.) Why would such a man continue to fascinate and entertain well into the 21st century?
My thinking on Doyle and his creation has shifted over the past few years due to a renewed appreciation for the Sherlock Holmes canon and, of course, writing a reinterpretation of his classic The Hound of the Baskervilles.
Update, 7 Jul 2022: I’ve taken a fair amount of heat for the sin of admitting I’d not read Sadly, Porn before producing the following post. Note that I did read the Amazon sample before writing what follows, which is 10% of the book’s opening; I don’t count that as a full read, and didn’t want to quibble about that point when I first published this. Scott Alexander’s review quotes substantially from the book as well, but again, another quibble.
What my detractors don’t seem to get is that this post spends the bulk of its energies examining W. H. Auden’s “West’s Disease” and not Edward Teach’s book. The post originated as a comment to Scott Alexander’s follow-up to his review, but as my comment grew and became more involved, I decided to publish it here, on my blog.
As such, this post should be framed as “If Astral Codex Ten and Resident Contrarian are correct about this one point in Sadly, Porn, it relates to West’s Disease in this way…”
But, of course, it depends on the reader to carry the logic from there, and not simply dunk on me and walk off with LOLs.
I am now reading Sadly, Porn. For the record, I’ve read nothing so far that changes my mind on any of my thoughts below. If anything, it’s only cementing my position.
Allow me to state this up-front: I’ve not read Edward Teach’s Sadly, Porn. Scott Alexander of Astral Codex Ten (ACX) has, though, and in response wrote a rather lengthy and discursive review, as well as a follow-up on the comments it elicited. At this moment, most of my understanding of Sadly, Porn comes from these sources (which I freely admit is an imperfect substitute for reading the book).
From what I’ve gathered, Sadly, Porn is a meandering and intentionally obscure treatise (diatribe?), grounded in psychoanalytics, which purports to explain—among other things—the ways people lie to themselves. Released in December 2021, the Kindle edition clocks in at over 1,100 pages, brimming with extended discourses on topics you might think plucked from the air, such as a ten-page examination of Shel Silverstein’s The Giving Tree. It’s also larded with David Foster Wallace-esque footnotes and Heartbreaking Work of Staggering Genius-style exhortations directed at the reader. The author opens with a thirty-page erotica story which, he later claims, is only included to scare off readers. (An odd strategy, since there are a multitude of writers producing such fiction for a lucrative living.) Really, to get a good idea of the book’s scope, read the ACX review.
What lit my interest in it comes from ACX taking a stab at boiling down Sadly, Porn to its core thesis:
Psychologically healthy people have desires. Sometimes they fantasize about these desires, and sometimes they act upon them. You’ve probably never met anyone like this.
Psychologically unhealthy people, e.g., you and everyone you know, don’t have desires, at least not in the normal sense. Wanting things is scary and might obligate you to act toward getting the thing lest you look like a coward. But your action might fail, and then you would be the sort of low-status loser who tries something and fails at it.
Again, from what I’ve gathered, Edward Teach believes that social status is the chief (or even sole) motivator of human behavior. (Or, perhaps he doesn’t; ACX makes it clear the book is too cagey to state its arguments plainly.)
Teach certainly paints us all as loathsome meat-bags of pettiness. Yet there’s something familiar about his observations that makes it difficult to reject his assertions. In a time where social media has devised a multitude of ways to score our social standing (via follower counts, likes, retweets, and so on), and in a culture endlessly promoting concepts like self-actualization and fame, his claims about the primacy of status-seeking has substance.
Now compare Teach’s accounting of Man’s damnable condition with W. H. Auden’s analysis of Nathanael West’s novels, where he first describes “West’s Disease”:
This is a disease of consciousness which renders it incapable of converting wishes into desires. … All wishes, whatever their apparent content, have the same and unvarying meaning: “I refuse to be what I am.” [But the sufferer] cannot desire anything, for the present state of the self is the ground of every desire, and that is precisely what the wisher rejects. [Emphasis mine.]
To simplify Auden’s distinction: A wish is the simple act of imagining oneself as a different person, or in a different situation; a desire is imagining how one might convert one’s current self into a different person or situation. A wish is wanting to be thin; a desire is vowing to join a gym and work-out every day (even if one doesn’t act on it). West’s Disease is the inability to transform one to the other, leading to inaction, loathing, and rage.
The finest examples of West’s Disease may be found in The Day of the Locust, Nathanael West’s most well-known novel. It’s a brilliant and acidic look at 1930s Hollywood, as witnessed by a motley group of misfits well-distanced from Tinseltown’s glamour, money, and success. “Hollywood’s success as a dream factory is predicated on knowing our wishes and actualizing them on the silver screen,” I wrote two years ago. “That’s why Hollywood appears a tantalizing cure for West’s Disease.”
Auden’s diagnosis that a person with West’s Disease “cannot desire anything” echoes another summation of Sadly, Porn from blogger Resident Contrarian:
[Teach asserts] we in general are incapable of action; we don’t want to act but also can’t act, and we rely on a nebulous “them” to put us on a track towards having to do it. … we want a situation where we don’t have to take an action, but where an action is demanded of us by circumstance.
I don’t think it’s a coincidence that RC’s example (“you don’t want to talk to the pretty girl; you want her to trip so you have to catch her”) sounds like a stock scene in a Hollywood romantic comedy. And I do equate wishing, in Auden’s terms, with Teach’s idea that we crave an externality to occur that actualizes our wishes for us. Teach is perhaps exploring Auden’s wish mechanism a bit more fully, but it looks to me that Auden in 1962 struck upon the same vein of thinking that Teach is attempting to communicate in 2022.
West’s Disease is what paralyzes the misfits in The Day of the Locust. These Hollywood outsiders witness the fruits of Hollywood’s money and glamour being distributed to others, never themselves. They want success, but success is supposed to come to them, not vice-versa. Faye Greener, the only character who can claim to have a film career in front of the camera, complains “the reason she wasn’t a star was because she didn’t have the right clothes.” (There’s a similar shrugging passivity in McCoy’s They Shoot Horses, Don’t They?)
But Auden is less absolutist than Teach. It’s West’s Disease, after all: It only afflicts certain individuals, whereas Teach finds it to be widespread. (Perhaps Teach is right, though. Perhaps West’s Disease is contagious and has spread virulently since 1962. Or since 1939, when Locust was published.)
Auden also does not pin down West’s Disease as a natural state of the human psyche, but as a result of modernity:
There have, no doubt, always been cases of West’s Disease, but the chances of infection in a democratic and mechanized society like our own are much greater than in the more static and poorer societies.
When, for most people, their work, their company, even their marriages, were determined, not by personal choice or ability, but by the class into which they were born, the individual was less tempted to develop a personal grudge against Fate; his fate was not his own but that of everyone around him.
But the greater the equality of opportunity in a society becomes, the more obvious becomes the inequality of the talent and character among individuals, and the more bitter and personal it must be to fail.[Again, emphasis mine.]
This jibes with one of my intuitions as I read ACX’s review: That Teach’s near-universality of status-seeking in the human psyche is more likely the result of (or greatly amplified by) recent trends in technology and social organization. Auden wrote the above when notions like meritocracy were ripe in the air and corporate ladders were being erected sky-high. Today, social media and tabloid-esque journalism is king, can show you the numbers to prove it, and has disjointed our culture in unexpected ways.
What’s more, 21st-century American popular media doesn’t merely make “inequalities of talent and character” obvious; our celebrity-obsessed culture revels in and celebrates them. As Budd Schulberg wrote about status climbing: “It will survive as long as money and prestige and power are ends in themselves, running wild, unharnessed from usefulness.”
The Day of the Locust opens describing those with West’s Disease as those who “loitered on the corners or stood with their backs to the shop windows and stared at everyone who passed. … They had come to California to die.” In the final chapter, they rise up in revolt, and Los Angeles burns. Auden saw West’s Disease as damaging not merely to the individual, but to the society around them.
Teach seems to treat West’s Disease as an intellectual, and perhaps masculine, failing. (Apparently cuckoldry is a running theme throughout Sadly, Porn.) The book adopts a scolding and sneering tone toward the reader, implicating them as weak and blithe to this delusion of false desires and status envy.
Twenty years ago this month, The Atlantic published a critical essay on the then-current state of American prose. As dry and dusty a topic that sounds—doubly so when published by an august New England monthly—the essay improbably became a cultural sensation, triggering op-eds in international newspapers, vitriolic letters-to-the-editor, and screechy denunciations from professional reviewers. Suddenly readers everywhere were debating—of all things—the modern novel.
Writer B. R. Myers unexpectedly touched a raw nerve in an America that was better-read than the literati believed possible. “A Reader’s Manifesto” dissected without mercy the work of such literary lights as Don DeLillo, Annie Proulx, Cormac McCarthy, Paul Auster, and David Guterson. Myers didn’t merely criticize their prose on terms of its grammar and diction. He attacked these writers on grounds of pretentiousness, and accused the literary establishment of abetting their ascendancy.
Charged stuff, but still very inside baseball. To rouse an impassioned response from readers over books like White Noise and Snow Falling on Cedars was a remarkable moment in American culture. It’s all the more notable a moment considering some of the above authors’ books satirize the inanity of American culture.
Looking back, it seems dream-like for a critical examination of literary novels to ignite such a furor. I can’t imagine such a thing happening today. Then again, it seemed equally unimaginable twenty years ago.
History of Manifesto
Fed-up with fawning reviews of works like Timbuktu and All the Pretty Horses, Myers first wrote his manifesto in 1999. Using careful, reasoned prose punctuated with wit and scathing humor, he roasted passages from prize-winning books—passages which had been the subject of so much praise by literary reviewers as examples of masterful writing. Using tried-and-true close-reading techniques, he punctured these writers’ obtuse and repetitive language to reveal their prose to be turgid, meaningless, and pretentious.
Myers was convinced no magazine or newspaper would publish his critique. He was an unknown in the literary world; a near-anonymous monograph on the quality of modern literary prose hardly promises to fly off bookstore shelves.
So Myers did what many writers would do in later years: He self-published his manifesto on Amazon. He titled it Gorgons in the Pool: The Trouble with Contemporary “Literary” Prose after a particularly choice passage in a Cormac McCarthy novel. “Nothing happened,” he later wrote. “I went online and ordered three copies for myself; they were the only ones ever sold.”
One of the copies he mailed out wound up in the hands of an Atlantic editor, who offered to publish rather than review it. The Atlantic demanded severe cuts and revisions, and the version published in the magazine comes off nastier than he’d intended. He also had the gut-wrenching task of waving off the Times Literary Supplement from publishing a review of Gorgons, as he’d already signed a contract with The Atlantic. (“As someone said to me the other day, ‘How do you know [Times Literary Supplement] wasn’t going to tear you apart?'” he later remarked. “I suppose everything worked out for the best.”) Bad timing would develop into a cadence for Manifesto.
The Atlantic article, tucked away deep inside the July/August double-issue, improbably made Myers a name-brand overnight among contemporary lit readers and writers. His outsider status only buffed his credentials as a hard-nosed reviewer. Even his use of first initials added a mysterious air to his origins. Although he received praise from many quarters, it mostly came from readers and (interestingly) journalists, a profession notorious for attracting writers shut-out of the book publishing world.
Although the literati initially ignored the essay, drumbeats of support from readers for Myers basic thesis—modern lit is pretentious—soon couldn’t be denied. Much of the early criticism directed back at Myers originated from book reviewers, book supplement editors, and literary novelists. Some of it was quite vitriolic, outraged anyone could suggest the writers he selected weren’t unassailable geniuses. Many exuded an air of befuddled annoyance: How could anyone give Myers or his thesis an ounce of credence? A few were outright smug about it, as though their refutations slammed the door on Myers and put an end to the dreary affair once and for all.
It didn’t work. The rebuttals only stoked increased support for Myers from readers around the world. The back-and-forth debate raged online and, as a mark of the times, across letters-to-the-editor pages, which printed point and counterpoint letters for weeks. This simply did not happen, even in a time when most people had their news delivered to them via bicycle.
Frustrated, the literary professional class took up what is today recognized as a surefire stratagem for shutting down an Internet debate: They doxxed him.
Not exactly—while The New York Times Book Review didn’t print Myers’ phone number and street address, they did see fit to delve into his past for anything incriminating (much like the Twitterati today will dumpster-dive people’s feeds to dig up embarrassing tweets from eight years ago). Demonstrating the ethics of a tabloid reporter, editor Judith Shulevitz dished to her readers that Myers was a foreigner (he’s not) who lived in New Mexico (i.e., not New York City) and was at that moment preparing to spend a year in Seoul “teaching North Korean literature to the South Koreans.” (Myers’ response: “I would probably have described my job in a way less calculated to evoke the phrase ‘selling ice to the eskimos.'”)
Shulevitz wrote Myers “is not just a man without a stake in the literary establishment. He is foreign to it in every way.” His manifesto could have
proved that a critic needs nothing more than taste to make a case. Does Myers’s essay do all this? It does not, because Myers doesn’t have a sure grasp of the world he’s attacking.
Most of the denunciations of Manifesto are steeped in this kind of a haughty condescension, and it served Myers well.
(I should add that I’m uncomfortable throwing around the phrase “literary establishment” as a catch-all for a wide and disjointed segment. Yet Shulevitz seemed comfortable acknowledging its existence in 2001, so I’ll assume it existed then and exists today.)
Manifesto continued to be a lodestone of bad timing. The Times‘ nativist pillorying of Myers was published on September 9, 2001. Two days later, the Times—and the rest of the world—was focused on a very different subject. The literary debate Myers had sparked that summer ground to a halt.
The history of Manifesto could easily have ended with the attacks on the World Trade Center, if not for events which nudged a little harder on the snowball Myers had started rolling in 1999.
First was Oprah selecting Jonathan Franzen’s The Corrections for her book club. To get an idea of how close this shaved against Myer’s Manifesto—and his continued game of footsie with bad timing—the same edition of the New York Times Book Review that exposed Myers as a Korean-teaching foreigner also included a glowing review of The Corrections laden with an irony of Oedipal proportions: The reviewer gives a winking approval that the book contains “just enough novel-of-paranoia touches so Oprah won’t assign it and ruin Franzen’s street cred.” Actually, Oprah was set to announce The Corrections as her next book club pick four days later (only to postpone it due to 9/11). When Franzen bristled that Oprah was attempting to smarten-up her book club by associating it with the “high-art literary tradition,” a new literary controversy erupted to displace Manifesto.
Although the imbroglio between Oprah and Franzen is better framed as tabloid-level tit-for-tat, Manifesto played a minor role. Online commenters made the point that Myers’ gripes about the literary establishment sneering down on the reading public were playing out before the nation’s eyes. Gone was his critics’ suggestion that, on this point, Myers was jousting with windmills.
The second event was Melville House publishing A Reader’s Manifesto: An Attack on the Growing Pretentiousness in American Literary Prose in 2002 (one of the two first books produced by the then-fledgling publisher). This full-length treatment gave Myers the opportunity to restore much of what was lost from Gorgons in the Pool when it was adapted for The Atlantic. It’s this edition I’ve based this review on.
The backward glance
I vividly recall reading “Manifesto” in the summer of 2001. I’d written my first novel and was discovering the ego-melting process called “finding a literary agent.” Over the prior years I had enrolled in evening and weekend creative writing courses around the Bay Area, where many of the books Myers lay judgment upon were held up as models exemplar. Also at the time I was a member of a weekly “writers’ reading group.” A member of the group handed me a Xerox of The Atlantic essay along with a half-joking warning not to take anything this Myers guy has to say too seriously.
I wound up taking B. R. Myers quite seriously. I had never read anything like “A Reader’s Manifesto.” Rereading Myer’s book for this post, I still marvel over his concision and convictions. It can be read in a single sitting, and unless you’re a grump, it will keep you engaged from start to finish. Myers understands well the game he’s taken up: He can’t poke a stick at others’ bad prose if his own prose is lacking. His manifesto is meticulous, refreshing, lively, and enlightening, as seen here when he trains his gimlet eye on McCarthy’s All the Pretty Horses:
As a fan of movie westerns I refuse to quibble with the myth that a rugged landscape can bestow an epic significance on the lives of its inhabitants. But as Conrad understood better than Melville, the novel is a fundamentally irreverent form; it tolerates epic language only when used with a selective touch. To record with the same majesty every aspect of a cowboy’s life, from a knife-fight to his lunchtime burrito, is to create what can only be described as kitsch.
Not only is this arguable, there’s a lot packed in there to argue with: I find this to be a positive.
Or here, where he’s analyzing David Guterson’s output:
…a slow tempo is as vital to his pseudo-lyrical effects as a fast one is to Proulx’s. What would otherwise be sprightly sentences are turned into mournful shuffles through the use of tautology. “Anything I said was a blunder, a faux pas,” “a clash of sound, discordant,” “She could see that he was angry, that he was holding it in, not exposing his rage,” “Wyman was gay, a homosexual,” and so on.
This level of tight engagement with the work at hand shows this is well above the usual culture-war crap that’s saturated our nation’s dialogue for decades now.
Some of his lines of attack are novel. Performing a close and scathing read of Annie Proulx’s self-approving dedication in Close Range (“my strangled, work-driven ways”) is the kind of antic you’d expect of the University Wits or Alexander Pope. His oft-quoted rejoinder to an exchange between Oprah and Toni Morrison is his most acidic and least endearing: “Sorry, my dear Toni, but it’s actually called bad writing.” (Less oft-quoted is his explanation: “Great prose isn’t always easy but it’s always lucid; no one of Oprah’s intelligence ever had to puzzle over what Joseph Conrad was trying to say in a particular sentence.”)
Regardless of what you might have read elsewhere, the boilerplate attacks on Myers don’t stand up to scrutiny. Supposedly he values plot over form; he disdains “difficult” books; he cherry-picked bad passages from the books he attacks; he selected writers who’d gone out of fashion; or the confounding claim that he’s a humorless cur prone to sarcasm and snide shots. Having read his book at least four times now, I say none of these complaints hold water. (Sarcasm may be the lowest form of wit, but it’s not humorless.) I’m not saying there’s no room for criticizing Manifesto, only that dismissing Myers without engaging his points is not fruitful.
And there’s plenty in Manifesto for writers to take away. Rather than being satisfied with throwing spitballs at modern lit, he contrasts prose he finds vapid with prose that stands up. Myers will forever get grief for quoting Louis L’Amour’s Hondo with approval, but the passage he includes is a model of clean, effective writing that succeeds in characterizing the protagonist with the deftness of a parable. Myers makes the point several times that the prose he’s complaining about could have been written with less-pompous English, and takes a few stabs at editing it as proof. He’s engaged with the texts under the gun, a marked difference from his critics who sniff down on him (and, it seems, cannot be bothered to quote and refute his specific claims).
My take-away from Manifesto for writers is, don’t produce affected writing, produce affecting writing: Language that stirs the reader and shines a light rather than obscures. Good editing requires close reads of your prose, and questioning what every word is doing in a sentence. Ditch the idea that affecting prose is “easy” and affected prose is “difficult,” an avant-garde pose. One critic complained “‘prose,’ for [Myers], equals syntax plus diction, and is expected to denote, rather than to evoke.” I think he expects it to do both.
Revolt of the reading public
The significance of Myer’s Manifesto is not a perverse thrill of taking down holy cows like McCarthy and DeLillo, but how eerily it presaged the next twenty years in American publishing. The circuitous route Myers followed from Gorgons in the Pool to The Atlantic Monthly to Melville House is a once-in-a-generation aberration, but the elements of getting said critique out of the word processor and into the hands of readers rings awfully familiar today.
When I read in 2002 of Myers self-publishing Gorgons on Amazon, I was floored: I had no idea such an opportunity was available to mere mortals. It was a bona fide light-bulb moment, the first time I pondered the possibility of making an end-run around the New York City publishers and selling my work directly to readers. Ten years later, not only was Amazon still open to self-publishing, the company was rapidly tooling up to make publishing your own e-book as easy as clicking a mouse button.
Less obvious today, but notable in 2001, was Myers praising Amazon user reviews (of the books Myers was criticizing, not his own overlooked Gorgons). Before Manifesto, any reference in the popular media to Amazon’s user reviews was bound to be dismissive or sardonic. Back then, cultural commentators saw putting opinion-making into the hands of readers as ludicrous as a truck driver penning a starred Michelin review. (Don’t forget, there were still people in 2001 arguing the Internet was a passing fad—that it was faster to drive to the bookstore and buy a book than for Amazon to deliver it, ergo Amazon’s days were numbered.) Myers didn’t merely approve of Amazon user reviews, he used them as evidence that readers can and do understand difficult literature. I believe this is the first time I saw anyone in the cultural sphere do this.
Self-publishing; “average people” versus the experts; the power of reader reviews; the pseudo-doxxing Myers was subjected to; online discussion boards keeping the debate alive; and vitriolic denunciations from on high. All that’s missing is a hash tag and some Bitcoin changing hands, and the dust-up around Manifesto would sound like any number of social media episodes we’ve seen in recent years.
Martin Gurri’s The Revolt of the Public deserves mention here. Although I’ve not read it, I have read plenty of reviews and analyses, simply because this 2014 book is claimed to have predicted the rise of Donald Trump, Brexit, cancel culture, the Capitol Hill attacks, QAnon, #MeToo, and more. (It too was self-published on Amazon.)
Gurri’s thesis is that the Internet is destabilizing public respect for institutional authority and, in due course, undermining the authorities’ control over social and political narratives. The expert class, once considered the final word, now must defend itself from an increasingly skeptical public.
It seems to me that the narratives being disrupted by digital communications may not merely be political narratives but also traditional ones—the narratives offered by the literary novel, and the narratives sold to the public by the literary expert class. Not only are big-name authors being treated with skepticism by the general public, so are the stories they’re proffering as significant both in terms of literary heft and their cultural insights. Look no further than the controversy surrounding last year’s American Dirt by Jeanine Cummins for an example of voices from below shouting up at the ensconced above, or the backlash suffered by Sarah Dessen after shaming a critical reader.
The disruption to the literary world even extends to novelists’ fawning reviewers. There is less distinction here than would first appear: Literary novels are often reviewed by other literary novelists. This incestuousness would be a scandal in other fields. “Imagine what would happen if the Big Three were allowed to review each other’s cars in Consumer Reports,” Myers noted in an interview. “They’d save the bad reviews for outsiders like the Japanese.”
A before-and-after example of the Internet’s effect on the publishing world is Lorenzo Carcaterra’s Sleepers (1995) and James Frey’s A Million Little Pieces (2003). Both were mega-bestsellers whose publication dates bookend the Internet’s ascension in daily life. Both were published as memoirs, and both had their factual accuracy challenged. The mass media reported the controversy around Sleepers by copy-and-pasting publisher press releases and quoting book agents. A Million Little Pieces was put under the Internet’s collective magnifying glass thanks to an investigation by the amateur web site The Smoking Gun.
This people-powered exposé became a nightmare for James Frey, and his reputation never recovered. Editions of A Million Little Pieces (another Oprah book club pick!) now include a publisher’s note warning of “certain embellishments” and “invented” details: “The reader should not consider this book anything other than a work of literature.”
Carcaterra largely escaped unscathed in 1995 thanks to the controversy being framed by the media as a publishing industry squabble. Sleepers remains sold as memoir. (Funnily enough, it’s also listed under Amazon’s “Hoaxes & Deceptions” category.) Carcaterra’s luck can be measured in years. If Sleepers had been a bestselling memoir in 2005, the Internet would have torn it to shreds.
“Leaders can’t stand at the top of pyramids anymore and talk down to people,” Martin Gurri writes. “The digital revolution flattened everything.” I say A Reader’s Manifesto was the initial deflating puncture of the literary world’s cozy status quo.
Engendered reputations
In the conclusion of Manifesto, Myers writes:
I don’t believe anything I write will have much effect on these writers’ careers. The public will give them no more thought in twenty years than it gives, say, Norman Rush today, but that will have nothing to do with me, and everything to do with what engendered their reputations in the first place.
(If you’re wondering who Norman Rush is, I confess I had to look him up myself.)
Some of the rebuttals directed at Myers in 2001 claimed a few of these authors were already “on their way out,” although each critic seemed to formulate a different list of who remained relevant and who was exiting stage left. I’m tempted to produce a list of the writers whose work Myers criticized to see where their reputations stand today. I won’t do that; any reader so inclined could make such a list on their own.
I will point out that some of Myers’ subjects have sunk into a comfortable life of teaching, penning the occasional pop culture piece, and a general resting upon of laurels. Myers makes a couple of pointed barbs about Old Man and the Sea, but at least Hemingway was still throwing left-hooks at the end of his life.
(When Myers’ critics claim that literary book awards and glowing reviews in highbrow magazines are meaningless, or that Myers ignored genre fiction’s own system of awards and reviews, they’re overlooking the enduring social capital of “literary significance.” A science-fiction writer receiving big-time accolades in 2001 is not going to be, in 2021, a tenured professor traveling the writer’s retreat circuit as a featured speaker and penning fluffy think pieces for Harper’s. The self-propelling feedback loop that is the literary world should not be discounted.)
Note that Myers leaves unsaid what exactly engendered these authors’ reputations in the first place. The optimist in me thinks he’s referring to the evanescence of their writing postures—live by the sword, die by the sword.
The pessimist in me suspects what really engendered their reputations is a resilient enabling literary class which eagerly maintains its country-club exclusivity while claiming commitments to diversity. Even in the face of a massive shift in digital publishing, and the concomitant explosion of voices now available via e-books and print-on-demand, the literary establishment remains a closed shop. Its reviewers walk hand-in-hand with big publishers, who in turn regularly ink seven-figure publishing deals and expect a return on said investment. Positive reviews in well-placed periodicals are an important component of any publishing marketing plan. (The podcast “Personal Rejection Letter” explored this question in 2017, along with a retrospective of Myer’s Manifesto.)
In other words, the authors Myers put under the microscope may or may not be relevant twenty years later, but the system that held them aloft remains alive and strong. The Internet has kneecapped it some—the literary establishment is less commanding than it once was—but it’s still humming along.
Could Myers have at least shifted the conversation? I say he did. While Jonathan Franzen’s 1996 “Perchance to Dream” (re-titled “Why Bother?”) and Tom Wolfe’s 1989 “Stalking the Billion-Footed Beast” are both considered modern literary manifestos of great import, it’s plain to me that Myers’ Manifesto has shown far more staying power in the public’s and writers’ consciousness. Even in a 2010 critical response to B. R. Myers review of Franzen’s Freedom, the comments section swings back and forth on the significance of Myer’s Manifesto, with the most recent comment coming in 2016. There are YouTube videos produced as late as last year going over the debate Myers ignited twenty years ago.
Meanwhile, in creative writing courses across America, mentioning Myers’ name will still earn an eye-roll and a dramatic sigh from the instructor, wordlessly asking when this guy will just go away.
In John Updike’s Picked-up Pieces, he expounds on his personal rules for reviewing books. I’m quoting them here to remind myself of this hard-won wisdom as well as to share with others:
Try to understand what the author wished to do, and do not blame him for not achieving what he did not attempt.
Give enough direct quotation—at least one extended passage—of the book’s prose so the review’s reader can form his own impression, can get his own taste.
Confirm your description of the book with quotation from the book, if only phrase-long, rather than proceeding by fuzzy précis.
Go easy on plot summary, and do not give away the ending.…
If the book is judged deficient, cite a successful example along the same lines, from the author’s oeuvre or elsewhere. Try to understand the failure. Sure it’s his and not yours?
To these concrete five might be added a vaguer sixth, having to do with maintaining a chemical purity in the reaction between product and appraiser. Do not accept for review a book you are predisposed to dislike, or committed by friendship to like. Do not imagine yourself a caretaker of any tradition, an enforcer of any party standards, a warrior in any ideological battle, a corrections officer of any kind. … Review the book, not the reputation.
I write reviews on this blog. Have I followed Updike’s commandments? Most are familiar to me in a hazy internalized way, even if I’ve not formalized book reviewing rules of my own. I confess guilt to giving away the ending of at least one book (a mystery novel, no less). And I could probably do better in my reviews with quoting the source material.
We live in a time of constant media negotiation. We’re not consumeristic readers any longer. With the Internet we’ve all become critics. Film review sites allows moviegoers to pan movies, even pan them before they’re released. We’re saturated with media and we’re saturated with criticism too. Updike’s rules may serve a newfound purpose: A way for us to judge criticism rather than accept it uncritically.