Last year I wrote about dysfunctional narratives, a type of story that Charles Baxter first identified in the 1990s and which now seems overly prevalent today. He quoted a description of them by poet Marilynne Robinson, who also identified this type of narrative. She called it a “mean little myth”:
One is born and in passage through childhood suffers some grave harm. Subsequent good fortune is meaningless because of the injury, while subsequent misfortune is highly significant as the consequence of this injury. The work of one’s life is to discover and name the harm one has suffered.
In my post, I wrote about a “Cambrian explosion” of dysfunctional narratives in our culture since the 1990s, this sense that we’re being overwhelmed by them. They’re in our magazines and books, in our cinema, in our newspapers, and on social media. “Reading begins to be understood as a form of personal therapy or political action,” Baxter wrote, and his observation seems as acute today as it did back then.
Last year I offered a few explanations for what energized this explosion. Recently I thought of another reason to add to the list. It’s a concept repeated endlessly in creative writing classes and how-to guides on writing fiction, namely, character-driven fiction versus plot-driven fiction. Respectable authors are supposed to write character-driven fiction and to eschew plot-driven fiction, which is largely associated with genre fiction.
When I first heard this edict of character versus plot, I accepted it as sage wisdom, and sought to follow it closely. Over the years, I kept hearing it from instructors and successful writers, especially writers of so-called literary fiction. I heard it so much, I began to question it. What exactly is character? What is plot?
I began to pose these questions to my peers. Their response usually sounded like this:
“‘Character’ is all the things that make a character unique. ‘Plot’ is the stuff that happens in a story.” A character-driven story is supposedly rich with humanizing details, while a plot-driven piece is a fluffy story where “a lot of stuff happens.”
Aristotle is not the final word on literary analysis, but his opinions on how a story succeeds or fails is far more nuanced than what many of my peers and instructors in creative writing programs could offer.
Aristotle defines character as a set of human traits imitated in the text. Traits could be run-of-the-mill personality markers, such as a character who is studious or arrogant, or complex and contradictory, like Hamlet’s brooding and questioning nature. Before modern times, playwrights often used traits associated with the four humors to define characters in a play.
For Aristotle, plot is the series of decisions a character makes that propels the story forward. These decisions generally take two forms: The character speaks, or the character acts. In line with the saying “actions speak louder than words,” Aristotle holds that a character’s actions are more significant, and more revealing, than the words they mouth.
When one of the salesmen in Glengarry Glen Ross announces he’s going close a big sale that night, and then crosses the street to have a cocktail, his actions reveal the hollowness of his words. Both decisions (speaking and acting) are also plot. Plot proves what character traits merely suggest.1
In other words, plot is not “stuff that happens.” (Note the passive voice, as though plot elements are forced upon the characters.) Rather, plot is a sequence of decisions made—and readers are very interested in a character’s decisions.
To be fair, inaction by a character is a kind of decision. Certainly there’s room for stories about characters who ponder a great deal and do little about it. In successful fiction, though, the final effect of inaction is almost always ironic. (Two good examples are Richard Ford’s “Rock Springs” and Thurber’s “The Secret Life of Walter Mitty.”) The problem is when inaction in literary fiction is treated as sublime.
The inaccurate, watered-down definition of plot-driven fiction—”A story where a lot of stuff happens”—has led to contemporary American literature’s fascination with flabby, low-energy narratives. I’ve met authors proud that the characters in their stories don’t do anything—never get off the couch, never pick up the phone, never make a decision of any consequence. Literary fiction has come to regard passivity as a virtue and action as a vice. A writer crafting a character who takes matters into their own hands risks having their work classified as genre fiction.
For decades now, creative writing programs have been pushing an aesthetic emphasizing character traits over character decisions. It’s frustrating to watch, year after year, the primacy of character-driven fiction getting pushed on young writers, with too many of them accepting the mantra without further consideration.
And this is why I think the Cambrian explosion of dysfunctional narratives is tied to this obsession with character-driven fiction. Passivity and inactivity are keystones of Baxter’s dysfunctional narratives. In his essay, he notes the trend toward “me” stories (“the protagonists…are central characters to whom things happen”) over “I” stories (“the protagonist makes certain decisions and takes some responsibility for them”).
This is why I’m wary of character-driven writers who do not permit their protagonists to make mistakes, instead strategically devising stories where they make no mistakes, and are therefore blameless. No wonder plot—that is, decision-making—is being eschewed, when this is the kind of story being upheld and praised.
Aristotle’s Poetics are obviously far more complicated than my three-paragraph summary, but the gist described here holds. ↩︎
A little more than a week ago, I wrote a review of an art show by the artist and TikTok sensation Devon Rodriguez, best known for live drawing subway riders. He is, by some measures, the most famous artist in the world, with many millions of social media followers. He did not like the review.
It went up on a Friday. On Saturday morning, I woke up to a tidal wave of anger from Rodriguez on Instagram, tagging me across scores of posts. Hundreds of his followers went on the attack.
Davis gives a more nuanced and thoughtful analysis of his hellish situation than should be expected from someone who received death threats over, of all things, a review of an art show. He reasons
the only way I can understand Rodriguez’s incredibly thin-skinned reaction to my article is that he has managed to rise to this status of apex visibility without any kind of critical writing about him at all. It’s all just been feel-good profiles, so that the first critical word feels like a huge crisis. That’s a relatively new kind of situation for an artist to be in…
In the past, artists had to pass through the gatekeepers of museums and art galleries before becoming well-known to the public. Even Basquiat had to break through the establishment before securing his place in the art world. In today’s digital world, it’s possible, even desirable, to hurdle over the gatekeepers and go straight to the masses with one’s output.
A similar dynamic is at play in the world of publishing, as I’ve written about a fewtimes. This desire to stand above criticism is, in my mind, the root motivation for dysfunctional narratives. The tenor of the attacks Ben Davis withstood sounds much like the way dysfunctional narratives are defended, such as the Rodriguez fan who snapped at Davis, “What if he was your son??”
Davis links this reaction to the notion of “parasocial relationships,” that is, “the imaginary, one-sided friendships people develop with celebrities and influencers in their heads.” This cuts to the “transitive logic” I wrote in 2019 about an all-too-similar event involving Sarah Dessen and her followers when they attacked a college student who posted a relatively innocuous criticism of Dessen’s work: “The logic magnified an innocuous criticism of a single YA author to an attack on all YA fiction and its readers. Thus, the logic went, if you’re a reader of YA fiction, it’s a personal attack on you.”
“Parasocial relationships” is the best term I’ve seen to describe how Dessen’s followers rose up and hounded the college student offline. Much of the outrage seemed rooted in the feeling that Dessen was not merely a YA author, but their friend. Any why not? These new, online super-authors are
not merely authors, they’re brands. Many of these YA authors have crafted an online persona of a confidant and sympathizing mentor. You don’t merely read their books, you hear from them everyday. You see their vacation photos and learn about their pets. You share their ups and downs in the real world.
Wikipedia says that the term parasocial interactions was first coined in 1956, no doubt in part inspired by the rise of television in the United States. The researchers described them existing prior to mass media, such as people emotionally bonding to gods, supernatural spirits, or saints. They are telling examples.
It requires no divination skills to predict these social media brouhahas will continue so long as artists and writers can organically grow their followings. Certainly I don’t see these kerfuffles as justification for returning to the pre-digital way, where editors and publishers decided over Negroni lunches who got published and who got to languish. But being thin-skinned to criticism, and using one’s followers to “cancel” the critic, is a bad choice no matter how you look at it.
As Davis predicts:
If there’s no criticism of [Rodriguez’s art], here’s what I think will happen: All the marketing companies and PR people looking to piggyback on Rodriguez’s popularity will stuff his feed with more and more cringe celebrity content and half-baked promo ideas until his social-media presence is bled dry of whatever charm it has.
Wearily, I started his essay expecting more of the same, and lo, finding it: Computers and the Internet, he contends, has done much to destroy literary fiction. By this point, I’m surprised any writer pursuing such a thesis would bother fortifying their argument with examples or statistics. Blythe does not fail on that count either: Other than some “c’mon, look around, you know what I’m saying,” the argument is made sans corroborative evidence. Of course the Internet has wrecked American literature. Why bother denying it?
It’s telling, then, that Blythe opens with the usual barrage of accusations about digital distractions—”Can you read anything at all from start to finish, i.e. an essay or a short story, without your mind being sliced apart by some digital switchblade?”—and then, to prove how things used to be so much better way back when, he segues to life as an Esquire editor in the 1980s and 90s:
[Rust Hill] and I would occasionally drink two or three Negronis at lunch, sometimes at the New York Delicatessen on 57th Street, and talk about the writers and novels and short stories we loved (and hated). … Then he and I would happily weave our way back to the office at 1790 Broadway, plop down in our cubicles and make enthusiastic phone calls to writers and agents, our voices probably a little louder than usual.
The jokes about fiction editors at a national magazine choosing stories to publish after a three-cocktail lunch write themselves, so I won’t bother. (Although I should, since, as an early writer, I had high hopes for placing a short story with a publication like Esquire. Perhaps I should have mailed a bottle of Bombay with each of my submissions.)
The dichotomy Blythe illustrates is telling: The hellish “after” is the mob writing Amazon user reviews and him not knowing how to turn off iPhone notifications; the blissful “before” is editorial cocktail lunches and not having to give a rat’s ass what anyone else thinks.
One counterpoint to Blythe’s thesis: The 1980s had plenty of distractions, including the now-obvious inability to silence your telephone without taking it off the hook. Another counterpoint: If you want to drink Negronis and argue literature over Reubens, well, you can do that today too. A third counterpoint: A short story printed in the pages of Esquire was sandwiched between glossy full-color ads for sports cars, tobacco, and liquor—most featuring leggy models in evening gowns or swimsuits. Distractions abounded, even before the Internet.
But none of these are what Blythe is really talking about. What he bemoans is the diffusion of editorial power over the past twenty years.
Blythe throws a curveball—a predictable curveball—after his reminisces about Negronis and schmears. Sure, computers are to blame for everything, but the real crime is that computers now permit readers to make their opinions on fiction known:
Writers and writing tend to be voted upon by readers, who inflict economic power (buy or kill the novel!) rather than deeply examining work the way passionate critics once did in newspapers and magazines. Their “likes” and “dislikes” make for massive rejoinders rather than critical insight. It’s actually a kind of bland politics, as if books and stories are to be elected or defeated. Everyone is apparently a numerical critic now, though not necessarily an astute one.
I don’t actually believe Blythe has done a thorough job surveying the digital landscape to consider the assortment and quality of reader reviews out there. There are, in fact, a plenitude of readers penning worthy critical insight over fiction. Just as there are so many great writers out there that deserve wider audiences, there also exist critical readers who should be trumpeted farther afield.
Setting that aside, I still happily defend readers content to note a simple up/down vote as their estimation of a book. Not every expression of having read a book demands an in-depth 8,000 word essay on the plight of the modern Citizen of the World.
Rather, I believe Blythe—as with so many others in the literary establishment—cannot accept readers could have any worthwhile expressible opinion about fiction. The world was so much easier when editors at glossy magazines issued the final word on what constituted good fiction and what was a dud. See also a book I’m certain Blythe detests, A Reader’s Manifesto, which tears apart—almost point by point—Blythe’s gripes.
When B. R Myers’ Manifesto was published twenty years ago, a major criticism of it was that Myers was tilting at windmills—that the literary establishment was not as snobbish and elitist as he described. Yet here Blythe is practically copping to the charges.
Thus the inanity of him complaining that today’s readers hold the power to “inflict economic power” when, apparently, such power should reside solely with critics and magazine editors. I don’t even want to argue this point; this idea is a retrograde understanding of how the world should work. This is why golden age thinking is so pernicious—since things used to be this way, it was the best way. Except when it’s not.
Of course the world was easier for the editors of national slicks fifty years ago, just as life used to be good for book publishers, major news broadcasters, and the rest of the national media. It was also deeply unsatisfying if one were not standing near the top of those heaps. It does not take much scratching in the dirt to understand the motivations of the counterculture and punk movements in producing their own criticism. The only other option back then was to bow to the opinions of a klatch of New York City editors and critics whose ascendancy was even more opaque than the bishops of the Holy See.
That said, it’s good to see a former Esquire editor praise the fiction output of magazines that, not so long ago, editors at that level were expected to sneer down upon: Publications such as Redbook, McCall’s, Analog, and Asimov’s Science Fiction all get an approving nod from Blythe.
But to cling to the assertion that in mid-century America “short fiction was a viable business, for publishers and writers alike” is golden age-ism at its worst. Sure, a few writers could make a go at it, but in this case the exceptions do not prove the rule. The vast sea of short story writers in America had to settle for—and continue to settle for—being published in obscure literary magazines and paid in free copies.
No less than Arthur Miller opined that the golden age of American theater arced in his own lifetime. Pianist Bill Evans remarked he was blessed to have experienced the tail end of jazz’s golden age in America before rock ‘n’ roll sucked all the oxygen out of the room. Neither of those artistic golden ages perished because of the Internet.
What caused them to die? That’s complicated, sure, but their demise—or, at least, rapid descents—were preceded by a turn toward the avant-garde. Which is to say, it became fashionable for jazz and theater to distance themselves from their audience under the guise of moving the art forward. The only moving that happened, though, was the audience for the exits.
Blythe then turns his attention to a third gripe in his meandering essay. Without a shred of evidence, he argues that the digital revolution of the last twenty-five years metastasized into a cultural Puritanism in today’s publishing world:
Perhaps because of online mass condemnations, there’s simply too much of an ethical demand in fiction from fearful editors and “sensitivity readers,” whose sensitivity is not unlike that of children raised in religious families… Too many authors and editors fear that they might write or publish something that to them, at least, is unknowingly “wrong,” narratives that will reveal their ethical ignorance, much to their shame. It’s as if etiquette has become ethics, and blasphemy a sin of secularity.
I cannot deny that there appears to be a correlation between the rise of the Internet in our daily lives and the shift over the last decade to cancel or ban “problematic” literature. What I fail to see is how pop-up alerts or a proliferation of Wi-Fi hot spots is to blame for this situation.
If Blythe were to peer backwards once more to his golden age of gin-soaked lunches, he would recall a nascent cultural phenomenon called “political correctness.” P.C. was the Ur-movement to today’s sensitivity readers and skittish editors. Social media whipped political correctness’ protestations into a hot froth of virtuous umbrage—a video game of oneupsmanship in political consciousness, where high scores are tallied with likes and follower counts. Using social media as leverage to block books from publication was the logical next step. But blaming computers for this situation is like blaming neutrons for the atom bomb.
After a dozen paragraphs of shaking my head at Blythe’s litany of complaints, I was pleasantly surprised to find myself in agreement with him:
The power of literary fiction—good literary fiction, anyway—does not come from moral rectitude. … Good literature investigates morality. It stares unrelentingly at the behavior of its characters without requiring righteousness.
At the risk of broken-record syndrome, I’ll repeat my claim that Charles Baxter’s “Dysfunctional Narratives” (penned twenty-five years ago, near the beginning of the Internet revolution) quietly predicted the situation Blythe is griping about today. Back then, Baxter noticed the earliest stirrings of a type of fiction where “characters are not often permitted to make intelligent and interesting mistakes and then to acknowledge them. … If fictional characters do make such mistakes, they’re judged immediately and without appeal.” He noted that reading had begun “to be understood as a form of personal therapy or political action,” and that this type of fiction was “pre-moralized.”
Unlike Blythe, Baxter did not fret that literary fiction would perish. Baxter was a creative writing instructor at a thriving Midwestern MFA program. He knew damn well that writing literary fiction was a growth industry, and in no danger of extinction. What concerned him was how much of this fiction was (and is) “me” fiction, that is, centered around passive protagonists suffering through some wrong. He noticed a dearth of “I” fiction with active protagonists who make decisions and face consequences.
As Blythe writes:
Too many publishers and editors these days seem to regard themselves as secular priests, dictating right and wrong, as opposed to focusing on the allure of the mystifying and the excitement of uncertainty. Ethics and aesthetics appear in this era to be intentionally merged, as if their respective “good” is identical.
If Blythe is going to roll his eyes at the glut of reader-led cancellations and moralizing editors, perhaps he could consider another glut in the literary world: The flood of the literary memoir, with its “searing” psychic wounds placed under microscope, and its inevitably featherweight closing epiphany. These testaments of self-actualization may be shelved under nonfiction, but they are decidedly fictional in construction. In the literary world, stories of imagination and projection have been superseded by stories of repurposed memory, whose critical defense is, invariably, “But this really happened.”
It was not always so. Memoir was once synonymous with popular fiction. Autobiography was reserved for celebrities such as Howard Cosell and Shirley MacLaine, or a controversial individual who found themself in the nation’s spotlight for a brief moment. It was not treated as a high art form, and perceived in some quarters as self-indulgent. No more.
There remains an audience for great fiction. Readers know when they’re being talked down to. They know the difference between a clueless author being crass and a thoughtful author being brutally honest. They also know the difference between a ripping yarn and a pre-moralized story they’re “supposed” to read, like eating one’s vegetables.
The death of literary fiction—especially the short story—will not be due to iPhone notifications and social media cancellations. Perhaps the problem Blythe senses is the loss of a mission to nurture and promote great fiction. The literary world has turned inward and grown insular. Its priorities are so skewed, I’ve witnessed literary writers question if fiction can even be judged or critiqued. The worsening relationship of class to literary fiction should not be overlooked, either.
If Blythe laments Asimov’s Science Fiction, perhaps he should check out the thriving Clarkesworld. Substacks of regular short fiction are regularly delivering work to thousands of readers. I don’t know if these publications’ editors are gulping down Negronis during their daily Zoom meetings—but as long as they’re putting out quality fiction that challenges and questions and enlightens, maybe that doesn’t matter, and never did.
Kat Rosenfield at Unherd claims she knows why men are no longer wild: “Our sense of adventure died with Chris McCandless.”
I last wrote about the mythology around Chris McCandless and Jon Krakauer’s Into the Wild in 2015. Rosenfield’s article motivated me to survey the situation once more.
The meat of Rosenfield’s argument lies within this claim:
McCandless’s story became the object of fascination — and not long after that, backlash. His life was either an inspiring example of indomitable American spirit or a nauseating waste of privilege and opportunity; his death was either a tragic accident or an idiotic, avoidable bit of foolishness.
My motivation for this post started right there, in the above assertion. Compare it to what I wrote in 2015:
If this framing—reckless versus romantic—sounds wearily familiar, it’s because the debate over McCandless’s death has become nothing more than a flash-point in a broader argument we’ve had in America since he and I were born…
McCandless’ life has been converted into a proxy for this country’s culture wars, a string of battles where no one—no one—raises the white flag. … I’m unable to see how this situation honors or respects Christopher McCandless’ life.
Eight years later, Chris McCandless is still serving as a proxy for whatever culture war debate is on our collective brains at the moment. His life and death, cleansed and romanticized, provide a mythic framework to hang any number of ideological flags: anti-materialism, anti-woke, anti-technology, anti-Americanism, anti-capitalism, and more.
This is vital to recognize. Culture war ushers in an inevitable and putty-like transitive logic: Criticism of Krakauer is interpreted as an attack on McCandless; criticism of McCandless is read as an attack on his values; criticism of McCandless’ fans is an attack on progressivism; and so on.
So, yeah, I have and will criticize Krakauer, the actions of some of McCandless’ legions of fans, and the politicization of Into the Wild. I’ll even criticize some of McCandless’ choices. I’m trusting you to recognize that doesn’t make me a “Chris hater.”
“He’s emerged as a hero”
From there, Rosenfield takes what has become a standard approach in modern rhetoric: She claims her reasonable and clear-eyed position is the one under attack, even if she has trouble locating the hordes mounting said attack:
The lumping-together of McCandless with Thoreau was inevitable, and not just because the latter was a major inspiration for the former: here was an expression of the timeless desire to take these icons of male self-sufficiency down a peg. Today, the mention of either man tends to elicit a snarl — but the bulk of the anger is saved for McCandless, fuelled by a contemporary media ecosystem that keeps finding new ways to tell his story. [Emphasis mine.]
I cannot recall anyone “snarling” over Thoreau, who remains required reading in American high school and university curricula. He may have been dismissed in his time as a fraud or a crank, but I’m pretty sure Thoreau’s preeminence in American culture and letters is secure.
As for McCandless, Into the Wild has been translated into thirty languages and has remained in print since it was first published in 1996. It was made into a major motion picture by Sean Penn. It’s on high school and college reading lists across the nation, and was selected by Slate as one of the best nonfiction books of the past twenty-five years. A web site dedicated to Chris McCandless (christophermccandless.info) is still going strong, hosting papers written by young people affected by McCandless’ life story, a memorial foundation, a documentary, and more. The PBS special “Return to the Wild” promised to “probe the mystery that still lies at the heart of a story that has become part of the American literary canon and compels so many to this day.”
Yet Rosenfield somehow concludes McCandless and his life story are under a brutal and withering assault. If the debate could be placed on a scale like produce aisle apples, I’m certain we’d find any snarling criticism of Chris McCandless and Jon Krakauer is far outweighed by McCandless’ legions of fans and sympathetic media sources.
It doesn’t help that Rosenfield’s adoring portrayal of McCandless as the kind of masculinity we need more of in our on society is framed just as she outlined at the top of her article: “His life was either an inspiring example of indomitable American spirit or a nauseating waste of privilege and opportunity.” There are no alternatives, apparently.
“But lately,” she writes, “the controversy surrounding McCandless as a mythological figure is no longer an accompaniment to the story; it is the story.”
I don’t know that “lately” part is true. The controversy around McCandless began almost immediately after Krakauer’s story was published in Outside magazine. Its editors reported they’ve never received as much mail about a single story, before or since. The controversy became the focus of all subsequent accounts of Chris McCandless for the same reason Twitter melted down over the color of a dress: People couldn’t believe any sane person would disagree with their interpretation—and if you did, there was something wrong with you.
Here’s a good question: Why is McCandless a mythological figure? He was a human being, “full of vim and vigor, a complicated young man of effusive talents, predictable weaknesses, and eccentric foibles,” as I wrote in 2015. We can valorize a man, we can valorize his values. Why valorize his avoidable death?
Much of the time I agree with the “he had a death wish” camp because I don’t know how else to reconcile what we know of his ordeal. Now and then I venture into the “what a dumbshit” territory, tempered by brief alliances with the “he was just another romantic boy on an all-American quest” partisans. Mostly I’m puzzled by the way he’s emerged as a hero, a kind of privileged-yet-strangely-dissatisfied-with-his-existence hero.
I don’t agree with the “death wish” angle, nor do I think he was a “dumbshit” for entering the Alaskan interior. (I would say “grievously unprepared.”) “An all-American quest” ticks a few of my personal checkboxes, as does “I’m puzzled by the way he’s emerged as a hero.”
The diary
In an aside, Rosenfield complains that McCandless’ sister’s memoir The Wild Truth (2014) has been weaponized: “It was received less as additional context to his story than a debunking of it: McCandless wasn’t a latter-day adventurer, he was a spoiled trust-fund kid with daddy issues.” (It would have helped if Rosenfield could have linked to an actual “debunking.” The single link she provided goes to a perfunctory summary of the memoir by USA Today.)
Carine McCandless does offer eye-opening “additional context” to Chris’ story. The real issue her memoir introduces is that Krakauer agreed to withhold these key details from Into the Wild at the request of the family. By doing so, Krakauer created a hole in his narrative and in our understanding of McCandless’ motivations. The Wild Truth was not weaponized for a mass “gotcha” campaign, or if it was, the campaign made not a dent on the beatification of Chris McCandless. Rather, as with the evolution of the poisonous seeds (explained below), its details were smoothed over by sympathetic media sources as completing and supporting Krakauer’s story, and further buttressing the McCandless hagiography.
“After a brief review of its contents and intention, we concluded that this fictionalized writing has absolutely nothing to do with our beloved son, Chris, or his character,” they wrote. “The whole unfortunate event in Chris’s life 22 years ago is about Chris and his dreams—not a spiteful, hyped up, attention-getting story about his family.”
There was another narrative hole, though, that is more substantial and of far more interest to people like me: The contents of the journal McCandless kept while in the Alaska wilderness. Initially only Krakauer had access to the diary, which he used while writing Into the Wild.
When the journal was finally released, it amounted to
approximately 430 words, 130 numbers, nine asterisks and a handful of symbols. Other than this, all Krakauer had to go on was several rolls of film found with the young man’s body and a rambling, cliche-filled, 103-word diatribe carved into plywood in which McCandless claimed to be “Alexander Supertramp” off on a “climatic battle to kill the false being within and victoriously conclude the spiritual pilgrimage.”
Craig Medred of Anchorage Daily News has much more to say about “the fiction that is Jon Krakauer’s Into the Wild“. That Krakauer reconstructed McCandless’ last weeks in minute detail from such sparse documentation should be a flare in the sky to anyone who still believes the label “nonfiction” means something.
“It is as if the late writer Ernest Hemingway found a 430-word journal written by Nick Adams containing the words ‘railroad,’ ‘fish,’ ‘forest fire,’ ‘camp’ and a few others,” Medred writes, “and from that wrote ‘Big Two-Hearted River’ as the true story of Adams’ biggest fishing adventure.”
If that sounds like hyperbole, then reread the final three chapters of Into the Wild and reckon them against the source material Krakauer was drawing from. Here’s a portion of the journal, from the McCandless’ memorial site:
Day 2: Fall through the ice day. Day 4: Magic bus day. Day 9: Weakness. Day 10: Snowed in. Day 13: Porcupine day…. Day 14: Misery. Day 31: Move bus. Grey bird. Ash bird. Squirrel. Gourmet duck! Day 43: MOOSE! Day 48: Maggots already. Smoking appears ineffective. Don’t know, looks like disaster. I now wish I had never shot the moose. One of the greatest tragedies of my life. Day 68: Beaver Dam. Disaster. Day 69: Rained in, river looks impossible. Lonely, Scared. Day 74: Terminal man. Faster. Day 78: Missed wolf. Ate potato seeds and many berries coming. Day 94: Woodpecker. Fog. Extremely weak. Fault of potato seed. Much trouble just to stand up. Starving. Great jeopardy. Day 100: Death looms as serious threat, too weak to walk out, have literally become trapped in wild—no game. Day 101-103: [No written entries, just the days listed.] Day 104: Missed bear! Day 105: Five squirrel. Caribou. Day 107: Beautiful berries. Day 108-113: [Days were marked only with slashes.]
McCandless notes on day 69, six weeks before his death: “Rained in, river looks impossible.” (I assume he means the river between him and civilization looks impossible to cross.) On day 100, he realizes “have literally become trapped in wild.” Earlier he writes, “Lonely, Scared.” Earlier still, he writes, “Weakness.”
“Male self-sufficiency”
McCandless entered the Alaskan wilderness packing ten pounds of rice, a .22-caliber rifle, ammunition, a camera, and a selection of books. Although his exact date of death is unknown, it appears he only survived for 113 days, or about sixteen weeks. All evidence is that he leaned heavily on a single food source, the seeds of a wild plant.
Rosenfield takes a crack at those who knock McCandless as unable to distinguish a moose from a caribou (“He could, actually”), although that’s beside the point. The real failure was him bagging a beast and lacking the skills to preserve it. After mere days he lost the carcass to maggots. Properly preserved, the meat and organs could have fed him for months, providing him with vital protein and fat. He wrote the waste was “one of the greatest tragedies of my life,” one of the few lucid and complete entries in his journal. Other than the wild seeds, he appears to have had no success in securing an additional food source.
The usual rejoinder to these failures is that the seeds he foraged were a good source of nutrition but, due to understandable circumstances, McCandless was poisoned by them, or some substance growing on them.
The seeds are, without a doubt, the most frustrating aspect of the entire affair. It’s oft-reported that over the years Krakauer required three tries to explain the puzzle of the poisonous seeds. As I wrote in 2015, the count was actually four, and is now closer to five:
and a modified fifth explanation in a peer-reviewed 2015 article, which he also discussed in a 2015 New Yorker article.
Diana Saverin of Outside magazine has a good summation of the history of the questions surrounding the seeds. (This article is also the first and only time I’ve seen a major media source acknowledge that the debate over McCandless’ legacy may be more than a Manichaean battle of “Chris supporters” vs. “Chris haters”: “[Some] readers don’t dismiss McCandless’ intention—spending time in the wilderness—as invalid or stupid. Rather, they reject his endeavor because of the consequence it led to: his death.” While not exactly my position, at least there’s an acknowledgment of a spectrum.)
With Krakauer’s later explanations for the seeds came sympathetic media outlets announcing he’d “solved” the mystery once and for all. NPR has declared the case closed on a couple of occasions. Salon originally titled their 2013 article “Chris McCandless’ death wasn’t his fault” before changing it to the blander “Into the Wild’s twist ending”. (The original title is still there in the page’s URL.)
The Salon switcheroo neatly encapsulates the stakes in play: By declaring McCandless was not culpable for his own death, the lessons and morality people wish to attach to McCandless’ life are preserved. Krakauer’s first explanation paints McCandless as fallible, and perhaps even liable for his own death (that Chris mistook poisonous seeds for edible ones). The later explanations (a mold or bacteria growing on edible seeds) reassures the faithful that McCandless’ death was understandable and unavoidable.
The various poisonous seed theories led to disputes between Krakauer and biologists. Even after the dust settled, the best the scientific minds could declare was “it is possible” the seeds were poisonous and “contributed” to McCandless’ death. That is not the indisputable evidence that some sources reported.
I’ll say it here, just as I said in 2015: I do not think McCandless was an idiot. I do not think he was reckless. He was far more prepared to enter the Alaska interior than a high majority of his admirers who’ve made similar attempts—but he was not prepared enough. I suspect he only realized his mistake when he could no longer trek out of the area (day 69, “river looks impossible”). Unlike living off-the-grid in places like the American Southwest, where he did well, McCandless’ resourcefulness was not enough in a truly remote and brutal location.
But even if rock-solid evidence arrived showing McCandless was poisoned by the potato seeds and nothing more, that does not prove his death was unavoidable. His survival in Alaska hinged on a single food source. He suffered from a single point of failure, and when that point failed, he was doomed. That is not “self-sufficiency.”
The cult
Rosenfield again:
On the 15th anniversary of McCandless’s death, Men’s Journal published a story titled ‘The Cult of Chris McCandless’, an examination of the young man’s legacy in and around the wilderness in which he perished. One gets the sense that there’s still little sympathy amongst Alaskans for McCandless’s death, and the quotes from locals range from pitying to contemptuous.
It’s true: A magazine wrote an article quoting some Alaskans’ contempt for Chris McCandless.
But it’s bewildering that Rosenfield could read the Men’s Journal story and come away with nothing more than the Alaskan angle. It’s titled “The Cult of Chris McCandless” for a reason. The bulk of the article regards the number of people—in particular, young men—inspired by McCandless to enter the wilderness and make a go of it themselves. They are almost always far less prepared for the ordeal than McCandless was when he entered Denali National Park.
I question the choice of the word “cult,” but confess I cannot offer a better alternative. The obsession with McCandless has made him a kind of secular saint, and the location of his death has become a pilgrimage site. TripAdvisor offers several pages on how to reach it; Google Maps still points to the site of Bus 142 before it was removed by the Alaska Army National Guard to discourage further sightseers. Authorities are routinely called in to rescue lost and stranded hikers. Deaths continue to occur. (Tellingly, Rosenfield mentions the reasons for the bus’ removal without pondering the implications of people losing their lives in the name of “authenticity.”)
If McCandless’ story truly inspires people to learn self-sufficiency—if it leads them to pause and hone the skills necessary to survive in the wilderness—I can only applaud that person for making the vision a reality. But when the inspired believe self-sufficiency is simply a matter of good intentions and a canteen of Evian, there’s a problem.
Compare the evolution of McCandless’ story—the beatification, the successive theories on the seeds, the guarded interpretation of his diary—to Charles Baxter’s observation of a proliferation of “dysfunctional narratives” in America:
Reading begins to be understood as a form of personal therapy or political action. In such an atmosphere, already moralized stories are more comforting than stories in which characters are making complex or unwitting mistakes.
That sounds an awful lot like what happened to Chris McCandless’ story over the span of thirty years.
The politics
Krakauer:
A lot of people came away from reading Into the Wild without grasping why Chris did what he did. Lacking explicit facts, they concluded that he was merely self-absorbed, unforgivably cruel to his parents, mentally ill, suicidal, and/or witless.
Rosenfield picks up where Krakauer leaves off…and makes a serious detour:
The guy who hunts his own food, chops his own wood, and builds his own home, is a suspicious character: a little too trad, a little too in-your-face masculine, probably a Trump voter. And the guy a step beyond that, the one who doesn’t just paint outside the lines but wants to buck the system entirely? There’s something really wrong with him. He’s no pioneer; he’s a misanthrope, a deadbeat, an incel. … We’re afraid of men like this, and we’re afraid of the people who admire them.
This characterization is off-the-rails.
Without exception, criticisms of McCandless as an irresponsible privileged twerp are coded right-wing. The type Rosenfield describes sounds more like a standard-issue take-down of libertarians and hard-right Republicans (“a misanthrope, a deadbeat, an incel”). Those take-downs inevitably come from sources coded as left-wing—the same sources who trumpet McCandless as a modern icon (Salon, NPR, etc.) These sources will question the myth of rugged individualism in American history—and then, with no apparent introspection, hold up high Chris McCandless’ rugged individualism as an example to follow.
If anything, the animus toward Chris McCandless is a mirror-image of the one Rosenfield describes. Critics like to portray him as a coastal elite, a hipster from a privileged enclave who foolishly launched a narcissistic quest for authenticity, and certainly not as a Trump voter. I’ve never heard anyone describe themselves as “fearful” of McCandless’ admirers. “Idiots” is the terse word one acquaintance used when I brought up the subject.
Recall Sherry Simpson: “Mostly I’m puzzled by the way he’s emerged as a hero, a kind of privileged-yet-strangely-dissatisfied-with-his-existence hero.”
“Gather, cook, and eat”
If you still think of me as a “Chris hater,” in return I ask for your opinion of other individualists who forsook modernity and escaped to the wild.
There’s Alastair Bland, the student I wrote about eight years ago. The similarities between Bland and McCandless are remarkable: Both were anthropology majors who believed hunter-gatherer societies were freer and enjoyed more leisure time than agricultural/industrial ones. Both expressed a sharp disdain for modern consumerism and materialism. Bland did not penetrate the Alaska interior, but he did live off the land in and around U.C. Santa Barbara in 2002. Bland found people around him cheering him on:
They marveled at how great [his experiment] was and exclaimed that they would some day try to do something similar. They thought it was a good thing to boycott the American market and a shame more people didn’t appreciate nature’s bounty the way I did.
Like McCandless, Bland wound up concentrating on a single food source—tree figs—which left him bleeding from the mouth and nauseous. His days were spent scrounging for his next meal. He dreamed of climbing trees and eating figs. His life became “gather, cook, and eat.”
Just as McCandless attempted to flee the Alaska interior sooner than planned, Bland too quit his experiment early:
Even now I don’t believe what I did was very constructive. It was a memorable time in my life, to be sure, and it was a good thing to have tried. But to carry on like that forever would have been, for me, social suicide.
There’s Timothy Treadwell who, like McCandless, found a spiritual refuge in the Alaska interior. He lived there for thirteen seasons among the coastal brown bears, both alone and with his girlfriend. Like McCandless, he came from a well-to-do family, and was athletic and gifted. After some failures as an aspiring actor and a bout with alcoholism, he turned his life around. He grew famous for spending time close to the bears in Alaska, daring to approach them to gain their trust. He was immortalized in Werner Herzog’s Grizzly Man.
In October 2003, Treadwell and his girlfriend were attacked and killed by a bear. Treadwell’s running camera captured the audio of the attack. It was the first and only incident of a bear killing a person in the history of Kitmai National Park.
John Rogers of Kitmai Coastal Bears Tour writes of “The Myth of Timothy Treadwell,” although this myth never took on the heroic proportions of McCandless’. While Rogers says, “Timothy Treadwell was not the foolhardy person the media portrays him to be,” he does not acquit him of culpability in his own death, either.
There’s Christopher Thomas Knight, the recluse who lived twenty-seven years in isolation in the north of Maine. In a bit of philosophizing that could have come from McCandless, he said by living alone “I lost my identity. There was no audience, no one to perform for … To put it romantically, I was completely free.”
But Knight survived while habitually breaking into nearby cabins. He was accused of performing over 1,000 burglaries over a quarter-century, pilfering goods and supplies for his own survival.
Note that McCandless has also been accused of breaking-and-entering by Craig Medred:
Three cabins — two privately owned and one a property of the National Park Service — were broken into while McCandless was at the bus. It had never happened before. It has not happened since.
There’s Robert Bogucki, raised in Malibu and a student at Georgetown University. As a young man, he began to question materialism and capitalism. He traveled to Australia to walk solo across its interior desert.
He entered the desert carrying a week’s worth of food and 26 liters of water. When his supplies ran out, he began digging for moisture, and cutting himself to drink his own blood. His absence sparked what was the largest and most expensive manhunt in Australia’s history. After forty-three days, he spelled out “HELP” with rocks and was rescued by a search helicopter. Bogucki “lost more than 30kg [66 pounds] from his 86kg [189 pounds] and it took him a full year to regain his previous strength and stamina.” (McCandless’ corpse weighed 66 pounds—half of Bogucki’s final weight—when he was discovered.)
Why did Bogucki do it? To see God. He desired to model Jesus’ forty days in the desert. He claimed God spoke to him and directed him to water sources. Where McCandless packed in a book on Alaskan horticulture, Bogucki carried a Bible.
Why are these “self-sufficient males” not idolized by the legions of McCandless followers? Why don’t we praise their “sense of adventure”?
Perhaps due to the faulty optics of each story: Bland’s admission of failure in a soft Southern California beach town; Bogucki’s distasteful Bible-thumping; Knight’s “self-reliance” revealed as a reliance on others; Treadwell’s violent attack recorded on tape, supplying us an unromantic record of nature’s grim realities.
Are optics really what makes McCandless different? Doesn’t such a cynical and relativist view smack face-first into McCandless’ values of authenticity—honesty with others, and honesty with one’s self?
It’s taken me nearly a year to write this post. I gave up twice. Researching and writing this has been exhausting. Why spend so much time and energy?
As I wrote in 2015:
Jon Krakauer introduced me to a vivid and lucid life, one that will stay with me for years.
That life has been flattened into an icon, propagated as a cult of personality, and used to buttress petty political divisions. In the least I must register my protest.
What if I told you that there’s been a sea-change in American storytelling over the past half-century? Not merely a change in subject matter, but that the fundamental nature of American narratives radically shifted? Would you believe me?
Now, what if I told you that a writer twenty-five years ago described these “new” stories, and even predicted they would become the dominant mode in our future? Would you believe that?
In 1997, Charles Baxter published Burning Down the House, a collection of essays on the state of American literature. It opens with “Dysfunctional Narratives: or, ‘Mistakes were Made,’” a blistering piece of criticism that not only detailed the kinds of stories he was reading back then, but predicted the types of stories we read and tell each other today.
Baxter appropriated the term “dysfunctional narrative” from poet C. K. Williams, but he expounded and expanded upon it so much, it’s fair to say he’s made the term his own. He borrowed a working definition of dysfunctional narratives from poet Marilynne Robinson, who described this modern mode of writing as a “mean little myth:”
One is born and in passage through childhood suffers some grave harm. Subsequent good fortune is meaningless because of the injury, while subsequent misfortune is highly significant as the consequence of this injury. The work of one’s life is to discover and name the harm one has suffered.
Baxter adds that the source of this injury “can never be expunged.” As for the ultimate meaning of these stories: “The injury is the meaning.”
To claim this mode of writing has become the dominant one in American culture demands proof, or at least some supporting evidence. Baxter lists examples, such as Richard Nixon’s passive-voice gloss over the Watergate cover-up (“mistakes were made”), Jane Smiley’s A Thousand Acres, and conspiracy theories, among others.
“Dysfunctional Narratives” doesn’t succeed by tallying a score, however. Rather, it describes a type of story that sounds all-too-familiar to modern ears:
Reading begins to be understood as a form of personal therapy or political action. In such an atmosphere, already moralized stories are more comforting than stories in which characters are making complex or unwitting mistakes.
Don’t merely consider Baxter’s descriptions in terms of books. News stories, the social media posts scrolling up your daily feed, even the way your best friend goes into how their boss has slighted them at work—all constitute narratives, small or large. Dysfunctional narratives read as if the storyteller’s thumb is heavy on the moral scale—they feel rigged.
It does seem curious that in contemporary America—a place of considerable good fortune and privilege—one of the most favored narrative modes from high to low has to do with disavowals, passivity, and the disarmed protagonist.
(I could go one quoting Baxter’s essay—he’s a quotable essayist—but you should go out and read all of Burning Down the House instead. It’s that good.)
Dysfunctional narratives are a literature of avoidance, a strategic weaving of talking points and selective omissions to block counter-criticism. If that sounds like so much political maneuvering, that’s because it is.
“Mistakes were made”
Let’s start with what dysfunctional narratives are not: They’re not merely stories about dysfunction, as in dysfunctional families, or learning dysfunctions. Yes, a dysfunctional narrative may feature such topics, but that is not what makes it dysfunctional. It describes how the story is told, the strategies and choices the author had made to tell their story.
Baxter points to Richard Nixon’s “mistakes were made” as the kernel for the dysfunctional narrative in modern America. (He calls Nixon “the spiritual godfather of the contemporary disavowal movement.”) He also holds up conspiracy theories as prototypes:
No one really knows who’s responsible for [the JFK assassination]. One of the signs of a dysfunctional narrative is that we cannot leave it behind, and we cannot put it to rest, because it does not, finally, give us the explanations we need to enclose it. We don’t know who the agent of action is. We don’t even know why it was done.
Recall the tagline for The X-Files, a TV show about the investigation of conspiracy theories: “The truth is out there.” In other words, the show’s stories can’t provide the truth—it’s elsewhere.
More memorably—and more controversially—Baxter also turns his gaze upon Jane Smiley’s A Thousand Acres, which features the use of recovered memories (“not so much out of Zola as Geraldo“) and grows into “an account of conspiracy and memory, sorrow and depression, in which several of the major characters are acting out rather than acting, and doing their best to find someone to blame.”
In a similar vein, a nearly-dysfunctional story would be The Prince of Tides by Pat Conroy. It centers on a family man who, via therapy, digs through memories of a childhood trauma which has paralyzed him emotionally as an adult. He gradually heals, and goes on to repair his relationship with his family. Notably, his elderly father does not remember abusing him years earlier, leaving one wound unhealed.
Another example would be Nathanael West‘s A Cool Million, which follows a clueless naif on a cross-American journey as he’s swindled, robbed, mugged, and framed. By the end, the inventory of body parts he’s lost is like counting the change in your pocket. It might be forgiven as a satire of the American dream, but A Cool Million remains a heavy-handed tale.
This leads to another point: A dysfunctional narrative is not necessarily a poorly told one. The dysfunction is not in the quality of the telling, but something more innate.
Examples of more topical dysfunctional narratives could be the story of Aziz Ansari’s first-date accuser. The complaints of just about any politician or pundit who claims they’ve been victimized or deplatformed by their opponents is dysfunctional. In almost every case, the stories feature a faultless, passive protagonist being traumatized by the more powerful or the abstract.
There’s one more point about dysfunctional narratives worth making: The problem is not that dysfunctional narratives exist. The problem is the sheer volume of them in our culture, the sense that we’re being flooded—overwhelmed, even—by their numbers. That’s what seems to concern Baxter. It certainly concerns me.
A literature of avoidance
In his essay Ur-Fascism, Umberto Eco offers this diagram:
one
two
three
four
abc
bcd
cde
def
Each column represents a political group or ideology, all distinct, yet possessing many common traits. (Think of different flavors of Communism, or various factions within a political party.) Groups one and two have traits b and c in common, groups two and four have trait d in common, and so on.
Eco points out that “owing to the uninterrupted series of decreasing similarities between one and four, there remains, by a sort of illusory transitivity, a family resemblance between four and one,” even though they do not share any traits. The traits form a chain—there is a common “smell” between the political groups.
Not all dysfunctional narratives are exactly alike, or have the exact traits as the rest, but they do have a common “smell.” Even if a 9/11 conspiracy theory seems utterly unlike A Cool Million, they both may be dysfunctional.
Likewise, in the traits that follow, just because a story doesn’t include all doesn’t mean it “avoids dysfunction.” Rather, dysfunctional narratives are built by the storyteller selecting the bricks they need to buttress their message:
A disarmed protagonist
An absent antagonist
Minimal secondary characters
An authorial thumb on the scale
“Pre-moralized”
A vaporous conclusion
Authorial infallibility and restricted interpretations
The most common trait of the dysfunctional narrative is a faultless, passive main character. Baxter calls this the “disarmed protagonist.” Baxter differentiates between “I” stories (“the protagonist makes certain decisions and takes some responsibility for them”) and “me” stories (“the protagonists…are central characters to whom things happen”). Dysfunctional narratives are the “me” stories.
And the errors these “me” characters make—if any—are forgivable, understanding, or forced upon them by dire circumstances. Compare this to the mistakes the people around them make—monstrous, unpardonable sins:
…characters [in stories] are not often permitted to make interesting and intelligent mistakes and then to acknowledge them. The whole idea of the “intelligent mistake,” the importance of the mistake made on impulse, has gone out the window. Or, if fictional characters do make such mistakes, they’re judged immediately and without appeal.
Power dynamics are a cornerstone of all narratives, but one “smell” of the dysfunctional variety is an extraordinary tilting of power against the main character. The system, or even the world, is allied against the protagonist. Close reads of these narratives reveals an authorial thumb on the story’s moral scale, an intuition that the situation has been soured a bit too much in the service of making a point. This scale-tipping may be achieved many ways, but often it requires a surgical omission of detail.
Hence how often in dysfunctional narratives the antagonist is absent. A crime in a dysfunctional novel doesn’t require a criminal. All it needs, in Robinson’s words, is for the main character to have endured some great wrong: “The work of one’s life is to discover and name the harm one has suffered.”
Name the harm, not the perpetrator. Why not the perpetrator? Because often there’s no person to name. The harm is a trauma or a memory. The perpetrator may have disappeared long ago, or died, or have utterly forgotten the wrongs they inflicted (as the father does in Prince of Tides). The malefactor may be an abstraction, like capitalism or sexism. But naming an abstraction as the villain does not name anything. It’s like naming narcissism as the cause of an airliner crash. This is by design. Abstractions and missing antagonists don’t have a voice. Even Satan gets to plead his case in Paradise Lost.
No ending is reached in a dysfunctional narrative, because there’s only a trauma, or a memory, or an abstraction to work against. These injuries never heal. Memories may fade, but the past is concrete. By telling the story, the trauma is now recorded and notarized like a deed. “There’s the typical story in which no one is responsible for anything,” Baxter complained in 2012. “Shit happens, that’s all. It’s all about fate, or something. I hate stories like that.” These stories trail off at the end, employing imagery like setting suns or echoes fading off to signify a story that will never conclude.
The most surface criticism of these narratives is that we, the readers, sense we’re being talked down to by the author. “In the absence of any clear moral vision, we get moralizing instead,” Baxter writes. A dysfunctional narrative dog-whistles its morality, and those who cannot decode the whistle are faulted for it. The stories are pre-moralized: The reader is expected to understand beforehand the entirety of the story’s moral universe. For a reader to admit otherwise, or to argue an alternate interpretation, is to risk personal embarrassment or confrontation from those who will not brook dissent.
And making the reader uncomfortable is often the outright goal of the dysfunctional narrative. The writer is the presumed authority; the reader, the presumed student. It’s a retrograde posture, a nagging echo from a lesser-democratic time. (When I read A Brief History of Time, I was most certainly the student—but Hawking admirably never made me feel that way.) Dysfunctional narratives are often combative with the reader; they do not acknowledge the reader’s right to negotiate or question the message. With dysfunctional narratives, it’s difficult to discern if the writer is telling a story or digging a moat around their main character.
“What we have instead is not exactly drama and not exactly therapy,” Baxter writes. “No one is in a position to judge.” A dysfunctional narrative portrays a world with few to no alternatives. A functional narrative explores alternatives. (This is what I mean when I write of fiction as an experiment.)
This is why so many dysfunctional narratives are aligned to the writer’s biography—who can claim to be a better authority on your life, after all? But the moment a reader reads a story, its protagonist is no longer the author’s sole property. The character is now a shared construct. Their decisions may be questioned (hence the passive nature of the protagonists—inaction avoids such judgements). If the author introduces secondary characters, they can’t claim similar authority over them—every additional character is one more attack vector of criticism, a chipping away of absolute authority over the story itself. That’s what happened to sensitivity reader Kosoko Jackson in 2019, whose debut novel was pulped due to questions over his secondary characters.
Of all the traits listed—from the disarmed protagonist to the vaporous conclusion—the trait I find the “smelliest” is authorial infallibility and restricted interpretation. That’s why I used weasel language when I called Prince of Tides “nearly-dysfunctional:” The book is most certainly open to interpretation and questioning. In contrast, questioning a conspiracy theory could get you labeled an unwitting dupe, a useful idiot, or worse.
A Cambrian explosion
What Baxter doesn’t explore fully is why we’ve had this Cambrian explosion of dysfunctional narratives. He speculates a couple of possibilities, such as them coming down to us from our political leadership (like Moses carrying down the stone tablets), or as the byproduct of consumerism. I find myself at my most skeptical when his essay stumbles down these side roads.
When Baxter claims these stories arose out of “groups in our time [feeling] confused or powerless…in such a consumerist climate, the perplexed and unhappy don’t know what their lives are telling them,” it seems Baxter is offering a dysfunctional narrative to explain the existence of dysfunctional narratives. He claims these dysfunctional stories are produced by people of “irregular employment and mounting debts.” I strongly doubt this as well. In my experience, this type of folk are not the dominant producers of such narratives. Rather, these are the people who turn to stories for escape and uplift…the very comforts dysfunctional narratives cannot provide, and are not intended to provide.
Rather than point the finger at dead presidents or capitalism, I’m more inclined to ascribe the shift to a handful of changes in our culture.
The term “The Program Era” comes from a book by the same name detailing the postwar rise and influence of creative writing programs in the United States. This democratization of creative writing programs was not as democratic as once hoped, but it still led to a sharp increase in the numbers of people writing fiction. Most of those students were drawn from America’s upwardly-striving classes. And, as part of the workshop method used in these programs, it also led to a rise in those people having to sit quietly and listen to their peers criticize their stories, sometimes demolishing them. (Charles Baxter was a creative writing professor and the head of a prominent writing program in the Midwest. Many of his examples in Burning Down the House come from manuscripts he read as an instructor.)
With the expansion of writing programs came a rise in aspiring writers scratching around for powerful subject matter. Topics like trauma and abuse are lodestones when seeking supercharged dramatic stakes. Naturally, these writers also drew from personal biography for easy access to subject matter.
Another reason related to the Program Era is the heavy-handed emphasis on character-driven fiction over plot-driven fiction. I explore this theory here.
Another reason is staring back at you: The World Wide Web has empowered the masses to tell their stories to a global audience. This has created a dynamic where everyone can be a reader, a writer, and a critic, and all at the same time.
The natural next step in the evolution of the above is for storytellers to strategize how best to defend their work—to remove any fault in the story’s armor, to buttress it with rearguards and fortifications. (This is different than working hard to produce a high-quality experience, which, in my view, is a better use of time.) And there’s been a shift in why we tell stories: Not necessarily to entertain or enrich, but as an act of therapy or grievance, or to collect “allies” in a climate where you’re either with me or against me. Inaction in fiction has come to be praised as a literary virtue. Stories with characters who take matters into their own hands often are derided as genre fiction.
Pick up a university literary magazine and read it from cover to cover. The “smell” of dysfunctional narratives is awfully similar to the smell of social media jeremiads.
These are not the kind of stories I want to read, but it’s becoming increasingly difficult to distance myself from them. Writers should strive to offer more than a list grievances, or perform acts of score-settling. If it’s too much to ask stories to explain, then certainly we can expect them to connect dots. Even if the main character does not grow by the last page, we should grow by then, if only a little.