Last year I wrote about dysfunctional narratives, a type of story that Charles Baxter first identified in the 1990s and which now seems overly prevalent today. He quoted a description of them by poet Marilynne Robinson, who also identified this type of narrative. She called it a “mean little myth”:
One is born and in passage through childhood suffers some grave harm. Subsequent good fortune is meaningless because of the injury, while subsequent misfortune is highly significant as the consequence of this injury. The work of one’s life is to discover and name the harm one has suffered.
In my post, I wrote about a “Cambrian explosion” of dysfunctional narratives in our culture since the 1990s, this sense that we’re being overwhelmed by them. They’re in our magazines and books, in our cinema, in our newspapers, and on social media. “Reading begins to be understood as a form of personal therapy or political action,” Baxter wrote, and his observation seems as acute today as it did back then.
Last year I offered a few explanations for what energized this explosion. Recently I thought of another reason to add to the list. It’s a concept repeated endlessly in creative writing classes and how-to guides on writing fiction, namely, character-driven fiction versus plot-driven fiction. Respectable authors are supposed to write character-driven fiction and to eschew plot-driven fiction, which is largely associated with genre fiction.
When I first heard this edict of character versus plot, I accepted it as sage wisdom, and sought to follow it closely. Over the years, I kept hearing it from instructors and successful writers, especially writers of so-called literary fiction. I heard it so much, I began to question it. What exactly is character? What is plot?
I began to pose these questions to my peers. Their response usually sounded like this:
“‘Character’ is all the things that make a character unique. ‘Plot’ is the stuff that happens in a story.” A character-driven story is supposedly rich with humanizing details, while a plot-driven piece is a fluffy story where “a lot of stuff happens.”
Aristotle is not the final word on literary analysis, but his opinions on how a story succeeds or fails is far more nuanced than what many of my peers and instructors in creative writing programs could offer.
Aristotle defines character as a set of human traits imitated in the text. Traits could be run-of-the-mill personality markers, such as a character who is studious or arrogant, or complex and contradictory, like Hamlet’s brooding and questioning nature. Before modern times, playwrights often used traits associated with the four humors to define characters in a play.
For Aristotle, plot is the series of decisions a character makes that propels the story forward. These decisions generally take two forms: The character speaks, or the character acts. In line with the saying “actions speak louder than words,” Aristotle holds that a character’s actions are more significant, and more revealing, than the words they mouth.
When one of the salesmen in Glengarry Glen Ross announces he’s going close a big sale that night, and then crosses the street to have a cocktail, his actions reveal the hollowness of his words. Both decisions (speaking and acting) are also plot. Plot proves what character traits merely suggest.1
In other words, plot is not “stuff that happens.” (Note the passive voice, as though plot elements are forced upon the characters.) Rather, plot is a sequence of decisions made—and readers are very interested in a character’s decisions.
To be fair, inaction by a character is a kind of decision. Certainly there’s room for stories about characters who ponder a great deal and do little about it. In successful fiction, though, the final effect of inaction is almost always ironic. (Two good examples are Richard Ford’s “Rock Springs” and Thurber’s “The Secret Life of Walter Mitty.”) The problem is when inaction in literary fiction is treated as sublime.
The inaccurate, watered-down definition of plot-driven fiction—”A story where a lot of stuff happens”—has led to contemporary American literature’s fascination with flabby, low-energy narratives. I’ve met authors proud that the characters in their stories don’t do anything—never get off the couch, never pick up the phone, never make a decision of any consequence. Literary fiction has come to regard passivity as a virtue and action as a vice. A writer crafting a character who takes matters into their own hands risks having their work classified as genre fiction.
For decades now, creative writing programs have been pushing an aesthetic emphasizing character traits over character decisions. It’s frustrating to watch, year after year, the primacy of character-driven fiction getting pushed on young writers, with too many of them accepting the mantra without further consideration.
And this is why I think the Cambrian explosion of dysfunctional narratives is tied to this obsession with character-driven fiction. Passivity and inactivity are keystones of Baxter’s dysfunctional narratives. In his essay, he notes the trend toward “me” stories (“the protagonists…are central characters to whom things happen”) over “I” stories (“the protagonist makes certain decisions and takes some responsibility for them”).
This is why I’m wary of character-driven writers who do not permit their protagonists to make mistakes, instead strategically devising stories where they make no mistakes, and are therefore blameless. No wonder plot—that is, decision-making—is being eschewed, when this is the kind of story being upheld and praised.
Aristotle’s Poetics are obviously far more complicated than my three-paragraph summary, but the gist described here holds. ↩︎
Not rethinking realism, as in rethinking philosophy’s single, objective reality, hard as rocks and nails. No, I mean rethinking realism in the sense of questioning the elevation of literary realism over the many other forms of fiction.
Realism has long been the go-to form in literature for telling a story a certain way. An entire literary style—Naturalism—sprung from the sense that Romanticism had gone too far and produced a literature divorced from the world as commonly experienced. The pendulum later shifted the other direction, and for a period of time realistic literature was derided as bourgeois and reactionary. Since World War II, with the rise of creative writing programs and a reinvigorated enforcement of upper-class distinctions, kitchen-table realism has returned to the pinnacle of literary loftiness in America.
So it’s funny to me that realism is also so important in popular entertainment. This is nowhere as true as with television, which is obsessed with depicting reality—from the “you are there”-style news reporting to game shows branded as “reality TV.” When the writers of TV’s M*A*S*H killed off Col. Henry Blake in a season finale, they were inundated with letters from outraged viewers. The Emmy award-winning writing team’s response was, “Well, that’s reality.” American auteur Robert Altman famously ends Nashville with an out-of-the-blue assassination of a central character. Why? Because, he explained, that’s reality.
It’s not that these plot points are faulty or wrong-headed. My complaint is that the excuse—”It’s reality”—is a lazy defense of artistic choices. Writers should cop to their decision rather than take the passive route and saying reality made the choice for them. Writers should ask themselves if a “realistic” moment is adding to, or subtracting from, the story.
Anyone who’s attended a creative writing class, workshop, or MFA program is familiar with the high ground presumed by realism. The trendy term is “psychologically realistic fiction.” In writing programs, names like Raymond Carver, Amy Hempel, Tobias Wolff, and Tim O’Brien are tossed out as the zenith of American writing. Students are explicitly encouraged to emulate them, and their importance is implicitly signaled by their repeated presence in syllabi and required-reading lists. (I’ve read “The Things They Carried” at least eight times over the course of decades of writing groups and classes.) These authors are lionized for many reasons, but importantly, they all wrote about reality.
(There are two exceptions worth mentioning: One is magical realism, although its high regard in writing programs is tied up with identity politics. The other is Borges, whom I jokingly refer to as science-fiction for MFA students. It must be noted that both exceptions originate from outside the United States. Kafka, incidentally, is read and praised in writing programs as well, but not in such a way as to encourage emulation—I suspect my instructors liked the idea of Kafka more than Kafka’s output.)
Look at how so much literary fiction operates. Protagonists tend to be thoughtful, rational, and deliberative—often, they exhibit little to no affect. Characters in opposition tend to be boorish, thoughtless, and emotional. Dialogue is either flat and unadorned, or snappy, like the patter of a stand-up comic. Scenes flow as one character uttering a brief line, followed by paragraphs of rumination. The other character responds, and more paragraphs of rumination.
The prose might be good—it might even be inspired—but is this realism? Going through contemporary literary magazines, reading one story after another, I’m not sure one will find a lot of psychological realism, in the sense of psychiatry’s DSM-5.
Genre fiction is not immune either. Too often connoisseurs of hard-boiled detective fiction and tough-guy novels claim their favorite authors are superior because of their attention to realism. Raymond Chandler’s “The Simple Art of Murder” is wonderful and insightful criticism, but at its heart is a trashing of the classic British mystery because “fiction in any form has always intended to be realistic.” It’s one of the few arguments in the essay that I question.
Janet Burroway wrote, “Sometimes reality doesn’t make for good fiction.” It’s a tough lesson to learn, and one that even seasoned writers fail to grasp.
After all, there is no widely-accepted maxim stating the primary purpose of story is to reproduce reality. Fiction is supposed to be an expression of a writer’s inner state, not a dry report of the who, what, where, and when. Besides, why do we need to reproduce reality with such fidelity? We’re soaking in it. If you want reality, put down your phone or leave your computer screen. You have returned to reality, effortlessly.
In a writing class I attended, one of the students was a fan of horror, particularly H. P. Lovecraft and Robert Chambers’ The King in Yellow. At an end-of-semester presentation before the class, he expressed frustration at the hard-realism reading list we’d been given, and of the months of instruction requiring him to write in similar form. “Reading about reality is like reading about your job on your day off,” he told us. There’s something to that.
Story creates a transcendence within the reader. This transcendence defies reality while mimicking it—reality is Play-Doh in the hands of an adept writer. From hard realism to squishy-soft fantasy and everything in-between, great writing takes me to another place and time, a chance to live another person’s life. Books are “portable dreamweavers.”
What if I told you that there’s been a sea-change in American storytelling over the past half-century? Not merely a change in subject matter, but that the fundamental nature of American narratives radically shifted? Would you believe me?
Now, what if I told you that a writer twenty-five years ago described these “new” stories, and even predicted they would become the dominant mode in our future? Would you believe that?
In 1997, Charles Baxter published Burning Down the House, a collection of essays on the state of American literature. It opens with “Dysfunctional Narratives: or, ‘Mistakes were Made,’” a blistering piece of criticism that not only detailed the kinds of stories he was reading back then, but predicted the types of stories we read and tell each other today.
Baxter appropriated the term “dysfunctional narrative” from poet C. K. Williams, but he expounded and expanded upon it so much, it’s fair to say he’s made the term his own. He borrowed a working definition of dysfunctional narratives from poet Marilynne Robinson, who described this modern mode of writing as a “mean little myth:”
One is born and in passage through childhood suffers some grave harm. Subsequent good fortune is meaningless because of the injury, while subsequent misfortune is highly significant as the consequence of this injury. The work of one’s life is to discover and name the harm one has suffered.
Baxter adds that the source of this injury “can never be expunged.” As for the ultimate meaning of these stories: “The injury is the meaning.”
To claim this mode of writing has become the dominant one in American culture demands proof, or at least some supporting evidence. Baxter lists examples, such as Richard Nixon’s passive-voice gloss over the Watergate cover-up (“mistakes were made”), Jane Smiley’s A Thousand Acres, and conspiracy theories, among others.
“Dysfunctional Narratives” doesn’t succeed by tallying a score, however. Rather, it describes a type of story that sounds all-too-familiar to modern ears:
Reading begins to be understood as a form of personal therapy or political action. In such an atmosphere, already moralized stories are more comforting than stories in which characters are making complex or unwitting mistakes.
Don’t merely consider Baxter’s descriptions in terms of books. News stories, the social media posts scrolling up your daily feed, even the way your best friend goes into how their boss has slighted them at work—all constitute narratives, small or large. Dysfunctional narratives read as if the storyteller’s thumb is heavy on the moral scale—they feel rigged.
It does seem curious that in contemporary America—a place of considerable good fortune and privilege—one of the most favored narrative modes from high to low has to do with disavowals, passivity, and the disarmed protagonist.
(I could go one quoting Baxter’s essay—he’s a quotable essayist—but you should go out and read all of Burning Down the House instead. It’s that good.)
Dysfunctional narratives are a literature of avoidance, a strategic weaving of talking points and selective omissions to block counter-criticism. If that sounds like so much political maneuvering, that’s because it is.
“Mistakes were made”
Let’s start with what dysfunctional narratives are not: They’re not merely stories about dysfunction, as in dysfunctional families, or learning dysfunctions. Yes, a dysfunctional narrative may feature such topics, but that is not what makes it dysfunctional. It describes how the story is told, the strategies and choices the author had made to tell their story.
Baxter points to Richard Nixon’s “mistakes were made” as the kernel for the dysfunctional narrative in modern America. (He calls Nixon “the spiritual godfather of the contemporary disavowal movement.”) He also holds up conspiracy theories as prototypes:
No one really knows who’s responsible for [the JFK assassination]. One of the signs of a dysfunctional narrative is that we cannot leave it behind, and we cannot put it to rest, because it does not, finally, give us the explanations we need to enclose it. We don’t know who the agent of action is. We don’t even know why it was done.
Recall the tagline for The X-Files, a TV show about the investigation of conspiracy theories: “The truth is out there.” In other words, the show’s stories can’t provide the truth—it’s elsewhere.
More memorably—and more controversially—Baxter also turns his gaze upon Jane Smiley’s A Thousand Acres, which features the use of recovered memories (“not so much out of Zola as Geraldo“) and grows into “an account of conspiracy and memory, sorrow and depression, in which several of the major characters are acting out rather than acting, and doing their best to find someone to blame.”
In a similar vein, a nearly-dysfunctional story would be The Prince of Tides by Pat Conroy. It centers on a family man who, via therapy, digs through memories of a childhood trauma which has paralyzed him emotionally as an adult. He gradually heals, and goes on to repair his relationship with his family. Notably, his elderly father does not remember abusing him years earlier, leaving one wound unhealed.
Another example would be Nathanael West‘s A Cool Million, which follows a clueless naif on a cross-American journey as he’s swindled, robbed, mugged, and framed. By the end, the inventory of body parts he’s lost is like counting the change in your pocket. It might be forgiven as a satire of the American dream, but A Cool Million remains a heavy-handed tale.
This leads to another point: A dysfunctional narrative is not necessarily a poorly told one. The dysfunction is not in the quality of the telling, but something more innate.
Examples of more topical dysfunctional narratives could be the story of Aziz Ansari’s first-date accuser. The complaints of just about any politician or pundit who claims they’ve been victimized or deplatformed by their opponents is dysfunctional. In almost every case, the stories feature a faultless, passive protagonist being traumatized by the more powerful or the abstract.
There’s one more point about dysfunctional narratives worth making: The problem is not that dysfunctional narratives exist. The problem is the sheer volume of them in our culture, the sense that we’re being flooded—overwhelmed, even—by their numbers. That’s what seems to concern Baxter. It certainly concerns me.
A literature of avoidance
In his essay Ur-Fascism, Umberto Eco offers this diagram:
one
two
three
four
abc
bcd
cde
def
Each column represents a political group or ideology, all distinct, yet possessing many common traits. (Think of different flavors of Communism, or various factions within a political party.) Groups one and two have traits b and c in common, groups two and four have trait d in common, and so on.
Eco points out that “owing to the uninterrupted series of decreasing similarities between one and four, there remains, by a sort of illusory transitivity, a family resemblance between four and one,” even though they do not share any traits. The traits form a chain—there is a common “smell” between the political groups.
Not all dysfunctional narratives are exactly alike, or have the exact traits as the rest, but they do have a common “smell.” Even if a 9/11 conspiracy theory seems utterly unlike A Cool Million, they both may be dysfunctional.
Likewise, in the traits that follow, just because a story doesn’t include all doesn’t mean it “avoids dysfunction.” Rather, dysfunctional narratives are built by the storyteller selecting the bricks they need to buttress their message:
A disarmed protagonist
An absent antagonist
Minimal secondary characters
An authorial thumb on the scale
“Pre-moralized”
A vaporous conclusion
Authorial infallibility and restricted interpretations
The most common trait of the dysfunctional narrative is a faultless, passive main character. Baxter calls this the “disarmed protagonist.” Baxter differentiates between “I” stories (“the protagonist makes certain decisions and takes some responsibility for them”) and “me” stories (“the protagonists…are central characters to whom things happen”). Dysfunctional narratives are the “me” stories.
And the errors these “me” characters make—if any—are forgivable, understanding, or forced upon them by dire circumstances. Compare this to the mistakes the people around them make—monstrous, unpardonable sins:
…characters [in stories] are not often permitted to make interesting and intelligent mistakes and then to acknowledge them. The whole idea of the “intelligent mistake,” the importance of the mistake made on impulse, has gone out the window. Or, if fictional characters do make such mistakes, they’re judged immediately and without appeal.
Power dynamics are a cornerstone of all narratives, but one “smell” of the dysfunctional variety is an extraordinary tilting of power against the main character. The system, or even the world, is allied against the protagonist. Close reads of these narratives reveals an authorial thumb on the story’s moral scale, an intuition that the situation has been soured a bit too much in the service of making a point. This scale-tipping may be achieved many ways, but often it requires a surgical omission of detail.
Hence how often in dysfunctional narratives the antagonist is absent. A crime in a dysfunctional novel doesn’t require a criminal. All it needs, in Robinson’s words, is for the main character to have endured some great wrong: “The work of one’s life is to discover and name the harm one has suffered.”
Name the harm, not the perpetrator. Why not the perpetrator? Because often there’s no person to name. The harm is a trauma or a memory. The perpetrator may have disappeared long ago, or died, or have utterly forgotten the wrongs they inflicted (as the father does in Prince of Tides). The malefactor may be an abstraction, like capitalism or sexism. But naming an abstraction as the villain does not name anything. It’s like naming narcissism as the cause of an airliner crash. This is by design. Abstractions and missing antagonists don’t have a voice. Even Satan gets to plead his case in Paradise Lost.
No ending is reached in a dysfunctional narrative, because there’s only a trauma, or a memory, or an abstraction to work against. These injuries never heal. Memories may fade, but the past is concrete. By telling the story, the trauma is now recorded and notarized like a deed. “There’s the typical story in which no one is responsible for anything,” Baxter complained in 2012. “Shit happens, that’s all. It’s all about fate, or something. I hate stories like that.” These stories trail off at the end, employing imagery like setting suns or echoes fading off to signify a story that will never conclude.
The most surface criticism of these narratives is that we, the readers, sense we’re being talked down to by the author. “In the absence of any clear moral vision, we get moralizing instead,” Baxter writes. A dysfunctional narrative dog-whistles its morality, and those who cannot decode the whistle are faulted for it. The stories are pre-moralized: The reader is expected to understand beforehand the entirety of the story’s moral universe. For a reader to admit otherwise, or to argue an alternate interpretation, is to risk personal embarrassment or confrontation from those who will not brook dissent.
And making the reader uncomfortable is often the outright goal of the dysfunctional narrative. The writer is the presumed authority; the reader, the presumed student. It’s a retrograde posture, a nagging echo from a lesser-democratic time. (When I read A Brief History of Time, I was most certainly the student—but Hawking admirably never made me feel that way.) Dysfunctional narratives are often combative with the reader; they do not acknowledge the reader’s right to negotiate or question the message. With dysfunctional narratives, it’s difficult to discern if the writer is telling a story or digging a moat around their main character.
“What we have instead is not exactly drama and not exactly therapy,” Baxter writes. “No one is in a position to judge.” A dysfunctional narrative portrays a world with few to no alternatives. A functional narrative explores alternatives. (This is what I mean when I write of fiction as an experiment.)
This is why so many dysfunctional narratives are aligned to the writer’s biography—who can claim to be a better authority on your life, after all? But the moment a reader reads a story, its protagonist is no longer the author’s sole property. The character is now a shared construct. Their decisions may be questioned (hence the passive nature of the protagonists—inaction avoids such judgements). If the author introduces secondary characters, they can’t claim similar authority over them—every additional character is one more attack vector of criticism, a chipping away of absolute authority over the story itself. That’s what happened to sensitivity reader Kosoko Jackson in 2019, whose debut novel was pulped due to questions over his secondary characters.
Of all the traits listed—from the disarmed protagonist to the vaporous conclusion—the trait I find the “smelliest” is authorial infallibility and restricted interpretation. That’s why I used weasel language when I called Prince of Tides “nearly-dysfunctional:” The book is most certainly open to interpretation and questioning. In contrast, questioning a conspiracy theory could get you labeled an unwitting dupe, a useful idiot, or worse.
A Cambrian explosion
What Baxter doesn’t explore fully is why we’ve had this Cambrian explosion of dysfunctional narratives. He speculates a couple of possibilities, such as them coming down to us from our political leadership (like Moses carrying down the stone tablets), or as the byproduct of consumerism. I find myself at my most skeptical when his essay stumbles down these side roads.
When Baxter claims these stories arose out of “groups in our time [feeling] confused or powerless…in such a consumerist climate, the perplexed and unhappy don’t know what their lives are telling them,” it seems Baxter is offering a dysfunctional narrative to explain the existence of dysfunctional narratives. He claims these dysfunctional stories are produced by people of “irregular employment and mounting debts.” I strongly doubt this as well. In my experience, this type of folk are not the dominant producers of such narratives. Rather, these are the people who turn to stories for escape and uplift…the very comforts dysfunctional narratives cannot provide, and are not intended to provide.
Rather than point the finger at dead presidents or capitalism, I’m more inclined to ascribe the shift to a handful of changes in our culture.
The term “The Program Era” comes from a book by the same name detailing the postwar rise and influence of creative writing programs in the United States. This democratization of creative writing programs was not as democratic as once hoped, but it still led to a sharp increase in the numbers of people writing fiction. Most of those students were drawn from America’s upwardly-striving classes. And, as part of the workshop method used in these programs, it also led to a rise in those people having to sit quietly and listen to their peers criticize their stories, sometimes demolishing them. (Charles Baxter was a creative writing professor and the head of a prominent writing program in the Midwest. Many of his examples in Burning Down the House come from manuscripts he read as an instructor.)
With the expansion of writing programs came a rise in aspiring writers scratching around for powerful subject matter. Topics like trauma and abuse are lodestones when seeking supercharged dramatic stakes. Naturally, these writers also drew from personal biography for easy access to subject matter.
Another reason related to the Program Era is the heavy-handed emphasis on character-driven fiction over plot-driven fiction. I explore this theory here.
Another reason is staring back at you: The World Wide Web has empowered the masses to tell their stories to a global audience. This has created a dynamic where everyone can be a reader, a writer, and a critic, and all at the same time.
The natural next step in the evolution of the above is for storytellers to strategize how best to defend their work—to remove any fault in the story’s armor, to buttress it with rearguards and fortifications. (This is different than working hard to produce a high-quality experience, which, in my view, is a better use of time.) And there’s been a shift in why we tell stories: Not necessarily to entertain or enrich, but as an act of therapy or grievance, or to collect “allies” in a climate where you’re either with me or against me. Inaction in fiction has come to be praised as a literary virtue. Stories with characters who take matters into their own hands often are derided as genre fiction.
Pick up a university literary magazine and read it from cover to cover. The “smell” of dysfunctional narratives is awfully similar to the smell of social media jeremiads.
These are not the kind of stories I want to read, but it’s becoming increasingly difficult to distance myself from them. Writers should strive to offer more than a list grievances, or perform acts of score-settling. If it’s too much to ask stories to explain, then certainly we can expect them to connect dots. Even if the main character does not grow by the last page, we should grow by then, if only a little.
The Illuminerdi (via) reports Apple TV+ is tooling up to produce a streaming adaptation of William Gibson’s cyberpunk masterpiece Neuromancer. The big question Illuminerdi concerns itself with is which actor will play protagonist Case, a drug-abusing hacker hired to pull off a virtual heist in cyberspace.
The story buries the lede. The truly big news is that Neuromancer has a reasonable chance of being adapted to the screen. Apple TV+ may not be the leading force in streaming entertainment today, but it’s established a track record of producing high-quality material and taking some risks along the way. I know I sound like the eternal fanboy when I say this, but, “This time it might be real.”
Neuromancer is a brilliant novel, one of my favorites, and by my lights, the book that rearranged science fiction. Just as Raymond Chandler did not invent the hard-boiled detective novel, William Gibson did not invent cyberpunk. But both authors took earlier bricklaying done by them and other writers, pulled it all together, and buffed the final result to a chrome-like sheen. There’s science fiction before Neuromancer, and there’s science fiction after Neuromancer.
Hence Neuromancer on film has been a hot topic among science fiction fans since the book was first published in 1984. Every few years over the subsequent decades, news would percolate up that a movie adaptation was in the works, only for the organizers to lose interest, fail to find finding, or simply not get the green light. The Wikipedia section on Neuromancer‘s numerous aborted film adaptations doesn’t do justice to its rocky history. Fake movie trailers have been sewn together; fan-made movie posters have been photoshopped. The rumors, anticipation, and disappointments surrounding the film’s production are legion. (My response to hearing of this latest adaptation attempt: “I’ll believe it when I see it.”)
There were several sidelights along the road to this moment, starting with Johnny Mnemonic in 1996. At first glance, it appeared the perfect aperitif for Neuromancer fans: Mnemonic was an adaptation of a Gibson short story set in the same story universe. The film landed flat, though, and is pretty grating to watch. (Some call it a cult classic—I can’t tell if they’re being ironic or not). Keanu Reeves turned in a cold performance (which he claims was intentional) within a confounding and bizarrely campy narrative. Some say Mnemonic was underfunded. Gibson said it was overfunded. Even if the studio execs were clueless in their meddling—not a stretch to imagine—I still think postmodernist director Robert Longo was simply in over his head.
(That said, I’ve not seen the new re-edit Johnny Mnemonic: In Black and White, so I’ll reserve judgment whether the film is irredeemable. I admit: The stills look damn promising.)
It took The Matrix (1999) to give hungry cyberpunks the cinematic meal they were waiting for. There’s so many parallels between it and Neuromancer, you can’t help but think the writing/directing Wachowskis owe Gibson a pitcher of beer (if not a brewery). But Darren Aronofsky (Pi, Requiem for a Dream) was on to something when, after viewing the film, he claimed “Cyberpunk? Done.” By using up Neuromancer‘s best devices, as well as every philosophical question explored by Philip K. Dick, the Wachowskis came close to shutting the door on the most interesting development in genre fiction since the 1930s. The banality and repetitiousness of the next three Matrix films—including 2021’s Resurrections, which I held a sliver of hope for—only seemed to cement Aronofsky’s point.
(Cyberpunk’s heyday in the 1990s has passed, but neo-cyberpunk lives. The new breed exists where a worldwide computer network is no longer an imagined future, but a concrete element of the story’s past.)
I’m perennially suspicious of Hollywood adapting books to the screen, especially science fiction. Too often screenwriters will ditch the most memorable and gripping parts of the source material to slide in Tinseltown’s tired narrative shorthand. Amazon’s The Man in the High Castle leaps to mind. I’ve not seen the recent adaptation of Foundation, but at least one reviewer thinks Asimov’s classic hasn’t actually been adapted. Still, Illuminerdi reports William Gibson is signed on as an executive producer for Neuromancer. That gives me a touch more confidence in the direction of the project.
But only a touch. In 2015, I wrote how Hollywood has abandoned “‘tight, gapless screenwriting’ to scripts focused on world-building, sequels, expansion, rebooting.” That was written in time when superhero franchises were claiming greater real estate at the cineplexes, and Hollywood had finished converting Tolkien’s charming tale about wee folk into a eight-hour epic-action trilogy. Cinema houses still ruled back then. Like a sneeze coming on, the theater owners knew a violent upheaval was imminent. Today, streaming services are the premier way to deliver movies to eager audiences. And that’s what worries me the most.
My dread is not that this cyberpunk classic will be adapted to television instead of the silver screen—it’s to see it adapted to a medium that expects seasons and episodes. As with High Castle and Foundation, the streaming services love season-long episodic television: All the better for binge-watching.
Episodic television ushers in the narrative shorthand that Neuromancer absolutely does not need: every hour ending on a contrived cliffhanger; the sexual tension of when-will-they-hook-up; the let-down of the couple separating (complete with the trite break-up language of television: “I need some space” or, “This is going too fast”); and so on.
Even if you’re coming in without having read a page of Asimov, you’ll still notice the drawn-out plots that go nowhere, the padding, and the weird choices the show has the characters make to keep the plot from moving forward. Cheap, nonsensical melodrama fills the series…The show also wants to have pew-pew laser battles and ship fights and spacewalk mishaps and junk, none of which offer anything you haven’t seen before, and are usually used to just run out the clock anyway.
He makes this sharp observation:
Then there’s the show’s terror that people might not make certain connections, so it shows something, has the character comment on it to themself, and then maybe throws in a flashback to someone saying something relevant even if it was said three minutes prior.
This comes from television writing 101: “Tell them what they’re going to see, show it to them, and then tell them what they saw.” If that sounds like how to organize a Powerpoint presentation, you’re right. It’s also why television writing in 2022 remains hard-wired to the narrative structures of I Love Lucy.
Just as Gibson’s console jockeys rewired systems to hijack signal broadcasts and repurposed wet-tech to bore holes through firewalls, let’s hope modern streaming technology is bent to Neuromancer‘s whims, and not vice-versa.
Addendum: One of the criticisms I’ve received, here and elsewhere, is that Neuromancer cannot properly be condensed into a two-hour movie, hence a series is a better fit for its adaptation.
I agree a multi-part show is appropriate for Neuromancer‘s intricate story line. I loathe condensing Neuromancer into a ninety-minute film almost as much as I loathe seeing Neuromancer: Season Two on my TV screen. However, when I originally wrote the above post, I kept fishing around for a good example of a multi-episode streaming series (for illustrative purposes), and failed to locate one.
This morning I recalled The People v. O. J. Simpson: American Crime Story (which started life on FX and moved to Netflix). Its miniseries format would work well for Neuromancer. Each segment builds the story and develops characters toward a conclusion, like chapters in a novel. There’s a beginning, a middle, and a door-closing end.
My gripe is that Apple TV+ may attempt to “episodize” Neuromancer, making it more like a soap opera or a recurring show than a single story told a chapter at a time. This is what happened to Man in the High Castle—which was more “inspired by” than a retelling of the source material—and what appears happened to Foundation.
Recently I picked up Conversations with Kurt Vonnegut, part of the Literary Conversations Series from University Press of Mississippi. The collection offers interviews and profiles of Vonnegut published between 1969 and 1999. The first comes shortly after the publication of Slaughterhouse-Five. The subsequent rocket ride of literary stardom Vonnegut enjoyed—or endured—follows.
The collection seems rather complete, culling all manner of sources, right down to a softball Q&A with Harry Reasoner for 60 Minutes. The collection is breezy if thought-provoking reading, much like many of Vonnegut’s books, but it still held a few surprises for me. (Apparently after the success of Slaughterhouse-Five, Vonnegut contemplated throwing out Breakfast of Champions when he realized he could now sell any book he wrote no matter its quality.)
The more I learn about Vonnegut, the more I’ve come to see how pragmatic he was when it came to the craft of writing. Vonnegut often lists Robert Louis Stevenson as one of his favorite authors because, as a boy, he was “excited by stories which were well-made. Real ‘story’ stories…with a beginning, middle, and end.” His essay “How to Write With Style” is advice of the roll-up-your-sleeves variety, featuring watery chestnuts like “Find a subject you care about” and “Keep it simple.” More interestingly, while teaching at the Iowa Writers’ Workshop, he led a course to help students make a career out of writing after graduating—teaching, technical writing, ad copy, anything to put bread on the table. Apparently the course was not well-regarded by the other faculty.
One popular meme is Vonnegut’s lecture on the shape of stories. The audience chortles as he chalks out curves and lines graphing a set of basic story structures. (Maya Eliam’s infographics of these shapes are lucid and wonderful.) Most likely many in the auditorium thought he was satirizing when he said story forms could be graphed mathematically or analyzed by a computer, but his lecture is in earnest. This was his master’s thesis in anthropology, after all.
In a 1977 interview with Paris Review—the most in-depth interview in the collection—Vonnegut drops a mention of his story shapes:
Vonnegut: Somebody gets into trouble, and then gets out again; somebody loses something and gets it back; somebody is wronged and gets revenge; Cinderella; somebody hits the skids and just goes down, down, down; people fall in love with each other, and a lot of other people get in the way…
Interviewer: If you will pardon my saying so, these are very old-fashioned plots.
Vonnegut: I guarantee you that no modern story scheme, even plotlessness, will give a reader genuine satisfaction, unless one of those old fashioned plots is smuggled in somewhere. I don’t praise plots as accurate representations of life, but as ways to keep readers reading. … When you exclude plot, when you exclude anyone’s wanting anything, you exclude the reader, which is a mean-spirited thing to do.
Vonnegut even compared writing novels to experiments, which I’ve explored myself. He felt experimentation was in his nature due to his education as a chemist and an engineer. (I believe this is the first time I’ve read another fiction writer describe creating fiction as a kind of experiment.) Here he talks with Laurie Clancy about Breakfast of Champions (still unpublished at this point):
Interviewer: Could you indicate what direction your new work is taking?
Vonnegut: It’s in the nature of an experiment. I don’t know how it’s going to come out or what the meaning’s going to be—but I’ve set up a situation where there’s only one person in the whole universe who has free will, who has to decide what to do next and why, has to wonder what’s really going on and what he’s supposed to do. … What the implications of this are I don’t know but I’m running off the experiment now. I’ll somehow have a conclusion when I’ve worked long enough on the book. … Regarding [God Bless You, Mr. Rosewater], I said to myself “Well, all right, what happens when you give poor people money?” So I ran the experiment off and tried to control it as responsibly as I could.
The Clancy interview is one of the best in the book. Vonnegut is engaged, thoughtful, and revelatory.
I’ve been dipping into Wayne L. Johnson’s 1980 book Ray Bradbury the past couple of months. It’s part of the Recognitions series published by Frederick Ungar, a series featuring critical work on genre writers who’ve transcended their genre.
Johnson’s Ray Bradbury is a biography of the author tracked through his output rather than a stiff-backed recounting of dates and locations of events in his life. Bradbury’s short stories are grouped by subject matter and style as a strategy for analyzing the author’s approach to fiction. Johnson’s book paints a picture of a man who delved deep in the human imagination and returned with some fantastic stories for the ages.
Ray Bradbury was one of the most prolific short story authors of the 20th century because he never abandoned the form, unlike other authors who move on from them to novel writing. Bradbury capitalized on his bounty by disguising his short story collections as longer work (The Martian Chronicles,The Illustrated Man). Even Fahrenheit 451 is itself a maturation of a shorter work first published in Galaxy Magazine.
What caught my eye (and sparked the idea for this blog post) was a brief aside in Johnson’s introduction about how Bradbury was able to sell his prodigious output of short stories across the spectrum of American publishing:
Convinced that most editors were bored with seeing the same sort of material arriving day after day, Bradbury resolved to submit stories which, at least on the face of it, seemed inappropriate to the publication involved. Rather than send “Dandelion Wine” (later a chapter in the novel) to Collier’s or Mademoiselle, therefore, Bradbury sent it to Gourmet, which didn’t publish fiction. It was immediately accepted. “The Kilimanjaro Device” was snapped up by Life, which also didn’t publish fiction, after the story had been rejected by most of the big fiction magazines. … Bradbury insists that he places complete faith in his loves and intuitions to see him through.
Bradbury was certainly a known quantity when these short stories were published but, as Johnson indicates, he still faced his share of rejection slips. I don’t think Bradbury’s wanton submissions were ignorant of market conditions; it sounds to me he was quite savvy with this strategy. (Sending “Dandelion Wine” to Gourmet magazine is kind of genius, actually.) But Bradbury’s strategy transcends the usual mantra to “study the market.”
I’ve been a front-line slush pile reader at a few literary magazines, and I can tell you Bradbury’s intuition is spot-on. When you’re cycling through a stack of manuscripts, they begin to look and read the same. Too many of those short stories were treading familiar paths. Too often they introduced characters awfully similar to the last story from the pile.
A story with some fresh air in it certainly would wake me from my slush-pile stupor. The magazine market has changed dramatically in the past ten years—and absolutely has reinvented itself since Bradbury was publishing “Dandelion Wine”—but I imagine similar dynamics are still in place in the 21st century. Surprise an editor with your story and you just might have a shot at publication.
And if you’re banging out short stories and fruitlessly submitting them one after another to the usual suspects, try taking a risk and following Bradbury’s lead. Trust me, if you can put on your next cover letter that your short fiction was published by Car & Driver or National Geographic, that will surprise editors too.
One of my complaints about literary magazines—both the small lit mags of university English departments and the literary lions like New Yorker, Tin House, and so forth—is the peculiar absence of up-to-date technology in their fiction. Characters don’t send much email. People rarely text each other. Voicemail is about the most modern of the Information Age conveniences in contemporary literature, and even then, it’s usually summarized by the narrator rather than “heard” by the reader. Why?
It’s no longer cyberpunk for your characters to have instant access to cyberspace in their coat pocket. It’s not science fiction for your character to read the morning news on a handheld view screen. Literary fiction often preens itself as being “realistic” compared to genre fiction, but how realistic is it today for a mother of two in Long Island not to have a 4G touch tablet in her purse or a FitBit on her wrist reminding her she’s not burned enough calories today?
Unless it’s set in the past or some truly remote locale, you forfeit your right to call your story a work of realism if your characters don’t have access to the Internet and they’re not using it fairly regularly. Digital access is simply that pervasive, worldwide. Yes, there are exceptions. I’m certain some writers think their characters or their settings are those exceptions. Probably not, though.
One reason for technology’s absence in literary fiction, I suspect, is that modern tech screws with storytelling. As greater minds than me have pointed out, we live in a age bereft of bar bets. The Guinness Book of World Records was originally conceived to settle pub arguments, but it was Wikipedia that ultimately fulfilled that burning need. Any question we have about the world or history, the answer can be located in an instant.
It carries into personal relationships as well. People no longer craft letters and post them in a box, then anxiously await a reply over the next days or weeks. When I was young, a friend might say he would call at eight—and so I would have to wait by the phone in the kitchen at eight o’clock, telling everyone else in the house not to make a call because I was waiting for one. My parents would wake my brother and I up in the middle of the night to say hello to our Midwestern relatives because the long-distance rates dropped after 11pm. (Remember paying a premium for long distance calls?) For years, many of my extended family members were nothing more than a tinny voice at the other end of a phone line and a yellowing Kodachrome print in my mother’s photo albums.
For all the nostalgia value of these tales, I’m happy to no longer be bound by such ancient analog technology. The key word of modern communications is instant. Unfortunately, such friction-free gratification often runs counter to a lot of storytelling precepts, things like tension (which involves time) and desire (which involves immediacy).
But mostly I suspect the writers of contemporary literature simply don’t like modern tech. Receiving a pop-up on your phone for an email explaining a long-forgotten lover’s death lacks a certain airy elegance that a hand-penned note on hospital letterhead offers. The terseness of SMS and instant messaging grates against the literary author’s desires for eloquence and nuance.
More broadly, there’s a general disdain for popular American culture in our contemporary literature. SUVs and dining at Olive Garden are often literary code words for boorish, crass people. Sympathetic characters tend to read the New York Times on Sunday mornings, walk to work, raise a vegetable garden, and run into friends at farmers’ markets.
This is one reason why I don’t buy the assertion that contemporary American literature is realistic. Too often it presents a world the writer (and their readers) would like to live in. That’s not hard realism. And this restrictive view of proper living feeds back on itself: literary magazines print these stories, developing writers read these stories and think they represent “correct” fiction, and so they write and submit likewise.
Give your characters the technology they deserve. If you’re writing about the past, that’s one thing, but if your story is set in modern times, don’t shortchange your characters’ resources.
Instead of viewing commonplace technology as a liability to storytelling, consider how vital the technology has become for us. Watch this magic trick, from Penn & Teller’s Fool Us:
The audience feels the risks the emcee is taking when instructed to place his own phone in an envelope. The surprise when the mallet is brought out, the tension it raises. Look at the audience’s visceral reaction when the mobile phones are hammered up. Even though Penn & Teller see through the act, there’s a kind of narrative structure to the magician’s “story.” At each step of the act, the stakes are raised.
Do this: The next time you’re out with a group (people you know and people you’ve just been introduced to), pull up a photo or a message on your smart phone, and then hand your phone to someone else. (Or, if someone offers you their phone, take it, twiddle with it, and hand it to another person.) Rare is the person comfortable with this. We don’t like these little things leaving our grasp.
That means, as writers, these devices are a goldmine.
We are wed to our new conveniences in ways we never were with “old” modern technology like microwaves, refrigerators, or even automobiles. Americans may love their cars, but they are married to their smart phones. Our mobile devices are lock-boxes of email and text messages, safe deposit boxes of our secrets and our genuine desires (versus the ones we signal to our friends and followers). Gossipy emails, intimate address books, bank accounts, baby pictures, lovers and lusts—our lives are secreted inside modern technology. This is rich soil for a writer to churn up, this confluence of personal power and emotional vulnerability.
Why dismiss or ignore this? Why not take advantage of it in your next story?