Ten years of blogging: Waiting for Neuro

Fan-made movie poster for William Gibson’s Neuromancer.

Previously: A unique manifesto

In this series where I review my last ten years of blogging, I tend to focus on posts that either made some waves, or posts with subjects I want to take up again. The 2022 post I want to return to has both qualities. It regards William Gibson’s cyberpunk classic Neuromancer.

Back then, I wrote about the Waiting for Godot-like patience the book’s fans have endured in anticipation of a movie adaptation. Rumors and announcements have come and gone over the last four decades: Directors taking on the project, studios arranging funding, big-name actors being tapped, and so on. Joshua Hull, author of Underexposed!: The 50 Greatest Movies Never Made calls the elusive Neuromancer movie “the white whale for a number of filmmakers throughout the years.”

It was yet another rumor in 2022 of an Apple TV+ adaptation that led to me to ask “Will we finally see Neuromancer on the screen?” As usual, nothing came of it.

At this rate, we may reach the year Neuromancer is set—some time in the 2030s—before a movie version is produced.

Fan-made "teaser" movie poster for a hoped-for Neuromancer film.
Fan-made “teaser” movie poster: “The future is already here”

Whatever else William Gibson will be remembered for, his fiction has almost always centered on the proles and lumpen claiming technology and repurposing it for their own ends. Gibson put the punk into cyberpunk. As I wrote in my 2023 follow-up,

[Gibson] took computers out of the realm of men in lab coats standing over coffin-sized boxes in dust-free rooms. He put tech on the street, in the pockets of skate punks and the ears of all-night sushi line cooks. … Instead of an obscure nerd subculture, he gave exotic tech to everyone, even folks sleeping on mattresses on rain-soaked streets. Neuromancer is a book set during a perpetual war between the haves and have-nots, and the battlefield is cyberspace.

So it’s delicious how this has folded back on itself. Today’s video and photo technology has been claimed and repurposed to craft hints of a film version of Neuromancer. As exhibited here and in the 2022/2023 posts, the proliferation of fan-made Neuromancer film posters is evidence of what can be done with some Photoshop skills and a DeviantArt account. Likewise, as the fake movie trailer below proves, it’s now possible to assemble, score, and distribute worldwide a cinematic amuse-gueule of what a movie might look like.

This leads to a tantalizing question: What if someone simply made their own version of Neuromancer? Not a movie trailer, but the movie itself?

When Neuromancer was published forty years ago, the suggestion would have been ludicrous. The indie film scene was only starting to find its feet. Movies like David Lynch’s Eraserhead (1977) and the Coen Brothers’ Blood Simple (1984) were some of the earliest to gain traction outside of college-town theaters. Robert Townsend’s Hollywood Shuffle (1987) was the first film I knew of to be financed on credit cards.

As in Gibson’s Sprawl, today’s film-making technology is fast, cheap, and out of control. A new breed of auteurs exist, capable of filming and editing movies with consumer electronics—even filming with iPhones. Distribution is as easy as clicking on an “Upload” button.

Unlike the high production costs associated with most science-fiction movies, Neuromancer could conceivably need far less in terms of costumes, set design, and special effects. Most of the book takes place in locales like neon-saturated Chiba and modern luxury hotels, locations that could be reproduced in many cities in North America. The “future tech” of Gibson’s Sprawl is off-the-shelf these days. Data gloves? Check. Virtual reality headset? Check. That leaves the bulk of the special effects to his vision of cyberspace, which an experienced computer animator could flesh out.

Perhaps I’m lowballing the effort required. Even if the production design is within reach, the make-or-break line will be in the acting. The power of Neuromancer is the characters and the human drama, and not a lot of splashy CGI. Get good actors, and be ruthless in cost-cutting the rest of the production.

Given all this, perhaps the question to ask is: Why hasn’t an amateur fan production been mounted?

Fan-made movie poster for Neuromancer: “Jack in soon”

The obvious hang-up will be copyright law. But the dreaded DMCA take-down notice shouldn’t be feared too quickly. Others have worked around this in clever ways. Three high school friends recreated Raiders of the Lost Ark, shot-for-shot, in the 1980s, and released it wide with the blessings of George Lucas and Steven Spielberg. Charles Ross performs a “One Man Star Wars Trilogy,” stage show, doing all the voice lines and sound effects, again, with the permission of Lucasfilm. Innumerable Star Trek fan films, from shoddy to surprisingly worthy, abound.

In the 2000s, an acquaintance of mine in the theater business gave me his take on Charles Ross’ “One Man Star Wars” shows. He said Ross started performing the act in his living room for friends and family, and then at local theaters for low pay, until he was filling city auditoriums. He didn’t approach Lucasfilm until he had some success under his belt. He plead that his show was an act of love for the source material, and Lucas relented.

In other words: Start low to the ground, work your way up, and ask forgiveness after-the-fact rather than begging permission to proceed.

A fan-made Neuromancer film could gingerly tiptoe across the finish line with a similar strategy—and maybe even get away with it. If the movie were a labor of love, and not an easy-money cash grab, who knows? I have to believe the game of red-light/green-light Gibson has played with Hollywood over the decades has soured him on the studios, not to mention his rejected script for an Aliens sequel and all the trouble the studios made for Johnny Mnemonic.

Stephen King had a “Dollar Deal” program where he permitted filmmakers to adapt any of his short stories for one dollar, providing they (a) sent him a copy of the final product, and (b) they did not exhibit it commercially without his permission. (To get an idea of its success, The Shawshank Redemption began as a Dollar Deal.) Lucas made similar agreements with the Raiders kids and Charles Ross.

Fan-made Neuromancer movie poster: “Hack Into a New Reality / December 2031”

Earlier I wondered if I was low-balling the production effort. I may be low-balling the copyright issue twice as much. Thanks to a lawsuit by CBS/Paramount against the fan production of Star Trek: Axanar, the heyday of Star Trek fan films appears behind us. Stephen King concluded his “Dollar Deal” program at the end of 2023, with no stated reason. My guess is that the proliferation of studio streaming services—the “plusses” like MGM+, Disney+, and Paramount Plus—has unlocked a new revenue stream for old intellectual property, and the copyright lawyers are battening down the hatches.

Still, there is no extant movie or TV version of Neuromancer to compete against. Who knows? An under-the-radar fan production might stand a chance, if done with care and passion for the source material.

Will we finally see Neuromancer on the screen?

Ten years of blogging: A unique manifesto

Cover of The Atlantic magazine, July/August 2001
The Atlantic magazine, July/August 2001. This issue featured the first installment of B. R. Myer’s original “A Reader’s Manifesto”

Previously: Flaubertian three-dimensionalism
Next: Waiting for Neuro

My favorite blog post for 2021 would have to be my review of B. R. Myers A Reader’s Manifesto, a book of literary criticism with a remarkable life: It started as a different book, was published as a two-part essay in The Atlantic, and then was published in book form yet again. (The history of its repeated rebirth is remarkable unto itself, and something I spend a little time digging into.)

Myers’ original 2001 essay sparked a debate over the state of American literature that was singular in my lifetime. His thesis was that American literature had grown pretentious, stunted, and dull. It’s wild to think there was a moment in recent American history when a good chunk of the public was discussing literature, contemporary or otherwise.

I did a bit of extra research for my post, digging into online newspaper and magazine archives to locate contemporaneous articles. (As I wrote, this was “a time when most people had their news delivered to them via bicycle.”) I also put the review through several extra editing cycles, in a lawyerly attempt to make certain everything on the page was either supported elsewhere or comported exactly to my personal viewpoint. There’s an admirable cleanness to Myers’ prose that I wanted to emulate.

Cover of A Reader's Manifesto by B.R. Myers

As an example of Manifesto‘s continuing relevance twenty years later, here’s a paragraph from the acclaimed 2021 novel Leave the World Behind:

The store was frigid, brightly lit, wide-aisled. She bought yogurt and blueberries. She bought sliced turkey, whole-grain bread, that pebbly mud-colored mustard, and mayonnaise. She bought potato chips and tortilla chips and jarred salsa full of cilantro, even though Archie refused to eat cilantro. She bought organic hot dogs and inexpensive buns and the same ketchup everyone bought. She bought cold, hard lemons and seltzer and Tito’s vodka and two bottles of nine-dollar red wine. She bought dried spaghetti and salted butter and a head of garlic. She bought thick-cut bacon and a two-pound bag of flour and twelve-dollar maple syrup in a faceted glass bottle like a tacky perfume. She bought a pound of ground coffee, so potent she could smell it through the vacuum seal, and size 4 coffee filters made of recycled paper. If you care? She cared! She bought a three-pack of paper towels, and a spray-on sunscreen, and aloe, because the children had inherited their father’s pale skin. She bought those fancy crackers you put out when there were guests, and Ritz crackers, which everyone liked best, and crumbly white cheddar cheese and extra-garlicky hummus and an unsliced hard salami and those carrots that are tumbled around until they’re the size of a child’s fingers. She bought packages of cookies from Pepperidge Farm and three pints of Ben & Jerry’s politically virtuous ice cream and a Duncan Hines boxed mix for a yellow cake and a Duncan Hines tub of chocolate frosting with a red plastic lid, because parenthood had taught her that on a vacation’s inevitable rainy day you could while away an hour by baking a boxed cake. She bought two tumescent zucchini, a bag of snap peas, a bouquet of curling kale so green it was almost black. She bought a bottle of olive oil and a box of Entenmann’s crumb-topped doughnuts, a bunch of bananas and a bag of white nectarines and two plastic packages of strawberries, a dozen brown eggs, a plastic box of prewashed spinach, a plastic container of olives, some heirloom tomatoes wrapped in crinkling cellophane, marbled green and shocking orange. She bought three pounds of ground beef and two packages of hamburger buns, their bottoms dusty with flour, and a jar of locally made pickles. She bought four avocados and three limes and a sandy bundle of cilantro even though Archie refused to eat cilantro. It was more than two hundred dollars, but never mind.

If you skipped ahead, go back and read it again. After all, Leave the World Behind won over twenty “Book of the Year” awards.

Myers called out these “skim-friendly” lists as indicators of “a tale of Life in Consumerland, full of heavy irony, trite musing about advertising and materialism, and long, long lists of consumer artifacts, all dedicated to the proposition that America is a wasteland of stupefied shoppers.” He was speaking about White Noise back then. He could very well be speaking about Leave the World Behind today.

When I first read Manifesto in 2001, and again in 2021, my view was that Myers should be taken more seriously than some would have us believe. I remain convinced. Then and today, literary readers are pressed into accepting an author as a modern-day genius by dint of their credentials, their identity, their subject matter, or their elite supporters in the press. Myer’s Manifesto is more than a work of literary criticism, it’s a work of media criticism—he’s also taking aim at the literary gatekeepers and taste-makers perched in positions of cultural privilege.

In 2001, B. R. Myers asked an emperor-has-no-clothes question: Instead of rewarding authors because of who they are or who they know, why not judge their books by story, language, themes, and meanings? That question caused a big fuss back then. It turns out, it still can, if you ask it in the right company.

Twenty Years Later: B. R. Myers, A Reader’s Manifesto

Character-driven fiction, plot-driven fiction

Charles Baxter
Charles Baxter

Last year I wrote about dysfunctional narratives, a type of story that Charles Baxter first identified in the 1990s and which now seems overly prevalent today. He quoted a description of them by poet Marilynne Robinson, who also identified this type of narrative. She called it a “mean little myth”:

One is born and in passage through childhood suffers some grave harm. Subsequent good fortune is meaningless because of the injury, while subsequent misfortune is highly significant as the consequence of this injury. The work of one’s life is to discover and name the harm one has suffered.

In my post, I wrote about a “Cambrian explosion” of dysfunctional narratives in our culture since the 1990s, this sense that we’re being overwhelmed by them. They’re in our magazines and books, in our cinema, in our newspapers, and on social media. “Reading begins to be understood as a form of personal therapy or political action,” Baxter wrote, and his observation seems as acute today as it did back then.

Last year I offered a few explanations for what energized this explosion. Recently I thought of another reason to add to the list. It’s a concept repeated endlessly in creative writing classes and how-to guides on writing fiction, namely, character-driven fiction versus plot-driven fiction. Respectable authors are supposed to write character-driven fiction and to eschew plot-driven fiction, which is largely associated with genre fiction.

When I first heard this edict of character versus plot, I accepted it as sage wisdom, and sought to follow it closely. Over the years, I kept hearing it from instructors and successful writers, especially writers of so-called literary fiction. I heard it so much, I began to question it. What exactly is character? What is plot?

I began to pose these questions to my peers. Their response usually sounded like this:

“‘Character’ is all the things that make a character unique. ‘Plot’ is the stuff that happens in a story.” A character-driven story is supposedly rich with humanizing details, while a plot-driven piece is a fluffy story where “a lot of stuff happens.”

Aristotle is not the final word on literary analysis, but his opinions on how a story succeeds or fails is far more nuanced than what many of my peers and instructors in creative writing programs could offer.

Aristotle defines character as a set of human traits imitated in the text. Traits could be run-of-the-mill personality markers, such as a character who is studious or arrogant, or complex and contradictory, like Hamlet’s brooding and questioning nature. Before modern times, playwrights often used traits associated with the four humors to define characters in a play.

The four humors

For Aristotle, plot is the series of decisions a character makes that propels the story forward. These decisions generally take two forms: The character speaks, or the character acts. In line with the saying “actions speak louder than words,” Aristotle holds that a character’s actions are more significant, and more revealing, than the words they mouth.

When one of the salesmen in Glengarry Glen Ross announces he’s going close a big sale that night, and then crosses the street to have a cocktail, his actions reveal the hollowness of his words. Both decisions (speaking and acting) are also plot. Plot proves what character traits merely suggest.1

In other words, plot is not “stuff that happens.” (Note the passive voice, as though plot elements are forced upon the characters.) Rather, plot is a sequence of decisions made—and readers are very interested in a character’s decisions.

To be fair, inaction by a character is a kind of decision. Certainly there’s room for stories about characters who ponder a great deal and do little about it. In successful fiction, though, the final effect of inaction is almost always ironic. (Two good examples are Richard Ford’s “Rock Springs” and Thurber’s “The Secret Life of Walter Mitty.”) The problem is when inaction in literary fiction is treated as sublime.

The inaccurate, watered-down definition of plot-driven fiction—”A story where a lot of stuff happens”—has led to contemporary American literature’s fascination with flabby, low-energy narratives. I’ve met authors proud that the characters in their stories don’t do anything—never get off the couch, never pick up the phone, never make a decision of any consequence. Literary fiction has come to regard passivity as a virtue and action as a vice. A writer crafting a character who takes matters into their own hands risks having their work classified as genre fiction.

For decades now, creative writing programs have been pushing an aesthetic emphasizing character traits over character decisions. It’s frustrating to watch, year after year, the primacy of character-driven fiction getting pushed on young writers, with too many of them accepting the mantra without further consideration.

And this is why I think the Cambrian explosion of dysfunctional narratives is tied to this obsession with character-driven fiction. Passivity and inactivity are keystones of Baxter’s dysfunctional narratives. In his essay, he notes the trend toward “me” stories (“the protagonists…are central characters to whom things happen”) over “I” stories (“the protagonist makes certain decisions and takes some responsibility for them”).

This is why I’m wary of character-driven writers who do not permit their protagonists to make mistakes, instead strategically devising stories where they make no mistakes, and are therefore blameless. No wonder plot—that is, decision-making—is being eschewed, when this is the kind of story being upheld and praised.

  1. Aristotle’s Poetics are obviously far more complicated than my three-paragraph summary, but the gist described here holds. ↩︎

Ten years of blogging: Flaubertian three-dimensionalism

Flannery O’Connor

Previously: Writer’s block
Next: A unique manifesto

The year that was 2020 will most likely go down as one of the most significant years of my life: The COVID-19 pandemic, lock-downs and masking, the murder of George Floyd and the ensuing riots, all leading up to the most contentious presidential election in memory that some still deny was properly tabulated.

In contrast, 2019 had been for me a rather productive year creatively, and I wound up publishing two novels in 2020 back-to-back: Stranger Son in April, followed by In My Memory Locked in June.

That aside, as 2020 trudged onward and the pandemic fevered on, it grew apparent normalcy would not make an appearance any time soon. I began to suffer a low-grade depression, a toothy rat gnawing at the ankles of my mental health. I needed to do something creative to keep a hold on my fragile state.

I made a personal goal of putting out a compact book—my previous two were unusually lengthy for me, with In My Memory Locked clocking in at 120,000 words. I had been binging on streamed movies (and who didn’t that year?) Viewing the masterful The Day of the Jackal motivated me to pick up Frederick Forsythe’s original novel, which I learned was inspired by his tenure as a journalist in Paris reporting on the assassination attempts made on Charles De Gaulle’s life.

I committed myself to write a taut thriller about the pandemic and lock-downs, short and sweet, with as little fat as possible, and saturated with paranoia and claustrophobia. The result was Man in the Middle, published in November 2020 and my most overlooked book. I’m proud of it, though, especially considering the conditions I was working under. I also believe it to be the first novel published expressly about the COVID-19 pandemic—but I cannot prove that.

As for blogging in 2020, I put out a number of short series which garnered some interest. At the start of the year, I did a mini-series on Dungeons & Dragons, including my take on Gary Gygax’s Appendix N, which was his book recommendations he included in the first AD&D Dungeon Master’s Guide. Another series took at look at Hollywood novels, which gave me a chance to write on a few books I’ve been meaning to cover for some time, including The Day of the Locust and They Shoot Horses, Don’t They?

Gustave Flaubert
Gustave Flaubert

But the post I’m most proud of from 2020 regarded a bit of writing advice I’ve heard on and off for years now in writing groups and at writing conferences: “Use three senses to make a scene come alive.” Invariably, this advice is attributed to Gustave Flaubert.

As far as writing lore goes, this one is rather economical in expression. It’s also not altogether obvious why it’s true. Why three sense, and not four or five, or even two? The resulting blog post was satisfying to write because investigating the origins of this saying led naturally to explaining why it appears to be true.

There appears to be no evidence Flaubert ever made this statement, at least, not in such a direct manner. Rather, the textual evidence is that it originated from Flannery O’Connor, who in turn was summarizing a observation made by her mentor, Caroline Gordon.

Now, I’ve read many of Flannery O’Connor’s short stories—anyone who’s taken a few creative writing classes will eventually read “A Good Man is Hard to Find,” her most anthologized work. I had never read anything by Caroline Gordon, however, so it was fascinating to delve briefly into her work.

It’s a shame Gordon is not more well-read today. It’s probably due to her work not taking the tangents and experiments that other American modernists risked (such as Faulkner and Jean Toomer). She remained a formalist to the end. Her How to Read a Novel is an enlightening book, and while a tad dated, would make fine reading for anyone serious about writing a full-bodied, red-blooded novel.

Mostly, my pride for “Use three senses to make a scene come alive” is that it’s a solid essay: It starts out with an interesting question that leads to more questions, takes a couple of detours and unexpected side-roads on its journey, and ends on a note of successful discovery. It’s about all I can aspire to when I sit down to write.

“Use three senses to make a scene come alive”

Ten years of blogging: Writer’s block

John Turturro in Barton Fink

Previously: An all-too-familiar utopia
Next: Flaubertian three-dimensionalism

From a novel-writing perspective, 2018 and 2019 was a creative interregnum. After publishing Hagar’s Mother in late 2017, I found myself juggling energy between two books. One was the third installment of the Bridge Daughter series, the other a futuristic detective novel where society has essentially become a giant social media simulation. While working on the former, 2018 fizzled away with a fearful lack of progress. As 2019 marched on, a slow panic developed inside me. Would I burn off a second year with nothing to show for it?

I learned a hard lesson: Writer’s block is real. Before this, I’d read articles by well-known writers who either denied it existed, or called it a semi-phony condition covering for laziness. The cure for supposed writer’s block, they explained, was to turn off your Internet, silence your phone, and write.

The early chapters of the Bridge Daughter sequel emerged in fits and spurts. Like a teenager learning how to drive a stick shift, I couldn’t find second gear and launch the story forward. Eventually I admitted that I’d hit something like writer’s block. I recalled what the Coen Brothers did when they were blocked developing Miller’s Crossing: They wrote a movie about writer’s block, Barton Fink.

While I didn’t go that meta, I used the problem to pivot to my science-fiction detective novel. Encouragingly, I was far more productive. It was also a much longer story. As a tightly-wound mystery, it was vital the chronologies of the different characters matched up, as story events were occurring in the background that the detective only learned about later. This required a fair amount of revision to clean up and synchronize.

The pivot did unblock me, and in a big way. During a stay in Tokyo at the end of 2019, I finished the remainder of the third Bridge Daughter book over a six-week sprint. Unlike the grind of the detective novel, Stranger Son spilled forth all at once. It and In My Memory Locked were published in 2020.

Photo of cappuccino with leaves drawn in the foam
Cappucino by Scott Rocher (CC-BY-NC 2.0)

The other writing outlet I used over 2019 to break my writer’s block was this blog. It’s no surprise my focus that year would be on the writing process itself. I blogged about keeping a writing notebook on your phone, story revision, story structure, and even on (bad) cover letters. Basically, any problem I faced while writing, I at least attempted to compose a post about it. (Most were never published, trapped forever in my blog software’s Drafts folder.)

So desperate to write anything to keep the blood flowing, I even wrote about writing in cafes. It couldn’t have been more flagrant: Sitting in a cafe, desperate to jump-start the creative engine, I started writing about what I saw around me. What began as a lark grew into a lengthy diatribe on the different cafes I’d written in over two-and-a-half decades, and the varieties of cafe patrons and owners I’ve had to put up with.

The cafe I wrote that post in was near-perfect for my writing habit. Plenty of seating, open late, electrical outlets, free Wi-Fi, good drinks, good food, reasonable prices, a cozy college student vibe—and a mere one block from my apartment. That’s why at the end of the post I didn’t reveal its name. I feared it would be discovered and ruined.

Well, not long after posting, the cafe changed owners. One by one, the wonderful perks disappeared, prices crept upwards, and hours were reduced. By the end of 2019, I was on the hunt for a new cafe.

A few months later, my preference for writing in public spaces would become a very distant problem.

A quarter-century writing in cafes

Ten years of blogging: An all-too-familiar utopia

Rod Serling, 1959.
Rod Serling, 1959. Serling penned the early drafts of the script for the Planet of the Apes film.

Previously: A literary eulogy
Next: Writer’s block

Earlier when I’ve paged through my past blog posts to locate my favorite for a particular year, one usually jumped out at me. For 2018, I find myself torn between two favorites. The tiebreaker in a case like this is: Do I have anything more to say on the subject?

On one hand is my write-up of Cat’s Cradle, a book I’ve adored and been fascinated with since I was young. I could easily write another 5,000 words on the many dimensions and subtleties of Vonnegut’s greatest work—yes, even greater than Slaughterhouse-Five. For the purposes of this series (a look back on my favorite posts over the last ten years), I’m willing to stand pat. My 2018 post doesn’t express everything I could say about the novel, but it touches on what I think are its most salient aspects.

Cover of "Cat's Cradle" by Kurt Vonnegut
1970s edition I purchased in junior high school. I still have it on my bookshelf.

The other post from 2018 I’m proud of regards Planet of the Apes—the original 1968 film, and not any of the sequels in what has become a rather exhausted movie franchise. I opened that write-up copping to the film being “a guilty pleasure,” that it is

campy, riveting, preachy, and provocative— Franklin J. Schaffner’s sci-fi classic is the very definition of middle-brow entertainment, in that it pleases the senses while challenging the mind.

It turns out that, yes, I do have a little more to say on the subject.

Often when I gear up to write about a book, I’ll go back and re-read it so it’s fresh in my mind. For my Apes post, I didn’t re-watch the movie, but rather read Pierre Boulle’s original 1963 novel, which I’d never picked up before. I didn’t spend too much time discussing the book, though, since my focus was on the film. That’s a shame, because the novel is quite the curiosity.

Boulle dismissed attempts to label his Apes as science-fiction, preferring to call it a “social fantasy.” The term comes across like a hipster pose, but it makes sense. Much as Gulliver’s Travels isn’t really about seafaring, the interstellar aspect of Apes is a literary contrivance for explaining how his character Ulysee winds up in a society run by simians.

Structurally, the book reads something like utopian literature. In works such as Ecotopia, The Dispossessed, or, obviously, Thomas More’s Utopia, the narrative is not centered around character(s) dropped into a tight situation and navigating conflicts toward some kind of resolution. Rather, utopian works spool out pages of exposition to detail the clockwork innards of a fictional society operating on principles quite different from our own.

Cover of "Planet of the Apes" by Pierre Boulle

Boulle likewise spends many precious pages explaining how the simians live, work, compete, and cooperate. So, is Planet of the Apes a utopian novel? It’s not so simple. As with the film, the human astronaut Ulysee is feared by the simians, who view his existence as a threat to their comprehension of the universe. Their plans for him are not kind.

While that might make the book sound dystopian instead, that’s a difficult label too. Prior to Ulysee falling from the sky onto their planet, things seem to be going pretty well for the apes. Their society isn’t bleak or oppressive or authoritarian. They merely have an all-too-recognizable reaction to the unexplainable, this human that talks and reasons, a creature they normally hunt for sport and trophy.

The genius of Boulle’s book is that it’s structured like a utopian novel, but instead of describing an alternate society, it describes our society, with humans swapped out for apes. (Unlike the film, the apes of the novel live in a mid-twentieth century world, with cars, telephones, and even tobacco.) Boulle’s clever twist permitted him to write about our world as though it was an exotic place. In the terminology of critical theory, it defamiliarized our society. That, in turn, permitted him to write about us from a distance. As with the movie series, the ape device became a powerful fulcrum for criticizing all manner of human activity, from animal cruelty to racism, from religion to capitalism.

I remain surprised how under-appreciated the book is today—another sad example of a successful Hollywood adaptation smothering out its source material.

From Chimpan-A to Chimpanzee: The Swiftian genius of Planet of the Apes

Rethinking realism

Close-up of man's face from "The Arnolfini Portrait" by Jan van Eyck

Not rethinking realism, as in rethinking philosophy’s single, objective reality, hard as rocks and nails. No, I mean rethinking realism in the sense of questioning the elevation of literary realism over the many other forms of fiction.

Realism has long been the go-to form in literature for telling a story a certain way. An entire literary style—Naturalism—sprung from the sense that Romanticism had gone too far and produced a literature divorced from the world as commonly experienced. The pendulum later shifted the other direction, and for a period of time realistic literature was derided as bourgeois and reactionary. Since World War II, with the rise of creative writing programs and a reinvigorated enforcement of upper-class distinctions, kitchen-table realism has returned to the pinnacle of literary loftiness in America.

So it’s funny to me that realism is also so important in popular entertainment. This is nowhere as true as with television, which is obsessed with depicting reality—from the “you are there”-style news reporting to game shows branded as “reality TV.” When the writers of TV’s M*A*S*H killed off Col. Henry Blake in a season finale, they were inundated with letters from outraged viewers. The Emmy award-winning writing team’s response was, “Well, that’s reality.” American auteur Robert Altman famously ends Nashville with an out-of-the-blue assassination of a central character. Why? Because, he explained, that’s reality.

It’s not that these plot points are faulty or wrong-headed. My complaint is that the excuse—”It’s reality”—is a lazy defense of artistic choices. Writers should cop to their decision rather than take the passive route and saying reality made the choice for them. Writers should ask themselves if a “realistic” moment is adding to, or subtracting from, the story.

Anyone who’s attended a creative writing class, workshop, or MFA program is familiar with the high ground presumed by realism. The trendy term is “psychologically realistic fiction.” In writing programs, names like Raymond Carver, Amy Hempel, Tobias Wolff, and Tim O’Brien are tossed out as the zenith of American writing. Students are explicitly encouraged to emulate them, and their importance is implicitly signaled by their repeated presence in syllabi and required-reading lists. (I’ve read “The Things They Carried” at least eight times over the course of decades of writing groups and classes.) These authors are lionized for many reasons, but importantly, they all wrote about reality.

(There are two exceptions worth mentioning: One is magical realism, although its high regard in writing programs is tied up with identity politics. The other is Borges, whom I jokingly refer to as science-fiction for MFA students. It must be noted that both exceptions originate from outside the United States. Kafka, incidentally, is read and praised in writing programs as well, but not in such a way as to encourage emulation—I suspect my instructors liked the idea of Kafka more than Kafka’s output.)

Look at how so much literary fiction operates. Protagonists tend to be thoughtful, rational, and deliberative—often, they exhibit little to no affect. Characters in opposition tend to be boorish, thoughtless, and emotional. Dialogue is either flat and unadorned, or snappy, like the patter of a stand-up comic. Scenes flow as one character uttering a brief line, followed by paragraphs of rumination. The other character responds, and more paragraphs of rumination.

The prose might be good—it might even be inspired—but is this realism? Going through contemporary literary magazines, reading one story after another, I’m not sure one will find a lot of psychological realism, in the sense of psychiatry’s DSM-5.

Genre fiction is not immune either. Too often connoisseurs of hard-boiled detective fiction and tough-guy novels claim their favorite authors are superior because of their attention to realism. Raymond Chandler’s “The Simple Art of Murder” is wonderful and insightful criticism, but at its heart is a trashing of the classic British mystery because “fiction in any form has always intended to be realistic.” It’s one of the few arguments in the essay that I question.

Janet Burroway wrote, “Sometimes reality doesn’t make for good fiction.” It’s a tough lesson to learn, and one that even seasoned writers fail to grasp.

After all, there is no widely-accepted maxim stating the primary purpose of story is to reproduce reality. Fiction is supposed to be an expression of a writer’s inner state, not a dry report of the who, what, where, and when. Besides, why do we need to reproduce reality with such fidelity? We’re soaking in it. If you want reality, put down your phone or leave your computer screen. You have returned to reality, effortlessly.

In a writing class I attended, one of the students was a fan of horror, particularly H. P. Lovecraft and Robert Chambers’ The King in Yellow. At an end-of-semester presentation before the class, he expressed frustration at the hard-realism reading list we’d been given, and of the months of instruction requiring him to write in similar form. “Reading about reality is like reading about your job on your day off,” he told us. There’s something to that.

Story creates a transcendence within the reader. This transcendence defies reality while mimicking it—reality is Play-Doh in the hands of an adept writer. From hard realism to squishy-soft fantasy and everything in-between, great writing takes me to another place and time, a chance to live another person’s life. Books are “portable dreamweavers.”