Earlier when I’ve paged through my past blog posts to locate my favorite for a particular year, one usually jumped out at me. For 2018, I find myself torn between two favorites. The tiebreaker in a case like this is: Do I have anything more to say on the subject?
On one hand is my write-up of Cat’s Cradle, a book I’ve adored and been fascinated with since I was young. I could easily write another 5,000 words on the many dimensions and subtleties of Vonnegut’s greatest work—yes, even greater than Slaughterhouse-Five. For the purposes of this series (a look back on my favorite posts over the last ten years), I’m willing to stand pat. My 2018 post doesn’t express everything I could say about the novel, but it touches on what I think are its most salient aspects.
The other post from 2018 I’m proud of regards Planet of the Apes—the original 1968 film, and not any of the sequels in what has become a rather exhausted movie franchise. I opened that write-up copping to the film being “a guilty pleasure,” that it is
campy, riveting, preachy, and provocative— Franklin J. Schaffner’s sci-fi classic is the very definition of middle-brow entertainment, in that it pleases the senses while challenging the mind.
It turns out that, yes, I do have a little more to say on the subject.
Often when I gear up to write about a book, I’ll go back and re-read it so it’s fresh in my mind. For my Apes post, I didn’t re-watch the movie, but rather read Pierre Boulle’s original 1963 novel, which I’d never picked up before. I didn’t spend too much time discussing the book, though, since my focus was on the film. That’s a shame, because the novel is quite the curiosity.
Boulle dismissed attempts to label his Apes as science-fiction, preferring to call it a “social fantasy.” The term comes across like a hipster pose, but it makes sense. Much as Gulliver’s Travels isn’t really about seafaring, the interstellar aspect of Apes is a literary contrivance for explaining how his character Ulysee winds up in a society run by simians.
Structurally, the book reads something like utopian literature. In works such as Ecotopia, The Dispossessed, or, obviously, Thomas More’s Utopia, the narrative is not centered around character(s) dropped into a tight situation and navigating conflicts toward some kind of resolution. Rather, utopian works spool out pages of exposition to detail the clockwork innards of a fictional society operating on principles quite different from our own.
Boulle likewise spends many precious pages explaining how the simians live, work, compete, and cooperate. So, is Planet of the Apes a utopian novel? It’s not so simple. As with the film, the human astronaut Ulysee is feared by the simians, who view his existence as a threat to their comprehension of the universe. Their plans for him are not kind.
While that might make the book sound dystopian instead, that’s a difficult label too. Prior to Ulysee falling from the sky onto their planet, things seem to be going pretty well for the apes. Their society isn’t bleak or oppressive or authoritarian. They merely have an all-too-recognizable reaction to the unexplainable, this human that talks and reasons, a creature they normally hunt for sport and trophy.
The genius of Boulle’s book is that it’s structured like a utopian novel, but instead of describing an alternate society, it describes our society, with humans swapped out for apes. (Unlike the film, the apes of the novel live in a mid-twentieth century world, with cars, telephones, and even tobacco.) Boulle’s clever twist permitted him to write about our world as though it was an exotic place. In the terminology of critical theory, it defamiliarized our society. That, in turn, permitted him to write about us from a distance. As with the movie series, the ape device became a powerful fulcrum for criticizing all manner of human activity, from animal cruelty to racism, from religion to capitalism.
I remain surprised how under-appreciated the book is today—another sad example of a successful Hollywood adaptation smothering out its source material.
Not rethinking realism, as in rethinking philosophy’s single, objective reality, hard as rocks and nails. No, I mean rethinking realism in the sense of questioning the elevation of literary realism over the many other forms of fiction.
Realism has long been the go-to form in literature for telling a story a certain way. An entire literary style—Naturalism—sprung from the sense that Romanticism had gone too far and produced a literature divorced from the world as commonly experienced. The pendulum later shifted the other direction, and for a period of time realistic literature was derided as bourgeois and reactionary. Since World War II, with the rise of creative writing programs and a reinvigorated enforcement of upper-class distinctions, kitchen-table realism has returned to the pinnacle of literary loftiness in America.
So it’s funny to me that realism is also so important in popular entertainment. This is nowhere as true as with television, which is obsessed with depicting reality—from the “you are there”-style news reporting to game shows branded as “reality TV.” When the writers of TV’s M*A*S*H killed off Col. Henry Blake in a season finale, they were inundated with letters from outraged viewers. The Emmy award-winning writing team’s response was, “Well, that’s reality.” American auteur Robert Altman famously ends Nashville with an out-of-the-blue assassination of a central character. Why? Because, he explained, that’s reality.
It’s not that these plot points are faulty or wrong-headed. My complaint is that the excuse—”It’s reality”—is a lazy defense of artistic choices. Writers should cop to their decision rather than take the passive route and saying reality made the choice for them. Writers should ask themselves if a “realistic” moment is adding to, or subtracting from, the story.
Anyone who’s attended a creative writing class, workshop, or MFA program is familiar with the high ground presumed by realism. The trendy term is “psychologically realistic fiction.” In writing programs, names like Raymond Carver, Amy Hempel, Tobias Wolff, and Tim O’Brien are tossed out as the zenith of American writing. Students are explicitly encouraged to emulate them, and their importance is implicitly signaled by their repeated presence in syllabi and required-reading lists. (I’ve read “The Things They Carried” at least eight times over the course of decades of writing groups and classes.) These authors are lionized for many reasons, but importantly, they all wrote about reality.
(There are two exceptions worth mentioning: One is magical realism, although its high regard in writing programs is tied up with identity politics. The other is Borges, whom I jokingly refer to as science-fiction for MFA students. It must be noted that both exceptions originate from outside the United States. Kafka, incidentally, is read and praised in writing programs as well, but not in such a way as to encourage emulation—I suspect my instructors liked the idea of Kafka more than Kafka’s output.)
Look at how so much literary fiction operates. Protagonists tend to be thoughtful, rational, and deliberative—often, they exhibit little to no affect. Characters in opposition tend to be boorish, thoughtless, and emotional. Dialogue is either flat and unadorned, or snappy, like the patter of a stand-up comic. Scenes flow as one character uttering a brief line, followed by paragraphs of rumination. The other character responds, and more paragraphs of rumination.
The prose might be good—it might even be inspired—but is this realism? Going through contemporary literary magazines, reading one story after another, I’m not sure one will find a lot of psychological realism, in the sense of psychiatry’s DSM-5.
Genre fiction is not immune either. Too often connoisseurs of hard-boiled detective fiction and tough-guy novels claim their favorite authors are superior because of their attention to realism. Raymond Chandler’s “The Simple Art of Murder” is wonderful and insightful criticism, but at its heart is a trashing of the classic British mystery because “fiction in any form has always intended to be realistic.” It’s one of the few arguments in the essay that I question.
Janet Burroway wrote, “Sometimes reality doesn’t make for good fiction.” It’s a tough lesson to learn, and one that even seasoned writers fail to grasp.
After all, there is no widely-accepted maxim stating the primary purpose of story is to reproduce reality. Fiction is supposed to be an expression of a writer’s inner state, not a dry report of the who, what, where, and when. Besides, why do we need to reproduce reality with such fidelity? We’re soaking in it. If you want reality, put down your phone or leave your computer screen. You have returned to reality, effortlessly.
In a writing class I attended, one of the students was a fan of horror, particularly H. P. Lovecraft and Robert Chambers’ The King in Yellow. At an end-of-semester presentation before the class, he expressed frustration at the hard-realism reading list we’d been given, and of the months of instruction requiring him to write in similar form. “Reading about reality is like reading about your job on your day off,” he told us. There’s something to that.
Story creates a transcendence within the reader. This transcendence defies reality while mimicking it—reality is Play-Doh in the hands of an adept writer. From hard realism to squishy-soft fantasy and everything in-between, great writing takes me to another place and time, a chance to live another person’s life. Books are “portable dreamweavers.”
Q: What was it about the Cain and Abel story that attracted you as a subject? JN: Two brothers fighting over the affection of their parents and their place in the world, a family banished to the wilderness, the murder, the punishment—it’s so mythic and yet familiar and relevant. Incredibly, the source material is only about fifteen sentences long. The porous openings in the original story gave me handholds into it.
Blogging in 2017 was again marked by another foray into the world of Kindle Scout, this time for my Bridge Daughter sequel Hagar’s Mother. That year I also ran a three-part series discussing the crossover between writing fiction and writing code, and some short entries on how I use a writing notebook when preparing to write a novel.
The most popular entry from 2017 was, by far, on Scott McCloud’s Understanding Comics. I first read this groundbreaking book in the 1990s, and have reread it at least three times since. McCloud wrote (and drew!) more than a treatise on how comics work. It’s a manifesto praising comics as the ultimate communication form ever devised. As I wrote, McCloud is “not merely comics’ Aristotle and ambassador, he’s its evangelist. Understanding Comics may be the first foundational lit crit text written by a fan boy.” I followed up a month later with “Blood in the margins,” which takes some of the lessons McCloud offers and back-ports them to fiction.
The 2017 entry I’m most proud of is on David Kidd’s memoir Peking Story: The Last Days of Old China. Originally anthologized in 1961 under its original title All the Emperor’s Horses, David Kidd’s classic is one of those remarkable nonfiction books that’s largely flown under the cultural radar. I have a theory why.
Kidd was an American, born and bred in the Midwest, who traveled to China at the end of World War II, where he married into a prominent Chinese family. When the book opens, he joins them behind the walls of their mansion compound, where they sip tea and reminisce about their family’s illustrious past. Meanwhile, the Communist insurgency is beginning to assume control over the country. Kidd pines for China’s past and mourns the loss of its ancient cultural traditions to the incoming revolutionaries. This is why I call the book “a literary eulogy.”
On the surface, it’s a wonderful read, with economical prose both graceful and straightforward, and lots of well-drawn authentic detail. Structurally, it’s as classical in its design as the Parthenon. As far as I can tell, it’s the only book Kidd authored, but what a book to rest your laurels upon.
As I wrote, Kidd was an unusual narrator for his memoir: “There are moments that read like a Graham Greene novel, the world-weary British expatriate turning up his nose at the dreary reactionaries and their anti-imperialist manifestos.” An uneasiness grows as you read between the lines. You sense that Kidd is, on one hand, a snobby and mildly myopic WASP, and on the other hand, an unrepentant Sinophile infatuated with China’s exotic past. His new in-laws, while not nearly as wealthy as their forebears, live a rather luxurious life compared to the peasants in the fields and the servants washing their clothes. Kidd seems as blithe to to the inequities as his in-laws are. When I reread Peking Story for the blog post, I kept wishing Kidd would at least once acknowledge the disparity. The acknowledgement is never really offered.
And that, I think, is the stain that prevents Peking Story from becoming a true classic of nonfiction or New Journalism. It’s not due to political correctness gone amok, but a lack of social awareness that modern readers expect from authors. Kidd should be the outsider peering in, but no, he is such a Sinophile, he eagerly jumps onto the garden divan to loll about with his new Beijing family. Even Fitzgerald—who never met a person of breeding he couldn’t write about—had the necessary introspection to offer the reader asides on the absurdities of the ultra-rich.
As much as I admire Kidd’s masterpiece, I can’t help but sense that the shadow casting a pall over it is not from what he wrote, but what he left unsaid.
When I started this blog years ago, I made a private agreement with myself: I would avoid writing topical political content. Substack, social media, and the blogosphere is saturated with political commentary, providing lots of heat but little light. I don’t like trafficking in outrage, which is the fast-track to success in political blogging.
However, I did write a novel about missile defense set during the Reagan Era in a town hosting a nuclear weapons research laboratory. That’s pretty political, even if the politics are fairly retro.
Due to recent events in the Middle East, the topic of missile defense has come up again, primarily thanks to Noah Smith’s Noahpinions (via). Smith’s piece on recent successes in missile defense tickled a nerve in me, mostly because this is a topic I’ve followed off-and-on for forty years.
To clarify, while my book is tinged with autobiography (I did indeed grow up in Livermore during the Reagan Era while the Strategic Defense Initiative (SDI) was being developed), it’s also fiction (neither of my parents worked for Lawrence Livermore National Laboratory (LLNL), although almost all my friends’ fathers were scientists there).
This is why I reacted strongly to Noah Smith writing this:
…for most of my adult life, I believed that ballistic missile defense was a hopeless, failed cause. From the 2000s all the way through the 2010s, I read lots of op-eds about how kinetic interceptors — “hitting a bullet with a bullet” were just an unworkably difficult technology, and how the U.S. shouldn’t waste our time and money on developing this sort of system.
He quickly adds,
Even the most ardent supporters of missile defense don’t think it could stop a nuclear strike by Russia or China. … critics of missile defense were right that missile defense will probably not provide us with an invincible anti-nuclear umbrella anytime soon. But they were wrong about much else.
Fair enough—Smith is differentiating between defense technology for stopping short- and medium-range ballistic missiles armed with conventional explosives (such as the type used to protect Israel from Iran’s attack in April) versus technology for stopping long-range nuclear ICBMs, which was the problem SDI was supposed to solve. Over the last forty years critics of missile defense have muddied these categories, taking the problems of developing an ICBM defense system, which must deal with missiles launched into the upper atmosphere, and translating them to conventional missiles, which fly at altitudes similar to prop planes.
With this proviso out of the way, Smith goes on to argue that “the way in which critics got this issue wrong illustrates why it’s difficult to get good information about military technology — and therefore why it’s hard for the public to make smart, well-informed choices about defense spending.” He proceeds into the history of short- and medium-range missile defense and its uninformed and myopic critics. It’s a thoughtful piece, well-reasoned and well-researched.
What’s my beef, then? It’s his assertion that the critics getting their prognostications wrong make it “hard for the public to make smart, well-informed choices about defense spending.” I disagree.
What I saw during the SDI years, and continue to see in 2024, is a dearth of public involvement in defense spending or development. How can the American public make well-informed choices, given the lack of transparency in budgeting or the research process?
This is not a bleeding-heart appeal (such as the 1980s bumper stickers Smith references: “It will be a great day when our schools get all the money they need and the Air Force has to hold a bake sale to buy a bomber”). Simply put, the public is not informed on the decision-making process, has not been invited into the process, and is not wanted to be involved in the process.
The evidence for this is in the history of missile defense itself. Even with four decades of critics piling on against it (which Smith thoroughly tallies), the short- and medium-ranged missile defense technology they mocked was approved, funded, developed, honed, and deployed. As far as I can tell, no one was voted out of office for supporting missile defense. No referendum on their development or funding was held.
Smith theorizes that with missile defense,
the information about how good America’s weapons systems are gets kept behind closed doors, unveiled in secret Congressional briefings and whispered between defense contractors. Meanwhile, everyone who wants to criticize U.S. weapons systems is on the outside, squawking loudly to the press.
My skepticism originates from Smith’s belief that the problem is an “asymmetry” between the military, which is secretive “in order to avoid alerting America’s rivals to our true strength,” and mouthy critics of short- and mid-range missile defense, which have a poor track record for predicting its failure.
Now, Smith’s point is a defensible perspective. It’s equally defensible that the reason the military is so hush-hush about their research projects is to keep the American public in the dark. When defense budgeting is discussed publicly, it’s about funding for more bread-and-butter expenses, such as better meal rations and outfitting soldiers with body armor. Who’s going to argue with supporting our troops?
In the 1980s, the public was well aware of the SDI project. It was highly publicized. Reagan announced it on national evening television. What the public did not know was how the LLNL planned to stop those nuclear missiles from reaching American soil. One of the earliest approaches they explored was a theoretical X-ray laser fired from orbiting satellites at the incoming missiles.
(Defenders of ICBM missile defense point out that the X-ray laser was merely one of many approaches considered. That’s true, but it’s also true that it, and so many of the other technologies considered, were discarded as impractical, unworkable, or simply dangerous. The history of SDI and its offshoot Brilliant Pebbles goes into the many misses and problems.)
To return to the X-ray laser: How was it to be powered? The proposal was that each satellite would hold a thermonuclear device. To fire the laser, the internal device would detonate. The contained thermonuclear explosion would shed X-ray radiation, which was redirected toward the intended target.
It’s easy to satirize this proposal (which I certainly did) as though a plot point in a hypothetical Dr. Strangelove sequel. My point here isn’t to knock research that sounds ludicrous, or which didn’t pan out during development. The history of technology, science, and engineering is littered with wild ideas and failed approaches.
My point is: SDI was highly publicized, but the energy source of the X-ray laser wasn’t revealed until years later by a whistleblower. The public could not make an informed decision, because the most vital information about the project was being withheld. Was it withheld because the U.S. military didn’t want to tip its hand to its enemies? Or because the details would be tremendously embarrassing—H-bombs in space to defend America from H-bombs in the sky? I lean toward the latter explanation.
How many voters today are aware that anti-ICBM research is still ongoing in 2024? The SDI project was renamed multiple times over the decades, as a cynical way to sneak it through the budgeting process. It was never defunded, although, again, as Smith and others recognize, “even the most ardent supporters of missile defense don’t think it could stop a nuclear strike by Russia or China.”
Rather, I think the military learned a lesson from Reagan and SDI: Don’t tell the public about your darlings. From a perspective of secure funding, there’s more harm than good from having your pet research project on the cover of Time.
To be fair, Smith doesn’t advocate for giving the U.S. military carte blanche on spending and research: “There’s no easy solution here, other than simply being aware of these difficulties and trying very hard to counteract them. We pundits should talk to and listen to a variety of experts, not just the loudest and most confident.” I agree, although I think his advice should extend well beyond the sphere of the punditry.
And Smith’s also right about the poor track records of critics of conventional missile defense. In the case of SDI, though, the polarity is reversed. Journalist William Broad’s first book on ICBM missile defense, Star Warriors, was a hagiography of SDI and the scientists at LLNL. From the tone and tenor of the 1985 book, a reader might think ICBM defense was simply a matter of getting the brightest and best minds together in a room with a whiteboard and a pot of hot coffee.
Broad’s later account, 1992’s Teller’s War, tells a very different story, revealing not only the research failures, but more critically the deceptions, exaggerations, and maneuverings used to sell the project to Reagan, Congress, and the American public. Broad barely references his earlier Star Warriors in the later book. In fact, reading between the lines of Teller’s War, its tone comes across a little like a scorned partner who realizes far too late that their spouse had baldly lied to them about an infidelity years earlier.
A postscript. When I wrote “the public is not informed on the decision-making process, has not been invited into the process, and is not wanted to be involved in the process,” that largely assumes the public wants to be informed about defense budgeting.
In an era of political gamesmanship, where “owning” the other political team is more than important than actual leadership and problem-solving, I’ve seen very little interest from the public in how our defense dollars are being spent. Perhaps the military doesn’t need to be secretive at all any more about its research projects.
2016 was a busy year for blogging. Amazon accepted Bridge Daughter for their Kindle Scout program, which entailed a month-long nomination process before they agreed to publish it. It was the start of a fairly intense roller coaster ride, most of which I captured in blog posts along the way.
Amazon’s imprimatur on the novel opened many doors. With a single email sent on a single day of the week to a mere sliver of their customer base, Amazon could generate hundreds of book sales, as though rubbing a lamp to summon a djinn. Amazon’s backing also led to a movie production company inquiring about film rights. They read the book and they asked questions, but ultimately they passed.
(Amazon dismantled the Kindle Scout program in 2018, which I still consider a tragedy.)
Of the long-form blog entries in 2016, I produced three that I remain proud of. I’m torn which to feature here. My account of Don Herron’s Fritz Leiber tour still evokes nostalgia. Don Herron is the creator of the classic Dashiell Hammett tour in San Francisco. Getting a chance to meet Herron and take his lesser-known Fritz Leiber tour was a once-in-a-lifetime opportunity, as he no longer leads it save for special occasions.
Another piece I’m proud of is my review/analysis of the Generation X cult classic Slacker, one of my favorite films. This entry has an untold side story: A few months after posting it, an online film aficionado site on Medium asked if I was interested in adapting the review. Unfortunately, what the editor wanted me to write about wasn’t what I found interesting about Slacker, and the opportunity fizzled out.
The third is a blog post I keep returning to as a kind of manifesto: “Fiction as a controlled experiment,” a write-up of my thoughts on the book On Literature by J. Hillis Miller.
Miller was a scholar at Yale and U.C. Irvine, and known for promoting deconstruction as a means of literary criticism. I discovered On Literature on a shelf of used books in a Tokyo bookstore, and assumed it would be thick with postmodern terminology and abstruse theories. Instead, On Literature is personal and ruminative. Parts of it read like a confessional. Miller admits to a lifelong love of reading, and writes in glowing terms on several children’s books he marveled over in his youth.
What caught my attention the most, however, is when he confesses to viewing a work of fiction as a “pocket or portable dreamweaver.” He describes books as devices that transport the reader to a new “hyper-world” for them to experience. The way he describes it reminds me of the linking books in the classic video game Myst.
This quaint vision of narrative is unfashionable in the world of literary criticism. Miller’s vision is also, in my view, charitable to lay readers, who are less interested in high theory and more interested in enjoying books, and curious why some books are more enjoyable than others.
But I do think this vision—”a pocket or portable dreamweaver”—is also a useful guide for an author developing a story or a novel. Miller insists a work of fiction is not “an imitation in words of some pre-existing reality but, on the contrary, it is the creation or discovery of a new, supplementary world, a metaworld.” That is what the creation of story is—not merely revealing or reporting an already existing world, but creating a new one in the author’s mind, and, in turn, recreating it in each reader’s mind. These multiple worlds are similar but never exactly the same.
Miller died in 2021 due to COVID-related issues, one month after the death of his wife of over seventy years. Reading On Literature makes me wish I could have enrolled in one of his courses. Whereas so many of the European deconstructionists seemed intent on subverting the power of literature, Miller was plainly in awe of the written word, and strove to promote it. We need more readers like him.
This is the much needed backstory of the character of Stapleton from Conan Doyle’s “Hound of the Baskervilles”. It is exceedingly well done and in keeping with Conan Doyle’s original story. … Heartily recommended for lovers of Holmes and those looking to add to their own Sherlock Holmes collections.
Thanks, Melisende! Full review here, more information on the book here, and Amazon page here.