Everything old is new again

“Woman in brown coat,” Devon Rodriguez

Ben Davis of artnet news reports a story that sounds all-too-familiar these days:

A little more than a week ago, I wrote a review of an art show by the artist and TikTok sensation Devon Rodriguez, best known for live drawing subway riders. He is, by some measures, the most famous artist in the world, with many millions of social media followers. He did not like the review.

It went up on a Friday. On Saturday morning, I woke up to a tidal wave of anger from Rodriguez on Instagram, tagging me across scores of posts. Hundreds of his followers went on the attack.

Davis gives a more nuanced and thoughtful analysis of his hellish situation than should be expected from someone who received death threats over, of all things, a review of an art show. He reasons

the only way I can understand Rodriguez’s incredibly thin-skinned reaction to my article is that he has managed to rise to this status of apex visibility without any kind of critical writing about him at all. It’s all just been feel-good profiles, so that the first critical word feels like a huge crisis. That’s a relatively new kind of situation for an artist to be in…

In the past, artists had to pass through the gatekeepers of museums and art galleries before becoming well-known to the public. Even Basquiat had to break through the establishment before securing his place in the art world. In today’s digital world, it’s possible, even desirable, to hurdle over the gatekeepers and go straight to the masses with one’s output.

A similar dynamic is at play in the world of publishing, as I’ve written about a few times. This desire to stand above criticism is, in my mind, the root motivation for dysfunctional narratives. The tenor of the attacks Ben Davis withstood sounds much like the way dysfunctional narratives are defended, such as the Rodriguez fan who snapped at Davis, “What if he was your son??”

Davis links this reaction to the notion of “parasocial relationships,” that is, “the imaginary, one-sided friendships people develop with celebrities and influencers in their heads.” This cuts to the “transitive logic” I wrote in 2019 about an all-too-similar event involving Sarah Dessen and her followers when they attacked a college student who posted a relatively innocuous criticism of Dessen’s work: “The logic magnified an innocuous criticism of a single YA author to an attack on all YA fiction and its readers. Thus, the logic went, if you’re a reader of YA fiction, it’s a personal attack on you.”

Sarah Dessen
Author Sarah Dessen

“Parasocial relationships” is the best term I’ve seen to describe how Dessen’s followers rose up and hounded the college student offline. Much of the outrage seemed rooted in the feeling that Dessen was not merely a YA author, but their friend. Any why not? These new, online super-authors are

not merely authors, they’re brands. Many of these YA authors have crafted an online persona of a confidant and sympathizing mentor. You don’t merely read their books, you hear from them everyday. You see their vacation photos and learn about their pets. You share their ups and downs in the real world.

Wikipedia says that the term parasocial interactions was first coined in 1956, no doubt in part inspired by the rise of television in the United States. The researchers described them existing prior to mass media, such as people emotionally bonding to gods, supernatural spirits, or saints. They are telling examples.

It requires much divination to predict these social media brouhahas will continue so long as artists and writers can organically grow their followings. Certainly I don’t see these kerfuffles as justification for returning to the pre-digital way, where editors and publishers decided over Negroni lunches who got published and who got to languish. But being thin-skinned to criticism, and using one’s followers to “cancel” the critic, is a bad choice no matter how you look at it.

As Davis predicts:

If there’s no criticism of [Rodriguez’s art], here’s what I think will happen: All the marketing companies and PR people looking to piggyback on Rodriguez’s popularity will stuff his feed with more and more cringe celebrity content and half-baked promo ideas until his social-media presence is bled dry of whatever charm it has.

Computer programming & writing fiction: Is coding art?

Girl with an 8-bit EarringIn my last post comparing writing programming and writing fiction, I concluded both were similar because of their relationship with their practitioners. “Art is a kind of recruiting poster for itself,” I wrote. “An art attracts its own artists.”

Wait—is computer programming art? It’s accepted to call fiction art, but can computer programming really be considered

the conscious use of the imagination in the production of objects intended to be contemplated or appreciated as beautiful, as in the arrangement of forms, sounds, or words. (The Free Dictionary)

Think of what programming a computer really boils down to: Ordering and organizing a series of mathematical instructions followed precisely by the machine. The computer is allowed no imagination in its interpretation of those instructions (if it possessed an imagination, which it doesn’t, at least not today). If there’s an art in computer programming, it stands in the purview of the programmer, not the machine. That’s how it should be. The art of painting is not in the tubes or cans of paint, but in the painter.

But is the arrangement of those computer instructions somehow “beautiful”?

It’s important to discern a difference between a computer program being art and the act of programming as an artistic form. Let’s start with the latter.

I believe programming is an art form, at least by modern notions of the term. Writing fiction and writing code requires continuous subjective decision-making during the entire process (a “conscious use of the imagination”). A personal fervor is vital for quality results. When a writer lacks that fervor, it shows in the end result, both for fiction and computer programs.

Programming is not a rote process of memorization and recall. There is no “correct” way to write a computer program but, like writing a novel or a short story, there are many wrong ways.

Bill Atkinson, creator of the original MacPaint, painted in MacPaint. (Daniel Rehn, CC BY 2.0)

Bill Atkinson, creator of the original MacPaint, painted in MacPaint. (Daniel Rehn, CC BY 2.0)

Coding requires taste, aesthetics, and an eye for detail. Programmers develop deeply personal philosophies. Some coders prefer verbosity (like Henry James in his later years) while others prefer economy (like Hemingway or Cain’s early work).

There’s a scene in August Wilson’s Fences that is one of the most distilled scenes I’ve ever read: The father and son debate buying a TV to watch the World Series. On the surface a mundane domestic moment, the scene is actually Fences in-miniature. The beauty of this scene mirrors brilliant software design, where each piece of the program is intimately connected to the entire application.

Years ago, I could always tell when I was working with a programmer who started coding on the Macintosh versus a programmer weaned on Microsoft Windows—the two companies have distinct programming styles and philosophies. Programmers who learned to code on those operating systems carried those styles and philosophies with them to other platforms and projects.

A computer program can be functional, operating, and seemingly free of bugs, and a programmer may still read the code and say it doesn’t “look” right. (The trendy term for this is “code smell.”) What’s more, two programmers may say a program doesn’t “look” right for entirely different reasons. This reminds me a great deal of the world of poetry, where poets may agree a poem is poorly executed and then squabble over the reasons why. (There are similar disagreements in the world of fiction, but I find them to be less…doctrinaire.)

Writing and programming both involve elements of discovery and improvisation. Even though I’m writing a series of blog posts advocating outlining stories before writing them, I don’t believe an outline can—or should—contain every detail present in the final story. An outline should not be so rigid as to prohibit discovery during the writing process.

For a long time, there was a big push to eliminate discovery and improvisation in the world of software development, as “discovery” and “improvisation” seem undesirable in a field of proper engineering. (In the 1960s, IBM famously discovered that the number of bugs a programmer produced was proportional to the amount of code he or she wrote. Their solution: limit the lines of code a programmer could write per day, a logic straight from the pages of Catch-22.) Newer software philosophies, notably Extreme Programming and Agile development practices, have flipped that thinking and embraced discovery and improvisation as healthy and necessary.

Suspicion of programming as an art form probably springs from a general lack of understanding of how programs are written. Programmers share an arcane terminology among themselves. They build and manipulate mysterious machines that have come to play a powerful, sometimes menacing, role in our lives.

<cite>Ex Machina</cite>, a 2015 film about a computer programmer who falls in love with an artificially intelligent android.

Ex Machina, a 2015 film about a computer programmer who falls in love with an artificially intelligent android.

That suspicion probably also arises from stereotypes. Programmers don’t look like artists. In popular culture, programmers are portrayed as geeks more comfortable around machines than humans. Sometimes coders in film or TV even fall in romantic love with their own programs. (Never mind that this trope originated in antiquity and regards an artist and not a bricklayer or farmer or soldier.)

Another reason people question programming as an art is that computer programs “do” things. There’s an academic suspicion of pure art having any sort of utility, probably due to fears of commodification and commercialization. We don’t think of Vermeer’s Girl with a Pearl Earring as “doing” anything other than hanging on a wall in some drafty museum, but it must be doing something to cause people to stand in line for hours to view it. It’s funny, this idea that pure art doesn’t “do” anything when it so plainly does. If art didn’t do anything, why would we care about it?

And this rolls back to the distinction I made earlier: Is computer software itself art? I’ll challenge the question with a question in return. We regard skyscrapers and bridges and automobiles and colanders as kinds of art. We laud architects, automotive designers, and commercial illustrators as artists. Why treat computer programs and their creators differently?

Next: Iterative processes

Colander, c. 1600 - 1650, Museum Boijmans Van Beuningen

Colander, c. 1600 – 1650, Museum Boijmans Van Beuningen

Computer programming & writing fiction: Art as a recruiting poster

IBM SelectricHow is computer programming like writing fiction? Is writing code anything like writing stories?

When I was young, perhaps seven or eight, I banged out my first short story on a second-hand IBM Selectric typewriter my mother brought home from her office. Powered on, the Selectric vibrated the whole desk and emanated a low mechanical hum, some unseen engine in the contraption idling. I still recall the smell of the ink in the typewriter ribbon and the satisfying, officious schock as the typeball jumped from its perch and tapped lettering onto the crisp onion paper I’d fed into the roller.

The story I wrote was a retelling of Richard Connell’s “The Most Dangerous Game” transplanted to a science-fiction setting. (In fact, I think I creatively titled it “The Most Dangerous Game in Space”.) My determination to spend hours coping with that unforgiving contraption went beyond an affinity for the classic short story. As an avid young reader, I’d come to wonder if I could pen my own fiction. My aspirations weren’t so bold as to imagine being published, only to see if I could write my own, but later that dream crept in too.

The Most Dangerous GameAround the same time (this would’ve been 1979), I cajoled my parents into buying a home computer. Silicon Valley was marketing home computers as personal productivity assistants, devices to balance one’s checkbook, manage a home mortgage, track stocks, and so on. Home computers were also being pitched as tools to give students an edge in school. I couldn’t care less about schoolwork—and I’ll be damned if that computer ever balanced my parents’ checkbook—but with a home computer I could play video games, my only real motive for wanting one.

Innumerable hours playing videos games led me to try to write my own. It was a natural progression, just as reading I, Robot set me to thinking of my own robot stories.

I never did write a video game, at least not one that anyone would want to play, but software development did become my career path, one I’m still following over 35 years later.

Likewise, although I didn’t finish that short story, writing fiction remains an important passion in my life, even more important than programming.

Walking these paths, I’m sometimes asked if writing software and writing stories are the same. Or, at least, if they bear any similarities. And my answer is, yes, there are commonalities between the two.

I’ll explore more parallels in the future, but already I’ve alluded to one thing they have in common. I’ve never met a good writer who wasn’t first an avid reader, and I’ve never met a good programmer who wasn’t first an avid computer hobbyist.

Art is a kind of recruiting poster for itself. An art attracts its own artists.

Next: Is coding art?