Two common misconceptions about critics is that they a) are secretly disgruntled artists frustrated at their own lack of success and b) have no clue how hard it is to create art, and if they simply did they’d be a lot more empathetic and soft in their critiques. The latter misconception always struck me as condescending to both the critic and the artist, as great art actually thrives when matched with a healthy critical apparatus. Good critics demand a standard; they cement taste and keep you honest. In the case of the former, I have found that my journey through making my own art has, if anything, made me a better critic, and way more insightful, knowledgeable, and appreciative of the nuances of creation and process than I ever could have become simply through consuming.
A lot of this disconnect about the role of criticism and the value of the critic has to my mind always been about the fact that no one thinks of criticism as an art form in and of itself. Before I knew anything about my own creativity or what I wanted to "be" as an adult, I fell in love with writing primarily through criticism. Reading music or movie reviews that explored the worlds conjured by the art they assessed gave me an understanding of why people like what they like and how taste or culture forms around it. Not only did I learn how to verbalize what I like and why, I developed an acumen for how a context or environment shapes the art it exports. What is it that makes music from New York sound the way it does? They’re nominally using the same tools and instruments that are used in Atlanta or Los Angeles, and yet there are clear differences—sonically, BPM, lyrically—that make it specific to their location. When French critics created the auteur theory, the original thinking behind it was that there were a few directors that could take seemingly formulaic studio fare and make it their own, and do so in such a way that you knew their movies through visual cues and signifiers, without needing any other information. The exploration of those aesthetic differences, at once nerdy and snobbish, holds real value to the people that care deeply about it. You were learning about process, technique, and style, and in learning these things you were also simultaneously crafting your own viewpoint of the world.
Over the past five years, I have been working to finish my first novel. I have been creating different forms of original work since I was a kid. I started a crudely drawn newspaper comic when I was 7. I took my father’s video camera at 10 and started shooting little shorts around the house. I learned how to edit for the first time in seventh grade. I started writing scripts in high school. Throughout this artistic journey, predictably, there has been a lot of starting and stopping; the kind that plagues most creatives with a million ideas and crippling ADHD. I would work on a thing when inspiration struck and then move on when I got bored, occasionally coming back but never finishing and therefore never releasing. When I started the novel five years ago, I decided to do something to force myself into accountability: I enrolled in grad school and made this book my thesis for an MFA in fiction. And it’s worked out … sort of. I wrote an entire draft of a book from start to finish, and I learned a lot about the craft of writing story, how to convey tone and voice and pace, and how a good book is structured. I learned a lot about process. I also learned that getting to the end of a story is only the beginning of the process of making something.
John Swartzwelder, a legendary Simpsons writer, once told The New Yorker that he prefers editing to writing. So much so, he said, that he rushes through the writing process, sometimes writing what he knows is absolute dogshit copy, just to get to the end. He does this so that he can then go back and edit, edit, edit until it’s quality. I got similar advice from one of my academic advisers, and I did find it a useful strategy as far as finishing a draft. But I also discovered just how taxing and arduous the editing process can be.
I never particularly thought of myself as some perfectionist. I wanted to write good and interesting things, and it didn’t matter to me how it happened. Stories about authors meticulously fretting over every single sentence and every word choice, or filmmakers taking hundreds of takes just to get a line or a movement the way they wanted it—I admired that, but I saw myself as much too lazy to commit to that kind of intensity to mission. And yet, the more I wrote and edited, the more I developed a lot of those same tendencies. It became almost like a game to me: finding the right turn of phrase, the best metaphor, the perfect punchline at the right moment. The more days I dedicated to fixing things one sentence, one paragraph at a time, the more of a whole it all added up to. Which is not to say this is the only worthwhile way to make art—there are great artists who understand what they want to say or make, and how they want to do it with much more ease or efficiency. But they weren’t born with that ability. It was a muscle developed from years and years of practice, and experience, and commitment.
There's no perfect way to make anything; that's why it's all described as "theory." There are times when I worry that all this craft and meticulous harping on specifics and perfect structure is hindering spontaneity and surprise, just as my previous lack of attention to the tiny details could make things baggy and disjointed. Perfect is the enemy of good, and good is the enemy of great, and everyone is grasping at straws trying to find that perfect balancing act that conveys their true essence. Ultimately none of it matters if no one is invested at all, but that's one of those things you have to leave to the gods or whoever.
I am in the middle of reading a biography of Elaine May, '50s comedian and '70s filmmaker, who was too ahead of her time and also probably too smart for her own good. She is written about as a woman that was a mad genius—funny and tough, but also a little abrasive and uncompromising in her vision, whether she knew how to enact it or not. That made for tough sledding in a misogynistic, unoriginal industry, but it was also tough for others trying to work with her because she so knew what she wanted and would not move off of it even when it was failing. That takes conviction, both in yourself as a person but also in your artistry. And while it's nice that May's movies are being rediscovered and heralded now, it didn't do anything for her at the time; she was essentially blacklisted after the production disaster of 1987's Ishtar.
Art is a muscle like any other. In the same way athletes train for their sport, creativity is a skill that must be developed. And throughout the last two years of creating in particular, that has really come into focus for me. My immersion in my own artistry has also fueled the critical part of my brain and made it more necessary than ever. It is probably why I’ve been especially harsh about the things I think are bad, and really tough towards stuff I actually do enjoy. It’s also helped me understand subtleties and nuances that I previously might have been reticent to dwell on for the sake of the macro story, be it the brush strokes or line drawings of a great painting by Lorna Simpson and Devin Troy Strother, or the lighting in a Michael Mann or Wong Kar-wai movie. The details are what matter in a world that's increasingly half-assed and convinced a robot could do it all just fine and for free.
Just as it is our responsibility to ensure a world where art is still made, we also have to ensure that it is of quality. It's vital that quality remains an ideal that matters to people, and to fight the idea that media exists only so that it can be better consumed. I care about the value that my art has, but I also care that there’s a critical eye given to it that honestly appraises its worth. Nothing can be great if we can’t even identify what makes it so.