The debate about the state of film criticism has settled—or calcified—into two camps: traditional print critics claim the Internet has replaced expertise with amateurs, fanboys, and obscurantists. Web enthusiasts counter that we’re in a new golden age of film criticism and accuse the traditionalists of jealousy, resentment, and Ludditism. In other words: idealization of the past versus idealization of the present; resolution via what Pauline Kael once referred to as “saphead objectivity.” Screw that.
Each side’s view may be equally rosy, but that doesn’t mean each can make an equal claim. I’ve been a film critic on and off for twenty-five years and have been lucky enough to take part in the tail end of the best era of print film criticism and the beginning of the Internet, when it seemed like the Web would be the new delivery system for the kind of writing that was starting to be imperiled in print. My experience tells me that not only was film criticism in better shape in the print era, but good work stood a greater chance of making an impact. Only a fool would say that there’s not good work being done on the Internet. But the nature of the medium, the way it has reshaped journalism and public discourse, makes it harder for that work to matter. In its contribution to the ongoing disposability of our cultural, political, and social life, in encouraging the cultural segregation that currently disfigures democracy, the Internet has to bear a great deal of responsibility for the present derangement of American life.
Publicly, film critics for established online publications will say that the Web has given a new home to film criticism. Off the record, many of those same critics will tell you their jobs depend on securing advertiser-pleasing hits by lavishing coverage on the worst of what’s out there, especially the superhero and fantasy movies. Editors hope to attract hits by feeding into a movie’s prerelease hoopla. What a critic actually thinks about the movie is often drowned in the ongoing publicity deluge. If a publication’s critic declines to join in the publicist-generated excitement over The Green Lantern or the fifth Pirates of the Caribbean, the editor can always find a writer, usually a young one looking to get a byline, to whip up the mindless sort of “Five Great Superhero Movies” list that guarantees traffic. Editors then point to the number of hits generated by this as proof that what the readers really want is coverage of the big movies—whether or not there’s been coverage of anything else to choose from. All this deprives critics of one of the main functions of their job: to alert readers to different kinds of work. And because maintaining advertising dollars depends on keeping the clicks coming, it’s easy for even good editors to make their publication a tool of the studio publicists.
Journalists—and journalism students—are drilled in the necessity of finding a story’s hook, the thing that will justify its being written now. But relevance, a.k.a. buzz, can’t trump the basic question: is this story good and does it deserve to be told? There was no hook when James Agee wrote his Life Magazine piece on forgotten silent-film comedians, the piece that brought Buster Keaton and Harold Lloyd back into the public’s consciousness. Passion—and William Shawn’s trust of the writers he hired—was the motivating factor in Kenneth Tynan’s now-revered New Yorker profile of Louise Brooks. What we have today is the likes of the Daily News, which, a few years back, refused to let a movie critic review an acclaimed film about anti-Nazi martyr Sophie Scholl because, in the editor’s words, the story was a “downer.”
Bloggers and the writers who turn out well-crafted pieces on their own Web sites are free to write what they want. The best of them, such as Dennis Cozzalio at Sergio Leone and the Infield Fly Rule or Kim Morgan at Sunset Gun or Farran Nehme Smith at The Self-Styled Siren, give public voice to the way movies function as private obsession. Their film knowledge is broad and deep, but they wear that knowledge lightly. They understand that the true appreciation of any art begins in pleasure (and not in the “work” of watching movies). To read them is to read people grounded in the sensual response to movies, in what the presence or look of a certain star, or the way a shot is lit stirs in them. Reading these writers, I often feel that I’m in the presence of people dedicated to the notion of collective cultural memory in an era when instant obsolescence is the rule.
HAVING THE freedom to write about exactly what you want requires enormous discipline. For every Internet writer who fashions his or pieces, who gives each time to sink in and reverberate, there are dozens more who remind me of what Pauline Kael wrote about Gena Rowlands in A Woman Under the Influence: “Nothing she does is memorable, because she does so much.”
When a site’s goal is to satisfy the great sucking maw of the Internet with a constant feed of new items, sourced or unsourced, nothing is around long enough to make an impact. When perpetual turnover is the norm, the shallow, silly, and irrelevant rule. The blog Playlist finished up its 2011 movie preview with a runt-of-the litter roundup called, “The Leftover Question Marks of 2011: Can These Films Possibly Be Any Good?” (The obvious answer is that no one knows until they’ve been screened and critics have done the work of watching and writing about them, but reflection is so twentieth century.) One gem about the upcoming adaptation of Bel Ami stood out: “We’ve heard good things about the short story [sic] where [sic] this film draws its inspiration, but to us, right now this is just a film where a hot young thing romances Kristen Scott Thomas, Uma Thurman and Christina Ricci, while prooooobably pining for the one that got away. Would be nice to be wrong.” What would be nicer would be if the writer (or some editor . . . one?) knew enough to note that the short story is actually a novel by Guy de Maupassant. We can’t know if Bel Ami is any good until we see it, but nothing in that description would lead you to think there’s any reason to consider a Maupassant adaptation any differently than you would an upholstered bodice ripper.
There’s nothing new about this. The ranks of film critics were full of silly asses in the print days. What is new is that there is currently no must-read critic, no Pauline Kael or Andrew Sarris whose opinion can kick off a conversation or an argument.
Part of the problem is the thing often cited to prove the strength of film criticism: the sheer number of people online who are doing it. But to use this as evidence of a new golden age is simply to play a new version of equating how good a movie is with its box-office receipts. There are too many critics writing too many pieces. And even the ones who have reacted against the shallowness of the current conversation, the ones who turn out long, detailed considerations of films have found a way to make themselves close to irrelevant. You can understand why a young critic would want to show off what he (and it’s almost always a “he”) knows. That’s part of a how a young critic gets noticed. But too many Web critics affect a donnish air ludicrously beyond their years. Whatever movies are for them—objects for analysis or gnostic contemplation—they don’t sound as if they’re any fun, and they don’t communicate to readers that movies, even difficult or unusual movies, can be a pleasure. A lot of the time they don’t communicate to readers at all.
ANYONE WHO has written film criticism for a large publication in the past twenty years has been told to assume that readers know nothing (even now when it’s easier than ever for them to look up a reference). But reading some of the more earnest Web critics, I get the feeling they don’t believe in making any overtures to readers not already novitiates in the order of cinema. All the serious young cinematic men sound as if they’re writing for each other. Not showing off, but sealed off. The austere Web critics don’t sound as if they’re interested in connecting movies to any life beyond the parameters of the screen. Articles that analyze sequences in terms of lighting and editing and even shot length are presented with the deadly seriousness of a doctoral dissertation. Reading long, detailed arguments about a difference of millimeters in the aspect ratio of a new Blu-Ray disc, the only shrinking millimeters I’m aware of are those of my open eyes narrowing.
When the writer Dan Kois advanced the heretical notion in the New York Times that he couldn’t pretend to enjoy movies he found boring, the reaction he got made it seem as if he had said movies could never deviate from convention and audiences should never try anything new. The film historian David Bordwell even used the word “philistinism.”
Bordwell and New York Times critic Manohla Dargis trotted out the work of a psychological researcher to show, in Bordwell’s words, “how a director can subtly guide our attention without cutting, camera movement, or auditory underlining.”
Can these critics hear themselves? Imagine someone who has never seen a movie by Hou Hsiao-Hsien or Apitchatpong Weerasethakul reading that cognitive theory is the key to unlocking its seemingly unyielding surface. Faced with that, who wouldn’t mutter, “Screw it” and head for the multiplex?
It’s much easier to brand people as philistines or to chatter among a small, select group that agrees with you than to make a case to readers that they should seek out something that might at first seem off putting to them. It’s certainly an admission of having no interest or no belief in the possibility of movies as a popular art form. The reaction to Kois was a sustained example of bullying masked as erudition. And though many of these critics would be appalled to hear themselves described as fanboys, this is what they often seem in their adherence to hive mentality. It doesn’t matter whether you’re defending The Dark Knight or The Tree of Life if you declare the people who don’t share your enthusiasm incapable of appreciating movies.
As a craft, American mainstream movies are all but dead, more the product of the merchandising department than filmmakers. American movies have replaced story and character and emotion with spectacle and noise, something you couldn’t say about the first contemporary special-effects movies, like Close Encounters of the Third Kind or E.T., or even the best comic-book movies like Richard Lester’s Superman II and Tim Burton’s Batman. The old adage is reversed. Instead of looking for something suitable for the kids, the trick now is to find something suitable for adults. An ordinary, not-bad thriller like The Lincoln Lawyer, which features good acting and a well worked-out plot, can seem like a big deal. A superb piece of craftsmanship like Tony Scott’s blue-collar thriller Unstoppable, a movie that shows some sense of how Americans live now, can feel like a miracle. Critics condescended to The King’s Speech, calling it “a crowd pleaser” despite the fact that movies that win Oscars, as this one did, are now part of the niche market, no longer expected to attract the big audiences. A few weeks after it won, it was released on DVD, reduced to background noise in the big box media stores.
The probable death of movies as popular art, and the retreat of serious critics into contemplation cells, points up a larger problem: the falseness of the claims made for the Web as a new beacon of democracy. In many ways, the Web has been a disaster for democracy.
WHEN I started as a film critic online at Salon.com, readers could click on a link that allowed them to e-mail me directly. Within a month, I heard from more readers than I had in a decade as a print critic. Not all the letters were nice (though the rude writers often apologized if you wrote back to them and reminded them a person was on the other end of their missive), but I felt in touch with my readers. There was also an edited letters column. That all ended when the publication made it possible for readers to post directly without going through an editor. Almost immediately, I and the other writers I knew stopped hearing directly from readers. Instead, instant posting became survival of the loudest. Posturing and haranguing ruled. If the writer was female or Jewish, misogynists and anti-Semites would turn up. Why wouldn’t they? There was no editor to stop them. Bullies and bigots seized the chance to show off. And those reasonable people, the ones I and my colleagues heard from? They went nowhere near the online forums.
THIS “LIVELY forum” or “spirited debate” or whatever euphemism is now used for online bullying has always been defended by the claims that balance would be restored as reasonable respondents came in to counter the blowhards. Bullet wounds can be stitched up as well, but the damage is already done. “Communication,” the virtual reality pioneer, Jaron Lanier writes in his essential book You Are Not a Gadget, “is now often experienced as a superhuman experience that towers above individuals. A new generation has come of age with a reduced experience of what a person can be, and of who each person might become.”
That kind of divisiveness is what digital culture has come to specialize in on a much broader and insidious scale—and what gets held up as proof of the Web’s democratizing influence. The same progressives who bemoan the way Fox News has polarized political discourse in America, masquerading as news while never troubling its followers with anything that would disturb its most cherished and untested convictions, happily turn to the satellite radio station of their preferred genre or subgenre of music or seek out the support group or message board that fits their demographic, the political site that skews their way. Entering the realm of the other seems done solely to express rage.
The rigorous division of websites into narrow interests, the attempts of Amazon and Netflix to steer your next purchase based on what you’ve already bought, the ability of Web users to never encounter anything outside of their established political or cultural preferences, and the way technology enables advertisers to identify each potential market and direct advertising to it, all represent the triumph of cultural segregation that is the negation of democracy. It’s the reassurance of never having to face anyone different from ourselves.
I don’t believe it’s an accident that this segregation has become our cultural norm at a time when America is as politically polarized as it’s ever been. The hard fact of democracy, which is always a crapshoot, is that for it to work we can’t shut out who or what we don’t like, who or what we have not bothered to encounter. Popular art, which depends on crossing barriers, can’t exist in such confines. And criticism—which is meant to help people make sense of work they don’t know or assume they won’t like, or work that they know but haven’t really thought about—becomes something like samizdat in a culture set up to enforce the boundaries that art and criticism must transverse. The snarkiness of the film writers who tell us nothing is to be taken seriously, as well as the dourness of the film writers who wear seriousness as a hair shirt (and who may yet succeed in making movie watching as joyless as academics have made reading novels), play into that divisiveness, telling their followers they’re not missing anything by ignoring everything beyond their own self-proscribed compound. It’s not movies and movie criticism that are drowning in the tyranny of the “like” button. It’s democracy.
Correction: an earlier version of this essay misidentified Indiewire as a blog that published “The Leftover Question Marks of 2011: Can These Films Possibly Be Any Good?” That post was published by the Playlist, a blog hosted by Indiewire.
Charles Taylor is a writer and teacher in New York City.