Friday, May 16, 2014
Because it's quicker than an essay.
- The opening credits are the best I have seen in years, since Watchmen at least. Big, exciting music set to blocks of classified text that are swiftly redacted leaving only the cast and crew, all superimposed over "stock" footage that I only realized toward the end was in fact giving us a condensed version of the movie's Godzilla's origin...
- ...which it strangely elects not to tell. There's sentimental value in having had him originally make trouble in 1954 and only now reawaken, but beyond that...why? The film never really explains why he's around and has it out for these giant bugs that are the real enemy. This is a reboot if anything, and though superhero origin stories are by now well-worn, it would have been interesting for the humans in the film to treat Godzilla as a force of nature rather never before encountered than to be largely taken for granted. A big part of characterization, even of characters who only roar at things, has to do with how they are regarded by the other characters; having Godzilla be treated as not that big a deal, makes him not that big a deal.
- The biology of those giant bugs makes no sense. They are originally introduced as parasites, discovered inside the fossilized remains of a giant...something. But their whole mission in the movie is to incest mate, in some hole in the ground. If anything they should be trying to fuck each other through Godzilla, which would certainly give the movie the horror kick that monsters seem to lack these days.
- The humans' strategy doesn't make much sense either. If you're trying to get a nuclear bomb past the baddie that we know doesn't fly, why send it on a train to be attacked, and then airlift it? (Because a train attack looks cool, of course, and I'll admit, the shots of the female bug passing over and under and around the bridge are pretty neat.)
- I like the Godzilla design. Much good use is made of the dorsal fins, cutting through the surface of the ocean like a shark times a million, and the face, while strangely dog-like in its nose, is quite expressive. He's got a nice 'fuck you' sneer to these giant bugs.
HERE BE SPOILERS
- I guess this is the closest I'll come to knowing what it felt like to watch Psycho when it first came out, because I was genuinely shocked that they killed Bryan Cranston not even halfway into the movie. It's a bold gambit, but it doesn't pay off because there's no Anthony Perkins to replace his Janet Leigh. The ostensible star of the movie, the big lizard, is deliberately only glimpsed through much of the film to keep from spoiling his mystique. Which is all well and good, but that leaves us with the other humans, none of which step into the celebrity void to anchor the film that Cranston leaves behind him. Ken Watanabe has the most natural gravity, and I was hoping that Cranston's death would clear the way for the story, such as it is, to focus on a character who is actually from the nation that gave us Godzilla. But Watanabe is reduced to spouting pseudo-profundities about Godzilla 'creating balance' and man trying to control nature. After being introduced as the leader of the lab that's been keeping the bug egg a secret for a decade-and-a-half, he ends up having nothing to offer and unlike Cranston isn't allowed to die when he's outlived his usefulness. Aaron Taylor-Johnson instead becomes our ostensible lead, but he is no more memorable than whatshisface from Pacific Rim was. None of the characters are so stock-y, but neither are they compelling in their own right either.
- Pacific Rim was dumb by design, and its humans generic ciphers. Godzilla tries to put some meat on these bones, the better to give some weight to the events at hand. The scenes have a greater visceral thrill by giving us so much of a worms-eye view of the destruction, but the consequential violence is all front-loaded, and none of the characters matter enough anyway. And there's some good monster action to be had, including some applause-baiting money shots. But the movie's coy approach to its monster star and undue emphasis on its human stars eventually works against it by cutting away from its final battle to characters who have long since ceased to be interesting.
- So the human : monster ratio in these movies has yet to be perfected. Godzilla is probably the better film, but I can't help but wonder if Pacific Rim is the better monster movie.
Sunday, March 23, 2014
The proceedings start strong with “Bench Seat.” Two young lovers (Emma Thorne, Logan Sutherland) are up on a secluded hillside overlooking town to make out, or, maybe, to break up, as it becomes clear that the young lady has serious, serious issues. The piece is a neat study in contrasts, with Sutherland’s smooth jocularity (which eventually shades into a passive resignation) juxtaposed with Thorne’s jagged, twitchy paranoia. Like several of the pieces it goes on for too long, circling back to the same topics and comic devices, but the performers’ energy never flags and it entertains and unnerves in equal measure.
Said energy carries over to the next playlet, the best of them, “All Apologies.” It is a comic monologue, given by a man (Jonathan Berenson) to his wife (Jill Durso), in which he apologizes in his own fashion for being an abusive sack of shit. It flips the dynamic of the previous scene, with the woman now reacting, this time silently but so intently, to the increasingly unhinged man. His rambling is a tour-de-force performance in bullshit-spinning peppered with digressions on the origin and meaning of words like “awesome” and “golf,” in the end punctured simply and devastatingly by the wife’s incredulous laughter. The laugh is not in fact in LaBute’s script--she merely stares instead--but it’s a welcome addition that opens the storytelling perspective beyond the captivating and entertaining monster at the piece’s center and keeps it from letting him off the hook.
The same, alas, cannot be said for “Merge,” in which a husband (Anthony Taylor), having picking his wife (Haley Palmaer) up from a convention, tries to parse her ambiguous testimony about having been raped in her hotel room. She wasn’t raped, we learn, for she drunkenly invited all those/both those (the use of the word ‘all’ is a detail the husband clamps onto with the tenacity of a pit bull) men to join her. The script’s one-sidedness--we never get an explanation for or exploration of her nymphomania--makes the play problematic to begin with. Coupled with Anthony Taylor’s furious, accusatory take on his character, it makes the play an ugly exercise in slut-shaming which reaches its terminal conclusion in a nihilistic ending, a murder-suicide traffic swerve that is merely hinted at in the title and stage directions. Of the night’s proceedings, it’s a considerable bump in the road.
The show gets stuck in second gear for the the next piece, “Road Trip.” We’re presented a man (Todd Litzinger) and a young girl (Caroline Jordan) on a seemingly innocuous excursion. We learn that he’s a predator, and like most predators he’s family or something like it, probably her step-father, and this trip, for the girl at least, is decidedly one-way. The relationship between the two characters and the reveal of its true horrible nature is drawn delicately, but the tension of the play exists only between the audience's omniscience, and the limited perspective of the girl. Between the girl and the man there isn’t so much friction, and the piece never quite takes off because of it.
Though it shares similar subject matter, this is not true of the final and most intriguing piece, the show’s namesake, “Autobahn.” Like “All Apologies” it’s a monologue, this time given by the woman (Annie Grier) ruminating while her husband (Jonathan West) drives, about their troubled foster son’s accusations of sexual abuse. The play rightly doesn’t belabor the reveal of what it’s about and so spends its time exploring the two characters’ responses to this situation and, accordingly, forcing us to wonder what exactly the situation is. The wife’s chatter, her rationalizations and can-do optimism, increasingly feel like an evasion of a horrible truth. Meanwhile Jonathan West wears a mask of the most profound sadness--but is it a mask? Is it guilt? Self-pity? Manipulation? The play never tips its hand in either direction. This can be frustrating, playing up an ambiguity for which, as Dylan Farrow’s recently renewed allegations against Woody Allen have shown, there is no luxury in real life. As a work of drama, though, the did-he-or-didn’t-he question works, especially as a contrast to the other the earlier plays, whose creeps we were allowed to feel much more certain about.
The production, it should be noted, eliminates two of the seven vignettes, "Funny" and "Long Division." They are missed not for the plays in themselves, though "Funny" is quite good, but for their casts, two women and two men; the one man-one woman dynamic of the other plays, especially given Neil LaBute’s fixation on abusive sexual relationships, could use some variety. What we are given, however, is a solid offering, directed and performed with energy and attention, in service of an at-times uneven collection of plays. As a vehicle for drama, the engine occasionally threatens to stall, but the body is ever polished.
Saturday, July 6, 2013
Saturday, January 12, 2013
Sunday, November 25, 2012
(Some spoilers follow.)
Wreck-It Ralph should be viewed as less a movie than a cultural artifact of the early 21st century. It is not a bad movie, by any means. Its construction is sound, its technical ability accomplished, its celebrity voice casting surprisingly successful (I typically find Sarah Silverman grating, but here, no longer able to just be as obviously "offensive" as possible, she is refreshingly spunky). On entertainment grounds, it is largely successful. But far more interesting than how funny it is (quite, and quite often), is the way it trades on its audience's knowledge of video games, and depends on video game characters and iconography for its effect. So completely reliant is it that, regardless of its merits as popcorn cinema, it functions less as an independent cultural entity than as a milestone in postmodern cross-corporate artistry.
The contours of the story are deceptively familiar. In the games of Litwak's Arcade, all the characters live in a world unto themselves. The villain of Fix-It Felix Jr., Wreck-It Ralph (John C. Reilly) spends his days smashing windows that Felix (Jack McBrayer) repairs in order to eventually save the building's tenants, who throw Ralph off the building at the end. Tired of always being the villain and having to live in the dump, Ralph abandons his game one day to try to win a Hero's Medal in alien invasion first-person shooter Hero's Duty. In doing so, he crash lands in a candy-themed racing game called Sugar Rush, bringing with him one of the nasty, reproducing bugs from Hero's Duty. In the process he raises the ire of Sugar Rush's King Candy (Alan Tudyk) and Hero's Duty Sergeant Tamora Calhoun (Jane Lynch). He then enlists glitchy exiled racer Vanellope von Schweetz (Silverman), to win him a medal in a race so he can get back to his own game before the arcade owner decides it's irreparably out of order without him and has it unplugged and taken away.
The scenario is classic Pixar, even though this is a Disney film. Toy Story was about the secret world of toys, A Bug's Life the secret world of bugs, Monsters, Inc....You get the picture. Yet rather than these films, Wreck-It Ralph shares a far stronger kinship with Scott Pilgrim vs. The World, a film which also used the storytelling grammar and tropes of video games to an unprecedented degree, but even then just as a filter for an otherwise grounded love story. Ralph is about video games, period, and to its end enlists an array of cameos spanning the past thirty years of video games, from Q*bert to Pac Man to Street Fighter to Sonic the Hedgehog to Super Mario Brothers and beyond, with homages to other games not mentioned by name. To understand the difference between this and the Pixar pictures, imagine if, rather than just Mr. Potato Head and Etch-a-Sketch, Andy's room in Toy Story was populated with Stretch Armstrong, Transformers, G.I. Joe, He-Man, and other such brand names. That would have been distracting at best--but here it's vital, not just to the world of the movie, but also to its humor and themes.
Nor is it simply a crass toy commercial, not the way the actual Transformers cartoon was all those years ago, but neither is it a self-contained product. A look into the 'universe' of video games would simply not be convincing without being populated by authentic video game characters, every one of them worth millions of dollars and owned by media giants. A good case could be made that 20 years after Super Mario Brothers: The Movie, the reason there are no great films based directly on a video game is that the characters are too tied to the companies that own them, and they exist solely to sell products, the games themselves, that make those companies money.
And it's not just characters that are being licensed here. An assortment of brand names appear, in all manner of capacities. A close-up shot of a Subway cup is the most blatant and annoying; embedded far deeper in the movie's being are Oreos, which are the subject of a gigglingly obvious Wizard of Oz pun, and Mentos and "Diet Cola," on which a training montage and the movie's climactic action scene are inextricably tied. As with the game characters, the product-placement so completely sublimates the mise en scene that the two become bound, the art as vehicle for the ad, the ad as vehicle for the art.
It's all very entertaining, in the way that inside jokes are very entertaining to knowing insiders. (The movie is rotten with delicious candy-themed puns, which are the lowest of inside jokes, in that you only need to understand the language to understand the joke.) I'm not so much of a scrooge that I didn't enjoy myself. But still, should we not be a little depressed that every new character worth dressing up as for Halloween is now owned by some soulless mega-conglomerate?
The old criticism of Disney was that it debased classic stories and characters by reducing them to commodities. Yet in order to turn the little mermaid Ariel into a marketable Disney princess, the House of Mouse still had to create their own engaging version of a character that had existed in numerous, independently-created iterations for over 150 years. Copyright laws and media consolidation have so strangled our cultural development that our classics today are commodities, which exist and are licensed only in approved forms and tended with an eye for the bottom line. By modern laws and logic, everything that followed Hans Christian Andersen's Little Mermaid was either intellectual property theft, or fan-fiction.
The fact of this creative corporatism--or is it corporate creativity?--affects not just the film's storytelling but also its message. By focusing and placing audience sympathies on Ralph, a "villain" who wants to be good, it presents itself as an underdog story of rebellion against a prevailing order. (Tellingly, Ralph resolves to do this in a funny scene set at an AA-styled meeting for video game villains that includes original rebel, Satan himself.) Yet games, the film's creators cannily understand, operate by rules that dictate what does and does not happen in a digital world. That's what video games are, essentially: programs made up of millions of lines of code that act as pure logic. It's a very conservative way of looking at the world that in this case happens to be true: the rules, the code, can't be changed. Thus Ralph's attempts to be a hero in the way Felix is, are doomed to fail (the code for Sugar Rush is changed as part of the machinations of the villain, a notably arbitrary plot device that is out of step with much of the rest of the film). When it comes to his world, the world of Fix-It Felix Jr., Felix will always be the hero, and Ralph will always be the villain.
One can read into this a certain reactionary strain of thinking that says things can never truly change, that the discontented ought to just be happy the prevailing order and get on with it, lest they destroy everything. It's certainly befitting a top-down hierarchical corporation with more money than God. Yet one can also hear in Ralph's final lines, in which he says that the best part of his day is just before he is thrown off the building, because it is then that he has the highest, greatest view of the arcade--one can hear in it echoes of The Myth of Sisyphus (indeed, is there anything more Sisyphean than living life according to a 'reset' button?):
I see that man going back down with a heavy yet measured step toward the torment of which he will never know the end. That hour like a breathing-space which returns as surely as his suffering, that is the hour of consciousness. At each of those moments when he leaves the heights and gradually sinks toward the lairs of the gods, he is superior to his fate. He is stronger than his rock.One must imagine Wreck-It Ralph happy.
Like The Dark Knight Rises earlier this summer, Ralph tries to split the difference in its dealing with change and the status quo--for it is not just the villains but the heroes that must live by this code (there's that word again...). Towards the end of the movie, we learn that King Candy is in fact the hero of a previous racing game, Turbo Time, who grew jealous of a new racing game that took players away from him. He invaded the game and glitched it, causing both his game and the new one to be unplugged. He then secretly installed himself in Sugar Rush and overthrew Vanellope, its queen. The putative hero was subject to the same rigid system as Ralph, a "villain." Not coincidentally is Vanellope restored to her throne, whereafter she renounces her crown in favor of "constitutional democracy," a term I'm fairly certain has never been used in a Disney animated movie until now. Democracy, of course, is the quintessence of 'splitting the difference,' which seems to be the essence of the message the film is trying to impart. America's brand of democracy today goes hand-in-hand with the kind of pervasively entrenched corporatism I was talking about earlier, and so the circle is complete: corporate art encourages corporate democracy encourages corporate art.
Wreck-It Ralph is thus as much about its own nature as a vehicle of the postmodern zeitgeist, as it is about video games and how they drive the zeitgeist itself. I hasten to add that this postmodernism, this collapsing and subverting of the old definitions, cuts both ways. It's not just that rather than art becoming commercialized, our commercials are becoming art. Twenty years ago, my seven year-old self was allowed to watch Disney movies but not to rip someone's heart out of their chest in Mortal Kombat. Now a Disney movie marketed to seven year-olds includes a scene with Mortal Kombat baddie Kano ripping the heart out of a zombie. It's very funny in the context of the movie, but broadly speaking, not so much. Game Over, man.
Wednesday, October 17, 2012
Recent history gives Argo's opening scenes an especially visceral immediacy. Cutting archival footage of the 1979 storming of the American embassy with period perfect re-enactments, and depicting attempts within the embassy offices to deal with the rapidly escalating situation, the sequence has a terrific on-the-ground quality that starts the movie off with bang. By so vividly recreating the events of '79, it provides an uneasy vicarious experience of the Egyptian and Libyan missions of but a month ago. (I don't know if Warner Brothers ever considered delaying the movie's release out of "sensitivity" regarding the Libya attack, but since such moves are stupid and self-defeating, I'm quite glad they did not.)
After six American diplomats manage to escape and find asylum in the home of Canadian Ambassador Ken Taylor (Victor Garber), the movie settles into what is essentially a heist flick mold. Tony Mendez (Ben Affleck), CIA rescue ops badass, is tapped by State Department mover and shaker Jack O'Donnell (Bryan Cranston) to figure out how to rescue the six Americans. He eventually settles on a plan, "the best bad option," to enter the country posing as a Hollywood producer scouting locations for a fake Hollywood science fiction movie, and to disguise the diplomats as his production crew and take them back with him. To do this he needs to create a believable dummy production--including script, concept art, and authentic trade magazine publicity--with the help of Planet of the Apes makeup man John Chambers (John Goodman) and producer Lester Siegel (Alan Arkin). Which is hard enough before having to actually go to Tehran and get the Americans out before the sweatshop workers piecing together their shredded embassy photos can alert the Revolutionary Guards that there are Americans hiding in Tehran.
The plot pretty much takes care of itself, explaining the steps to be followed and then upping the ante with a raft of complications along the way. It's solid stuff, and the script and actors deftly balance the grave seriousness of the problem with the absurdity of the solution at hand. The good humor of Arkin and Goodman (who at one point tells Affleck, who directed this movie, that you "can train a rhesus monkey to direct in a day") dominates the Hollywood-heavy first half, somewhat to the detriment of the back end. As they can only sit around and wait to be rescued, the American hideaways are only given the barest of character development, and so it falls mostly on Affleck to carry the picture once it moves to Iran. He does good work as the smartest man in the room, steely and unflappable except for a concern about having to be away from his family that is exactly as perfunctory as it needs to be. Still, it would have been nice to get to know the trapped Americans better in order to contrast them with the roles they are forced to adopt for their survival.
Without getting into too much detail, the movie's third act goes perhaps too far in the use of creative license. The action got to be enough that it took me out of the film, and I started to wonder how much of this was true. It's not a fair complaint--there really was a lot that was fictionalized, to the movie's benefit--and William Goldenberg's editing does wonders tying together three plot threads that moment-by-moment push the tension ever higher. But the movie does start to feel (ironically?) a little too Hollywoody, whereas everything that came before was believable and restrained.
Maybe it's just recent events, the "too soon" factor, that make me fault the movie for its drift into fancy. If so, it works more than one-way. For not only does the September 11 consulate attack shape the way one approaches Argo, but Argo shapes the way one looks at the consulate attacks. It isn't much of a spoiler to say that the operation is successful but the U.S. government must publicly give all credit to the Canadians because the embassy hostages would otherwise face brutal reprisal. The movie ends here, but in the real world history marched on: those hostages and the failed attempt to free them, for whose sake the government buried the story of its most spectacular rescue mission, helped destroy Jimmy Carter's presidency, and it was not until he left office that they were freed.
Too often secrecy is invoked by our government today as a means of covering up information that would embarrass it. Argo presents an all-too-rare instance of secrecy that did precisely the opposite, that downplayed success for the greater good until 1997, when the mission was declassified. If there's a takeaway from the timing of the movie's release, it's that there is often more going on in international relations than we realize. Unknown unknowns, and all that. Moreover, the people who are involved in these hot spots, do realize what's going on. Or, at least, they know the risks. As an end-of-movie caption inform us, all of the rescued diplomats, in spite of their harrowing experience, returned to the foreign service. One imagines Ambassador Chris Stevens would have done the same.
Wednesday, October 3, 2012
Sunday, September 23, 2012
Americans were not the only ones serving in Iraq.
It's a fairly obvious observation, but one that I don't recall being much considered during the darkest days of that endless war. True, we often spoke of our faithful allies, the British, but even that term 'British' itself is an elision. For, as writer Gregory Burke notes in the program for Black Watch, imported for the second time from the National Theatre of Scotland by the Shakespeare Theatre Company, "Scotland has always provided a percentage of the British Army that is disproportionate to its population size." The only time I recall Scotland entering the Iraq conversation was the public dispute between a belligerent Christopher Hitchens and Scottish MP and Saddam Hussein apologist George Galloway.
We did not think much about Scotland, much less its military class, in the context of Iraq, but they thought about Iraq, and us, at considerable length. So we find in Black Watch, an exploration of the play's namesake, a famed Scottish infantry battalion, and the outsized role it played in Iraq. The play, directed by John Tiffany, is a marvel of performance and technical skill and shows a critical moment in time from an unusual perspective. Yet a crucial component is missing: the play's original audience, without which something has been lost in translation.
The play's story moves on two tracks--an unnamed writer, ostensibly Burke, interviewing the men of the Black Watch, and the stories they tell him, of daily life and daily death in Iraq. Interspersed among the interviews and vignettes are found objects of the war: a debate between two Scottish MPs; letters from an officer to his significant other; Scottish traditionals and military tattoos. Over time the team's rude banter and idle foolery gives way to frayed nerves and boiling anger as they are worn down as much by relentless shelling and suicide bombers as by the government's decision to fold the Black Watch in with other independent regiments into a single unit.
With the exception of the lights and sound, which work in tandem to create the deafening and blinding explosions characteristic of post-Saddam Iraq, the show's technical approach is deceptively simple. The Sydney Harmon Hall's proscenium arch has been reconfigured into a stadium-styled seating that requires much of the audience to cross the stage, where they must remain until the end of the performance (the show runs a fleet two hours with no intermission). At one end of the stage is strung a curtain that triples as both projection screen and scrim, and on the other end is a hefty door. A rough frame on either side allows certain moments to be played from elevated heights. Set pieces were otherwise minimal, though a great deal of mileage is made with a mobile pool table. The costuming is authentic, both in the Watch's military garb and in their easygoing civilian pub-wear. The swift action is realized by a helping dose of misdirection so that new business is constantly materializing right under the audience's noses.
No single actor stands out among the ensemble cast, as it should be in a piece about a military unit. A few, though, are given greater prominence, particularly Robert Jack, pulling double duty as the timid writer and an abrasive Sergeant, and Ryan Fletcher, as the team's de facto leader Cammy. What most impresses about the cast is the lived-in quality of their characterizations, moving with both a young, hangdog masculine swagger and military precision. Their dialects can occasionally be difficult for the American ear to untangle, but their sheer physicality does a lot of the necessary communication for them.
And there is much to communicate. The play offers some unusual perspectives, particularly for an American audience. The Watch views us, for instance, with a mixture of appallation and admiration at our overwhelming military superiority. So goes an exchange during a four hour bombing campaign:
"This is nay fucking fighting. This is just plain old-fashioned bullying like.
"It’s good fun, though."
"Do you think?"There is too the realization that ours was far from the only nation that relaxed the standards of admission to its volunteer army in order to fully staff its ranks. Late in the play we learn that one of the characters was diagnosed depressive after his first tour and should never have even been in Iraq a second time, but all that pesky medical paperwork just happened to get lost when the military needed warm bodies.
"Aye. It’s good to be the bully."
Still, or all it has to recommend it, particularly in universal moments like these, I couldn't help but feel at some remove from the play as a whole. Theatre is a live event, with each production, to say nothing of each performance, born of given circumstances. All shows are particular, but some are more particular than others. Black Watch, both the play and this iteration of it, are very much artifacts of mid-2000s Scotland, a nation the size of South Carolina with the population of Colorado. Nor is it just a play by Scotland, but of Scotland for Scotland. The show began life as a National Theatre of Scotland production at the Edinburgh Fringe Festival in an old drill hall and is saturated in a cultural shorthand--not just the dialects, but the politicians, the songs, the Black Watch itself--that could be taken for granted to forge a bond between performers and the audience that is at the heart of a live experience. Transplanting the play to one of the fanciest venues of the most powerful city in the world robs it of both its physical and cultural intimacy.
There's nothing wrong with the show. But without that full connection to the audience, it can't but feel slightly rote. All theatre, all theatre that matters, is local, and so it goes with Black Watch. As was said of another disastrous American military venture, "You weren't there, man."
Monday, July 23, 2012
With Heath Ledger dead and the Joker thus removed, the deck was stacked perhaps impossibly against Christopher Nolan to ideally conclude his Batman trilogy. The last film literally left him hanging and the character was such a wild popular success that his absence was always going to be felt whether the role was re-cast or the story re-tooled. I'm shooting this elephant in the living room now because it's not fair to judge the movie we have in theaters, The Dark Knight Rises, by what it now could never have been (nor is it fair to bring in the monstrous acts at the film's premier Aurora, Colorado, however salient the issues of citizen violence it raises may be). The movie simply is what it is. And what it is, is incredibly, if not quite terminally, problematic.
There's a great many plot developments I shan't spoil, but in essence: eight years have passed since the events of The Dark Knight. Batman (Christian Bale), blamed for the death of Harvey Dent, has disappeared from Gotham. Bruce Wayne has also retired from public life, and has sunk his fortune into what seems to be a half-baked fusion energy project headed by one Miranda Tate (Marion Cotillard). The police, thanks to new powers granted in Dent's name, have largely cleaned up the streets, such that cat burglar Selina Kyle (Ann Hathaway) is the worst of Gotham's worries. The going is so good that plans are afoot to get rid of Commissioner Gordon (Gary Oldman), who is looked up to by idealistic officer John Blake (Joseph Gordon-Leavitt). All of this changes with the coming of Bane (Tom Hardy), a buff and gas-masked terrorist who plans to overthrow Gotham City's ruling order of compromised cops and parasitic rich.
This would be a lot for just a standalone movie to tackle, but in the story department Rises is actually pulling triple detail. Not only is it introducing new elements for it own story, it is also tying up loose ends from the last movie and re-establishing narrative and stylistic continuity with Batman Begins. As a result the pacing is noticeably slack, especially when compared to its predecessor, which--largely due to Heath Ledger's mesmerizing performance and Hans Zimmer's abrasive one-note Joker theme--was relentlessly paced and by the end left the viewer traumatized.
Rises simply has much more to attend to, and so resorts to corner-cutting and rule-breaking that starts getting it into trouble. Although one never becomes unsure of what's happening, how it got there is decidedly less clear. Where did those motorcycles come from? How does Bruce Wayne return and where'd that Batsuit come from? The movie lacks the machine-like narrative precision of Nolan's most recent film, Inception, instead indulges hoary plot devices like a ghost, questionable character deceptions used justify a late plot twist, and a lot of happy, stylish coincidences. The Dark Knight fudged its logic too (the Joker's bombing schemes would have fizzled if the characters did not react exactly as they did), but this tended to be obscured because, again, that movie's structural chaos and sensory assault kept one from noticing.
The philosophical concerns of Rises are in a similar state of labored muddiness. Bane goes on with the logorrhea of an Ayn Rand character, about returning justice to Gotham City and punishing the corrupt rich, with explicit echoes of Occupy Wall Street. Yet his supporters are all thugs, no idealists except maybe for Selina, and it's all a sham anyway, a cover for his real plan--and I don't think I'm really spoiling anything here--to wipe Gotham City out entirely. This is a shame, because the movie is otherwise covering some interesting territory. The ending of The Dark Knight suggested that people needed a beautiful lie, e.g. Harvey Dent, to be inspired to goodness, but that idea is here completely exploded. Whereas Batman was previously the force that provoked an escalating criminal madness (the Joker), here the Dent myth creates a sham prosperity, a levee against Gotham's crime that will and does break. Instead of the order/chaos dichotomy that Batman's relationship with the Joker represented, Bane's overturning the status quo is the twin to Batman's vigilantism: they are both products of and solutions to a failure of public justice. This similarity seems to escape Nolan (perhaps because he spent the whole previous film arguing that Batman in fact made things worse), who instead settles for a much more optimistic and simplistic idea that erasing the past and starting anew will lead to a better tomorrow, as opposed to just beginning the same problems over again.
It's all grandiose, too much for its own good, but the film's superlative technical aspects keep it from bogging down too badly. Christopher Nolan has finally figured out how to direct visually coherent action sequences, so that the many battles going on, the Bane-Batman fisticuffs in the sewers in particular, can actually be appreciated. Pittsburgh, playing Gotham City, lacks Chicago's grittiness from the previous two entries, but in the movie's second half convincingly doubles as a Hobbesian failed state (I mean this as a compliment). Hans Zimmer's score uses much of what came before and is mostly effective, though without the straight madness of the Joker theme it does not have the same drive and at times feels less urgent than merely frantic.
The performances are typically great, but I'd like to single out Hathaway's Catwoman, who was certainly the biggest X factor of the piece. A lithe and playful anti-hero that serves as an effective foil to the largely dour and tortured Batman, she's a dependable ass-kicker and gives the movie some needed bounce in every scene she's in. She's also perhaps the most morally grounded of the characters, with a relatable sense of grievance and thwarted justice that is much more genuine than Bane's comic book-y posturing. She sells her character's ambivalence and development much more than one would think the script would allow.
So in the end, does all this sound and fury pay off? Again, the results are mixed. As a standalone film it's kind of a mess, but in the context of the series as a whole it gets the job done in bringing everything across the finish line (the possibility of which was in grave doubt following Heath Ledger's death). The central issue of the trilogy, how crime and crime fighting have deformed Bruce Wayne's character, is fully explored and pushed to its logical conclusion--before the movie beats a slight retreats at the last moment, in a way I can't decide whether or not it's cop out. Though I question the details of how it goes about tying back in to Batman Begins, I'm glad it did so if only so that film, with its pulpy secret society wants to destroy Gotham plot, wouldn't seem like such an anomaly among the three.
It's a question of ends and means, one which the series has fixated on for some time now. Batman's existence, once implied and here made explicit, is essentially that "the rules were once a weapon and now have become shackles," and Batman was necessary "dig into the filth" in order for the police to keep their hands clean. This problematic logic--to hell with principle, you're going to have to cheat at some point to win in the end--The Dark Knight Rises subscribes to not just morally, but narratively too. The story cheats when it has too, and sometimes even when it doesn't. The end result is a conclusion that like its hero is successful, but qualified and compromised.
Sunday, July 1, 2012
Wednesday, June 13, 2012
Monday, June 11, 2012
Thursday, May 31, 2012
Based on a 1988 Texas Monthly article by Skip Hollandsworth, Bernie tells the story of Bernie Tiede (Jack Black) a mild and ambiguously gay mortician who is beloved by the town of Carthage, Texas for the kindness he shows his clients, both living and dead. This kindness he extends to Marge Nugent (Shirley Maclaine), reputed to be the nastiest old crone in town. They become increasingly close, shopping, traveling, seeing and shows, with Nugent eventually bequeathing the entirety of her estate to Bernie. She also becomes increasingly possessive, to the point that Bernie snaps and shoots her in the back four times, then keeps her body in a freezer while giving away enormous sums of her money to the people of Carthage. When he's caught none of the townspeople can believe he did it, or that he ought to go to jail, and so it falls on District Attorney Danny Buck Davidson (Matthew McConaughey) to see that justice is properly administered.
The story is straightforward enough, but is given all kinds delightful wrinkles and twists. To start with, the tone is mordantly humorous, especially for a true crime story, but also rather sweet. The opening scene, in which Bernie demonstrates the finer points of corpse preparation (the angle of the head should be "neither star-gazing, nor navel-gazing") sets the deliberately and jarringly light-hearted tone for the rest of the movie.The story is told through a mixture of straight scripted narrative and interview footage with the townspeople of Carthage, who are endearingly provincial; an old codger type has some of the funniest lines, describing southern Texas as "where the Tex meets the Mex," and referring to "The People's Republic of Austin." Many of the interview subjects are actors (one of the actors is Matthew McConaughey's mother), though, which adds yet another layer of unreality to the film.
The comic ambiguity extends to its three stars. Jack Black is the film's greatest asset, giving a performance that is uncharacteristically restrained and made all the funnier for it. His Tiede is mannered and precise, right down to the delicate way he walks, and has a beguiling sweetness that makes his decision to kill his sugar momma both the most natural and most unbelievable thing in the world. (All too fittingly, one scene has him playing Harold Hill in a self-directed community production of The Music Man.) It's by far the best performance of his career.
This goes for Matthew McConaughey, too, generally useless as a rom-com leading man but here displaying great comic timing as the clueless DA who ends up being the only person in town able to view the murder with the proper perspective. Shirley MacLaine at first seems like the weak link, only on rare occasions becoming the Bitch Out of Hell that the townspeople make Marge Nugent out to be, but one wonders if this isn't intentional. The Carthaginians are gossip hounds through and through, and given how much the film is elsewhere forcing us to question what is or isn't true, it's entirely possible that the heavy emphasis on Nugent's happiness when she's with Tiede isn't deliberately chafing against her reputation.
Bernie came from nowhere and has ended up one of my favorite flicks that I've seen in awhile. It is, moreover, one of the most quotable. The last time I can remember reciting lines to my friends afterwards was Burn After Reading, almost four years ago. Bernie has slipped under the radar thus far--it's made only $2.5 million and is likely only going to play in indie theaters--but I can easily see it finding its audience on DVD. But why wait? It's a great group movie, believe you me.
Saturday, May 19, 2012
The Avengers is a great summer movie. I don't need to elaborate on why, for if you are reading this you likely have already seen it and either do not need or do not want convincing. A.O. Scott, however, even when praising the film for its entertainment value, is queasy about the cold corporate calculation of it all.
“I’m always angry,” [Bruce Banner] says at one point, and while “The Avengers” is hardly worth raging about, its failures are significant and dispiriting. The light, amusing bits cannot overcome the grinding, hectic emptiness, the bloated cynicism that is less a shortcoming of this particular film than a feature of the genre. Mr. Whedon’s playful, democratic pop sensibility is no match for the glowering authoritarianism that now defines Hollywood’s comic-book universe. Some of the rebel spirit of Mr. Whedon’s early projects “Buffy the Vampire Slayer,” “Firefly” and “Serenity” creeps in around the edges but as detail and decoration rather than as the animating ethos.
“I aim to misbehave,” Malcolm Reynolds famously said in “Serenity.” But for all their maverick swagger, the Avengers are dutiful corporate citizens, serving a conveniently vague set of principles. Are they serving private interests, big government, their own vanity, or what? It hardly matters, because the true guiding spirit of their movie is Loki, who promises to set the human race free from freedom and who can be counted on for a big show wherever he goes. In Germany he compels a crowd to kneel before him in mute, terrified awe, and “The Avengers,” which recently opened there to huge box office returns, expects a similarly submissive audience here at home. The price of entertainment is obedience.The corporatism goes deeper than cinematic aesthetics. Marvel Comics quite infamously screwed Jack Kirby from receiving proper compensation for co-creating the Avengers and dozens of other iconic characters that Marvel simply would not today exist without. They imposed limits on his rights to his own artwork and have made billions of dollars off his creations. In the months leading up to The Avengers' release there was talk of a boycott on these grounds. The number of people were enlisted to the cause is moot, however, in the face of a $200 million opening weekend that has unstoppably grown the movie Hulk-like into a billion dollar baby.
In a way this reminds me of liberals' moral dilemma when it comes to the question of re-electing Barack Obama. Politically speaking, Obama is The Avengers to liberals: hip, energetic, pushing all the right buttons. He stanched the bleeding of an economy in freefall, saved Detroit, passed Health Care Reform and financial regulation, repealed Don't Ask, Don't Tell, and is the first president to come out in support of gay marriage. He saved the day and looked cool and smart doing it.
Yet his presidency is littered too with illiberal policies that, were they done by a Republican, would have liberals howling: "extrajudicial killings, violating the War Powers Resolution, waging war without Congressional approval, violating the Geneva Conventions, whitewashing torture, warring on whistleblowers," to name a few. The liberal wish list is being dutifully checked, while fundamental issues of the rule of law have been left to atrophy or, worse, have been outright attacked. Like The Avengers, the slick surface sheen obscures a fundamental emptiness.
Tim Brayton, in a positive if weary review, referred to the Marvel mashup as "Transformers: Dark of the Moon for literate people who enjoy wit." Implicit in the complaint is a reluctant defense: "Have you seen the alternative?" With yesterday's release of Battleship, which grafts an alien invasion plot onto the mechanics of a Milton Bradley guessing game, we get a stark view of just how much worse it can be.
So Avengers, so Obama, whose opponent is a man described by his own underling as an Etch-a-Sketch in what was supposed to be a compliment, who once promised to "double Guantanamo," wants to start a war with Iran, and can barely even be bothered with pretending to care about the law. Unlike movies, elections are zero-sum competitions. One of these two will come out victorious, and anyone who votes for a third-party candidate or abstains out of protest would do well to keep that in mind.
The problem remains that the better of the viable options are still far from ideal. The Avengers is ultimately insubstantial, as is Obama's approach to the law. But these choices don't present themselves out of the blue. They are both, in fact, animated by the same thing that drives their vastly inferior competition: corporate cash and popular sentiment. Like the Jack Kirby case, Obama's legacy is handicapped by monied interests. The health care and financial regulation bills both conceded numerous demands in the interest of receiving industry cooperation. When industry sets the terms of regulation, the word has lost all meaning (that the Wall Street-backed Republicans tried to attack the legislation as a giveaway to the financial industry merely demonstrates, again, the debased nature of the choices we have).
A deeper problem still is in fact the feature of democracy, the wisdom of the crowd. Obama is one of our canniest politicians, such that even a humane gesture like supporting gay marriage is calibrated toward pacifying constituencies, garnering votes and donations. That's just the nature of running for office. Yet where is the civil liberties constituency? I'm not even speaking of liberals, but of the broader electorate that any candidate must court in order to win. Will they favor their franchise toward appeals for due process, executive transparency, and an unwinding of the security state--or promises of jobs and protection from whoever may or may not be trying to take away their influential minority's rights? Whether it's Jack Kirby or Khalid Sheikh Mohammad, the American public has far less interest in upholding fairness and justice than in self-gratification, particularly when life is so dire already.
Obama was once promised to be transformative. That just as well describes Mitt Romney, both for his ideological shape-shifting, and his policies, which have their cinematic analogue in the empty-headed mayhem of the Transformers movies. Obama and The Avengers are both compromised, but as a product of the broken systems in which they operate. For all the valid criticisms of them that do exist, one must ever ponder the alternative. Revenge of the Fallen was by all accounts a terrible movie; we don't need to watch it again.
Friday, May 18, 2012
Charles Murray has historically promulgated racism and profoundly stupid culture war claptrap, but he has a recent piece for The New Criterion about today's supposed lack of powerful, enduring art that's actually worth grappling with. To be sure, it's essentially an elaborate "get off my lawn" demand. But it is an argument that a great many people likely find compelling, and it is at least responding to actual issues, so it's worth unpacking how wrong it really is.
Murray begins by saying there is a dearth of great art today, meaning the next generation that will create tomorrow's art has nothing to draw on.
The insight that great accomplishment begets more great accomplishment goes back two thousand years to a Roman, Velleius Paterculus, who first analyzed the clustering of genius in Athens and concluded that “genius is fostered by emulation.” In the modern era, that insight has been confirmed in rigorous quantitative studies, and it is one of those social science findings that shouldn’t surprise anyone. If children who have the potential for creating great art are watching a Leonardo da Vinci set the standard, they are more likely to create art like Michelangelo, Dürer, or Raphael did. This is relevant for thinking about the future of American accomplishment in the arts because, as far as I can see, we do not have any great models in the current generation who will produce greatness in the next generation.Murray lists the exhaustion of forms ("What’s the point of writing a great symphony in the classical style (from the ambitious composer’s point of view), when we already have so many of them?"), and the persistent obscurity of abstract, nonrepresentational, and atonal work among the general public among reasons for concern. Hamlet can only be written once; with hundreds of years of low-hanging artistic and literary fruit having been plucked, artists have to push into ever more esoteric forms in order to break new ground.
The key to innovation, thinks Murray, is technology, which opens up new forms of expression. This is true, though it's largely premised on the idea that nothing of value is being created in the old fields, which the creators and audiences would vehemently dispute. Notably, Murray makes something of an exception for film:
The richest new organizing structure of the twentieth century was the motion picture. It is also the only organizing structure that does not show signs of being filled up. A plausible case can be made that the film industry is still making products that rank somewhere among the all-time best, and there is reason to hope that even better are yet to come.I suspect this has less to do with film's superiority as an artistic medium or even its relative youth, than with its ease of transmission. Movies require a relatively short investment of time, are promoted across TV and the internet by multi-million dollar ad campaigns, and are easily reproduced and distributed so that they can leave an enormous cultural footprint. The most brilliant work of art will never be canonized if there isn't a mass awareness of it first. Theatre and the fine arts struggle with this, as well as books, a non-visual medium in an image-saturated media environment (book trailers are illustration of this dilemma).
More importantly, technology has lowered the barriers to entry in many creative fields, and paradoxically made it far more difficult for any single person or work to tap into the multiplicity of zeitgeists. This is to say nothing of the replacement of 'high culture''s previous trickle-down significance with popular culture--fifty years on, I think it's safe to say that posterity remembers the Beatles more than it does, say, Phillip Glass.
All of this, it ought be said, is enabled by the moral promiscuity of post-industrial capitalism. The entertainment industry cares for no virtue but profitability. Robert Bork glimpsed this truth when he sighed that "“You almost began to want to put the [Berlin] wall back up,” because of crude American rock music's unimpeded flow into post-Soviet Union Eastern Germany. Murray is not so observant as the already obtuse Bork, but as it turns out, he shares with him a similar reactionary strain.
All the discussion of form and medium is actually peripheral to Murray's bigger argument, which stands on much shakier ground. The deeper problems, he thinks, are a nihilistic mindset, characterized by a rejection of God and religion, that has gripped the intelligentsia and now the culture at large. This has led to an absence of the transcendent, of a sense of "the good," that animates great art:
Beauty is not the only transcendental good that the arts require. A coherent sense of the good is also important—perhaps not so much for great music (though I may be wrong about that), but often for great art and almost always for great literature. I do not mean that a great painting has to be beautiful in a saccharine sense or that great novels must be moral fables that could qualify for McGuffey’s Readers. Rather, a painter’s or a novelist’s conception of the meaning of a human life provides the frame within which the artist translates the varieties of human experience into art. The artistic treatment of violence offers an example. In the absence of a conception of the good, the depiction of violence is sensationalism at best—think Sam Peckinpah. When the depiction of violence is taken to extremes, it can have the same soul-corroding effect as pornography. But when it is informed by a conception of the good, the depiction of violence can have great artistic power—think Macbeth. So whereas some great works of art, music, and even literature are not informed by a conception of the good, the translation of this concept to the canvas or the written word is often what separates enduring art from entertainment. Extract its moral vision, and Goya’s The Third of May 1808 becomes a violent cartoon. Extract its moral vision, and Huckleberry Finn becomes Tom Sawyer.
To generalize my argument regarding the importance of the transcendental goods, I believe that when artists do not have coherent ideals of beauty, their work tends to be sterile; when they do not have coherent ideals of the good, their work tends to be vulgar. Without either beauty or the good, their work tends to be shallow. Artistic accomplishment that is sterile, vulgar, and shallow does not endure.Murray singles out Peckinpah as an example of sensationalist violence that "can have the same soul-corroding effect as pornography." The Wild Bunch's vision of humanity--summed up in its opening scene of children setting a fire ant colony after a couple scorpions during a preacher's sermon--isn't especially uplifting, but it still has considerable power forty years on. A much more accurate target would be something more tawdry and commercial and disreputable; the Saw franchise, perhaps, which certainly doesn't lack for sensationalism and aims for gross disgust rather than any deeper horror.
Moreover, the accusation of 'cultural nihilism' is just a broad-stroke evasion. The "rejection of traditional religion... among intellectual and artistic elites" didn't just happen in a vacuum. The intellectual groundwork was already laid in the 19th century by challenges from the usual suspects, Darwin, Freud, and Nietzsche. Since the 1990s, the period with which Murray is most concerned, the moral authority of the traditional American religion has collapsed, both with the Catholic Church's complicity and conspiracy in child rape and in conservative Christianity's archaic views on sex, birth control, and gays. To call this nihilism is to take the moral rectitude of Christianity and its institutions for granted, regardless of their real-world effects and the changes in attitudes surrounding them. Merely wishing a return to the older, more comforting zeitgeist is not going to bring it back, nor is it even necessarily desirable.
This is not to say that the current order is without its problems. In the arts, the absence of a guiding moral sense can lead to solipsism and a self-impressed cleverness (Damien Hirst and the Young British Artists are for me some of the worst offenders). Yet it isn't like no one else was aware of this. Most visibly, David Foster Wallace grappled with the perils of postmodern irony and its tendency toward hall-of-mirrors vacuity:
For me, the last few years of the postmodern era have seemed a bit like the way you feel when you’re in high school and your parents go on a trip, and you throw a party. You get all your friends over and throw this wild disgusting fabulous party. For a while it’s great, free and freeing, parental authority gone and overthrown, a cat’s-away-let’s-play Dionysian revel. But then time passes and the party gets louder and louder, and you run out of drugs, and nobody’s got any money for more drugs, and things get broken and spilled, and there’s a cigarette burn on the couch, and you’re the host and it’s your house too, and you gradually start wishing your parents would come back and restore some fucking order in your house. It’s not a perfect analogy, but the sense I get of my generation of writers and intellectuals or whatever is that it’s 3:00 A.M. and the couch has several burn-holes and somebody’s thrown up in the umbrella stand and we’re wishing the revel would end. The postmodern founders’ patricidal work was great, but patricide produces orphans, and no amount of revelry can make up for the fact that writers my age have been literary orphans throughout our formative years. We’re kind of wishing some parents would come back. And of course we’re uneasy about the fact that we wish they’d come back–I mean, what’s wrong with us? Are we total pussies? Is there something about authority and limits we actually need? And then the uneasiest feeling of all, as we start gradually to realize that parents in fact aren’t ever coming back–which means “we’re” going to have to be the parents.Wallace, though, was a part of the group under examination and had a stake in seeing the art redeemed, unlike Murray, who uses the state of the arts as a Trojan Horse for attacking the current liberal social order.
Toward the end of the essay Murray gets to his real point, which is (what else?) to criticize the welfare state. Essentially, ours is an aging society, such that in 50 years the old will outnumber the young. Part of the reason for this is extended life expectancy, which Murray believes robs people of their sense of urgency in life and their motivation to make great art:
In a world where people of all ages die often and unexpectedly, there’s a palpable urgency to getting on with whatever you’re going to do with your life. If you don’t leave your mark now, you may never get the chance. If you live in a world where you’re sure you’re going to live until at least eighty, do you have the same compulsion to leave your mark now? Or do you figure that there’s still plenty of time left, and you’ll get to it pretty soon? To what extent does enjoying life—since you can be sure there’s going to be so much to enjoy—start to take precedence over maniacal efforts to leave a mark?Naturally, this mindset finds its fullest expression in the conservative bogeyman of Europe:
I believe this self-absorption in whiling away life as pleasantly as possible explains why Europe has become a continent that no longer celebrates greatness. When I have spoken in Europe about the unparalleled explosion of European art and science from 1400 to 1900, the reaction of the audiences has invariably been embarrassment. Post-colonial guilt explains some of this reaction—Europeans seem obsessed with seeing the West as a force for evil in the world. But I suggest that another psychological dynamic is at work. When life has become a matter of passing away the time, being reminded of the greatness of your forebears is irritating and threatening.Murray looks at European history and sees only its intellectual achievements. These are vast, no one would argue otherwise. But it is simply willful blindness to not see that as enlightened as these advances were, they stand amid a backdrop of serfdom, high infant mortality, and any number of scourges and follies that the modern liberal project has devoted itself to minimizing if not outright eradicating. Goya and his audience could afford to be great--the peasantry, not so much.
Which all to point out that Murray is complaining about standards of living being higher than they ever were.
It's of a poisonous reactionary nostalgia that has gripped the right of late--whether in the form of fundamentalist Christianity, neo-conservative war hawkishness, or the Tea Party, whose shrieks about big government (at least that which isn't benefiting the "deserving" them), Murray's gripe matches best. The urge to turn back the clock, whether with sexual mores, military glory, or anti-government individualism comes from a blinkered view of history and an inability to cope with the world as it is today. To the extent that there exist postmodernism's discontents, a retreat into the past is no solution at all but intellectual surrender.
It may well be that under the welfare state, and the consumer capitalist system that exists alongside it, life accomplishment is viewed with less urgency than it was in more trying times (though I seriously doubt it--ask any creative person if she doesn't feel a "'this-is-what-I-was-put-on-earth-to-do' motivation to create great work," and while you're at it, ask a poor person if, living "In a world where people of all ages die often and unexpectedly," he feels inspired to live out his full potential). But even if we were to grant all this, it would not make Charles Murray's complaints any more valid. Art is inextricably tied with the circumstances surrounding its creation. It's circular to say that 19th century art could have only been produced in the 19th century, but there you go. Really, I'm eager to revisit this argument forty years down the line, if only because at that point I may be filling Murray's role; after all, today's liberal is tomorrow's conservative.