METAMERICANA: TOO MANY COOKS Is a Political Statement Worth Hearing

METAMERICANA: TOO MANY COOKS Is a Political Statement Worth Hearing

The argument for the recent viral short Too Many Cooks
being a postmodern parody is easy to make—too easy, in fact. Sure, on the face
of it, Casper Kelly’s eleven-minute video for Cartoon Network’s “Adult Swim”
viewing bloc is a deconstruction of the opening credits often found on
cheesy 80s sitcoms, police procedurals, and sci-fi knock-offs. And yes,
the fact that the running conceit in the video is the power that language has
over us (the actors’ names, which appear beneath them in the usual way of all opening
credits, ultimately become a terrorizing force more human than the humans
they’re attached to) does tend to support the claim that the
postmodernist principle that we are all constructed by and in language is in play. But Too Many Cooks is mixing together too many opposite inclinations, effects,
and plot structures to be adequately described as “postmodern.” Instead, it
seems to intend, as so many Adult Swim videos do, to be inscrutable rather than
analytical, contradictory rather than instructive, and simultaneously
deconstructive and constructive rather than merely deconstructive.
 
For
all its fragmentation—the video moves rapidly between
television subgenres, even as it endlessly recycles the same theme song
(with
slightly different lyrics each time)—Too Many Cooks has a story to
tell that’s surprisingly conventional. First, there’s a villain: a
mysterious,
cannibalistic killer who’s introduced early on, whose name isn’t known,
whose
motives beyond bloodlust are inscrutable, who’s frightening in
appearance, whose
early victims are caught unawares, who understands his local environment
much
better than any of the good guys, and who towards the end of his
homicidal spree faces a “final girl” (an attractive young female more
canny than all the victims preceding her).
Sound familiar? It should, as it’s every horror movie ever made, other
than
meta-commentaries like Scream or Joss
Whedon’s A Cabin in the Woods. Too Many Cooks even features hapless law
enforcement, as several police officers fail to notice the killer even when
he’s literally right under their noses.
 
Just as it has a fairly conventional villain, Too Many
Cooks
has a hero whose placement is conventional even if certain of his
descriptive particulars are not. Smarf the Cat, described by The New Yorker
as the product of “Alf mating with a cat rather than eating
one,” is
introduced early on in a way that makes him endearing. Smarf has special
gifts that
others don’t immediately see (e.g., he can shoot rainbows from his hands
and
lasers from his eyes), has an apprehension of danger that exceeds that
of law
enforcement and all the other good guys, and in the end kills the
villain but is
gravely injured himself. Smarf’s role in Too Many Cooks is undergirded
by such a human inclination that it belies the fact he’s the only
non-human in the video: he’s trying to put everything back to normal.
“Back to normal,” in
the terms of the world of Too Many Cooks, means finally ending the
opening-credits loop all the characters in the video are caught up in;
Smarf, though grievously wounded, does
this by pressing a giant red button, after which he appears to die.
But—surprise!—he doesn’t die. In fact he’s fine, though the
cliff-hanging ending of Too Many Cooks suggests that Smarf’s still
caught up in the cycle of danger we’d assumed he’d escaped. All of which
should surprise no one,
as it’s exactly how the hero of a conventional horror film is dealt
with.
 
So why are so many commentators in major media (including not just The New Yorker, but also The Daily Beast and others) referring to "Too Many Cooks" as postmodern, or using terms common to postmodernist literary theory
(like “parody”) to explain the operations of Kelly’s intricately networked art-house flick? The
answer seems to be that “postmodern” is the term we use habitually, even
instinctively, for things we don’t understand and don’t really care to. Too
Many Cooks
is blindingly fast in its transformations, and
repeatedly obscure in its deconstructions of iconic images and ideas, so it
must be “postmodern” in some way—that is, beyond our understanding.
 
In fact, the new avant-garde in the arts, and particularly
in the visual arts, very much wants to
be understood. It wants you to be able to follow with little difficulty what
you’re seeing, even as the effect it has on you pushes you simultaneously
toward several internally contradictory extremes. Too Many Cooks is at once funny and
horrifying, mesmerizing and cloying, exhilarating and depressing, filled with
obvious references to popular culture and entirely disinterested in whether you
can catch even a fraction of them. If it seems in a sense ironic—as it clearly
does take a dim view of the formal constraints that typified 80s
television programs—it’s also earnest enough to want to give you everything you
expect from a fantasy: a villain, a hero, a plot, some tragedies, some
emotional manipulations, and a resolution that both satisfies and keeps you
guessing about what could come next.
 
“Classic” postmodern art emphasizes that meaning falls apart
at every critical juncture, and therefore usually requires specialized academic
training to fully interpret and appreciate. If and when it seeks a popular
audience, it does so to shock, distress, or otherwise disgust its viewers; even
Andy Warhol’s paintings, while easy enough to “get” on a first look, were
intended to provoke anxious debate over what is and is not art. Too Many
Cooks
is a different breed of artwork entirely because it requires little debate
regarding its central premise but still provokes significant emotional anxiety among
its viewership. If postmodern literature usually sends us running to our scholars for assistance, Too
Many Cooks
is much more likely to have you singing its theme song in
the shower. We’ve moved from a time when avant-garde art wanted to
unsettle our
minds to a time when it wants to unsettle our nerves and give us
immediate pleasure simultaneously. What’s at stake in this
movement from the postmodern paradigm to what’s lately being called
“metamodernism”? It’s a good question, and by now there’s enough visual
art like Too Many Cooks out there that we do well to consider the
omnipresence of contemporary art that ostentatiously combines opposing ideas in a way most of us can’t
readily process.
 
Metamodern art like Too Many Cooks is trying to
do an end-around past those institutions we once turned to for communal
sense-making: mass media, the academy, and non-academic "experts" within
their subfields.
When Too Many Cooks was released, everyone began forwarding it to
everyone
else via social media and email, whether or not anyone doing the
forwarding had
yet processed their emotional reactions to the video. The currency of Too Many
Cooks
became attention itself, not understanding, and the power to pass
on that currency resided in any person with access to the Internet, not
just those specifically empowered with cultural capital (for instance,
via
higher education) to tell everyone else what’s worth sharing and viewing
and
what isn’t. If we live in a time of great cynicism about media,
academic, and
of course political institutions, art that’s designed to virally
infect all of us with emotions we can’t process is subversive by definition.
 
Consider the way Too Many Cooks moved through the culture:
it at once became a hot topic on The New Yorker, New York Magazine,
and CNN websites,
even as it was still burning its way through every discussion board on
countercultural hotbeds Reddit and 4chan. The disconnect between those
two
audiences—one attracted to High Art, the other, broadly speaking, to
Low—was so
great that Reddit and 4chan users were heard loudly complaining that
their
enjoyment of Too Many Cooks was being coopted by those whose values
and tastes they didn’t and can’t share.
In other words, Too Many Cooks was destroying class distinctions by
appealing
to basic human emotions all of us contend with, regardless of income,
education, or
institutional affiliation. To call Too Many Cooks mere parody when it
deliberately speaks directly to and about longstanding story structures
and
psychosocial conventions unfairly casts it as deliberately obscure. It’s
a strange thing: we
live in an age in which we treat as obscure that which is simple in
order to avoid
seeing that it’s our simplicities that unite us, and that we all
struggle daily to resolve contradictory ideas and emotions. Too Many
Cooks
may suggest a worldview troubled by the overload of information
we all experience in the Internet Age,
but it’s also trying to remind us that, for now at least, we’re all in
the same kitchen
and eating the same food.

Seth Abramson is the author of five poetry collections, including two, Metamericana and DATA,
forthcoming in 2015 and 2016. Currently a doctoral candidate at
University of Wisconsin-Madison, he is also Series Co-Editor for
Best American Experimental Writing, whose next edition will be published by Wesleyan University Press in 2015.

ARIELLE BERNSTEIN: Girl Found: GONE GIRL’s Boring Masochism

ARIELLE BERNSTEIN: Girl Found: GONE GIRL’s Boring Masochism

nullBefore I saw Gone
Girl,
I had seen enough plot spoilers to know that Amy Dunne was the icy
villain, a femme fatale who devours male victims like a praying mantis. I
expected rage; what I didn’t expect was her willingness to hurt herself. Amy’s
aggressive behavior and her ability to manipulate the system hinges on how she
cuts, bleeds, tears at and otherwise desecrates her own body. 

I know, I know. Feminist champions of Gone Girl claim that Amy’s ability to play with the cookie-cutter roles
that women are cast in is somehow triumphant, but Amy’s self-inflicted wounds,
coupled with her meticulously constructed calendar, complete with yellow sticky
notes questioning whether now would be a good time to kill herself, struck me
as boring, rather than subversive. While male villains like Batman’s The Joker and American Psycho’s Patrick Bateman thrill
us as they play the role of sadists, female villains, even at their most evil
and vindictive, are still relegated to the role of masochists.

Just as horror films love to torture their female victims,
feminist films and literature are often obsessed with female debasement. We
watch brilliant 19th century women slowly deteriorate into insanity
in stories like “The Yellow Wallpaper.” We lament the smart, talented young
women who try to off themselves in Girl
Interrupted
. We watch Dove ads where rows of normal looking women shed
tears when talking about the pressure to have poreless skin and gaps between
their thighs. From Beyonce’s “Pretty Hurts” to the return to Twin Peaks and its obsession with the
tragic death of the young and beautiful Laura Palmer, what defines femininity
today is pain. The recently released short animated feature, “Sidewalk” by
Celia Bullwinkel, shows a girl’s journey to womanhood and old age, during which
she is always uncomfortable in her skin. She endures stares and whistles from
men as she enters puberty, the discomfort of pregnancy, the pressures faced on
older women’s bodies and, finally, the invisibility of old age. “Sidewalk” is
touted as a journey to “self-love,” but when the protagonist reaches old age
and helps a young girl walk along the same sidewalk, the mood is one of
resignation, rather than joy, the path to womanhood still presented as an
obstacle, rather than a pleasure.

This downtrodden story of what it means to be a woman is
just as limited a view of the female experience as the more cheerful,
empty-headed views of womanhood portrayed in such musical numbers as “I Enjoy
Being a Girl” from the 1958 musical Flower
Drum Song
and “How Lovely to Be a Woman” from the 1963 musical Bye Bye Birdie. Both songs feature a
young, beautiful woman enjoying her sexy new curves and newfound attention from
men. Certainly these songs, along with 80s and 90s jams like Cindi Lauper’s,
“Girls Just Want To Have Fun” and Shania Twain’s “I Feel Like A Woman” aren’t
particularly deep or challenging of gender norms, but at least their view of
the female experience is upbeat.

In contrast our modern day obsession with female suffering
is as much a throwback to earlier tropes, as it is a kind of pushback against a
consumer culture that claims that by purchasing the right product women can be
happy and free. Amy Dunne’s desire to disappear certainly fits this model. In her
now famous “Cool Girl” speech, she describes the social pressure on women to
fit into a man’s fantasy, at once inhabiting and also casting off the “Cool Girl”
persona in the process.

Perhaps Amy’s “Cool Girl” theory would have been more
meaningful to me had I thought that Amy was truly making a feminist manifesto
and wasn’t just angry that her husband was having an affair with a “younger,
bouncier Cool Girl.”  Throughout the
film, Amy is not only vicious to her philandering husband and other men who she
tortures using her feminine wiles; she is also equally hostile to women,
speaking ill of the “stupid” neighbor she tries to quickly befriend, and
throwing venomous barbs at the large-breasted student her husband is having an
affair with. Amy’s self-involved, beautiful, blond, white, trust fund brand of
feminism just rings tone deaf to me in a world where women of all colors,
creeds and classes are claiming the feminist mantle in the name of justice,
rather than a plea to “have it all.” Amy’s self-victimization presents feminism
as its worst possible caricature: one of spoiled rage and privilege, rather
than a very real call for women’s stories to be told and women’s voices to be
heard.

In this way, Gone
Girl’s
heroine is not reclaiming her identity when she stages her escape;
she’s just another in a long line of self-destructive women, obsessed with
finding ways to disappear completely.

Arielle Bernstein is
a writer living in Washington, DC. She teaches writing at American
University and also freelances. Her work has been published in
The
Millions, The Rumpus, St. Petersburg Review and The Ilanot Review. She
has been listed four times as a finalist in
Glimmer Train short story
contests
. She is currently writing her first book.

KICKING TELEVISION: Re-imagining the Sitcom

KICKING TELEVISION: Re-imagining the Sitcom

null

The sitcom is dead. Though we’re
continually told we’re living in the New Golden Age of Television, a quick
survey of the situational comedy landscape suggests that this is not the case.
After The Sopranos gave television
permission to tell stories in more cinematic and innovative ways, we have been
blessed with unparalleled artistry and achievement on its dramatic side. Breaking Bad, Lost, Mad Men, True Detective, The Walking Dead, Friday
Night Lights
and their brethren have treated audiences to heretofore-unseen
storytelling and production on the small screen. And yet on the comedy side,
we’re left with The Big Bang Theory,
capable if uninspiring television that is forgotten moments after the credits
roll.

It wasn’t that long ago that the
sitcom ruled the airwaves. In the ‘90s, Seinfeld
and Friends were not just the most
watched shows on TV—they were part of the cultural zeitgeist. Before that, Cheers and Roseanne reveled in blue-collar settings with grace and humour.
Their predecessors, like Maude and All in the Family, contributed to the
greater discourse, addressing societal change and issues beyond what TV had
discussed previously. The sitcom wasn’t just entertainment time-filler. It was
art.

And then came Chuck Lorre.

I’m certainly not blaming the
creator of Two and a Half Men and The Big Bang Theory for the demise of
the medium, but rather pointing to these productions as indicators of the
critical flaws in the sitcom. These shows lack ambition. The writing is
borrowed from episodes we’ve seen ad infintum. The characters are stock. The
format is flat. Consider the new sitcoms cancelled already this fall season: Bad Judge, A to Z, Manhattan Love Story,
and Selfie. There was nothing
memorable or exciting about them. There was nothing we haven’t seen before.
Bland versions of those same four shows have been rolled out each season,
pillaged from the pile of pilot season dreck. And even more bland versions will
be rolled out midseason.

There is some hope. In the
instances of a post-Seinfeld TV-scape
where the industry was ambitious, there has been success. The Office in its first few seasons was as funny and clever as
anything that has ever fit beneath the sitcom umbrella. Arrested Development was punished for its ingenuity, a victim of
poor scheduling and a network that failed to see its burgeoning cult status. It’s Always Sunny in Philadelphia was Seinfeld on crack, before it became It’s Always Sunny in Philadelphia on
crack and lost its way. Party Down
was imaginative and inventive, and yet its location on the upstart Starz network
and its micro-budget couldn’t maintain its momentum nor cast. Louie is more original than most, but it
limits itself and fails to step to far beyond the confines of the genre,
despite the overwhelming sensation that it wants to. Community was the one great hope. A show that satirized the genre,
that defied the tropes. But NBC did its best to kill it, and now its left with
a fraction of its original cast in the unknown wasteland of Yahoo TV, whatever
that is. But what we’re left with, what the industry trumpets as successful, is
Modern Family, a fading Parks and Rec, a middling Mindy Project, and a sea of forgettable
offerings that don’t resonate with the audience and don’t challenge the medium.

(You’re the Worst, as I
have previously written
, is absolute genius and exempt from this tirade.)

And that’s it, other than a few
episodes here and there and a cancelled-too-early show that had promise that
we’ll never see realized. Is Modern
Family
really the best sitcom the industry can offer, as the Emmy voters
would contend, or is it simply the most not incompetent? It’s overly celebrated
in a manner that proves my thesis: It is the best of a genre that doesn’t try;
it is inoffensive and forgettable. In reality, it’s a milquetoast offering that
offends no one and takes up twenty-two minutes of twelve million people’s
Wednesday night. It is not appointment viewing. It is not Must See TV. Quite
simply, it’s all that’s on.

So is the sitcom really dead, or
is it just on life support, in desperate need of a shot of adrenaline or
whiskey or Wes Anderson?

Writing, in any of its
incarnations, is simply about telling a story. At its best, it’s telling
stories in ways that are interesting. I don’t know if the Vassar MFA grads that
currently make up 80% of the sitcom writer pool are afraid to be progressive or
are just cursed with moderate talents, but it’s time the industry looked past a
writer’s room that couldn’t get an honest guffaw without a bag of shrooms and a
laugh track.

While television dramas have
mined external resources for auteurship, the sitcom has stayed with the
tried-and-tired formula of an unambitious rotation of series creators with
pilots directed by James Burrows. David Fincher (House of Cards), Frank Darabont (The Walking Dead), and Martin Scorsese (Boardwalk Empire) are just a few of the prominent filmmakers who
have made successful forays into serial storytelling on the small screen during
the unprecedented rise of the drama in the past decade or so. Nic Pizzolatto
was a celebrated novelist, a finalist for the Edgar and National Magazine awards,
an honourable mention for the Pushcart Prize, and the winner of the Prix du Premier Roman étranger, as well
as a creative writing professor, before True
Detective
took him out of the classroom and Barnes & Noble discount
bin.

And yet, in the sitcom world,
we’re still saddled with shows “from the creators of Suburgatory and According to
Jim
.” In an industry that loves to attempt to Xerox success, why has the
comedy side of television refused to learn from its drama cousins? Would we not
be interested to see what interesting and progressive comedic filmmakers could
do with a television comedy? This trend may be slowly beginning, with TV
projects forthcoming from Mark and Jay Duplass (Togetherness) and Jason Reitman (Casual). Wouldn’t you love to see what Anderson could do with the
medium? Nicholas Stoller? Lorene Scafaria? Jonathan Dayton and Valerie Faris?

And why be so shortsighted as to
stay within the Hollywood bubble? Did HBO’s moderate success with Bored to Death, from the celebrated
novelist Jonathan Ames, not prove that literary quality has a place on
television? I’d love to see a sitcom born of the mind of George Saunders, or Jennifer
Egan, or Irvine Welsh, or Chuck Klosterman, or Sam Lipsyte, or Sloane Crosley,
or Elna Baker, or… the list borders on infinite. At least their adaptation of Pygmalion (ahem, Selfie) would come from people who had actually read the book.

The limits of the contemporary in
TV are not confined to its writing. The entire production has become stale. Let’s
put up the fourth wall once and for all, and be done with the live studio
audience, shall we? I suppose the multi-camera sitcom was supposed to be the television
version of a play, but the genre has become tired. What was the last
multi-camera sitcom to be interesting or innovative? (And if your answer in any
way suggests a Chuck Lorre production, your punishment is to watch Mom and only Mom for eternity.) The last multi-cam sitcom of any significant cultural
value was likely Seinfeld, and it
went off the air in 1998. Since then, every September and February, networks
march out a slew of carbon copy multi-camera endeavours that are rarely funny,
never innovative, and suffer tremendously at the will of their tropes.

And the laugh track? How in the
name of the Charles brothers does the laugh track still exist? I think an
audience knows when to laugh without 240 tourists on the Warner Brothers lot
telling us for twenty-two minutes.

Twenty-two excruciating minutes.

Does anyone know why the sitcom
is only a half-hour (with commercials)? Why is comedy limited and tragedy
open-ended? Would you rather laugh for an hour or cry for an hour? And from a
purely budgetary standpoint, why do Mark Harmon and Jon Cryer make the same
amount of money per episode for the same mediocre and unimaginative drivel? If
comedic and dramatic films can be of similar length, who is to say that the
same can’t be done on television?

Beyond the temporal structure of
the sitcom, its aesthetic structure is in need of contemporization and
ambition. The industry has limited the genre to two options: multi-camera and
single camera. The worlds of sitcoms are confined, insulated. They exist on
three to five sets. They are painted in the same colours, shot with the same
filters, and staged as they were three generations ago.

The incredible six-minute-long
take from the episode "Who Goes There" of True Detective is an example of what the talents of an innovative
director like Cary Fukunaga can bring to the medium. Why can’t sitcoms be
visually inventive? We have seen glimpses of such inventiveness in Pushing Daisies and to a certain extent
in the aesthetic of Community, but
their absence elsewhere in television are tenable. Why the reluctance to push
boundaries and challenge formula the way dramatic television has?

The answer to most of these
questions is that the television industry is remarkably stubborn and unimaginative,
for a business that requires creative minds. But the ability of dramatic
television to evolve in the past decade suggests that comedic television could
do the same, if just given the chance. Cable and streaming television have
reinvigorated an industry once limited by the whims of the four major networks.
The exodus of talent from film to TV has proved that the small screen is not
limiting to artistic or material aspirations among the Hollywood elite.
Removing the antiquated reins from the sitcom would certainly produce a
defining new era of the medium, and no doubt reduce the amount of half-hours of
our lives ruled by Chuck Lorre.

Mike Spry is a writer, editor, and columnist who has written for The
Toronto Star, Maisonneuve, and The Smoking Jacket, among
others, and contributes to MTV’s
 PLAY
with AJ
. He is the author of the poetry collection JACK (Snare
Books, 2008) and
Bourbon & Eventide (Invisible Publishing, 2014), the short story collection Distillery Songs (Insomniac Press,
2011), and the co-author of
Cheap Throat: The Diary of a Locked-Out
Hockey Player
(Found Press,
2013).
Follow him on Twitter @mdspry.

METAMERICANA: Christopher Nolan’s INTERSTELLAR Offers Us a New Theory of Everything

METAMERICANA: Christopher Nolan’s INTERSTELLAR Offers Us a New Theory of Everything

nullScientists have recently claimed that a possible
“theory of everything,” an escape from our dreary four-dimensional reality, resides
in “M-theory,” an eleven-dimensional unification of all extant superstring
equations. As crazy as a mathematical maxim that resides in the eleventh
dimension may sound, M-theory is endorsed by renowned genius Stephen Hawking
and others of his ilk as a sort of universal codebreaker—what the alchemists of
old would have called the Philosopher’s Stone, and what religious people in all
periods have loosely thought of as God. If we presently feel bounded by our
limited understanding of the universe, M-theory would obliterate that sense of
imprisonment.

Simultaneously, poets have striven for a similar
escape, only through words. However, they have not kept pace with their
opposite numbers in the sciences. This is in part because they’ve come to
believe themselves mathematicians’ competitors. For the last forty years, the
most innovative Western poetry has been so layered and nuanced that it has
written itself out of all sociocultural coherence. Not only is it no longer a
counterweight to the intricacies of science, it no longer speaks to the great mass
of persons now living. The belief that innovative poetries must be every bit as
theorized and conceptually indecipherable as M-theory is to most of us has
guaranteed poetry a marginalized place in our collective consciousness, if that.

Christopher Nolan’s new film Interstellar, which addresses both science and poetry in
implicit and explicit ways, offers us a possible “theory of everything”—one
in which the simple beauties of art are conjoined with the complex mathematics
of science in a middle space between the two, with that middle space
corresponding to the pathway from our collective reality so many of us have
been seeking for so long.

That scientists have always looked to the stars
(literally) and higher dimensions (figuratively) for the key to unlocking all
we can’t access is no surprise; the notion that poets have been engaged in the
same task from the very beginning of art is perhaps a more controversial
submission. Don’t the best poets find timeless ways to drill down on individual
words and phrases and ideas, rather than creating and testing out entirely new
realities through new forms of speech? A cynic might say so, but French critic
and theoretician Jacques Derrida said differently: he imagined that speech and
the written word could transcend spacetime. Derrida suggested that language can
outlive both its author and its intended recipient, providing
a pathway to unanchoring language from its moorings in time and space. The
much-vaunted “death of the author” Derrida’s (and French theorist Roland
Barthes’) work eventually heralded in Western literature was intended as a
freeing of language, not its imprisonment. So those who study and perform the
capacities and incapacities of language have always, in their own way, been
reaching for the stars—even if the way they’ve gone about it of late is to
surround their work with such a volume of theory and abstraction that it looks
and sounds to most like quantum physics.

“Love is the only thing we can observe that transcends
space and time,” says astronaut Amelia Brand (Anne Hathaway) to Cooper
(Matthew McConaughey) in Interstellar, and as cornball as that sentiment
sounds out of context, it happens to be true. Though “love” is a term that
should by all rights require the presence of two entities—an author and an
intended recipient who are both necessary if interchangeable—in fact love often
survives the separation of entities by space and time. We continue to love
those who’ve left us, whether they’ve left us figuratively (by emotional
detachment), geographically (by distance in space), or literally and finally
(by dint of death). So maybe Interstellar is on to something. The film’s
suggestion that just as quantum physics now resides in the fifth and higher
dimensions, so too must the simple emotions both art and life invoke in us, is
less a play on our heartstrings than an actionable suggestion for living.
Perhaps art and science were intended to take dramatically different paths
toward the same conclusion, not so much because each can independently come up
with a satisfactory answer to the problem of everything but because the two
jointly just might. If many of this decade’s newest forms of innovative art
find ways to juxtapose polar opposites like sincerity and irony or cynicism and
optimism, perhaps they ought to add to those generatively contending forces art
and science. Perhaps art must be as different from science as it can possibly
be—while maintaining a common purpose—in order for it to fulfill its implicit
promise to the species.

For much of its lengthy run-time, Interstellar
is a slow and quiet movie, but once it picks up it amps up its melodrama. The
film’s elegantly simple visuals are finally matched by equally simple
sentiments that run the risk of mawkishness. Yet somehow the film always stays
on the right side of that line. Perhaps that’s because watching four astronauts
seek habitable planets in order to save the species—a species, in the
near-future world of Interstellar, starving from food shortages and
choking on unpredictable dustclouds—is not, actually, something we can detach
ourselves from sufficiently to smother it with our cynicism and irony. So the
film’s final solution to the problem of getting astronauts decades out into
space and then having them send helpful messages back to Earth—the idea that
love is to art what gravity is to science, i.e. transdimensional—seems less
like treacly wisdom and more like something today’s creative avant-garde would
do well to consider.

In the realm of the scientific, increasing degrees
of complexity are welcome so long as they’re intellectually solvent; in the
realm of art, perhaps increasing degrees of simplicity should be welcome as
long as they’re spiritually mimetic—that is, as long as they trace human
experience as faithfully as the tenets of physics do. The late great David
Foster Wallace once predicted that the next authentic literary avant-garde
wouldn’t need tenured boosters in the academy to sell it, or pedigreed authors
to write it, or a sufficiently jaded populace to read it, as in fact it would
endorse just the sort of “single-entendre principles” that already guide our
lives (however imperfectly). Though the means of their operation is frequently
hidden from us, our guiding stars as civic and creative beings are still basic
principles like courage, integrity, charity, empathy, grace, kindness, and
inquisitiveness. These are not ideas we need to shroud in the coded language of
theory to enact; in fact, as important as these ideas are to the contemporary
arts—every bit as important as unfathomably intricate equations are to quantum
physics—they require no steeping in elevated language to remain fully
operational.

The final thirty minutes of Interstellar are as
strange a cinematic experience as you’ll ever have, so strange an experience
that their logic at times seems beyond the grasp of anyone but a Hawking or the
equivalent. But in fact the emotional and creative logic of Interstellar
is every bit as simple as its science is complex. This doesn’t mean that its
emotional and creative logic is less advanced than its science; instead, it
merely reminds us that the boundaries we need to push in art are not
necessarily those of science, even as the two are collaborators (not
competitors) in the development of a theory of everything. Just as the new
science looks absolutely nothing like the old science, however much it builds
on the discoveries of mathematicians long dead, our new art will look (and
read) absolutely nothing like our old art, however much it couldn’t have been
produced without the countless generations of poets and other artists who
preceded it and who reached for transcendence and fell short. Show me a theory
of the avant-garde in art as easily spoken and easy to understand as M-theory
is beyond my grasp and I’ll show you a step forward in time our leading lights
in the arts have yet to take.

Seth Abramson is the author of five poetry collections, including two, Metamericana and DATA,
forthcoming in 2015 and 2016. Currently a doctoral candidate at
University of Wisconsin-Madison, he is also Series Co-Editor for
Best American Experimental Writing, whose next edition will be published by Wesleyan University Press in 2015.

METAMERICANA: BIRDMAN Is the IRON MAN Finale You’ve Been Waiting For

METAMERICANA: BIRDMAN Is the IRON MAN Finale You’ve Been Waiting For

null

Rumor has it that Robert Downey Jr. will appear in Iron Man 4—and probably Iron Man 5—but
surely that particular Marvel Cinematic Universe franchise, however
lucrative, has to end sometime. Or does it? Does anyone doubt that Iron Man 10 would still earn its studio backers a truckload of coin? Maybe the question isn’t how many Iron Man
sequels (and, soon enough, prequels) can be made, but how long
the superhero genre Iron Man epitomizes can be the toast of Hollywood.
While it sometimes takes much longer than it should, American art genres do
evolve over time, and sooner or later American arts will evolve such that playboy
anti-heroes with mechanical or innate superpowers will get left behind.
What Michael Keaton’s Birdman makes clear is that even the end of superheroism would be insufficient to end the relentless onslaught of Iron Man
vehicles. This metamodern period in America may tire of
its superheroes, but the real question is when or whether our
superheroes will tire of America. So we can imagine, sometime in 2030,
an Iron Man 11 in which Robert Downey Jr. plays Robert Downey Jr., the former "Iron Man" of ten Hollywood films by that name. Iron Man 11
would be a superhero movie for the Age we live in, a movie in which our
collective exhaustion with spectacle would be conjoined with our
collective boredom at the absence of spectacle; in Iron Man 11
Downey would play both himself and Iron Man simultaneously, and from
minute to minute we wouldn’t know what the point of distinguishing between
the two really was.
But maybe we won’t have to wait that long.
In Birdman,
Michael Keaton—who played Batman in 1989 and 1992 films featuring the
Caped Crusader—plays, more or less, Michael Keaton. Sure, the credits
say he’s playing "Riggan Thompson," a washed-up celebrity made famous by
playing "Birdman" in three superhero films, the most recent being
(ahem) a 1992 release, but anyone over thirty watching Birdman knows full well this is Keaton-as-Keaton—or at least
a lightly tweaked version of the Keaton we believe Michael Keaton to
be.
Say what you will, Michael Keaton’s career as an A-list actor basically ended with Batman Returns in 1992, a few roles (charitably, 1996’s Multiplicity and 1997’s Jackie Brown)
notwithstanding. So watching "Riggan Thompson" stage a self-written and
self-directed Raymond Carver adaptation on Broadway as a way of
"finally doing something honest" strikes about as close to home as it’s
supposed to. In other words, if Keaton’s fictional Thompson was the star
of Birdman, Birdman 2, and Birdman 3, this iteration of the Birdman franchise might as well be titled Birdman 4: Riggan’s Return, or Keaton’s Batman Returns Again, or, twenty years from now, Iron Man 11.
The film asks us to consider what happens when a
celebrity-cum-superhero tries to take off his Lycra jumpsuit, only to
find out that it can’t ever be taken off. Riggan is stuck as Birdman
both figuratively and literally, as playing the role has left Thompson
hearing the hectoring, hateful voice of "Birdman" in his head at all
hours of the day. He even believes himself capable of Birdman’s two
foremost powers: the power of telekinesis and the power of flight. (You
can probably see where that’s headed, though in the end Birdman surprises even on that score.)
But it’d be wrong to call Birdman
merely a "meta" superhero film, just as it would be wrong to call it—as
one might be tempted to do—a "meta" film about actors or a "metamodern"
film about how reality and fiction collide to form a higher order
experience that draws from both reality and fiction but is finally
neither. Like most films that try to capture a cultural moment in which
we’re equally attentive to, distracted by, enamored with, and
distrustful of all manmade stimuli—Birdman
doesn’t want to settle for being any one thing. Much like the Internet,
it has about forty messages it would like to deliver, and also like the
Internet, it would prefer to deliver them all at once.
The
film’s first message is that attention is power. At one point Riggan’s
daughter shows him a viral YouTube clip and says, "Believe it or not, this is power…" Birdman
submits that because attention in a fully networked world is in fact a
substantive good—it can briefly nourish the spirit of the sort of
temperamental, ego-surfing American our present Age has birthed—the
power that comes from being paid attention to is by no means an empty or
merely formal gesture. The second message Birdman
delivers is that admiration is not love, but because so many of us are
unable to make the distinction, it might as well be. A third message is
that choosing truth as an end-game isn’t the same as living truthfully. A
fourth message is that our eccentricities strengthen us in the long
term, but only by weakening us in the short term—thereby forcing a
confrontation, perhaps sooner than we’d like, with how unlivable our
eccentricities sometimes cause our lives to be. A fifth message we
encounter is that distinct artistic genres can never be confused for one
another, except when, paradoxically, they become one another—for instance, by making a film appear (as Birdman
is made to appear) to have been filmed the way a play is performed,
with a single tracking shot and in a single take. A sixth message is
that turning one’s faults into a narrative doesn’t bring one any closer to
transcending them, as all narrative is necessarily a reentrenchment of
archetypes rather than a recasting of terms. And yet another message
available to Birdman viewers is
that there’s a difference between popularity and prestige, between being
a celebrity and being an actor, between knowing how to interpret art
and knowing how to enact it.
There are several dozen more throughlines in Birdman,
all equally close to the surface of Riggan’s interactions with his
resentful attorney-cum-assistant (Zach Galifianakis); his resentful,
diva-like leading lady (Naomi Watts); his resentful, brooding, "purist"
male lead (Edward Norton); his resentful yet strangely hot-and-cold
ex-wife (Amy Ryan); and his stereotypically rebellious daughter—played
as a resentful sort of girl by Emma Stone. The point of all these
disparate messages—some internally contradictory, some merely
contradictory to one another—is that they be delivered all at once, in
an onrushing cacophony, making Birdman
at once a terminal superhero flick, a black comedy about celebrity, and
a metafiction about cross-genre acts of creative narration. It’s a
credit to its terrific ensemble cast that Birdman is a superlative example of each of these cinematic subgenres. 
It used to be the case that someone would say to you, "If you like Butch Cassidy and the Sundance Kid, you’ll love The Assassination of Jesse James by the Coward Robert Ford."
In other words, if you like a genre you like a genre, or if you like a
movie you’ll also like its tangentially related contemporary update. Now,
a moviegoer is more likely to hear, "If you’re excited about the
upcoming, fourth-wall-breaking Deadpool movie, you’ll like Birdman; also, if you like the hardcore "meta" bent of Philip Seymour Hoffman’s Synecdoche, New York;
also, if you admire Michael Keaton’s ineffable, decade-spanning ability
to play Michael Keaton; also…" and so on. It’s fair to say, that is,
that the films that speak most effectively into and out of America in
2014 are those that give us everything we want all at once and with no
clear direction on what to do with it all. Not coincidentally, that’s
exactly how one might feel after having just been granted a superpower; or
having been granted an elongated career in Hollywood; or having just
been made the parent of a someday-to-be resentful child; or–and perhaps
this is really the point–having just been born into our collective
four-dimensional reality as a human. When we say a film is "metamodern,"
as we must certainly say of Birdman,
we are saying that it enacts the joining of Art and Life, or artifice
and authenticity, that all of us inherit merely by virtue of being
alive—and that it performs this elegant symphony of contradictions
without offering us any interpretation or any hope of reducing our
experience to a series of helpfully labeled micro-philosophies.
In
the end, Keaton-Riggan-Birdman gets his heart’s desire, or maybe he does;
embraces the hybridity of his self-identity, or maybe he does; makes good on the
promise of his natural talent, or maybe he does. He looks, in other words, the
way all of us do from the great height of higher dimensions of space and
time: like a simultaneously perfected and imperfect philosophical
vehicle who still has to put his Lycra jumpsuit on one leg at a
time.

Seth Abramson is the author of five poetry collections, including two, Metamericana and DATA,
forthcoming in 2015 and 2016. Currently a doctoral candidate at
University of Wisconsin-Madison, he is also Series Co-Editor for
Best American Experimental Writing, whose next edition will be published by Wesleyan University Press in 2015.

ARIELLE BERNSTEIN: PARKS AND RECREATION, A Feminist Utopia

ARIELLE BERNSTEIN: PARKS AND RECREATION, A Feminist Utopia

nullAt the end of the penultimate season of Parks and Recreation, our heroine Leslie Knope gets everything—the
man, the kids, the high profile job. She even manages to move her new position right smack into the middle of her beloved hometown Pawnee.
Though the finale of the most recent season was wrapped up in a very pretty bow, it still felt genuinely satisfying, as well as genuinely
subversive. In a world where the T.V. show Girls
portrays sex and romance as empty and unsatisfying for its female leads, and
heroines in shows from Game of Thrones
to American Horror Story navigate a
landscape where sexism is rampant and men are often depicted as deeply
misogynistic, Leslie Knope’s triumphant success felt like a kind of joyful
respite and relief from a terrifying and cruel world.

One of the reasons Parks
and Recreation
has succeeded as a feminist T.V. show is not simply because
the female characters have remained funny, dynamic, ambitious, unique and
interesting, but also because the show succeeds at presenting male characters
that are equal parts strong, vulnerable, silly and staunch advocates for the
rights and successes of female characters throughout the series. Male and
female characters in Parks and Recreation
actively root for one another, rather than tearing each other down. Ben and
Lesley’s marriage is a model of egalitarianism; April and Andy’s young, silly
love is presented as a string of silly, ridiculous games and make out parties,
with each character deeply invested in helping the other grow. Even Ron
Swanson, staunch individualist and rugged he-man, is distinguished throughout
the series by his commitment to women’s rights. By the end of the series he is
a proud dad and loving husband, all without having to give up his signature “strong
silent type” brand of masculinity. Ron’s appreciation of feminism doesn’t
diminish his hatred of vegans, or devotion to woodworking—it simply makes him a
much more interesting and funny character.

Many of the current debates about female representation
onscreen are about granting female protagonists access to male spaces. We saw
this in the 80s and 90s when there was a proliferation of women as warrior
motifs from Xena the Warrior Princess
to Buffy the Vampire Slayer.
Recently, a slew of articles have called for women to have access to the joys
and pitfalls of the antihero world as well, as we praise the glorious brutality
of Orange is the New Black’s character
Vee and go to theaters in droves to see the icy villainy of Gone Girl. 

Heroines today are more diverse and complex than ever
before, yet few serious dramas that feature a cast of strong female characters
showcase romantic relationships that are genuinely egalitarian, the way we see
romance unfold in Parks and Recreation.
Often, female protagonists who are strong and willful are presented as
rejecting male romantic interest. In modern Disney Princess films like Brave, the heroine often makes a big
deal about not needing a man or romantic partner. In some films, like The Hunger Games, the romantic scripts
are flipped and male romantic interests are portrayed as doting, helpful and encouraging
mates. In truth, while many
cluck their tongues
at the unhealthy dynamics presented in teen romances
like Twilight, and their adult
equivalents Fifty Shades of Grey, one
of the pleasures of both these series is the positioning of boys and men as
being the objects of desire, even if the female protagonists within these
worlds aren’t particularly interesting in and of themselves.

The suggestion that strong female characters are the sole
hallmarks of feminist media may simply not be setting the bar high enough. In
order to really dismantle the patriarchy we need to see more varied
presentations of men. This is not to say that we should do away with the
douchey bros, bullies and alpha assholes that have become a mainstay in popular
media. Complex villains are fascinating, but excellent dramas like Breaking Bad, Mad Men, American Horror
Story
and The Walking Dead too
often pit men and women against each other, as if one gender’s success is
another’s loss.

The great T.V. dramas of today are about creating immersive
fantasies where we are transported to different times, places and worlds. The
adherence, then, to the narrative that men and women are consistently at odds
with one another is not about portraying a kind of gritty realism; it’s about
perpetuating the status quo and limiting our imagination about the
possibilities for a feminist future. I’d like to see a media landscape that
acknowledges the changing roles of men and women with greater nuance and
compassion, and also recognizes that there are many men today who are
incredibly happy to be living in a world where they aren’t shackled to one
particular model of male strength. Parks
and Recreation’s
greatest feminist success is not simply that the heroine
is allowed to “have it all” but in creating a world where male and female
characters are equally one another’s allies.

Arielle Bernstein is
a writer living in Washington, DC. She teaches writing at American
University and also freelances. Her work has been published in
The
Millions, The Rumpus, St. Petersburg Review and The Ilanot Review. She
has been listed four times as a finalist in
Glimmer Train short story
contests
. She is currently writing her first book.

Children in Horror Films: The Kids Are Not Alright

Children in Horror Films: The Kids Are Not Alright

nullYour life is going along normally, and then it happens: you
or someone you love suddenly finds that something is growing inside, a life
form that feeds.  In the morning you are
nauseous, and as it grows you shift uncomfortably through the night, struggling
to sleep.  You feel it moving inside you,
shifting, kicking, altering your moods, influencing your thoughts.  And then, finally, after nine long months, it
wants to get out.  You want this too,
desperately, but the emergence is violent, excruciating, prolonged.  After what seems like days of pain, you hear
a cry, a wail, from the just opened mouth of a being who is at once a part of
you and utterly alien.  As it grows, it
begins to do things, without warning, without discernible motive: knocking a
juice glass off the table, pulling kitty’s tail, hitting another child on the
playground.  When scolded, it cries, and
you feel guilty for being so harsh, until later, with little provocation, it
breaks into a fit of rage, screaming, kicking. 
Gradually, you become isolated, a veil drawn between you and the friends
you used to see; you sleep poorly, awakened by cries; even your spouse seems
far away, separated by the daily and nightly routine of caring for the
child who has taken over your life.

Is it any wonder there are so many horror movies about
children?  We regard children with pious
adoration, yet lurking just beneath this reverence is a sense of dread, an
awareness of how little we really know about our kids.  And for every family sitcom or melodrama
celebrating the wonders of parenting and childhood, there is a horror film that
dwells in the child’s dark shadow.

The mainstream rebirth of the horror film in the seventies
happened through a child.  Linda Blair’s
uncanny performance as the possessed twelve-year-old Regan McNeil in The Exorcist (1973) remains one of the
iconic moments in the genre’s history, making audiences squirm as they watch
a loving daughter turn into a bile-spewing monster.  The transformation is so horrifying because
we first experience the parental love of Ellyn Burstyn’s Chris through
touchingly candid moments of mother-daughter laughing and cuddling.  As Regan is taken through a nightmarish
battery of painful tests to discover why her personality is changing, we
experience these horrors from both mother’s and daughter’s perspective.  Yet when Regan goes entirely over to the dark
side, she becomes another being altogether, one that we have only glimpsed in
isolated moments.  Although the tale is
one of demonic possession, it works because we have all seen such isolated moments
of uncanny child behavior—talking to no one, staring into the distance,
inexplicable bursts of anger—and wondered what it meant.

Before The Exorcist
there was Rosemary’s Baby (1966),
which focused on how a child can take one’s life over even before it’s
born.  Roman Polanski’s vision is a
powerfully feminist one, as the narrative focuses on the ways in which a
woman’s body can be appropriated by men. 
John Cassevete’s Guy Woodhouse essentially sells his wife’s womb to the
devil in exchange for a boost in his acting career, and while the supernatural
element is strong, his betrayal serves as a metaphor for all of the selfish
reasons men might have for wanting children—either for public prestige or for
want of an heir, a kind of immortality. 
As Mia Farrow’s Rosemary grows increasingly ill, however, we enter the
special hell that for some women is the experience of pregnancy.  Stymied at every turn as she seeks personal
and professional help, the film frustrates our and Rosemary’s need to discover
whether her fears are real or only in her head. 
Yet when she finally discovers the truth, Rosemary’s acceptance of her
child is at once touching and repulsive, and we are left with the feeling that
the mother-child bond is something unknowable, uncanny.

Larry Cohen’s masterpiece It’s Alive follows a similar arc, as Frank and Lenore Davis are
initially repulsed by, but gradually learn to love, their monstrous
progeny.  The film begins with one of the
most horrifying portrayals of childbirth ever filmed, with a delivery room strewn
with gore, and as the fanged, clawed child escapes, the body count grows.  Desperate for sustenance, he attacks a
milkman, feeding on fresh meat along with that more traditional baby food, milk. Indeed, a
stream of milk and blood flows from the delivery truck, a raw image of the fluids with which mothers have always sustained their children.  Yet somehow out of these horrors comes love. As Frank comes to understand and even embrace the creature he produced, the
film miraculously transforms into a moving meditation on the strange powers of
parental affection.

For the record, I must confess that I have never had such
feelings. My wife and I remain happily childless, and have no urge to change
that.  The topic came up when we got to
the ticket counter to see George Ratliffe’s criminally-underrated Joshua (2007), and the usher asked, “Um,
do you have kids?”  “No,” I replied.  “Are you thinking of ever having kids?”
“Definitely not,” said my wife, laughing. 
The usher smiled and said, “Then you’re going to love this film!”  And certainly nothing I have seen better
expresses all of the reasons one might not want to have a child.  Vera Farmiga gives a magnificent performance
as a mother who tries, but fails, to love creepy son Joshua.  As she nurses their second child, a girl, the
older boy’s behavior grows increasingly strange, as he asks questions about
embalming techniques and hovers around his baby sister’s crib in the dark.  Sensing his parents’ growing fear of him, he
digs out old videotapes of his childhood, and discovers that as a baby he
nearly drove his mother insane with his incessant crying and screaming.  As his behavior grows more disturbing, father
Sam Rockwell begins to unravel, and Joshua knows just how to push him over the
edge without incurring any blame.

The precocious monster theme is fairly prevalent in child
horror films, but most compelling is the apocalyptic subgenre that imagines
such children taking over the world. 
Perhaps the best example is Village
of the Damned
(1960), which manages to conjure a fully realized alternate
world of dread in merely 77 minutes. 
Everyone in the quiet English village of Midwich simultaneously falls
asleep, after which unexplained event all of the women of childbearing age
discover themselves to be pregnant. 
Later, they all give birth on the same day, to children with golden eyes
and pale blonde hair, somewhere between alien humanoids and Nazi youth.  Their uncanny mental powers place them in the
realm of science fiction, yet the fears they evoke—of our growing obsolescence
and eventual replacement by a new generation, better adapted to a changing
world—are very real.

More subtle and ultimately more troubling is the
slow-creeping apocalypse imagined in the Spanish horror film, Who Can Kill a Child? (1976), in which a
young English couple on vacation make the mistake of visiting an island where
the children have violently seized power from the adults.  Director Narciso Ibáñez Serrador cunningly
opens the film with a montage of black and white photographs of child-victims
of war, juxtaposed with data recording the number of children fallen victim to
the world’s major conflicts.  Although we
share the English couple’s horrified point of view as they struggle to survive
against the malicious onslaught of a new breed of children, we have also been
shown how little right we have to the sovereign power of adulthood.  The film lays bare the naked self-interest
and condescension that lies beneath our sentimental reverence of childhood and
self-aggrandizement of parenthood, as we discover that the real answer to the
question asked by the film’s title is: adults.

Jed Mayer is an Associate Professor of English at the State University of New York, New Paltz.

Why Alex Ross Perry’s LISTEN UP PHILIP Is the Kindest Movie You’ll See All Year

Why Alex Ross Perry’s LISTEN UP PHILIP Is the Kindest Movie You’ll See All Year

nullAlex Ross Perry’s LISTEN UP PHILIP, besides featuring Jason Schwartzman’s best acting job and wrestling remarkable turns from Jonathan Pryce and Elizabeth Moss, performs an act of kindness for its viewers. This tale of an abusive, alienated, successful novelist’s spiral into loneliness lays out, in excruciating detail, the relationship between cause and effect that can govern the shape a human life takes. In showing us, painfully clearly, the results of novelist Philip Lewis Friedman’s poor behavior, both within his own life and in the reactions of those around him, Perry advocates strongly against such behavior, making his film the equivalent of watching a Biblical punishment unfold on film. The critical reception has focused almost entirely on Philip’s meanness, and the entertainment value therein, and not on why such a story might be told. Philip’s behavior is not, in fact, the most interesting part of the film–there is no novelty in the idea of a cruel, clever writer. That story’s been told, many times, and without such a shaky camera. There is, however, a great deal of novelty and originality in holding that cruel clever writer accountable, at length, and in so doing, prodding at viewers’ consciences. The play’s the thing, after all.

This reviewer will confess that it is a great relief to see Schwartzman out from under the thumb of Wes Anderson’s coddling genius. So deft and believable is his performance as Philip that I hated practically every nasty word that came out of his mouth. I disliked his smarmy smile. I found his walk annoyingly stridant. I was aghast that his girlfriend, played with reserve and likable cool by Moss, might find herself, for even one second, happy in his presence–unless her character was, in fact, akin to his. At some points, I hated his chin. When Philip discloses, in an intimate moment, that his parents died when he was young, and describes that as the source of "sadness," I will confess to thinking, "Cry me a river, you stupid, pathetic cliche. Are you even telling the truth?" In any event, what of the story being told here? It’s a simple one. Philip decides, upon the release of his second book, to forgo all tours or publicity, choosing instead to go upstate and lick the boots of Ike Zimmerman, a well-established and successful novelist who is Philip’s elder spiritual doppelganger: blunt, anti-social, manipulative, in search of the perfect quip at all times, vigorously dismissive. And alienated from his daughter, who, while not exactly a charmer herself, has a few beautifully executed moments of pain at Zimmerman’s hands. In so retreating to the country, Philip lands himself an adjunct teaching position–which most holders of such positions would chuckle at, given that it’s a cruel hand dealt upon Philip; such jobs are generally unglamorous, poorly paid, uninsured, and short-lived. As circumstances prove true to that latter characteristic, Philip makes no friends and finds himself bounced from his position, nevertheless managing to charm a French colleague whose initial action upon meeting him was to persuade all of his colleagues to dislike him. Throughout the film’s miserable sojourn, Philip is told off numerous times, by people from various walks of life, including a former college roommate who calls him a "Jew bastard" and a former girlfriend who responds to his request for a kiss by running away. The sad part, but the part which is the root of the film’s charity: Philip has it coming. He is arrogant towards his students in the face of open worship; he treats his agent badly (and is called an "asshole" for it); when he learns that a journalist who was supposed to intervew him committed suicide, he pines that it would have been a great piece for him. These moments of cruelty have some entertainment value, but for anyone who’s known a lot of writers, they’re unremarkable, since most writers know that, from the time of James Joyce onwards, the capacity for cruelty in literary sorts is as bottomless as the River Lethe. What’s remarkable here is what happens. And what is that? Well, Philip happens. In our last sighting of him, we see him walking down a crowded street, carrying a box of his belongings, alone, bereft of his former girfriend, who wouldn’t even open the door for him; the suggestion is that he’s walking towards more of the same. Are these his just desserts? Does he deserve to be this alone, to have all these people shouting at him, to be patronizd by a writer he worships, to be shown such anger by those around him? Yes, he does. If you have to ask why, then perhaos you should watch the movie again.

American culture, it must be understood, generally congratulates selfishness. It’s not typically seen as such, this quality, but it manifests itself that way. Slavish attention to career advancement, fierce competition with others, establishment of political alliances solely for the purpose of said advancement, dismissal of people, things, and ideas lying outside of one’s world view: these actions will, typically, make one successful and content in the world at large. The better car, the better phone, the better TV set, the better shirt, the better face: these things matter. Celebrity homes, celebrity surgeries, celebrity photos, celebrity "selfies," celebrity photo leaks: these things matter as well, perhaps more than we even think. The impact on human behavior of the absorption of these values is insidious. Talking becomes less important; a phone call becomes a rarer and rarer thing; and a handwritten letter? Forget it. The self is all. And if, one day, there’s a shooting in a mall, or a school, we cry mental illness, when in fact what we mean is national illness. It’s doubtful that Perry, in telling this story–and an old-fashioned story it is, with plenty of contrasting motivations, an antagonist, a protagonist, a climax, and a resolution (though perhaps antagonist and protagonist) have switched costumes here–intended it to be a fable, with a clear moral. It’s a character portrait, after all, an experiment as such, to see what happens if, instead of ignoring callousness and accepting it, we hold it up to a "hard Sophoclean light." The experiment, as conducted, performs a valuable service, providing a cutaway, of sorts, into a human psyche in the process of decay, or hardening; the cutaway is explicit, and gory, and eye-opening about the potential rebound effects of cruelty. It could be said that such a cutaway speaks out strongly in favor of kindness, of the opposite of Philip’s behavior. Beyond this, though, in the manner of all good experiments, Listen Up Philip points a way forward: towards different movies about writers, and perhaps different films about people, in which we take a good look at characters’ flaws and virtues, instead of waiting for them to sprout wings or replace their microchips. One might then hope that, as time passes, life might come to imitate art.

Max Winter is the Editor of Press Play.

A NEW COLUMN BY MIKE SPRY: KICKING TELEVISION: It’s Time to Bring Back The Muppets. Again.

A NEW COLUMN BY MIKE SPRY: KICKING TELEVISION: It’s Time to Bring Back The Muppets. Again.

nullWhen I was a kid, there were few things I enjoyed as much
as the Muppets. The worlds created by Jim Henson dominated and cultivated my
childhood. Sesame Street, Fraggle Rock, and all things Muppet were
my earliest, fondest memories of entertainment. My mother had read an article
in the late ‘70s that claimed children should be limited to no more than an
hour-and-a-half of television per day–so most of the TV my sister and I were
allowed to consume involved Henson. Despite my parents’ insistence on the
dangers of television then, there has always been a virtue to Henson’s
productions. Sesame Street taught you
about the number 7, the letter M, what it was like to live on the Upper West
Side, and unrequited love. Fraggle Rock extended
one’s imagination, taught us about issues of class, and radishes, and
unrequited love. The Muppet Show brought
us into the realm of the subversive, prepared our young minds for Saturday Night Live, reveled happily in
absurdity and slapstick, and taught, of course, the lessons of unrequited love.
The Muppet Show was the star of them
all, the crown jewel of the Henson universe. And given the current sad
landscape of programming for kids, it’s
time to play the music, it’s time to light the lights, it’s time to meet the
Muppets on The Muppet Show tonight
. Again.

It’s time to reboot The
Muppet Show
.

For the most part I couldn’t give a flying fish about
television for kids. I don’t have kids, don’t really understand the desire to
have kids, doubt that unconditional love could be any more thrilling than clean
towels, and I think children should be unseen and unheard until they’re old
enough to watch and disseminate Breaking
Bad
. But my sister has two kids and offers a wealth of opportunities for
unpaid babysitting internships, and so I’ve found myself, over the past eight
years, confronted by what passes for televised entertainment for children. And
it’s god-awful. What the hell are Wiggles? Isn’t a sponge in someone’s pants
counterintuitive? Why does Lego suddenly talk? In an infinite channel universe,
there’s nothing on (except the timeless Sesame
Street
) that challenges, entertains and does not insult children, while
maintaining a subversive adult narrative and humor for Disney Channel-weary
parents and uncles.

What made, and makes, the Muppets such an enduring and
iconic part of the cultural landscape is their ability to treat children like
adults while allowing adults to be children. As a kid, “The Swedish Chef” is a
funny-looking mustachioed foreigner speaking gibberish and making a mess. It’s
hilarious. Pee-inducing. To an adult, the show is a perfect satire of the
cooking shows and inane cooking segments on The
Today Show
and its talk-formula brethren. Also pee-inducing. “Pigs in Space”
to a child’s eyes is a bunch of talking pigs being silly, superfluous, insane.
Those of us past our adolescence recognize it as a parody of Star Trek, Lost in Space, and early sci-fi. Kids don’t care that Dr. Julius
Strangepork is a reference to Dr. Strangelove, but its inclusion doesn’t
counter their enjoyment of the sketch, and provides safe passage for adult viewing.
The list of clever, funny, and remarkably well-written and well-crafted
sketches is endless. The intelligent and hilarious satire raised the level of
the show beyond the condescending time-filler programming that infects present
day children’s television, pandering nonsense which serves only as a virtual
babysitter, absent of form or substance.

Furthermore, The
Muppet Show
borrowed from variety shows of the era like SNL by having guest stars that were
unknown to children but comforting to adults, giving them permission to watch
the show even in the absence of children. And though kids didn’t know who
Johnny Cash or Elton John or John Cleese were, the guest stars’ participation
in the program slowly introduced youngsters to a grander cultural discourse.
The contemporary equivalent of this would be celebrities lending their voices
to animated TV shows or films. But in this manner they are rarely themselves,
and are included in order to increase ratings or box office revenues, not to
present a production that respects a cross-generational demographic.

The Muppets are the property of the Walt Disney Company,
currently charged with the task of reviving the Star Wars franchise. Their return to the big screen, successfully,
suggests that a revival of the seminal variety show is not without merit or
possibility. The Jason Segel-Nicholas Stoller-led The Muppets re-invigorated the franchise in 2011 (after a long
stretch of poorly conceived, straight-to-video releases) by employing the
elements of clever satire, well-placed cameos, and musical theatrics that made
the show (and films) so successful. The film commented on the folly of reality
TV, the economic disparities of the day, and the tropes of romantic comedies.
The soundtrack was playful and accomplished, and appealing to both children and
adults. Every generation can appreciate a puppet barber shop quartet covering
“Smells Like teen Spirit”. Its follow-up, the less successful commercially but
equally endearing Muppets Most Wanted,
solidified the Muppets as a viable entity for the studio in which to invest and
returned them with prominence to the cultural zeitgeist. So why not revisit the
production that started it all?

The Muppet Show was
revived briefly by ABC in 1996 as Muppets
Tonight
, but failed to attract enthusiastic audiences. My memory of the
show is that I have no memory of the show, which speaks volumes as to its
failure. But the landscape of television has changed drastically since then.
The medium is more intelligent, more ambitious, and has far more outlets than
ever before. A venue like Netflix, tailor-made for parents to provide viewing
entertainment and respite on their own schedules, would be perfect for a
rebooted Muppet Show. Two generations
have had to withstand the inanities of the Teletubbies,
Dora the Explorer, and Barney, programming that is nothing but
refined sugar and starch and shows contempt for tired adults. 

In one of my earliest experiences with my niece and nephew
left in my charge, we watched the 2011 The
Muppets
. Admittedly, I was rather nervous. What if they didn’t like it,
didn’t get it, didn’t want to finish watching it? Even worse, what if I didn’t like it? George Lucas had
broken my generation’s heart in 1999 with poorly conceived Star Wars Episode I: The Phantom Menace, and broke it two more
times over three summers. Lucas did it again when he produced the Indiana Jones
film whose name shall not be mentioned. There was reason for skepticism. But The Muppets exceeded my expectations,
and my niece and nephew and I have watched it together too many times to count,
singing along, reveling in the wonder of its genius and that of the Jim Henson
universe. Pretty soon they’ll be too old for the Muppets, having reached that
strange period known as adolescence, puberty, when you hate everything. The
promise of a rebooted Muppet Show
would extend this connection we have. Hell, it may even encourage me to have my
own kids.

 


Mike Spry is a writer, editor, and columnist who has written for The
Toronto Star, Maisonneuve, and The Smoking Jacket, among
others, and contributes to MTV’s
 PLAY
with AJ
. He is the author of the poetry collection JACK (Snare
Books, 2008) and
Bourbon & Eventide (Invisible Publishing, 2014), the short story collection Distillery Songs (Insomniac Press,
2011), and the co-author of
Cheap Throat: The Diary of a Locked-Out
Hockey Player
(Found Press,
2013).
Follow him on Twitter @mdspry.

METAMERICANA: Hawkeye, Normcore Avenger: A (Mellow) Revolution from Marvel Comics’ Matt Fraction and David Aja

METAMERICANA: Hawkeye, Normcore Avenger: A (Mellow) Revolution from Marvel Comics

nullSo-called “normcore fashion,” a bizarre combination of countercultural
radicalism and bourgeois complacency, is the only way anyone has found thus far
to re-envision mainstream culture as avant-garde. In normcore culture,
twenty-something hipsters who have already established their countercultural
bona fides by dressing in the uniform of their kind for years (think
thick-rimmed glasses, skinny jeans, sportcoats, bow ties, and brogues) turn
these customs on their head by returning to the white, upper-middle class clothing
stores of their youth. Thus, a herd of excruciatingly self-aware young people seems
to dress like either their parents or their suburban peers, and outside
observers are none the wiser about their intentions. Normcore is ironic to
those who know it when they see it, and painfully earnest to those who see
someone wearing clothes from The Gap or Abercrombie & Fitch and assume it’s
the result of thoughtlessness rather than design. Of course, the more generous
view of normcore suggests that those who subscribe to its fashion wing simply
no longer wish to be distinguished from others on the basis of their attire.
Better, then, to say that the wearing of jeans and tee shirts by normcore
aficionados is merely a “detached and knowing” decision, and not necessarily an
“ironic” one. But what happens to our hipster calculus when normcore culture
goes supernatural?

Superheroes are the hipsters of English-language graphic novels: discernible
almost immediately by their accoutrements, superheros may want to be like you
and me (hence, secret identities) but before long are sure to do something—lift
a car, shoot an eye-beam—that places them outside mainstream culture. They can’t
help themselves. And millions of us read about their exploits in comic books
because we, too, can’t help ourselves. Following the adventures of costumed
counter-culturists is the nerdy equivalent of sitting on a park bench
people-watching in the Williamsburg section of Brooklyn. Which is why, when
comic book writer Matt Fraction and artist David Aja decided to portray the
least-popular Avenger, Hawkeye, not as a bow-wielding badass but an
unremarkable, hoodie-wearing bro hanging around his apartment, it felt—to those
of us who enjoy comic books but are tired of their poor writing, cinema-ready
plotlines, and cutout characters—like something of a revolution.

Fraction and Aja’s Hawkeye depicts its titular character in his
traditional (at least since the Avengers movie) purple and black get-up on the
cover of its first two paperback collected editions. In both cases, “Hawkeye”/Clint
Barton—Iowan; former carnie; superpower-less master archer—is carrying his
trademark bow. It’s an intentional misdirection, as in the pages of My Life
As a Weapon
(collecting Hawkeye #1-5 and Young Avengers Presents
#6) and Little Hits (collecting Hawkeye #6-11) Hawkeye rarely
uses his bow and is almost never in his Avengers uniform. Instead, he putters
about his Bed-Stuy apartment and does, well, not very much. A breakdown of the
early issues:

Hawkeye #1: Hawkeye recuperates in a hospital, adopts a dog, attends a
neighborhood barbecue, helps a single mom avoid eviction, and buys his
apartment building so he can become its landlord.

Hawkeye #2: Hawkeye practices shooting his bow, attends a gala event,
stops a gang of petty thieves (but in a tux), and has a long phone call with a
young female protégé who has a crush on him.

Hawkeye #3: Hawkeye organizes his arrows, buys a new car, sleeps with a
stranger, and fights off some heavies hired by a slumlord who wants Clint’s
apartment building back.

Hawkeye #4: Hawkeye attends a neighborhood barbecue, gets interviewed by
the Avengers, travels to the Middle East, has his wallet stolen in a taxi, and
attempts to buy at auction an item that could destroy his reputation if it
falls into the wrong hands.

And so on. Clint virtually never gets into uniform, virtually never faces a
super-villain, never uses any superpowers, and views any excitement he
experiences as a distraction from what he really wants to be doing: hanging out
with his neighbors at rooftop barbecues and petting his adopted dog (“Pizza
Dog,” so named because this iteration of Clint Barton isn’t very witty, either,
so he simply names his dog after the mutt’s favorite food). In Little Hits,
the second Hawkeye paperback collected, the low-key vibe continues, and
if anything is doubled down upon by Fraction and Aja:

Hawkeye #6: Hawkeye sets up his stereo system, saves the world from a
terrorist organization (presented, however, via just a two-page pictorial
summary), argues with the maintenance man at his apartment building, attends a
neighborhood barbecue, fights off some slumlord heavies, watches TV, and
considers going on vacation.

Hawkeye #7: Hawkeye helps a neighbor move during a hurricane, and later
rescues him from drowning in his new basement. Hawkeye’s protégé Kate Bishop
attends a wedding, goes to a pharmacy, and stops a robbery in progress.

Hawkeye #8: Hawkeye deals with a new (and crappy) romantic relationship,
tries to fight slumlord heavies but ends up in jail, and complains about his
new girlfriend messing up his comic book collection.

While the news recently came down that the Fraction/Aja Hawkeye series
will come to a close with issue #22, the fact remains that this writer-artist
duo has given us an entirely new way of thinking about not just comic books but
ourselves. There are a number of things Hawkeye does in this series that no one
without superlative archery skills could do. However, these acts of heroism are
overwhelmed in both number and vividness by the roster of things Clint Barton
does in Bed-Stuy that nearly any of us could do: make an effort to meet
and befriend our neighbors; help someone move or avoid eviction; finally unpack
our boxes and set up our new apartment; adopt a stray; or make a property
investment with an eye toward making the lives of others a little less bleak.
There’s nothing preachy about Hawkeye, however—it can’t be said that
Fraction and Aja have any evident interest in making us all better people. What
they want, I think, is no more than what Barton himself wants, and what, if we
go back into the annals of Western literature, David Copperfield once wanted:
to be the hero of our own life stories, whatever banalities and unremarkable
tribulations those stories will so often, inevitably, entail.
 

In other
words, Fraction and Aja have somehow captured the temperament of our Age:
neither naively fixated on the possibility of heroism nor (anymore) captivated
by anti-heroes. The earnestness of the conventional superhero has begun to irk
us, but so too, however slowly, has an unwilling and unlikely hero like
Deadpool, a mercenary whose running commentary on his own antics—droll,
fourth-wall-breaking—is steeped in petulant cynicism. In an ongoing tug-of-war
that mirrors what’s happening now in video games (cf. “#gamergate”), there’s a
divide between those who want a comic book that simply “plays well”—meaning, it
touches all the usual plot, tight-pant, and monologing-baddie bases—and one
that is reflexive enough about its aesthetics and ambitious enough about its
aims to qualify as Art. Fraction and Aja have given us a comic book series that’s
decidedly in the middle in all particulars—even its interior art is somehow,
despite its stylishness, understated—and in doing so find a sweet spot that’s
exactly where most of us already live. This new Clint Barton is neither a hero
nor an anti-hero, he’s simply . . . unremarkable. Which makes him as
remarkable a superhero as we’ve seen in a very, very long time.

Seth Abramson is the author of three collections of poetry, most recently Thievery (University of Akron Press, 2013). He has published work in numerous magazines and anthologies, including Best New Poets, American Poetry Review, Boston Review, New American Writing, Colorado Review, Denver Quarterly, and The Southern Review.
A graduate of Dartmouth College, Harvard Law School, and the Iowa
Writers’ Workshop, he was a public defender from 2001 to 2007 and is
presently a doctoral candidate in English Literature at University of
Wisconsin-Madison. He runs a contemporary poetry review series for
The Huffington Post and has covered graduate creative writing programs for Poets & Writers magazine since 2008.