Understanding Your Academic Friend: Job Market Edition

Screen Shot 2014-05-15 at 3.10.46 PM

“If you and your spouse don’t like living 400 miles apart, why don’t you just get jobs at the same university?”

“You miss living near your mom? Well, there are like 5 colleges in her town — just work at one of those!”

“You still don’t know anything about that assistant professor job? Didn’t you apply to it 9 months ago?”

“Wow, your salary is terrible. Why don’t you work for a school that pays better wages?”

“Want me to talk to my friend’s mom, the dean at University X? I’ll bet she can hook you up with a job there and then we’ll live closer to each other!”

I’ve had to answer all of these questions — or some variation of them — ever since I completed my PhD 7 years ago and began looking for tenure track jobs. The people asking these questions are friends and family who love me very much but who just cannot understand why a “smart, hard-working” lass like me has such limited choices when searching for permanent employment as a professor. When I’m asked these questions I need to pause and take a deep breath because I know the rant that’s about to issue forth from my mouth is going to sound defensive, irate and even paranoid to my concerned listener. When I finish the rant, I know my concerned listener is going to slowly back away from me, all the while secretly dialing 9-1-1.

In the interest of generating a better understanding between academics and the people who love them, I’ve decided to write a post explaining exactly how the academic job market works for someone like me, a relatively intelligent, hard-working lady with a PhD in the humanities. My experiences do not, of course, represent the experiences of all academics hunting for jobs, nor do they represent the experiences of all humanities PhDs (they do, of course, represent the experiences of all humanities unicorns though). I think this post will prove useful for many academics as they return to the Fall 2014 Edition of the Job Market.

So, my dear academics, the next time a friend says “I just don’t understand why a smart, hard-working person like you can’t get a job,” you can just pull out your smart phone, load up this post, and then sit down and have a stress-free cocktail while I school your well-meaning friend/ mother-in-law/ neighbor about what an academic job search entails and, more importantly, how it feels. I should note that I have been successful on the job market (which is why I’m currently employed) but for the purposes of this post I’m going to describe (one of my) unsuccessful attempts at the job market, during the 2013-2014 season. Enjoy that sweet sweet schadenfreude, you vultures.

 ***

Spring 2013

Though job ads usually don’t go live until the fall, the academic job search usually begins the spring before. At this point all you really need to be doing is selecting three individuals in your field (preferably three TENURED individuals) who think you’re swell and ask them if they will write a letter of recommendation. It’s necessary to make this request months in advance of application deadlines since many of these folks are super busy. You should also lock yourself in your bedroom and do dips, Robert-De-Niro-in-Cape-Fear style, because upper body strength is important. Who knows what the fall may bring.

Summer 2013

Job ads still haven’t been posted yet, but at this point any serious job candidate is working on her job materials. These are complex documents with specific (and often contradictory) rules and limits. Here’s a breakdown of some (not all, no, there will be so much more to write and obsess over once actual job ads are posted) of the documents the academic must prepare in advance of the job season:

1. The Cover Letter

The cover letter is a nightmare. You have 2 pages (single spaced, natch) to tell the search committee about: who you are, where you were educated, why you’re applying to this job, why you’re a good fit for this job, all the research you published in the past and why it’s important, all the research you’re working on now and why that’s important, the classes you’ve taught and why you’ve taught them, the classes you could teach at University X, if given the chance, and your record of service. You explain all of this without underselling OR overselling yourself and you must write it in such a way that the committee won’t fall asleep during paragraph two (remember, most of these jobs will have anywhere from 200-400 applicants so your letter must STAND OUT). You will draft the cover letter, then redraft it, then send it to a trusted colleague, revise it a few more times, send it to several more trusted colleagues (henceforth TCs), obsess, weep, and revise it one more time. Then more De Niro dips.

2. The CV

The curriculum vitae is not a resume. Whereas the primary virtue of a resume is its brevity, the curriculum vitae goes on and on and on. Most academics keep their CVs fairly up-to-date, so getting the CV job market ready isn’t very time-consuming. Still, it’s always a good idea to send it along to some TCs for feedback and copyediting. And don’t worry about those poor, overworked TCs: academics love giving other academics job market advice almost as much as mothers like to share labor and delivery stories with other mothers. There is unity in adversity. We also drink in the pain of others like vampires.

Is that a CV in your pocket or are you just happy to see me?

Is that a CV in your pocket or are you just happy to see me?

3. Statement of Teaching Philosophy

The statement of teaching philosophy (aka, teaching statement) is basically a narrative that details your approach to education in your field. You usually offer examples from specific classes and explain why your students are totally and completely engaged with the amazing lessons and assignments you have created for them. What’s super fun about these documents is that every school you apply to will ask for a slightly different version (and some, bless them, might not request it at all). Some search committees want a one-page document and others want two-page documents and still others don’t specify length at all (a move designed specifically to fuck with the perpetually anxious job candidate). Some search committees might ask that you submit a combined teaching and research statement, which, as you might guess, is the worst. So when you draft this document in the summer it’s just that: a draft. It’s preemptive writing. And it’s only just begun.

4. Statement of Research Interests

You know all the stuff you said about your research in the cover letter? Well say all of that again, only use different words and use more of them. This document could literally be any length come fall so just settle in, cowboy.

August 2013

Job ads have been posted! JOB ADS HAVE BEEN POSTED! JOB. ADS. HAVE. BEEN. POSTED.

Commence obsessing.

September 2013

At this point job ads are appearing in dribs and drabs, so you’re able to apply to them fair quickly. If you were obsessive in preparing your materials over the summer, your primary task now is to tailor each set of materials to every job ad. This process involves: researching the individual department you’re applying to as well as the university, hunting down titles and descriptions of courses you might be asked to teach, and poring over every detail of the job ad to ensure that your materials appear to speak to their specific (or as it may be, general) needs. This takes more time than you think it will.

Also keep in mind that every ad will ask for a slightly different configuration of materials. Some search committees are darlings and only ask for a cover letter and CV for the first round of the search, while others ask for cover letter, CV, letters of recommendation, writing samples, teaching statements and all the lyrics to “We Didn’t Start the Fire.”

It’s also important to keep in mind that most folks on the academic job market are dissertating and/or teaching or, if you’re like me, already have a full-time job (and kids). But still, things haven’t gotten too stressful yet — the train’s barely left the station.

October-November 2013

Loads of jobs have been posted over the last few months and you are applying to ALL OF THEM. Well-meaning friends will send you emails with hopeful subject lines like “This job seems perfect for you!” and a link to a job you will not get. You apply to it anyway.

Also? Remember, all those jobs you applied to back in September? Well, right now you might also start receiving automated rejection emails that look something like this:

Screen Shot 2014-08-28 at 10.37.18 PM

Neat huh?

If you are lucky, though, the search committee will send you an email asking you to “submit more materials” — Ah, it feels good even to type those words — and at that point you do a happy, submit-more-materials jig in front of your computer. Yay! More materials! They like me!

Every search committee will ask for something different at this point. Almost every school requests a writing sample and letters of recommendation at this stage. Some schools will ask the candidate to submit sample syllabi while still others ask for the candidate to design an entirely new syllabus. It’s kind of a free-for-all.

Oh, you might *also* be doing phone or Skype interviews with departments that don’t attend the annual MLA convention in early January, where many humanities-based schools conduct face-to-face first round interviews (more on those later). It’s far more humane to allow candidates to interview from home, so I’m always pleased when this is presented as an option. Of course, interviewing from home generates its own share of problems when, for example, your cat and your toddler simultaneously demand entrance to your office in the middle of a Skype interview for which you have put on a pressed button-down, suit jacket and a pair of pajama pants.

December 2013

Ah December, December. As the days get shorter, the Job Wiki gets longer. Most job candidates now have a pretty good idea about how the market is “going.” Spoiler alert: it is going terribly. 

Even if you haven’t received a lot of rejections yet, it doesn’t mean you haven’t been rejected dozens of times. It just means that the university is going to wait until an offer is made and accepted by The One in the spring before sending you the automated rejection notice I posted above. Usually though, we don’t need to wait that long. If University X has already contacted the standard 10-15 candidates for first round interviews (which you know because you check the goddamn Wiki every day 5 minutes) and they haven’t contacted you (which you know because you checked your spam folder twice and had your husband call your phone to be certain that it was working properly), then baby, you’re out.

Yes, December is a dark month for the job market candidate. As the winter holidays arrive, your dear academic friend has invested over six months in a job search which has, at best, offered ambiguity and at worst, pummeled her with outright rejection. Your friend, if she’s lucky, has some MLA interviews scheduled by now or maybe even … a final round interview! … lined up for just after the holidays. So try to pull her away from her interview flashcards. Treat her with care. Make her get drunk with you the day after Christmas in some crappy bar you two liked to frequent in your younger, more carefree days because listen: shit is about to get real for your friend.

to be continued…

[Note: In the coming weeks I will post Part II of "Understanding Your Academic Friend: Job Market Edition" or "When Shit Gets Real"]

So academic friends, have any to add to this timeline? What else should the friends and family of job-seeking academics (henceforth FFJSA) know before the job season begins in earnest next month? Share below…

Tell Us A Story is Back!

I don’t usually cross post between blogs, but since Tell Us A Story returns today from summer vacation, I wanted to give them a shout out. I also wanted to encourage all of you readers to submit your true stories HERE.

To kick off the 2014-2015 season, Adam Rose brings us chemotherapy two ways and tells us exactly what it’s like to pump poison through your body:

“It’s been two days since my third round of chemotherapy. I needed two Ativan on my way to treatment in hopes that they would keep me calm enough for Roxanne and Mark to insert the tube into a vein. Turns out it was a two person job even with the dopey drug running through my system. The Ativan made my body slow down and my mind fuzz over like frost on a windshield. I squeezed Mark’s hand to pump up the reluctant vessels while focusing on the painting of a rodeo clown leaping over a bull. Roxanne struggled to find a vein that was relaxed enough for the needle.”

Click HERE to read the rest.

Notes on a Riot

Police presence in Ferguson on August 11th 2014 image source: PBS news hour

Police presence in Ferguson on August 11th 2014
image source: PBS news hour

Like many professors, I live on the same campus where I work. As a result, I’ve watched drunk East Carolina University students urinate and puke on my lawn and toss empty red solo cups into the shrubbery around my home. But one evening I had a more troubling run-in with a college student. It began when I woke to the sound of my dog barking. It took me a minute to orient myself and understand that my dog was barking because someone was knocking on the front door. It was 2 am and my husband was out of town, but I opened the front door anyway. On the stoop was a college-aged woman dressed in a Halloween costume that consisted of a halter top, small tight shorts, and sky-high heels. The woman was sobbing and  shivering in the late October air and her thick eye make up was running down her face. She was incoherent and hysterical– I could smell the tequila on her breath — so it took me a while to figure out what she wanted .

She told me that she was visiting a friend for the night and that she had lost her friend…and her cell phone. She had no idea where she was or where to go. I think she came to my door because my porch light has motion detectors and she must have thought it was a sign. As she rambled on and on I could hear my baby crying upstairs. I told the woman to wait on my stoop, that I had to go get my baby and my phone, and that I would call the police to see if they could drive her somewhere. “Nooooooo,” she wailed, “don’t call the police!” I urged her to wait a minute so I could go get my baby and soothe him, but when I returned a few minutes later with my cell phone in hand, she was gone.

Renisha McBride image source:  http://www.nydailynews.com/

Renisha McBride
image source: http://www.nydailynews.com/

I felt many emotions that night: annoyance at being woken up, panic over how to best get help for the young woman, and later, guilt over my inability to help her. But one emotion that I did not feel that night was fear. I was never threatened by this young woman’s presence on my stoop and I never felt the need to “protect” my property. Why would I? She was a young woman, no more than 19 or 20, and though she was drunk and hysterical, she needed my help. I was reminded of this incident when I heard that Renisha McBride, a young woman of no more than 19 or 20, was shot dead last fall after knocking on Theodore Wafer’s door in the middle of the night while drunk and in need of help. Wafer was recently convicted of second-degree murder and manslaughter (which is a miracle), but that didn’t stop the Associated Press from describing the Wafer verdict thusly:

Screen Shot 2014-08-13 at 12.34.53 AM

McBride, the victim, a young girl needlessly shot down by a paranoid homeowner, is described as a nameless drunk, even a court ruling establishing her victimhood beyond a shadow of a doubt. Now, just a few days later in Ferguson, Missouri, citizens are actively protesting the death/murder of Michael Brown, another unarmed African American youth shot down for seemingly no reason. If you haven’t heard of Brown yet, here are the basic facts:

1. On Saturday evening Michael Brown, an unarmed African American teenager, was fatally shot by a police officer on a sidewalk in Ferguson, a suburb of St. Louis, Missouri.

2. There are 2 very different accounts of why and how Brown was shot. The police claim that Brown got into their police car and attempted to take an officer’s gun, leading to the chain of events that resulted in Brown fleeing the vehicle and being shot. By contract, witnesses on the scene claim that Brown and his friend, Dorian Johnson, were walking in the middle of the street when the police car pulled up, told the boys to “Get the f*** on the sidewalk” and then hit them with their car door. This then led to a physical altercation that sent both boys running down the sidewalk with the police shooting after them.

3. As a result of Brown’s death/murder the citizens of Ferguson took to the streets, demanding answers, investigations, and the name of the officer who pulled the trigger. Most of these citizens engaged in peaceful protests while others have engaged in “looting” (setting fires, stealing from local businesses, and damaging property).

image source: http://blogs.riverfronttimes.com/

Michael Brown at 16 image source:
http://blogs.riverfronttimes.com/

Now America is trying to make sense of the riots/uprisings that have taken hold of Ferguson the last two days and whether the town’s reaction is or is not “justified.” Was Brown a thug who foolishly tried to grab an officer’s gun? Or, was he yet another case of an African American shot because his skin color made him into a threat? 

I suppose both theories are plausible, but given how many unarmed, brown-skinned Americans have been killed in *just* the last 2 years — Trayvon Martin (2012), Ramarley Graham (2012), Renisha McBride (2013), Jonathan Ferrell (2013), John Crawford (2014), Eric Garner (2014), Ezell Ford (2 days ago) — my God, I’m not even scratching the surface, there are too many to list  — I’m willing to bet that Michael Brown didn’t do anything to *deserve* his death. He was a teenage boy out for a walk with his friend on a Saturday night and his skin color made him into a police target. He was a threat merely by existing.

Given the amount of bodies that are piling up — young, innocent, unarmed bodies — it shouldn’t be surprising that people in Ferguson have taken to the streets demanding justice. And yes, in addition to the peaceful protests and fliers with clearly delineated demands, there has been destruction to property and looting. But there is always destruction in a war zone. War makes people act in uncharacteristic ways. And make no mistake: Ferguson is now a war zone. The media has been blocked from entering the city, the FAA has declared the air space over Ferguson a “no fly zone” for a week “to provide a safe environment for law enforcement activities,” and the police are shooting rubber bullets and tear gas at civilians.

Screen Shot 2014-08-12 at 7.44.35 PM

But no matter. The images of masses of brown faces in the streets of Ferguson can and will be brushed aside as “looters” and “f*cking animals.” Michael Brown’s death is already another statistic, another body on the pile of Americans who had the audacity to believe that they would be safe walking down the street or knocking on a door for help.

Screen Shot 2014-08-12 at 7.33.16 PM

What is especially soul-crushing is knowing that these events happen over and over and over again in America — the Red Summer of 1919, Watts in 1964, Los Angeles in 1992 — and again and again we look away. We laud the protests of the Arab Spring, awed by the fortitude and bravery of people who risk bodily harm and even death in their demands for a just government, but we have trouble seeing our own protests that way. Justice is a right, not a privilege. Justice is something we are all supposed to be entitled to in this county.

***

When the uprisings in Los Angeles were televised in 1992 I was a freshman in high school. All I knew about Los Angeles is what I had learned from movies like Pretty Woman  and Boyz N the Hood –there were rich white people, poor black people with guns, and Julia Roberts pretending to be a prostitute.  On my television these “rioters and looters” looked positively crazy, out of control. And when I saw army tanks moving through the streets of Compton I felt a sense of relief. 

Sides-LA-Riots-6_525

That’s because a lifetime of American media consumption — mostly in the form of film, television, and nightly newscasts — had conditioned my eyes and my brain to read images of angry African Americans, not as allies in the struggle for a just country, but as threats to my country’s safety. I could pull any number of examples of how and why my brain and eyes were conditioned in this way. I could cite, for example, how every hero and romantic lead in everything I watched was almost always played by a white actor. I could cite how every criminal, rapist, and threat to my white womanhood was almost always played by a black actor. And those army tanks driving through the outskirts of Los Angeles didn’t look like an infringement on freedom to me at the time (and as they do now). They looked like safety because I came of age during the Gulf War, when images of tanks moving through wartorn streets in regions of the world where people who don’t look like me live come to stand for “justice” and “peacemaking.” Images get twisted and flipped and distorted.

image source: bossip.com

image source:
bossip.com

The #IfTheyShotMe hashtag, started by Tyler Atkins, illuminates how easily images  — particular the cache of selfies uploaded to a Facebook page or Instagram account — can be molded to support whatever narrative you want to spin about someone. The hashtag features two images which could tell two very different stories about an unarmed man after he is shot — a troublemaker or a scholar? a womanizer or a war vet? The hashtag illuminates how those who wish to believe that Michael Brown’s death was simply a tragic consequence of not following rules and provoking the police can easily find images of him flashing “gang signs” or looking tough in a photo, and thus “deserving” his fate. Those who believe he was wrongfully shot down because he, like most African American male teens, looks “suspicious,” can proffer images of Brown in his graduation robes.

Of course, as so many smart folks have already pointed out, it doesn’t really matter that Brown was supposed to go off to college this week, just as it doesn’t matter what a woman was wearing when she was raped. It doesn’t matter whether an unarmed man is a thug or a scholar when he is shot down in the street like a dog. But I like this hashtag because at the very least it is forcing us all to think about the way we’re all (mis)reading the images around us, to our peril.

The original images used in Tyler Atkins' #IfTheyShotMe hashtag

The original images used in Tyler Atkins’ #IfTheyShotMe hashtag

The same day that the people of Ferguson took to the streets to stand up for Michael Brown and for every other unarmed person killed for being black, comedian and actor Robin Williams died. I was sad to hear this news and even sadder to hear that Williams took his own life, so I went to social media to engage in some good, old fashioned public mourning, the Twitter wake. In addition to the usual sharing of memorable quotes and clips from the actor’s past, people in my feed were also sharing suicide prevention hotline numbers and urging friends to “stay here,” reminding them that they are loved and needed by their friends and families. People asked for greater understanding of mental illness and depression. And some people simply asked that we all try to be kind to each other, that we remember that we’re all human, that we all hurt, and that we are all, ultimately, the same. Folks, now it’s time to send some of that kind energy to the people of Ferguson and to the family and friends of Michael Brown. They’re hurting and they need it.

No End in Sight: Academic Research and “Time Off”

 

Wheedle

“Can you read these words to me, Amanda?” my first grade teacher asked, pointing at the cover of The Wheedle on the Needle. I shook my head and smiled, thinking this was some kind of trick. How the hell would I know how to read those letters? Later, I asked my friends if they had been able to decipher the book cover, assuming they were as lost as I had been. “The Wheedle on the Needle,” my friend replied, almost casually. The others nodded and I felt betrayed: when did everyone learn to read? This was 1983, when it was not assumed that children would enter kindergarten knowing how to read. But still, somehow, between kindergarten and first grade, I had fallen behind my peers.

Soon after my fateful reading test our teacher sorted us into reading groups. I was, of course, placed in the “remedial” reading group while all of my friends were in the “advanced” group. Though I had no way of knowing this earlier — this was the first time any kind of judgment had been made, implicitly or explicitly, about our intelligence — I now had confirmation: I was stupid.

I decided then and there that I would learn to read, as quickly as possible, and I would get the hell out of the remedial group. After several months of intense concentration and effort — it was the first time I can recall applying myself fully to academics — I was in the advanced reading group.  It felt good to be back with my friends and sure, it felt good to learn how to read. But the biggest lesson I learned that day was that I was built for studying: a natural born student.

Fast forward to 1999, my first year of graduate school. I had just graduated  magna cum laude from an Ivy League institution and I was pretty confident in my intellectual capabilities. As an undergrad I had stuffed my brain with the likes of Doris Lessing, Tom Stoppard, Toni Morrison, Euripides, and T.S. Eliot, but I quickly learned that these names meant nothing to my new classmates. They had abandoned the text, that frivolous playground of undergraduate English majors, and moved on to more challenging writers with unfamiliar names like “Foucault” and “Deleuze” and “Baudrillard.” When did this happen? Why did I not get the memo? I was behind everyone else and grad school had barely started. It was first grade all over again.

To cope with this brand new bout of imposter syndrome, I set to work “catching up” with my peers. I made lists of “essential” books and essays — the stuff I thought I should have already read, before coming to graduate school — and tried to fit them in after completing all of my assigned coursework (which was impossible since my coursework took up almost all of my time). How does one cope with such an impossible work load? Easy: you never stop working. And when you do stop working, you must berate yourself about your decision to not-work because, in the world of the scholar, you can always be working. That’s why alcohol is so useful for graduate students. No one feels bad about not reading Foucault while intoxicated.

9780253204745_med

Sometimes I would be in my apartment, rereading an incomprehensible passage in The Acoustic Mirror for the fourth time, and I would be seized with a bottomless sense of doom, like I was free falling down a long dark well, only it was the inside of me that was falling. The only way I knew how to keep my body from collapsing in on itself, like a black hole of dread, was to get into bed, squeeze my eyes tight, and breathe deeply until my internal gravitational pull slowed to a stop. Sometimes this took minutes, other days it took hours. Then I would get out of bed, pick up The Acoustic Mirror and my yellow highlighter, take a deep breath, and begin again.

At the time I had no idea that there was a name for these episodes: panic attacks. I just thought I was too dumb for graduate school and had a bad time coping with that reality. But after some consultations with my doctor and my parents I realized that the best thing for me to do was to take a leave of absence after completing my Masters. I hoped that a year off might help me to decide whether I should continue on to do a PhD or move into some profession that would not cause my body to regularly seize up  with dread or cause the skin on my face to erupt in angry pulsing nodules of adolescent acne.

The year off was good for me. I worked for AmeriCorps, watched a lot of movies, read all of the Harry Potters, got a puppy, and learned how to share a home with the man who would eventually become my husband and the father of our two kids. At the end of the year I felt refreshed and returned to the University of Pittsburgh, fully ready to begin a PhD in film studies. I still had the occasional panic attacks, suffered from imposter syndrome, and regularly believed that there would never be enough hours in the day to complete all of the reading, viewing and writing that  I thought I needed to complete. But I also knew that being a scholar was what I liked best and so the constant anxiety, a kind of low-level hum –my body’s own white noise — was the penalty I had to pay to do what I loved.

During those 5 years I was always wondering if I was doing “enough” to succeed. I distinctly remember sitting around with my fellow PhDs, comparing the amount of hours we spent on our coursework each week — not to brag or one-up each other — but out of a genuine desire to determine whether what we were doing was truly “enough.” Because there was no other way to measure the knowledge we were slowly and painfully accumulating. Was 50 hours enough? 60?  70? (Answer: it is never enough).

Of course anyone who pursues a post-graduate degree  — doctors, lawyers, nurses, veterinarians — finds themselves devoting all of their free hours to their studies. But the difference for professors is that this frantic need to always be reading or writing, to always be a student, never really “ends.” In this profession we are made to feel as if teaching and committee work and the occasional article or book are not enough. If we’re not publishing books with the top presses or publishing articles in the top journals or being offered jobs at R1 schools, then we don’t really matter in the field. If we’re not always working (and I mean always working) then we don’t exist.

William Pannapacker addressed this issue quite well in a piece for The Chronicle of Higher Education, which is worth quoting at length, because it is fantastic:

If someone asks, “How are you?,” I sigh, shrug, and say, “Busy, like everyone else.” If pressed, I will admit that I spent some time with my family—the way a Mormon might confess to having tried a beer, once. For more than 20 years, I have worn what Ian Bogost has called “the turtlenecked hairshirt.” I can’t help it; self-abnegation is the deepest reflex of my profession, and it’s getting stronger all the time…

Surely, the Catholic tradition of monastics and mendicants lies behind this tendency that I share with my profession, but there are other traditions at work here. As H. L. Mencken said, Puritanism is “the haunting fear that someone, somewhere, may be happy.” Happiness is worldliness, and idleness is sin: Work is an end in itself, as Max Weber observed in The Protestant Ethic and the Spirit of Capitalism. Likewise, there’s an old, unspoken commandment, “A professor shall not be seen mowing the lawn on weekdays.”

This “turtlenecked hairshirt” doesn’t go away when you finish your dissertation, or (if you’re lucky) snag your first tenure track job. It doesn’t even end when you get tenure. I know professors who have climbed as far as they can up the academic hierarchy (and it is a woefully stubby ladder to begin with), but who still regularly churn out monographs and anthologies as if they are getting paid by the word. But here’s the thing: they’re not getting paid by the word. Or the chapter. They’re barely clearing a few hundred dollars for what is often years of tireless research and writing. No, academics are “paid” in positive reviews, citations, and ego stroking.We’re paid with tenure or new job opportunities. Those of us on the tenure track are “paid” in new titles: Assistant Professor, Associate Professor, Full Professor.

***

I am a tenured professor working at a state university that has ceased to offer raises (including cost of living raises) to its faculty. When I started my job in 2007 I was making approximately $53,000, a solid starting salary for an Assistant Professor circa 2007. Today, after 7 years at the same institution, I’m proud of my research profile, the classes I’ve taught, the students I’ve mentored and the film studies program I’ve helped build, but my salary is a mere $2,000 more than it was when I started 7 years ago. I have been told by numerous administrators that I should not get my hopes up for a raise, that money is tight (even though newbie professors fresh out of graduate school are hired every year at much higher salaries). The $2,000 I received for getting tenure is likely going to be “it” for a very long time. Yes that’s correct, the only raise I’ve received in 7 years is $2,000 for getting tenure. Oh, you can also call me “Associate Professor” now. I know academic titles carry a lot of weight so I wanted to make sure y’all knew about that, too.

I had planned to spend my summer — as most academics do — working on a major research project, in this case, my next book project. I would find a way, as I always do, to fit research and writing into the pieces of time leftover after teaching a summer class, driving my kids to their various activities, and visiting the family and friends who live too far away to visit during the school year. My summer research projects always drain away the time I spent with family and friends, but I have done this every summer since I can remember: to get a job, to get tenure, and because I was always advised to work for the job I want, not the job I have.

“Why are you always working in the summer, aren’t you a teacher?” my non-academic friends often ask me, while my academic friends usually ask “What are you working on this summer?”

***

A few months ago, after a failed attempt to get a job at a university that might actually pay me a salary commensurate with my rank and experience, I came to the realization that the stress and late nights, the self doubt and loathing, were now unnecessary. I am not going to get a better-paying job and my current employers, no matter how many books I publish, how many students I mentor, or how many committees I serve on, are not going to give me any more money. Or at least not much money. Initially this realization made me despondent: if no one is paying me more money to produce more work, and very few people read the peer-reviewed articles or monographs I’m trying to crank out, then what happens? What happens when a professor no longer has any incentive to work at the breakneck pace at which she has been encouraged to work since she first embarked upon that great and arduous journey towards a career in academia?

Nothing. Nothing happens. And, dear reader, it is glorious.

Yes, this summer I decided to stop: panicking, working at 9pm after the kids go to bed, working on Saturday afternoons, bringing “work” with me on vacation, making myself feel guilty for not working on vacation, complaining about how “busy” and “stressed” I am all the time in real life and online, writing articles or presenting at conferences just to add a line to my CV, writing shit that no one will be able to read because it’s locked behind a paywall, viewing the success of my friends and colleagues as a indictment of my own (non)success, and staring at my computer screen while my kids ask when I will be done working so I can play with them. Plus, most people believe that professors are lazy layabouts in the summer anyway, so I decided to start living up to the stereotype.

So this summer I’ve been on vacation — a real, honest-to-goodness vacation. Sure, I taught a 5-week class and I’ve answered urgent emails. I’ve spoken with colleagues about conference panels and workshops. And right now I’m writing this blog post. But I’ve stopped with the “musts” and the “shoulds.” I’m only working on what I want to work on. And sometimes, even when I really do feel like I’d like to say, brush up on the history of broadcast television, I decide to go out to lunch with my kids instead. Just because. I’m saying “no” to “Would you like to chair this blah blah blah…” and “yes” to “Would you like to sit in this chair and drink a cocktail?” And I’m enjoying my family and my life in a way that I haven’t been able to since…well, since I started graduate school back in 1999.

I want to be clear: I love writing and researching. I love the feeling of finishing a sentence and knowing that it says exactly what I want it to say. I love following an idea through all the way and producing scholarship that is readable and functional. I’m incredibly proud of my first book and I think it’s doing something useful in the subfield of genre studies. But my scholarship won’t cure cancer. It doesn’t provide fresh drinking water to drought-stricken regions. It’s not even the kind of writing people stay up all night reading and then eagerly discuss with their book club the next day, like Twilight. That’s just not how humanities scholarship works. So I’m in no big rush to publish my next piece of scholarship. While I love doing good scholarship I don’t love feeling like a hamster on a wheel: working, working, working for no tangible reward and with no end in sight. At least the hamster is getting exercise.

***

Last week my children and I drove up to Connecticut to spend a few days with a dear friend and her family. They swam and dug holes and her kids taught my kids how to catch (and release) frogs. They were having the kind of summer I remember having when I was young — days that unspool in no particular hurry, with no clear agenda. As we walked home in the twilight, holding hands, my daughter said to me “This is the best vacation ever!” And she’s right, it is.

Frog catching

Frog catching

 

(Aca) Blogs are Like Assholes…

You’ve heard the joke, right?  There are over 152,000,000 blogs on the internet. And in one small corner of the internet are the academic blogs, the aca-blogs. I define “aca-blogs” as blogs written and moderated by an individual (as opposed to a collective) currently involved in academia (whether as a student, instructor or administrator). The content of these blogs vary widely but they are usually at least tangentially related to the blogger’s field of academic study. Most of these bloggers write in a looser, more informal style than they would for a more traditional scholarly publication, like a peer-reviewed journal or a monograph published by a university press (i.e, the kind of documents that — at least at one time — would get you a job or tenure).

Now, I’ve never been an early adopter. I’m a proud member of the “early majority,” the folks who watch and see what happens to the early adopters before taking the plunge. I was late to Facebook (August 2008), Twitter (March 2009), and (aca)blogging (August 2009). I only started blogging in the wake of the medium’s “golden age” (an era which, like all golden ages, varies wildly depending on who you consult). I use the term “golden age” to signal a time when a large portion of the academics I interacted with on social media also had blogs, and posted to them regularly (see my blogroll for a sizable sample of media studies bloggers). Starting a blog was common for people like me — that is, for people who liked talking about popular culture in a looser, more informal way, online, with other fans and academics. And with gifs.

Part of what (I think) my early readers enjoyed about my blog is that I was using my PhD, a degree that (supposedly) gives me the ability to provide nuanced arguments and historical context about the popular culture they were consuming.  I like that my online friends (including folks I went to elementary school with, my Mom’s friends, my kids’ friends’ parents) can read my mom’s Oscar predictions or why I think the Jersey Shore cast is a lot like the Teenage Mutant Ninja Turtles and they don’t need to buy a subscription to a journal or be affiliated with a university to do so. That’s important. If we, as Humanities-based scholars, are terrified about the way our discipline is being devalued (literally and metaphorically) then we need to show the public exactly how valuable our work is. How can we say “people need media literacy!” but only if they enroll in my class or pay for a journal subscription? That just supports the erroneous belief that our work is elitist/useless when it’s not. I know this work is valuable and I want everyone to have access to it. I also like the timeliness afforded by this online, open-access platform. I can watch Mildred Pierce the night it airs and have a review published on my personal blog the next day, which is exactly when folks want to read it. If I want to do some detailed research and further thinking about that series, then sure, I’d spend several months on a much longer piece and then send it to a journal or anthology.

Indeed, Karra Shimabukuro, a PhD student who maintains two different blogs, explains her interest in blogging this way:

I like [blogging] because it lets me share my work, and in this day and age perhaps get people to know my work and me. Now that I’m in my PhD program, I try to post stuff pretty regularly, and I always link to Twitter when I do, so get more views. I think it’s important to share my research. I read quite a few blogs, usually when I am looking for something specific though- job market, conference, early career advice type stuff. 

In the early days of my blog’s life I posted frequently (several times per week) and my posts were generally short (less than 1000 words). These posts were written quickly, often in response to an episode of television I had just watched or a conversation I had just had with someone on Twitter (or Facebook, or occasionally, real life). My early posts were also interactive. I almost always concluded posts with questions for my readers, invitations to engage with me on the platform I built for just that purpose.

Ben Railton, a professor who blogs at American Studies, told me via email:

For me individually, blogging has been infinitely helpful in developing what I consider a far more public voice and style, one that seeks to engage audiences well outside the academy. Each of my last two books, and my current fourth in manuscript, has moved more and more fully into that voice and style, and so I see the blog as the driving force in much of my writing and work and career.

 And collectively, I believe that scholarly blogs emphasize some of the best things about the profession: community, conversation, connection, an openness to evolving thought and response, links between our individual perspectives and knowledges and broader issues, and more.

Looking back at these early posts I’m surprised by the liveliness of the comments section — how people would talk to me and each other in rich and interesting ways. In 2009 my blog felt vibrant, exciting, and integral to my scholarship. A few of of my posts became longer articles or conference talks. Writing posts made me feel like I was part of an intellectual community exchanging ideas back and forth in a productive kind of dialogue.

In hindsight it’s strange to me that I blogged so much in 2009 and 2010 because those years mark one of the most challenging periods of my life — just before the birth of my second child, a beautiful boy who never ever (ever) slept. During the brief snatches of time when my newborn son was asleep, or at least awake and content, I would grab my laptop and compose my thoughts about The Hills or Google+ (LOL, Google+!). I found that, when the muse comes calling, you have to write then, not sooner and not later, or she’ll go away. So I wrote posts in the middle of the night and even while nursing my son. Blogging felt vital to me then, like a muscle that needed stretching. And when the words came, they came in a stream. The sexual connotations here are purposeful — blogging was satisfying to me in the same way sex can be satisfying. And like sex, sometimes when you try to blog, you just can’t get it up: the moment’s not right, the inspiration vanishes.

But things are different in 2014. I’ve had tenure for a year. I just completed a manuscript and turned it in to the press. My son (now 4 and a half) sleeps through the night (almost) every night and I find that I can work while lounging in a hammock next to my 8-year-old daughter as she reads. In other words, I have plenty of time to stretch my blog muscle. Yet,  I’m just losing my desire for blogging. It used to be that if I went more than a few weeks without writing a post, I got twitchy, an addict in the midst of withdrawal. But now, my blog’s stagnation engenders no such discomfort. It’s like the day you realize you’re over an old love. Dispassion and neutrality abound.

Taking stock of her own blogging hiatus last year, Slaves of Academe writes “As it turns out, walking away from one’s blog was relatively easy, given the surplus of competing screens.” And I suppose that that’s the first reason why I blog less frequently than I did 5 years ago. Back in 2009 it seemed that the internet was quite interested in the proto-scholarship offered up by the academic blog. There was an excitement there of seeing new scholarship take shape right before our eyes. And Michael Newman, a media studies professor writing about this same topic on his own personal (neglected) blog, zigzigger, explains:

 People mixed personal and professional. They’d get first-persony and confessional even in efforts at engaging with intellectual concerns. They’d make the blog as much about process as product. No one was editing or reviewing your blog, so it had a raw immediacy missing from more formal writing. 

Newman notes the rise of academic blog collectives (like Antenna), a move which has, for better or worse, worked to legitimize the process of academic blogging:

As blogs become more legitimate and serve these more official functions, they seem less appropriate for the more casual, sloppy, first-drafty ponderings that made the format seem vital in the first place.

This has certainly been true for me. I often find myself starting to write a post and then abandoning it for it’s lack of intellectual “rigor.” I second guess my posts more often now, worrying that they might be too frivolous, too self-indulgent, too weird. But of course, that’s what my blog has always been. It just seems like that sort of casual, stream-of-consciousness style writing is less acceptable now among academics. Or maybe everyone is just bored with it.

Justin Horton, an ABD who has been blogging since 2012, has noticed an overall decrease in the numbers of posts coming out of personal blogs. He tells me:

Personal blogs have been diminished by other web spaces (Antenna, etc), but there is still a place for them, and oddly, it seems be occupied by very young scholars (who haven’t gotten their names out there) and senior scholars whose names are widely known and have a built-in audience (I’m think of Bordwell, Steven Shaviro, and so forth).

Years ago it seemed like blogs represented the next wave of academic scholarship: short bursts of freeform thinking published immediately and set in dialogue with other robust online voices. But blogging has not yielded the legitimacy many of us hoped for. While I still put my blog in my tenure file, citing (what I believe to be) its value, I understand that my department’s personnel committee does not view it as a major component of my research, teaching or service (the holy trifecta of academic values), even though it has greatly contributed to all three. So without institutional legitimacy or scholarly engagement, what purpose does the academic blog hold today? Has its moment passed?

I had a chat, via Facebook message, with three fellow aca-bloggers — the aformentioned Michael Newman, Kristen Warner of Dear Black Woman, and  Alyx Vesey, of Feminist Music Geek — to get some answers. I’ve pasted our discussion below:

Screen Shot 2014-06-20 at 9.45.56 PM

Kristen started things off, by addressing the rise of the so-called “critic culture”:

Screen Shot 2014-06-20 at 9.47.11 PM

Editor’s note: I really really love Google books.

Screen Shot 2014-06-20 at 9.51.13 PM

Screen Shot 2014-06-20 at 9.52.14 PM

Screen Shot 2014-06-20 at 9.52.50 PM

Screen Shot 2014-06-20 at 9.54.12 PM

 

Editors’s note: here is a link to Kristen’s post on Jessica Pare.

Screen Shot 2014-06-20 at 9.54.55 PM

Screen Shot 2014-06-20 at 9.55.39 PM

Screen Shot 2014-06-20 at 9.56.50 PM

Editor’s Note: Alyx is referring to Myles McNutt, of Cultural Learnings (and the AV Club and my heart).

Screen Shot 2014-06-20 at 9.58.28 PM

Screen Shot 2014-06-20 at 9.59.14 PM

Screen Shot 2014-06-20 at 9.59.47 PM

Screen Shot 2014-06-20 at 10.00.36 PM

 No, the slow disappearance of the personal aca-blog isn’t exactly a crisis — not like the academic job market crisis, or the humanities crisis, or the crisis in higher education. But the downtick in blogging in my field does give me pause because I see real value in the kind of intellectual work performed on blogs. Posts are loose, topical, and invite others to join in. They’re accessible in a way that academic journal articles usually are not. And unlike the think pieces and recaps I most frequently read online (and which I enjoy), personal blog posts are rarely subjected to the rabid feeding frenzy of misogyny, racism and obtuseness that characterizes so many comment sections these days. The personal blog affords a certain level of civility and respect. If we disagree with each other — and we often do, thank God — we’re not going to call each other cunts or trolls or worse. At least not in public for everyone to see. We’re…classy.

So while my blogging has slowed, I’m not quite ready to give up on the platform yet. I still think there’s value in this mode of intellectual exchange — in the informality, the speed with which ideas can be exchanged, and, of course, the gifs.

So, what do you think (all 10 readers who are still reading)? Is the aca-blog dead? Does it matter? Did you like my gifs? Comment below. And please don’t call me a cunt.

Everyone’s a Little Bit Rapey?

24504

Let’s get this out of the way: I love Louis CK. I’ve watched (and enjoyed) all of his stand up concert films and every episode of his FX series, Louie. Louis CK’s humor appeals to me because it makes me squirm: it makes me examine the terrible parts of myself and question my belief systems. He does what, in my opinion, all great comedy should do: “it walks the line between hilarity and horror; make me laugh when my first instinct is to cry.” (yes, I just quoted myself; don’t judge me). A great example of how Louis CK achieves this fine balance of horror, humor and humility can be found in the lengthy stand-up segment of last night’s episode, “Pamela Part I,” a bit which I first saw back in March, when he delivered it as part of his opening monologue on Saturday Night LiveIt’s a great bit, reeling us in with the funny, then surprising and shaming us, then finally, making us laugh. For example, CK talks about how the Bible refers to God as “our Father” and as male, even though it would make more sense for God, if s/he truly exists, to be a female:

The point is: Women birthed us, women raised us. So why aren’t they running things? I think I know why. I think it’s because, millions of years ago, women were in charge, and they were mean, they were horrible! They made us walk around naked, and then they’d laugh at you and flick your penis when you walk by… They were AWFUL! But what could you do? It’s your Mom and her friends, like what could you possibly do about it? And then one guy punched his mom, and we’re like: “We can hit them!” And then we did the whole thing. 

After hearing this bit I actually turned to my husband and said “I should show this to my students to explain the concept of patriarchy!” Louis CK has that kind of effect on me. For this reason I’m willing to give Louis CK the benefit of the doubt when he takes a risk in his comedy. True,  Louie has been an uneven series; for example “The Elevator,” a 6-episode story arc focusing on Louie’s chaste courtship of Amia (Eszter Balint), a Hungarian woman temporarily staying in Louie’s apartment building, was not always successful (in my humble opinion). For example, it’s hard to understand why two fortysomething adults would hang out with each for hours on end without being able to communicate (Louie doesn’t speak any Hungarian, Amia doesn’t speak any English) and without having sex. No sex? No conversation? What were they doing all month? However, I forgave this unbelievable communication gap (have these two never heard of Google Translate? It’s free, Louie!) because it paid off very well in “The Elevator, Part 6,” when Amia takes Louie to a Hungarian restaurant and begs a waiter to translate her love letter into English.

Screen Shot 2014-06-03 at 2.12.23 PM

 

During the six episodes of “The Elevator” we only heard Louie’s point-of-view. He tells his friends, and anyone who will listen, that he loves Amia, despite the communication gap (and only knowing her for one month).  But we never hear Amia’s (English) words. So when the waiter sits down at Louie and Amia’s table, puts on his spectacles, and begins reading “Dear Louie…” I was almost as excited as Louie was to hear what she has to say. As the waiter reads Amia’s words, my eyes stay fixed on Louie, who is (charmingly) both embarrassed and delighted by the sudden rush of emotions he can now attribute to his love object. A month of unsaid thoughts and desires come pouring out of the waiter’s mouth until Louie grips his hand and asks him to stop. It’s too much at once; Louie can’t take it all in. He’s not accustomed to women reciprocating his desires. The revelation is bittersweet, of course, because Amia will soon return to Hungary permanently, to be with her son and friends and life. Their love is doomed.

louie-elevator-part-6-fx

Image courtesy of: Zap2it.com

Of course, it’s worth pointing out that this touching love scene was preceded by Louie venturing out into the wilds of Brooklyn in the middle of a hurricane to rescue his ex-wife and two daughters from their slowly-flooding apartment building. Why did these three women need rescuing? As Louie’s ex-wife (Kelechi Watson) says, more than once, her husband is out of town! Yes, when her man is out of town, Janet, a normally resourceful, independent woman, turns into a wailing mess of panic and throws her arms around her ex-husband and sobs in relief when he shows up to save her and her daughters. This scene was so over-the-top in terms of its macho, hero-complex pacing that I almost expected it all to be just a fantasy in Louie’s head, an attempt to make up for the deflating experience of finally getting to screw the woman he loves (or at least lusts after) and then having her run off into the rain, muttering in Hungarian. Placing Amia’s love letter scene directly after Louie’s heroic rescue of his (all-female) family makes it feel too much like a “reward,” as something he earned for “manning up.” But maybe that was the point? Was Louis CK trying to demonstrate how his character has such a lowly sense of self that he can only be loved and receive love after performing an over-the-top rescue mission of three helpless women? Is this perhaps a commentary on the character’s deep neuroses? Maybe. Maybe.

I’m willing to forgive the masculinist fantasies at the heart of “Elevator, Part 6,” however I am far more ambivalent about the key scene in “Pamela, Part I” in which Louie appears to/tries to rape his friend/crush, Pamela (Pamela Adlon). Recall that Pamela is Louie’s longtime love interest who repeatedly shot down his attempts to romance her. Let’s revisit the speech Louie makes to Pamela back in season 2:

Pamela, I’m in love with you. Yeah, it’s that bad. You’re so beautiful to me. Shut up! Lemme tell you. Let me. Every time I look at your face or even remember it, it wrecks me – and the way you are with me – and you’re just fun and you shit all over me and you make fun of me and you’re real. I don’t have enough time in any day to think about you enough. I feel like I’m going to live a thousand years cause that’s how long it’s gonna take me to have one thought about you which is that I’m crazy about you, Pamela. I don’t wanna be with anybody else. I don’t. I really don’t. I don’t think about women anymore. I think about you. I had a dream the other night that you and I were on a train. We were on this train and you were holding my hand. That’s the whole dream. You were holding my hand and I felt you holding my hand. I woke up and I couldn’t believe it wasn’t real. I’m sick in love with you, Pamela. It’s like a condition. It’s like polio. I feel like I’m gonna die if I can’t be with you. And I can’t be with you. So I’m gonna die – and I don’t care cause I was brought into existence to know you and that’s enough. The idea that you would want me back it’s like greedy.

Amazing shit, right? But Pamela isn’t into it. She only likes Louie as a friend so she gets on a plane and moves, permanently, to Paris. That is, until she returns in “Elevator Part 3,” contrite, hoping that she and Louie can “pursue something, a girl/guy kissing thing.” Pamela doesn’t sound convinced, even as she tries to convince Louie, and he gently turns her down because he has fallen for Amia.

But in “Pamela Part 1″ Louie is heartbroken (“walking poetry,” according to the pragmatic Dr. Bigelow [Charles Grodin], resident sage of Louie) and decides to give Pamela a call. Like any self-respecting person, Pamela sees the rebound for what it is, and Louie doesn’t deny it. Still, Louie attempts romance once again one night, after Pamela babysat his daughters. In a scene which echoes the first time Louie and Amia kiss (and later, make love), Louie awkwardly leans in to kiss Pamela. After she ducks his mouth, he tries again. And again. And AGAIN. He grabs and pulls at her. He drags her small frame from room to room. He reminds her that she wanted to do some “girl/guy kissing stuff,” but Pamela isn’t having it. Is it because she can’t bring herself to admit that she’s attracted to Louie? Or is it because she would really like to be attracted to a “nice guy” like Louie but just…isn’t?

Image courtesy of: www.designntrend.com

Image courtesy of:
http://www.designntrend.com

Ultimately, it doesn’t matter what Pamela did or did not “truly” want in that moment. What matters is what her mouth was saying and her body was doing — both were communicating, quite clearly, no. Old Louie would have given up after the first pass. Like a turtle retreating into his shell, it takes little for old Louie to disengage. But new Louie, the Louie who can single-handedly rescue three women from a Brooklyn apartment, who won over the recalcitrant Hungarian, doesn’t retreat. He is clearly frustrated by Pamela’s hot/cold routine. He believes that if he can just fuck her, or just kiss her, then she’ll know, unequivocally, that she is, in fact, attracted to him. Louie is large man, tall and broad, and Pamela is small. After a lengthy struggle, Pamela finally frees herself and screams “This would be rape if you weren’t so stupid. God! You can’t even rape well!” After he secures a psuedo-kiss from Pamela (still under duress), she escapes his apartment and we see Louie’s expression: it is not one of shame but triumph.

Throughout this entire ordeal I was horrified, not because I haven’t seen this scene before — the trope of the woman who resists and resists and resists until finally, she collapses in a man’s arms, is a tried and true cliche — but because I didn’t expect to see it in an episode of Louie. Now I’ve read several recaps of this episode that point to Louie’s lengthy bit about patriarchal oppression (quoted above) being strategically placed before this scene. In other words, because Louis CK was aware that this scene was “rapey,” it’s okay. It’s honest and real. It’s about how date rape happens. It’s about how all men are just a little bit rapey. Maybe. Maybe. But coming in the wake of the University of California Santa Barbara shootings less than 2 weeks ago, in which a young, troubled man murdered seven humans because he was tired of “not getting the girl,” this episode felt like salt rubbed in a very raw wound.

In his (mostly) thoughtful reflections on this episode for the AV Club, Todd VanDerWerff writes:

 The thing it does more bracingly than any episode of TV I’ve seen is place us in the point-of-view of a man who would force himself—no matter how mildly—on a woman and have us see how easily that could slip over into being any man if the circumstances were right, if his feelings were hurt just so or if she lashed out at him while crying on their bathroom floor. To be a man is to remember constantly, daily, that you are, on average, bigger than the average-sized member of half the population, that your mere presence can be scary or threatening to them, especially in the wrong circumstances, and that it is up to you to be on guard against that happening, no matter how unfair that might seem.

But here’s the thing: I’m tired of trying to understand the man’s point of view in this situation. I don’t want to know anymore about the PUAHaters and their hurt feelings. I don’t want to hear about how men think about sex all the time (newsflash: SO DO WOMEN). I don’t care what led up to Louie’s attempted rape of Pamela. I don’t care about his low self esteem or hurt feelings. I don’t want to sympathize with this point of view anymore. Louis CK and other well-meaning men want to tell us how hard it is to be a big strong horny man who just wants that cocktease to finally…give…in. But damn, Louis CK, I’m just not here for that.

I know lots of men who would rather die than force themselves on a woman. I know lots of men who are not in the least bit rapey. I know lots of men who can control themselves. So let’s do ourselves a favor: let’s stop pretending like rape is a man’s default setting when a woman says no because it’s not. I want think pieces about men who don’t rape women. I want to see entire episodes of television in which a man does not rape a woman, or attempt to rape a woman. I would like a rape-free TV this summer. 

But, as Louis CK says, “…we’re like’ We can hit them!’ And then we did the whole thing.” 

 

The Postfeminist Gift of Gwen Stacy, or Gwen Stacy is SOME PIG!

When my 4-year-old son asked me if I would take him to see The Amazing Spider-Man 2 (Marc Webb) on Memorial Day, I’ll admit that I wasn’t even aware the sequel (to the reboot) had been released. I was also unaware that X-Men: Days of Future Past (Bryan Singer) or Captain America: Winter Soldier (Joe Russo) were playing in the same theater. I guess I’ve lost my taste for super hero films. I used to love them. In fact, when I was 13-years-old I became obsessed with Batman (1989, Tim Burton). I had posters and collected trading cards and listened obsessively to the soundtrack:

My interest in films like Tim Burton’s Batman and Sam Raimi’s Spider-Man (2002) was due less to their super hero antics (the amazing weapons, the acrobatic fight scenes, the spectacle of urban destruction) and far more to do with the idea of normal people who feel an obligation to act on the behalf of others. Because I loved these dark, brooding, almost-noirish heroes, I forgave these films for their lifeless female characters. Or rather, I never thought much about them. I never once identified with Vicki Vale (Kim Basinger) or Mary Jane Watson (Kirsten Dunst). I didn’t want to be rescued by Batman (Michael Keaton) or Spider-Man (Tobey Maguire), I wanted to be Batman and Spider-Man and rescue folks myself. Now, it’s no secret that super hero movies have a major gender (and race and ethnicity and sexuality problem). Almost all of the major stars of the super hero franchises are white, heterosexual, cis men. And after a while, the white male fantasies of control and power over a chaotic and inherently evil world were no longer interesting to me. I stopped going to see super hero movies.

Though I did not see the first film in the franchise reboot, The Amazing Spider-Man (2012), I wasn’t expecting to see anything other than 142 (!!!) minutes of CGI fight scenes, smashed cars, franchise-building, and pretty girls who need rescuing — and that’s exactly what I got. Now I don’t want to shit on Gwen Stacy (Emma Stone). She’s adorable. Her outfits are amazing (amazing!).  She’s the valedictorian of her graduating class. Her hair curls in all the right places. She snagged a sweet job (internship?) at Oscorp Industries immediately upon graduating. She wrinkles her little nose when she laughs. She even got into Oxford to study sciency stuff. She also uses her knowledge of high school science to help Spider-Man magnetize his web shooters, a key trick allowing Spider-Man to wrangle with Electro (Jamie Foxx).

All of these character traits and plot points appear, on the surface, to elevate Gwen above the usual superhero girlfriend role. Indeed, when Gwen finally decides to leave New York for England (Oxford! Science stuff!) Spider-Man cribs a move from Charlotte’s Web, by writing the words “I Love You” in giant web-letters. He then tells Gwen that he will follow her to Oxford. He will follow Gwen anywhere. He will be her trailing, Spidey-spouse, doing fixed term work across the Pond. OMG, swoonsville, right ladies?

Screen Shot 2014-05-27 at 2.50.47 PM

Image source: http://i.dailymail.co.uk/

Gwen Stacy is SOME PIG.     Image source:
http://i.dailymail.co.uk/

But even as someone who knows nothing about the Spider-Man I  know that shit is not going to happen. Spider-Man cannot give up his gift, his “great responsibility,” for the love of a woman. He can’t be secondary because he’s primary. He’s the protagonist. And Electro is totally sucking up all of New York City’s power so Spidey basically says “Now I’m gonna tie your ass this police car with some of my webs. Bye.”

This enrages Gwen, who is all “Fuck off, Spider-Man,” because she is a modern postfeminist woman (with GIRL POWER!) and she makes her own choices, and no one, not even fucking Spider-Man, is going to tell her what to do. She yells something to this effect and it is adorable but pointless because as we all know, this is not Gwen’s movie. Still, Gwen pulls out some scissors or a Swiss Army knife or something and hacks away at those sticky webs and then shows up at the big show down between her boyfriend and Electro at some magical place in New York City where all the electricity is kept. Gwen uses her vast knowledge of New York City’s power grid (what?), to help Spider-Man destroy Electro and save New York City from a black out, which is a super dire situation because then planes crash.

Despite Gwen’s key contribution to this epic CGI-battle, the whole scene felt a lot like the scene at the very end of the film (SPOILER ALERT!) when a little boy, dressed up in a Spider-Man costume, attempts to face off against an Oscorp-generated villain, Rhino (Paul Giamatti). It’s admirable and it’s adorable (his costume is too big for him!), but ultimately, we take a deep sigh of relief when Spider-Man finally appears on the scene, thanks the little boy for his bravery, pats him on the head, then delivers him into the arms of his weeping mama. Gwen is like that little boy: we admire her, she’s adorable and brave, but ultimately, she needs to move aside so the real heroes can do their work. Superheroing is a (white) man’s game. It is not for women and children. It’s not for poor, lonely, invisible Electro either.

This became most apparent in the final battle of the film between Spider-Man and the newly villainized Harry/Green Goblin (Dane DeHaan, looking like a cracked out, lost member of One Direction). Because although Gwen reminded Spidey about how magnets work and knew how to access New York City’s power grid (again, I must ask, how does an 18-year-old who jut started working at Oscorp know this?), she is, at the end of the day, just a woman. And a woman’s main value in cinema, especially a summer blockbuster reboot of a successful comic book franchise, is in her to-be-looked-at-ness. That is, Gwen’s purpose is to be an object of the Gaze: Spidey’s gaze, Green Goblin’s gaze, and the audience’s gaze. Her greatest value and power in the film lies in what she means to Peter Parker/Spider-Man. Gwen’s photograph appears all over Peter’s bedroom. She is an image to be adored. She is Peter’s everything. She is his crime-fighting muse. After they break up, Spider-Man sits atop New York City buildings, stalking watching Gwen going about her day. We watch her too. Her outfits are amazing.

Gwen exists to be looked at and she exists as an object of exchange. Harry/Green Goblin values her only because Peter values her. That is, Gwen’s worth is determined by the men around her. As Gayle Rubin argues in her seminal essay, “The Traffic in Women: Notes on the ‘Political Economy’ of Sex” (1975):

Screen Shot 2014-05-27 at 11.13.54 AM

If Harry possesses Gwen, he can exchange her for something he values, in this case, the blood of Spider-Man (which Harry believes will save his life). Gwen gains nothing in this exchange of her body (other than, she hopes, the opportunity to remain alive) because she is the object, the gift, that the powerful white men toss back and forth like a beautiful little rag doll. In the film’s (almost) final battle scene, Harry, now in full Green Goblin mode, scoops up little Miss Gwen and carries her off to a Dangerous Place. Spider-Man, predictably, chases after his love, intent on both saving her life and stopping Green Goblin.

And so, near the end of Spider-Man 2 we find ourselves in a familiar situation: our beautiful damsel, our muse, the gift/ransom exchanged between two men (one selfless, the other selfish), is literally dangling by a string. Here Gwen becomes more valuable than ever because she is now the audience’s gift. Because we identify with Spider-Man, the protagonist, Gwen’s peril is intended to fill us with the worst kind of dread. If she dies, how will Spider-Man feel? I mean, it’s gonna really fuck him up, right? Gwen’s life is the film’s climax.

And when that thin cord of webbing snaps and Gwen begins her freefall through one of the many dangerous old warehouses that seem to lurk around every corner in super hero movies, we are meant to feel the weight of her loss. She is falling away from Peter Parker, our proxy, and his loss is our loss. We see the terror in Gwen’s eyes (good acting, Emma Stone, seriously) and we know those eyes will haunt poor Spider-Man for months or even years. So much good brooding is head! And when Spider-Man cradles Gwen’s lifeless body in his arms, seconds after it hit the ground with a sickening crunch, we can only wonder about the impact of this death on our hero.
Here the film offers us a series of flashbacks of Peter’s time with Gwen, happy times when they were eating frozen yogurt and standing atop bridges and going to interviews at Oxford in very, very tiny skirts. Each of these flashbacks are shown from Peter’s point of view. We see Gwen through his eyes: as he watches her or goes in for an embrace. Because even in her death montage, Gwen has no subjectivity. Her loss can only be registered as Peter’s loss. She is forever the object of his longing gazes. We see Peter go to the graveyard season after season, all to the detriment of his important super hero duties. Here again, Gwen’s value is only in relation to how she impacts Spider-Man’s ability to be Spider-Man.

Look, I get it. The movie’s title is The Amazing Spider-Man 2, not Gwen Remembers How Magnets Work or Gwen Goes to Oxford. Of course every supporting character’s role is there to do just that — to support the story of the Amazing Spider-Man. But I suppose it’s Gwen’s postfeminist accoutrement that leaves a sour taste in my mouth. I almost wish Gwen were more helpless and passive, stupider and more frightened. But it’s the fact that Gwen is so damn capable: she’s pretty and smart and plucky and brave and has the love of a good man. She is living the postfeminist dream (until she dies, that is!) and for that, she gives the film an appearance of some kind of gender equality. “Look, she helps Spidey! Look, she’s pursuing a career!” At the end of the day, these stories belong to the same white men they’ve always belonged to.

This is how summer blockbusters work. They are not for me. They are for white, heterosexual teenage boys. The rest of us are just there for the ride. We, like poor Gwen, are hanging by that slender thread of webbing, hoping Spider-Man can hold on to us just a few minutes longer before we’re dropped to the ground.