Note: I was asked to give a sermon at the local Unitarian Universalist Church. The audience was mainly folks over 60, and I think they found this talk useful. I hope you will, too.
I’m a Democrat but I come from a long line of Republicans, so it’s always been difficult to hold different political beliefs from my family. But things have never been so taxing as they have been since the election of Donald Trump. Yes, my family and I disagree on many of Trump’s policies and approaches. But more worrying to me, as a scholar of the media, is how difficult it has become to support claims with evidence that will be accepted. I’ll offer an example to illustrate:
Last weekend my family drove north to central Pennsylvania to see my mother. As with most visits, our conversation invariably turned to politics, specifically, a widely reported story about how the US Navy hung a tarp over a warship docked in Japan, where President Trump was giving a speech last week. The warship in question, the USS John McCain, is named after the father and grandfather of the late Senator John McCain. As I’m sure you all know, Donald Trump maintained a contentious relationship with John McCain, a member of his own party, in both life and death. This particular story—the fact that the warship was covered by a tarp during Trump’s visit—was reported in prominent sources including The New York Times, The Washington Post, USA Today, NPR, CNN, NBC News, ABC News, CBS News, and yes, even Fox News. While there is no full consensus on who made this request, all of these news sources cite a statement released by the US Navy’s chief of information, that a request was made by the White House, “to minimize the visibility of USS John S. McCain.”
My mother a staunch Republican, agreed that the story was embarrassing, but then told me that her close friend, Richard, a strong Trump supporter, disputed the veracity of the story.
“Richard said the reason the boat was covered with a tarp is because it was being repaired.”
“And where did he hear that?”
“Fox news,” she replied, but then, before I could say anything, quickly followed up, “But Fox News is where I heard about the story in the first place.”
In other words, both my mom and Richard, staunch republicans who watch Fox News regularly, learned about the covered USS John McCain through the same news source, but came away with very different conclusions. I decided to investigate the root of Richard’s story on Fox News. With a little Googling I found a series of articles on Fox News which mention that the warship’s name was obscured by a tarp and a paint barge, but that this was due to repairs on the ship, not a request from the White House. Faced with dozens of sources reporting that the White House requested the warship be covered and just one source reporting that the ship was simply under repairs, my mother threw up her hands and concluded: how can we ever know the truth?
This question troubled me because my mother reads the local newspaper every morning, and the New York Times on Sundays, and is generally aware of both national and international current events. Of all the people in her educated Boomer demographic, she should know where and how to find reliable and consistent information about the world. So what happened? The first problem my mother faced, and which so many Americans are facing right now, is the misguided belief that there are two sides to every story, and that, at the end of the day, it is simply a matter of one’s opinion. The second problem my mother faced is a deficit in media literacy. While my mom now knows how to minimize a window, print a PDF, and share articles on Facebook, she is less aware of how information functions in today’s media environment. In this respect, my mother is like the great majority of us, not just the Boomers, who consume and share content in an ever-shifting online environment. How does content circulate online and how do we know which sources to trust? This morning, I’d like to unpack these two causes of fake news and the spread of misinformation.
One of the most lauded virtues of American society is the idea of free speech and that a plurality of voices is always preferable to the restriction of some in favor of others, hence the appeal to hearing “both sides” of an argument. This feels objectively true and logical, but like all things in life, functions quite differently in different scenarios. Both Sides-ism causes problems when we incorrectly decide that all views on a single topic deserve the same amount of consideration. All political issues elicit a range of opinions, depending on who you’re talking to, but these opinions, these “sides,” do not always carry equal weight. I might be pro-choice because I believe women should have sovereignty over their bodies or because I just hate babies. Both are indeed opinions but one carries far more validity (and morality) than the other. The phenomenon of Both Sides-ism, or false balance, occurs when viewpoints are presented in public discourses as having roughly equal weight, when they objectively do not.
In a New York Times editorial, published in the final delirious months of the 2016 election season, economist Paul Krugman described the phenomenon of Both Sides-ism, this “cult of balance,” as “the almost pathological determination to portray politicians and their programs as being equally good or equally bad, no matter how ludicrous that pretense becomes.”
The consequences of Both Sides-ism are most destructive when all views about personhood are given the same consideration. Take, for example, the recent case, Masterpiece Cakeshop Ltd. v. Colorado Civil Rights Commission in which a baker refused to make a cake for a same sex wedding, claiming that to do so conflicted with his religious beliefs. The plaintiffs argued this was discrimination while Masterpiece Cakeshop thought the refusal couldn’t be discrimination when it was actually freedom of religion. In this way, Both Sides-ism, this false balance in our discourse, works to legitimate opinions which are not legitimate because equal rights are not up for debate. Another consequence of Both Sides-ism is that, just as it converts opinions into truths, it turns truths into opinions. Because my mother’s friend Richard reported to her that the USS John McCain was covered with a tarp due solely to repairs being made, a story supported by one news source, she felt compelled to give this claim the same weight as the story reported by dozens of news sources, which is that someone in the Trump administration requested that the ship be covered.
We need to be mindful of these slippages and logical fallacies. While there are dozens of opinions for every major political issue, we must always remember that some are more valid than others. And considering some of these opinions, like the idea that same sex marriage is a sin, is damaging. While there is much fake news out there, that does not negate the fact that real reliable news and facts exist. But how can you determine how to find reliable news? That brings me to my second point…
Scholars who study the media and the way it’s content is shaped and deployed have long known that much of the news on cable channels like CNN, Fox, and MSNBC, as well as network and local newscasts, tend to focus on the sensational at the expense of the newsworthy. These scholars have also long known that, in a bid for more eyeballs (and advertising dollars) news headlines, and even news content, can be purposely misleading, incorrect, or yes, even “fake.”
However, in the lead up to, and following, the contentious 2016 U.S. Presidential election, the term “fake news” was deployed more frequently, and in more contexts, than ever before, and not because Americans were suddenly becoming more media literate. The prevalence of the term “fake news” coincided with a rise in conspiracy theories getting traction on social media and then finding their way into public discourse. Take, for example, when White House advisor Kellyanne Conway cited something she called the “Bowling Green Disaster,” an event invented out of whole cloth, as a justification for the Trump administration’s controversial travel ban in February 2017. This increasing inability to discern truth from lies has changed in relation to the technology we have developed for communicating authentic facts. That is, technology is conditioning the way we understand the look and sound of reality and truth. Let’s take a quick trip into our technological past to illustrate what I mean.
Before the global spread of the printing press in the sixteenth century, information about anything outside what you could personally verify was simply not available. When we move forward in time to the circulation of newspapers in the seventeenth century, we are able to learn factually-verifiable things about the world outside our immediate purview, and we are consuming the same facts as our neighbors reading the same newspaper. Again, there is an implicit trust that what we are reading is in, fact, from a reliable source and that this source would have no motivation for misleading us.
The development and deployment of radio and television for the mass distribution of information in the twentieth century likewise carried an implicit trust, inherited from the media that preceded them (namely print newspapers and newsreels that ran before cinemas screenings). Rather than buying a newspaper or venturing out to the movies to watch a newsreel, consumers could enjoy the radio, and the information it provided, without leaving the comfort or the intimacy of their home. Radio also acted a democratizing technology: it gave anyone with a radio common access to events and entertainments that only a tiny minority had been able to enjoy previously. In the 1920s and 1930s, the medium of radio made the world feel like a smaller place when everyone with a radio could listen to the same content being broadcast at the same time.
Just after WWII, when the new medium of television gained a foothold in the American consciousness, it was likewise viewed as a tool for the production of social knowledge, part of a postwar television cultural moment that embraced the medium for its ability to convey realism. The technology of television—its ability to record human behavior and broadcast it live—was linked to a general postwar interest in documenting, analyzing and understanding humanity. By 1960, 90% of American homes had television (compared with 9% in 1950). More and more, Americans were able to consume the same information at the same time, all across the country.
Until the mid-1980s, American television programming was dominated by a small number of broadcast networks: ABC, NBC, CBS and later Fox in 1986. Cable television technology existed as early as the 1940s, a solution to the problem of getting TV signals into mountainous areas. But cable subscriptions steadily rose throughout the 1970s and 1980s in response to new FCC regulations and policies allowed the technology to grow. What’s important for our discussion of fake news is that cable provided access to targeted audiences, rather than the more general demographic targeted by networks. Special interests, homogenous groups, had content made specifically for them.
The development of cable had a massive impact on journalism because, rather than being relegated to the morning or evening news, cable allowed the news to run 24 hours per day. In the 1980s, Ted Turner launched CNN which provided “saturation coverage” of world events like the Tiananmen Square protests of 1989 or the first Gulf War. Instantaneous and ongoing coverage of major events as they happen is incredibly useful but also has its drawbacks.
For example, this type of coverage demands an immediate response from politicians, even before they have a chance to inform themselves on developing stories, aka “the CNN effect.” Former Secretary of State James Baker said of the CNN effect “The one thing it does, is to drive policymakers to have a policy position. I would have to articulate it very quickly. You are in real-time mode. You don’t have time to reflect.” CNN’s saturation coverage also increases pressure on cable news to cover an issue first, leading to errors in reporting and even false and misleading information.
In 1996, Rupert Murdoch launched Fox News as a corrective to the liberal bias he argued was present in American media. The new channel relied on personality-driven commentary by conservative talk radio hosts figures like Sean Hannity, Bill O’Reilly, and Glenn Beck, only Fox framed right-wing positions and coverage as “news” and not “opinion.” Thus, the news channel served as decisive blow to the boundary between fact and opinion in journalism. This blurring of fact and opinion, along with the drive for 24 hours of content, has led to an onslaught of information of varying levels of utility.
The development and widespread use of the internet and social media platforms like Facebook and Twitter over the last 20 years has profoundly impacted our relationship with facts and truth. Complex search algorithms, like Google, make information retrieval and organization easier and faster, thereby giving human brains more free time to think and do and make, something engineer. Engineer Vannevar Bush first addressed how creation and publication of research and data was far outpacing the human mind’s ability to organize, locate, and access that information in a timely manner in a 1967 article titled, “Memex Revisited”. He was particularly concerned about a problem that plagues us today: information overload. If attention is a finite commodity and information is increasingly boundless, how can we reconcile the two?
Bush’s essay proposes a solution: a hypothetical microfilm viewing machine, a “memex,” which mimics the way the human brain recalls information and makes connections between different concepts and ideas. While most file storage systems at this time were structured like indexes, with categories, subcategories, and hierarchies of information, Bush’s hypothetical memex was structured by association, working much as the human brain does when searching for an answer. Bush foretells the development of the modern search engine, which is able to process 40,000 queries per second, searching for the exact information a user seeks.
We often think of the ways technology shapes the way we think, but Bush’s essay highlights how the ways we think can also shape the structure of technology. Indeed, complex search algorithms, like Google, make information retrieval and organization easier and faster, thereby giving human brains more free time to think and do and make. Bush described this as the “privilege of forgetting.” But when this perspective bumps up against our current experience of the internet, a memex that exceeds Bush’s wildest dreams, we can also see how the privilege of forgetting might also be one source of the current distrust of the news and the rejection of facts and science.
With that in mind, I made up a hand out outlining basic tips and tricks for figuring out whether or not the news you’re consuming can be trusted, and, more importantly, so you can share these tips and tricks with friends and family. We have to hang onto the truth, and find ways to help those around us hang on, too. Please feel free to share with family and friends (especially your racist uncle): UU Detecting Fake News
 Hunt Allcott and Matthew Gentzkow, “Social Media and Fake News in the 2016 Election.” Journal of Economic Perspectives. 31, no. 2 (2017): 211-236.
 Andy Guess, Brendan Nyhan, and Jason Reifler, “Selective Exposure to MisinFormation: Evidence from the Consumption of Fake News During the 2016 U.S. Presidential Campaign,”2018, https://www.dartmouth.edu/~nyhan/fake-news-2016.pdf 2018.
 Schmidt, Samantha and Lindsey Bever, “Kellyanne Conway Cites ‘Bowling Green Massacre’ that Never Happened to Defend Travel Ban,” The Washington Post, February 3, 2017.
 David Hendy. “Technologies,” The Television History Book, ed. Michelle Hilmes (London, U.K.: BFI, 2003), 5.
 Vannevar Bush, “Memex Revisited,” Science is not Enough (New York, New York: William Morrow & Co., 1967).
As some visitors to this blog may already know, I curate a true story blog over at tellusastoryblog.com. For our first three years of existence, we aimed to publish one new true story every Wednesday (we took summers and holidays off). But we found that model to be unsustainable so we have just switched to a quarterly format, which we are very excited about! We also redesigned the look and functionality of our site.
Tell Us A Story is proud to announce the publication of our first ever quarterly edition of Tell Us A Story, featuring the best work submitted to us over the last few months. Give it a read and a share! Click here to read volume 4, issue 1.
Look for our next issue in Fall 2016!
Several months ago I published a 2-part guide to the academic job market right here on my blog (for free!!!!!!!!!!), as a way to help other academics explain this bizarre, yearly ritual to family and friends. Indeed, several readers told me that the posts really *did* help them talk to their loved ones about the academic job market (talking about it is the first step!). Yes, I’m working miracles here, folks. And then, this happened:
“A few months ago, as I was sitting down to my morning coffee, several friends – all from very different circles of my life – sent me a link to an article, accompanied by some variation of the question: “Didn’t you already write this?” The article in question had just been published on a popular online publication, one that I read and link to regularly, and has close to 8 million readers.
Usually, when I read something online that’s similar to something I’ve already published on my tiny WordPress blog, I chalk it up to the great intellectual zeitgeist. Because great minds do, usually, think alike, especially when those minds are reading and writing and posting and sharing and tweeting in the same small, specialized online space. I am certain that most of the time, the author in question is not aware of me or my scholarship. It’s a world wide web out there, after all. Why would someone with a successful, paid writing career need to steal content from me, a rinky-dink blogger who gives her writing away for free?
But in this case, the writer in question was familiar with my work. She travels in the same small, specialized online space that I do. She partakes of the same zeitgeist. In fact, she had started following my blog just a few days after I posted the essay that she would later mimic in conceit, tone and even overall structure.
Ethically speaking, idea theft is just as egregious as plagiarism, especially when those ideas are stolen from free sites and appropriated by those who actually make a profit from their online labor.
When pressed on this point, the writer told me that she does read my blog. She even had it listed on her own blog’s (now-defunct) blogroll. But she denied reading my two most recent posts, the posts I accused her of copying. Therefore she refused to link to or cite my blog in her original piece, a piece that generated millions of page views, social media shares, praise and, of course, money, for both her and the publication for which she is a columnist.
So if a writer publishes a piece (and profits from a piece) that is substantially similar to a previously published piece, one which the writer had most certainly heard of, if not read, is this copyright infringement? Has this writer actually done something wrong?”
Well, Christian Exoo and I decided to try to find out. To read our article “Plagiarism, Patchwriting and the Race and Gender Hierarchy of Online Idea Theft” at TruthOut, click HERE.
Academic writing has taken quite a bashing since, well, forever, and that’s not entirely undeserved. Academic writing can be pedantic, jargon-y, solipsistic and self-important. There are endless think pieces, editorials and New Yorker cartoons about the impenetrability of academese. In one of those said pieces, “Why Academics Can’t Write,” Michael Billig explains:
Throughout the social sciences, we can find academics parading their big nouns and their noun-stuffed noun-phrases. By giving something an official name, especially a multi-noun name which can be shortened to an acronym, you can present yourself as having discovered something real—something to impress the inspectors from the Research Excellence Framework.
Yes, the implication here is that academics are always trying to make things — a movie, a poem, themselves and their writing — appear more important than they actually are. These pieces also argue that academics dress simple concepts up in big words in order to exclude those who have not had access to the same educational expertise. In “On Writing Well,” Stephen M. Walt argues:
jargon is a way for professional academics to remind ordinary people that they are part of a guild with specialized knowledge that outsiders lack…
This is how we control the perimeters, our critics charge; this is how we guard ourselves from interlopers. But, this explanation seems odd. After all, the point of scholarship — of all those long hours of reading and studying and writing and editing — is to uncover truths, backed by research, and then to educate others. Sometimes we do that in the classroom for our students, of course, but even more significantly, we are supposed to be educating the world with our ideas. That’s especially true of academics (like me) employed by public universities, funded by tax payer dollars. That money, supporting higher education, is to (ideally) allow us to contribute to the world’s knowledge about our specific fields of study.
So if knowledge-sharing is the mission of the scholar, why would so many of us consciously want to create an environment of exclusion around our writing? As Steven Pinker asks in “Why Academics Stink at Writing”
Why should a profession that trades in words and dedicates itself to the transmission of knowledge so often turn out prose that is turgid, soggy, wooden, bloated, clumsy, obscure, unpleasant to read, and impossible to understand?
Contrary to popular belief, academics don’t *just* write for other academics (that’s what conference presentations are for!). We write believing that what we’re writing has a point and purpose, that it will educate and edify. I’ve never met an academic who has asked for help with making her essay “more difficult to understand.” Now, of course, some academics do use jargon as subterfuge. Walt continues:
But if your prose is muddy and obscure or your arguments are hedged in every conceivable direction, then readers may not be able to figure out what you’re really saying and you can always dodge criticism by claiming to have been misunderstood…Bad writing thus becomes a form of academic camouflage designed to shield the author from criticism.
Walt, Billig, Pinker and everyone else who has, at one time or another, complained that a passage of academese was needlessly difficult to understand are right to be frustrated. I’ve made the same complaints myself. However, this generalized dismissal of “academese,” of dense, often-jargony prose that is nuanced, reflexive and even self-effacing , is, I’m afraid, just another bullet in the arsenal for those who believe that higher education is populated with up-tight, boring, useless pedants who just talk and write out of some masturbatory infatuation with their own intelligence. The inherent distrust of scholarly language is, at its heart, a dismissal of academia itself.
Now I’ll be the first to agree that higher education is currently crippled by a series of interrelated and devastating problems — the adjunctification and devaluation of teachers, the overproduction of PhDs, tuition hikes, endless assessment bullshit, the inflation of middle-management (aka, the rise of the “ass deans”), MOOCs, racism, sexism, homophobia, ablism, ageism, it’s ALL there people — but academese is the least egregious of these problems, don’t you think? Academese — that slow nuanced ponderous way of seeing the world — we are told, is a symptom of academia’s pretensions. But I think it’s one of our only saving graces.
The work I do is nuanced and specific. It requires hours of reading and thinking before a single word is typed. This work is boring at times — at times even dreadful — but it’s necessary for quality scholarship and sound arguments. Because once you start to research an idea — and I mean really research, beyond the first page of Google search results — you find that the ideas you had, those wonderful, catchy epiphanies that might make for a great headline or tweet, are not nearly as sound as you assumed. And so you go back, armed with the new knowledge you just gleaned, and adjust your original claim. Then you think some more and revise. It is slow work, but it’s necessary work. The fastest work I do is the writing for this blog, which as I see as a space of discovery and intellectual growth. I try not to make grand claims for this blog, mostly for that reason.
The problem then, with academic writing, is that its core — the creation of careful, accurate ideas about the world — are born of research and revision and, most important of all, time. Time is needed. But our world is increasingly regulated by the ethic of the instant. We are losing our patience. We need content that comes quickly and often, content that can be read during a short morning commute or a long dump (sorry for the vulagrity, Ma), content that can be tweeted and retweeted and Tumblred and bit-lyed. And that content is great. It’s filled with interesting and dynamic ideas. But this content cannot replace the deep structures of thought that come from research and revision and time.
Let me show you what I mean by way of example:
Stanley has already taken quite a drubbing for this piece (and deservedly so) so I won’t add to the pile on. But I do want to point out that had this profile been written by someone with a background in race and gender studies, not to mention the history of racial and gendered representation in television, this profile would have turned out very differently. I’m not saying that Stanley needed a PhD to properly write this piece, what I’m saying is: the woman needed to do her research. As Tressie McMillan Cottom explains:
Here’s the thing with using a stereotype to analyze counter hegemonic discourses. If you use the trope to critique race instead of critiquing racism, no matter what you say next the story is about the stereotype. That’s the entire purpose of stereotypes. They are convenient, if lazy, vehicles of communication. The “angry black woman” traffics in a specific history of oppression, violence and erasure just like the “spicy Latina” and “smart Asian”. They are effective because they work. They conjure immediate maps of cognitive interpretation. When you’re pressed for space or time or simply disinclined to engage complexities, stereotypes are hard to resist. They deliver the sensory perception of understanding while obfuscating. That’s their power and, when the stereotype is about you, their peril.
Wanna guess why Cottom’s perspective on this is so nuanced and careful? Because she studies this shit. Imagine that: knowing what you’re talking about before you hit “publish.”
Or how about this recent piece on the “rise” of black British actors in America?
Carter’s profile of black British actors in Hollywood does a great job of repeating everything said by her interview subjects but is completely lacking in an analysis of the complicated and fraught history of black American actors in Hollywood. And that perspective is very, very necessary for an essay claiming to be about “The Rise of the Black British Actor in America.” So what is someone like Carter to do? Well, she could start by changing the title of her essay to “Black British Actors Discuss Working in Hollywood.” Don’t make claims that you can’t fulfill. Because you see, in academia, “The Rise of the Black British Actor in America” would actually be a book-length project. It would require months, if not years, of careful research, writing, and revision. One simply cannot write about hard-working black British actors in Hollywood without mentioning the ridiculous dearth of good Hollywood roles for people of color. As Tambay A. Obsenson rightly points out in his response to the piece:
Unless there’s a genuine collective will to get underneath the surface of it all, instead of just bulletin board-style engagement. There’s so much to unpack here, and if a conversation about the so-called “rise in black British actors in America” is to be had, a rather one-sided, short-sighted Buzzfeed piece doesn’t do much to inspire. It only further progresses previous theories that ultimately cause division within the diaspora.
But the internet has created the scholarship of the pastless present, where a subject’s history can be summed up in the last thinkpiece that was published about it, which was last week. And last week is, of course, ancient history. Quick and dirty analyses of entire decades, entire industries, entire races and genders, are generally easy and even enjoyable to read (simplicity is bliss!), and they often contain (some) good information. But many of them make claims they can’t support. They write checks their asses can’t cash. But you know who CAN cash those checks? Academics. In fact, those are some of the only checks we ever get to cash.
Academese can answer those broad questions, with actual facts and research and entire knowledge trajectories. As Obsensen adds:
But the Buzzfeed piece is so bereft of essential data, that it’s tough to take it entirely seriously. If the attempt is to have a conversation about the central matter that the article seems to want to inform its readers on, it fails. There’s a far more comprehensive discussion to be had here.
A far more comprehensive discussion is exactly what academics have been trained to do. We’re good at it! Indeed, Obsensen has yet to write a full response to the Buzzfeed piece because, wait for it, he has to do his research first: “But a black British invasion, there is not. I will take a look at this further, using actual data, after I complete my research of all roles given to black actors in American productions, over the last 5 years.” Now, look, I’m not shitting all over Carter or anyone else who has ever had to publish on a deadline in order to collect a paycheck. I understand that this is how online publishing often works. And Carter did a great job interviewing her subjects. Its a thorough piece that will certainly influence Buzzfeed readers to go see Selma (2015, Ava DuVernay). But it is not about the rise of the black British actor in America. It is an ad for Selma.
Now don’t get me wrong, I’m not calling for an end to short, pithy, generalized articles on the internet. I love those spurts of knowledge, bite-sized bits of knowledge. I may be well-versed in film and media (and really then, only my own small corner of it) but the rest of my understanding of what’s happening in the world of war and vaccines and space travel and Kim Kardashian comes from what I can read in 5 minute intervals while waiting for the pharmacist to fill my prescription. My working mom brain, frankly, can’t handle too much more than that. And that is how it should be; none among us can be experts in everything, or even a few things.
But here’s what I’m saying: we need to recognize that there is a difference between a 100,000 word academic book and a 1500 word thinkpiece. They have different purposes and functions and audiences. We need to understand the conditions under which claims can be made and what facts are necessary before assertions can be made. That’s why articles are peer-reviewed and book monographs are carefully vetted before publication. Writers who are not experts can pick up these documents and read them and then…cite them! In academia we call this “scholarship.”
No, academic articles rarely yield snappy titles. They’re hard to summarize. Seriously, the next time you see an academic, corner them and ask them to summarize their latest research project in 140 characters — I dare you. But trust me, people — you don’t want to call for an end to academese. Because without detailed, nuanced, reflexive, overly-cited, and yes, even hedging writing, there can be no progress in thought. There can be no true thinkpieces. Without academese, everything is what the author says it is, an opinion tethered to air, a viral simulacrum of knowledge.
Sometimes I try to write creative non-fiction. Luckily, the good folks at Word Riot, a site I greatly admire, thought this was acceptable for publication in their December 2014 issue. I’m super honored and would love if you’d read it. It’s about my idol, Diane Rehm.
The link to “Diane” is here.
Here are some recent news stories about women:
In Afghanistan, a 3-year-old girl was snatched from her front yard, where she was playing with friends, and raped in her neighbor’s garden by an 18-year-old man. The rapist then tried, unsuccessfully, to kill the child. Currently this little girl is in intensive care in Kabul, fighting for her life. But even if this little girl survives this horrifying experience, her parents tell the reporter, she will carry the shame and stigma of being raped for the rest of her life. The parents hope to bring the rapist to court, but as they are poor, they are certain their family will not receive justice. The child’s mother and grandmother have threatened to commit suicide in protest.
In Egypt,Raslan Fadl, a doctor who routinely performs genital mutilation surgery on women, was acquitted of manslaughter charges. Dr. Fadl performed the controversial surgery on 12-year-old Sohair al-Bata’a in June 2013 and she later died from complications stemming from the procedure. According to The Guardian, “No reason was given by the judge, with the verdict being simply scrawled in a court ledger, rather than being announced in the Agga courtroom.”
Washed up rapper, Eminem (nee Marshall Mathers), leaked portions of his new song, “Vegas,” in which he addresses Iggy Izalea (singer and appropriator of racial signifiers) thusly:
“Unless you’re Nicki
grab you by the wrist let’s ski
so what’s it gon be
put that shit away Iggy
You don’t wanna blow that rape whistle on me”
Azalea’s response was, naturally, disgust and a yawn:
This story was followed, finally, by a story on the growing sexual assault allegations against Bill Cosby. Cosby has been plagued by rumors of sexual misconduct for decades. However, a series of recent events, including Cosby’s ill-conceived idea to invite fans to “meme” him and Hannibal Buress’ recent stand up bit about the star, brought the issue back into the national spotlight. As Roxane Gay succinctly notes “There is a popular and precious fantasy that abounds, that women are largely conspiring to take men down with accusations of rape, as if there is some kind of benefit to publicly outing oneself as a rape victim. This fantasy becomes even more elaborate when a famous and/or wealthy man is involved. These women are out to get that man. They want his money. They want attention. It’s easier to indulge this fantasy than it is to face the truth that sometimes, the people we admire and think we know, are capable of terrible things.”
I cite these horrific stories happening all over the world, to women of all ages, races, and class backgrounds, because they are all things that happen to women because they are women. These are all crimes in which womens bodies are seen as objects for men to take and use as they wish simply because they can. The little girl in Afghanistan was raped because she has a vagina and because she is too small to defend herself. Cosby’s alleged victims were raped because they have vaginas and because they naive enough to assume that their boss — the humanitarian, the art collector, the seller of pudding pops — would not drug them. And Iggy Izalea, bless her confused little heart, makes a great point: why is it when men disagree with women, their first threat is one of sexual assault? Why doesn’t Eminem write lyrics about how Izalea is profiting off of another culture or that her music sucks? Because those critiques have nothing to do with Izalea’s vagina. If you want to disempower or threaten or traumatize a woman, you have to remind her she is, at the end of the day, nothing more than a vagina that can be invaded, pillaged and emptied into.
But you know this, don’t you, readers? Why am I reminding you of the fragile space women (and especially women of color) occupy in this world, of the delicate tightrope we walk between arousing the respect of our male peers and arousing their desires to violate our vaginas? Because of International Men’s Day.
“There’s an International Mens Day?” you’re asking yourself right now, “What does that entail?” Great question, hypothetical reader. This is from their official website:
“Objectives of International Men’s Day include a focus on men’s and boy’s health, improving gender relations, promoting gender equality, and highlighting positive male role models. It is an occasion for men to celebrate their achievements and contributions, in particular their contributions to community, family, marriage, and child care while highlighting the discrimination against them.”
When I opened up my Twitter feed on Wednesday, I noticed the #InternationalMensDay hashtag popping up in my feed now and then, mostly because my friend, Will Brooker, was engaging many of the men using the hashtag in conversations about the meaning of the day and its possible ramifications.
Now, I’m no troll (and neither is Will, by the way). Yes, I like to talk shit and I have been known to bust my friend’s chops for my own amusement (something I’ve written about in the past), but generally, I do not spend my time in real life or on the internet, looking for a fight. But International Mens Day struck me as so ill-conceived, so offensive, that I couldn’t help myself.
Within minutes I had several irate IMD supporters in my mentions:
These men were outraged that I could so callously dismiss the very real problems men had to deal with on a day to day basis:
Yes apparently International Mens Day is needed because all of the feminists are sitting around cackling about the high rates of male suicide, or the fact that more men die on the job than women, or that more men are homeless than women. And since women have their own day on March 8th — and African Americans get the whole month of February! — then why can’t men have their own day, too? After all, men are people, right? Of course they are. But that’s not the point.
As a Huffington Post editorial put it:
“The problem with the IMD idea is that men’s vulnerabilities are not clearly and consistently put into the context of gender inequality and the ongoing oppression of women. For example, a review of homicide data shows that where homicide rates against men are high, violence against women by male partners is also high (and female deaths by homicides more likely to happen). Or, for example, men face particular health problems because we teach boys to be powerful men by suppressing a range of feelings, by engaging in risk-taking behaviors, by teaching them to fight and never back down, by saying that asking for help is for sissies — that is, the values of manhood celebrated in male-dominated societies come with real costs to men ourselves.”
Yes, the problem with IMD is that the real problems faced by men are not the direct result of the fact that they are men. Let me offer a personal example here to explain what I mean. I am a white, upper middle class, high-achieving white woman. According to studies, I am more likely to develop an eating disorder than other women. And eating disorders are very much tied to gender in that women face more pressure to be thin that men do. But does that mean there should be an entire day for white, upper middle class, high-achieving white women in order to bring awareness to the fact that we are more likely to acquire an eating disorder than others? No. Because the point of having a “day” or a “month” devoted to a particular group of people is to shed light on the unique challenges they face and the achievements they’ve made because otherwise society would not take notice of these challenges and achievements. Let me say that again: because otherwise society would not take notice of these challenges and achievements.
We do not need an International White, Upper Middle Class, High-Achieving White Woman Day because I see plenty of recognition of the challenges and achievements of my life; in the representation game, white women fall just behind white men in the amount of representation we get in the news and in popular culture. Likewise, we do not need an International Mens Day because, really, everyday is mens day. Every. Single. Day.
As more and more angry replies began to fill up my Twitter feed, I knew I should abandon ship. I would never convince these men that they do not need a day devoted to men’s issues since “men’s issues,” in our culture, are simply “issues.” But I couldn’t help myself. These men were so aggrieved, so very hurt that I could not see how they were victims, suffering in a world of rampant misandry:
I realize that giving an oppressed group of people their own day or month is a pretty pointless gesture. It could even be argued that these days serve to further marginalize groups by cordoning off their needs, their history, their lives, from the rest of the world. Still, after #gamergate and Time magazine readers voted to ban the word “feminism,” to name two recent public attacks against women, it’s hard for me not to see International Mens Day as an attack on women, and feminists in particular, like a tit for tat.
So yeah, I realize that by trolling the #InternationalMensDay hashtag I did little to promote the cause of feminism or to educate these men about why IMD might be problematic. But I didn’t do it to educate anyone or to promote a cause. I did it, you see, because sometimes in the face of absurdity, our only choice is to cloak ourselves in sarcasm and great big mugs of mascara flavored bitch tears.
You may have heard that Twin Peaks, beloved cult television of my adolescence, is getting a third season on Showtime. That won’t happen until 2016. In the meantime, I’m going to quietly weep about it. Why am I blue? I explain over at Antenna and talk about it with two other fans, Jason Mittell and Dana Och.
Here’s an excerpt:
“I started watching Twin Peaks when ABC aired reruns in the summer of 1990, after some of my friends started discussing this “crazy” show they were watching about a murdered prom queen. During the prom queen’s funeral her stricken father throws himself on top of her coffin, causing it to lurch up and down. The scene goes on and on, then fades to black.
I started watching based on that anecdote alone and was immediately hooked. Twin Peaks was violent, sexual, funny and sad, all at the same time – I was 13 and I kept waiting for some adult to come in the room and tell me to stop watching it. My Twin Peaks fandom felt intimate, and, most importantly, very illicit.
One month before I turned 14, Lynch’s daughter published The Secret Diary of Laura Palmer, a paratext meant to fill in key plot holes and offer additional clues about Laura’s murder. But really, it was like an X-rated Are You There God, It’s Me Margaret. The book was far smuttier than the show and my friends and I studied it like the Talmud. That book, coupled with Angelo Badalamenti’s soundtrack, which I played on repeat on my tapedeck, created my first true immersive TV experience.”
Read the whole thing here.