By Shawn StJean
Perhaps my accompanying photos are a trifle hyperbolic. Perhaps. It’s a truism among our global neighbors that Americans (by which I mean U.S. citizens) expect everyone, everywhere, to speak English. The corollary, of course, is that most refuse to learn other languages, such as Spanish, even when the utility of doing so is abundantly clear. But a looming problem for our culture in the 21st century seems to be that Americans increasingly decline even to learn English–at least beyond the 3rd or 4th grade level.
This level, supported by weak resources in the slang of the moment, proves sufficient for basic writing and speaking, but does not carry us far into the realm of critical thought and communication.
I choose the word “collapse” for my title, rather than “decline,” because I mean just that–what used to be a language with hundreds of thousands of specific, nuanced and descriptive choices has and continues to converge and impode into fewer and fewer. With the recession of traditional print media in the face of digital dissemination of what can charitably be called information, even simple affirmations like “Yes,” “certainly,” “definitely,” “acknowledged,” and “no doubt,” in the most extreme example of private text messaging, have all been replaced by a single letter: “K.”
Need this be a bad thing? After all, what’s more efficient than “K”? Doesn’t that free us up for more important, or at least more, activity? Before answering, let’s look at some other casualties in this war for the space in our brains.
Examine the following short list of commonly used expressions, and you’ll realize that either they are purposefully and even defiantly vague, or that one word takes the place of many–indicative of the digital age we live in (compression, homogenization, and subtle loss of nuanced information):
“Do” replaces the verbs/actions “try” “give” “accept” “participate in” “contribute to” “tolerate” “clean.” As in “I don’t do Christmas.”
“Go” travel/venture/explore/pedal/fly/walk/hike/swim/jog and even “communicate something uncomfortable,” as in “Don’t go there.”
“huge” /big/large/important/significant/influential/knowledgeable/enthusiastic. “I’m a huge fan.” In my ear, this sounds ridiculous even on the face of it. We all speak in metaphors of one degree or another all the time (“collapse” is a minor metaphor when not speaking of a physical structure,) but the above expression equates to saying the gushing adorer is an abnormally large person (or ventilating device.) One might as well offer to wave oversized palm leaves, ancient-Egyptian style, at the object of worship.
“way” very/much/far/long (“This license is way out of date.” “This sauce has way more garlic than the recipe calls for.”) This one in particular disturbs me because it demonstrates we aren’t just discussing slang here. “Way” has been adopted not just in common speech, but by professional writers. It has infiltrated the language in a permanent, um, way–ahem–manner.
“You’re all set.”
“It’s all good.”
“it’s all about”
“comes into play”
“back in the day”
Of course, words are invented, repurposed, and recombined all the time. I must be overracting. Aren’t these replacing archaic usages? We’ve got “tweet.” And “text.” “Sick,” “diesel.” Oh, and “literally” can apparently now mean just the opposite, “metaphorically”–I mean, does it really matter?
“[ ] is a thing.” Ah, yes, thing–the one catch-all noun when people grasp for a word and cannot find it, the very expression of inarticulateness, has become an official word to describe a fad, trend, icon, object of buzz or gossip, popular occurrence or consumer good, news item of the day, or week. We had all those expressions, and they all relied upon small distinctions. At this stage in human (d)evolution, we needed “thing”?
Okay. Let’s say I’m right. So the language is imploding. What’s at stake here?
Many will not miss the subtleties that have dispersed into ether, I imagine. Then again, it’s difficult to miss something you never knew you had. What about the millions of unborn youngsters who will grow up with effective working vocabularies of a mere few thousand words? Will they write poetry that amounts to more than a colorful tag on a railroad bridge? Will they read it? Will they understand the U.S. Constitution, even as they are called increasingly upon to “defend” it? Will the historical records of the 19th and 20th centuries begin to sound as impenetrable as Shakespearean solilioquies do to us? And I’m not talking about the kind of missing material in a contraction: to anyone but a fiction-writer or screenwriter, the distinction between “I’ve” and “I have” is not great. One might use it to distinguish among charaters who are high-born or low-born, for example. For the rest of us, it’s merely a convenience.
George Orwell warned writers not to compose in cliche’s. He claimed, essentially, that writing in the pre-digested shorthand phrases of others leads to thinking in the pre-digested shorthand phrases of others. Other signs that your thinking has been compromised: Do you find yourself regularly Googling information that you could remember with just a bit of effort? Are you trusting that information (or that from Wikipedia, Mapquest, Siri, or the CBS Evening News) enough to act upon it or pass it on to another human being without double-checking it? Are you cut-and-pasting that information (either in written or verbal form) without rephrasing it? My overall point here is there exist vital differences among raw data, information (processed data), and intelligence (interpreted information). And yet many of us are not bothering to recognize them. Not because we lack the cognitive ability, but because we lack the critical tools and the will to use them.
A brief [ mostly harmless] experiment should serve here. Raise your hand if you like music.
That should include most of you, one hopes. If you like music, you have probably in your time looked up some song lyrics. In the old days, we read them out of LP album covers–which meant the source was the band’s record label, presumably direct from the songwriters themselves, which meant little chance of transmissional error. Nowadays, we all know where song lyrics get found. Dozens of websites cater to this need; even Google has gotten directly into the act through their search engine. Look up a song or two that you know intimately, but the performed and recorded lyrics of which are not 100% crystal-clear by listening. I can guarantee you that, as transcribed onto your website of choice, you will not be long in discovering blatant errors in those lyrics which materially alter their meaning. Furthermore, and more appallingly to me, you will discover upon cross-checking that most, if not all, of the alternative websites repeat that same error. Which means, of course, that they are all “borrowing” from each other, and profiting off both you and the songwriters with little regard for the truth. Now, if the stakes here seem low to you, import your experiment to the television news programs. Jon Stewart had a running bit on his incarnation of The Daily Show dedicated to proving that not only do major news outlets shamelessly plagiarize from each other, but they do so in unedited cliche’s. Again, in the old days, we might double-check their intelligence in what used to be called printed newspapers. Umm. Except. . .
One of the great virtues of written language is its precision, yet increasingly written English begins to resemble spoken English, even in widely disseminated and professionally published print media. And spoken English begins to resemble colloquial English. Don’t think so? Ask an octogenarian (someone born roughly during the Great Depression, as of 2017) if their parents would use the word “cool” as part of their everyday discourse. Nowadays, try to find someone who doesn’t. Not that I think “cool” has done the language any great harm. As far as I can tell, it was first used in America, in its modern sense, by Emerson in the 1840s–which probably means it dates back even farther and derives from the British. But this word may prove the exception rather than the rule. As it is, it conflates a much more typically detailed appraisal of a person, event, or object. A girl who might once have been variously described as “tolerant,” “forgiving,” “loose,” “free-thinking,” “substance-abusing,” or “not a nag” is now simply “cool.”
Of course, one might argue that simple is better; the fewer moving parts in a machine, the more reliable it is likely to be (read “mousetrap.”)
I doubt the sustainability of that argument. Another, more insidious example: “fewer” vs. “lesser” (or less). Almost no one but your English teacher bothers with this one anymore. Here’s why: who cares if your supermarket checkout line reads (correctly) “fewer than 12 items” or (incorrectly) “less than 12 items”? Can’t we just dispense with one of these? Well, we could. Except one of them refers predominantly to individual items and people, and the other refers objects in bulk or concepts. That is, “fewer people are finding jobs their college degrees prepared them for.” NOT “less people.” Because those people are individuals, not some vague statistic. There’s less forest, which means fewer trees. There may be “less opportunity.” There may be “less rain this year” or even “less cod in these waters.” But if there are unaccountably “less people,” we had better start looking for them. And reevaluating the value we place on human life.
I’d like to conclude with a different, and more familiar example; possibly the mostly commonly transmitted text message in English:
Where R U
It (or some variant) is quick, servicible, doesn’t cost much effort to send, or–hypothetically–to answer. And yet this message has probably caused more misunderstandings and needless arguments than most. Why? It’s laden with ambiguity (or even what deconstructors call “undecideability”). In the absence of voice intonation, facial expression, pronunciation, linguistic context, primary and/or secondary punctuation, and so on, the receiver must interpolate those for herself. Here’s how that might go, in response:
“None of your damn business.”
“Uh oh, he’s saying I’m late again.”
“Did I promise to be somewhere right now?”
“I’m at Main Street and Vine”
“She really wants to know Who am I with, and What am I doing?”
“I left an hour ago.”
Texts and tweets may count portability and quickness among their virtues, but they certainly cannot include clarity in that list. Even among intimates, this message is as likely to lead to a dispute as an informative reply. Another aspect that’s missing, and increasingly missing from written communication especially, is any sense of formality, professionalism, or what used be called politeness. Now, you may say, “Well, that’s just a text message.” Sure. But ask yourself how many e-mails you have received without a greeting, a signature, an identification of the sender or introduction, or even so much as a rudimentary spell-check? Did you answer them? If you did, you, as are we all, are complicit in the process of collapse. Compare these two e-mails, typical of what I, as a college professor, have received from freshman students:
 Dear Professor: I’m sorry I missed class last Tuesday and Thursday as my grandmother died. I misplaced my copy of the syllabus. Can you tell me what we did in class so I can make up the work? Thanks, Kayla
 I missed class last week would you tell me what I missed
Neither one of these qualifies as polished, professional communication–especially from a writing student–but I think you’ll agree that the former has a few lingering virtues to recommend it, which have gone glimmering in the latter. In fact, were I to delve deeper into my records of the past, we’d find that the students of the 1990s had bothered to include my actual name; that the excuses were often more inventive and frequently included such touches as offers of doctor’s notes; that a request to meet in office hours was not unheard of upon missing a week’s worth of training; that the student might have actually acquired class notes from a peer before writing; that the student would bother to identify which of the four classes I teach she was enrolled in.
I’m not sure that the degradation of the language–as slow and inevitable as abuse of the atmosphere that has summoned the effects of global warming– will contribute materially to the collapse of the society, the culture, or possibly even our civilization. But I don’t fancy it helping. It’s perhaps predictable that as our planet becomes more overpopulated, as more wealth becomes concentrated into fewer hands, and as such factors demand a parallel dynamic of information becoming the province of fewer people (collectors,) the rest of us will not find encouragement to strengthen our language skills beyond the consumer sphere (that is, you and I only need know how communicate well enough to work and buy and perhaps sell a bit.)
As for writing, a culture’s written language is the primary repository of its history. Without a sense of history, it cannot evolve.
The solution? Same as it’s always been, and the advice is good not just for writers, but for anyone who wishes to grow their brain and live up to something approaching their potential: READ. Read anything. Comic books, advertisements, editorials, romance novels, cereal boxes, movie credits. Some are better than others, obviously. Personally, I recommend Hawthorne, Hemingway, and Wharton, along with Carl Sagan for those whose tastes require something a little more contemporary–here was a man who knew a bit about large-scale collapse–but that’s just me.
by Ed Anger, opportunistic occasional contributor
Trivia. Americans can’t look away, like a car wreck. Since when did a tweet–ANY tweet–become newsworthy? This is an avenue specifically designed to carry information Too Trivial for Traditional media. If you missed it the first time, by definition it wasn’t important!
Twitter, or as I like to refer to this bottom of the social media barrel, Twerper. After all, who but a Twit or Twerp would exchange insults through a means that cannot possibly have repercussions other than a 140-character counter-insult? You think anyone’s gonna stand toe-to-toe with Arnold Schwarzenegger and tell him he was a lousy governor, no matter how many secret service agents he’s got at his back?
Apparently TV and Taxpayer money (the two dominant consonants of “TriVia” uncoincidentally lurking there as well) aren’t hip enough anymore. Just what we all needed, the Maury Povich show with semi-literate politicians!
The entire culture, having apparently run out of real topics–the trials and heroics of mere mortals don’t generate enough interest–has been inoculated (like that free flu-shot they give you at the supermarket that gives you the flu) and thus addicted to trivia. Imagine yourself as the Texas Ranger who gets assigned to the case of Tom Brady’s missing Super Bowl jersey. “Hell, it’s not bad enough this fella makes 300 times my salary for playing ball, now I get to track down his dirty laundry. So this is what my career has come to. . .” Let’s hope nobody robs a bank or kills somebody while he’s not at his post.
Newspapers have already become as thin as Target flyers. Once they start reprinting tweets, we’ll have hit the Trifecta of redundant, useless information that distracts us from the latest global warming evidence or how the debt-ceiling got hiked today.
So T-Rump (how does one type with hands like a T-Rex?) goes after Schwarzenegger, and the Governator shoots back a better one. What a pair of Trumps. Ad nauseum. Then we tune in to the 6-o’clock news to witness how the leaders of our nation have devolved to antics that most of us outgrew as 11-year old children. Quite a Trip.
I’ve had enough of this Twaddle. I think I have a case of the D.T.s. Need a drink. . .
By Shawn Stjean
Alternate reality. Imagine this: you’re in a movie theater, and the feature film stars 5-25 named women characters, and one male. You’re a pretty sharp viewer, so it’s not long before you realize the male seems primarily there, in the first hour of run time, to confirm the heterosexuality of the women (he’s the boyfriend of one, but a second makes a suggestive remark to him, and a third checks out his ass–with help from the POV of the panning camera, held by a female as the credits will show, and directed by a female). So we can all be comfortable knowing our heroes are “normal.”
In the second half, the male gets sent home, while the women go out and accomplish their epic mission. That’s okay, he can make supper and take care of their motherless child while he waits. Oh, sorry, I spoke too soon. The bad guys break in and kidnap him, to use as leverage against the team.
As all this drama unfolds, you glance around to see if the rest of the audience is buying it. You notice something: like you, 52% of the audience is male. Yet, this lack of interest by the filmmakers in your gender seems “normal.” How? It’s always been that way.
Back onscreen, something odd happens. Your male character looks as if he’s about to display power somehow: by interrupting, or grabbing a gun, or possibly even out-thinking the bad guys. Well, he’s quickly de-powered. How? Well, it looks as if someone just slapped him across the face and sent him sprawling. But the real work is done with the word that directly precedes the act: a slur that you’ve heard in dozens of films and never thought much of. Yet, today, you realize that it comes always at moments when males threaten to display true free agency. In some other reality, the word is B—H. Here, it’s unpronounceable. You first remember hearing it onscreen in 1986, when rare male hero Ripley had to fight the Alien King for custody of his adopted son, Newt, and challenged: “Get away from him, you —–!”
For those familiar with the Bechdel Test *1 for films, you recognize I’m furthering its project of offering an inverted perspective, a (regrettably) ridiculous fantasy to create empathy with female viewers. No, the test isn’t sophisticated enough to tell a good movie from a bad one, based on gender representation alone. It wasn’t meant to: it simply points to an area of our culture with a big, gaping hole: why doesn’t the film industry, which creates products for consumption by roughly equal numbers of men and women, fairly represent and employ both?
Let’s tweak the scenario just a bit, and in a more realistic direction. Let’s say you haven’t come alone to the theater. Your young child is sitting next to you. A son, in my alternate reality. A daughter, in our own. That matter to you?
It ought to. You, as an adult, can process a certain level of critical thinking about all this. He can, too, of course–perhaps more than most adults realize–however, there’s quite a lot of subconscious imitative behavior left in him. At some level, he’s digesting all this gender inequity as normal.
Which brings me, as a major example, to Marvel Studios. Not because they do so poorly, but because they do so well. And because they produce big-budget blockbusters that are suitable and attractive to children.
Here’s a statement most parents would agree with: when you regularly leave your child with Grandma, or Uncle Joe, then in effect Grandma or Uncle Joe are helping you raise your child, for better or worse. Now, here’s a more controversial statement: When you leave your child in daycare, then the babysitters there are helping you raise your child. Does the fact that these providers are not blood-related, or that they accept payment, change the dynamic, from the child’s perspective? I doubt it. Finally, try this one: when you sit your child in front of a video game, television, or book, then those media are helping to raise your child. The stories they tell are as influential, if not more so, than Grandma’s. Marvel, in all its forms, and like it or not, is helping America raise its children.
Back to our own reality. Where, to put it succinctly, boys rule.
Here’s a great little moment from Captain America: The Winter Soldier:
Black Widow: Where did Captain America learn to steal a car?
Cap: Nazi Germany. And we’re borrowing–take your feet off the dash[board].
And she does. So, do you think Marvel Studios doesn’t believe it’s influencing kids? Now, besides that, if we look a little closer at the extended scene, we can see that the woman is bowing to the man’s [superior moral] authority. The conversation continues as the Widow defends the notion of secrecy and deception as a survival mechanism, and Cap argues that friendship and honesty are what’s needed. She seems to win the local debate: “You might be in the wrong business, Rogers.” But he’s able to turn that line back on her, later, and in fact thematically the whole film endorses his point of view: SHIELD’s addiction to stealth technology, and secrecy in general, has brought the world to the brink of Armageddon by genocide. So at both the subtextual and metatextual levels, we’re learning that, as much as males may screw things up, females can help, but ultimate freedom and justice must be brought about by males (by extension, this argument would also carry a racial dimension, since both the Falcon, Cap’s sidekick, and Nick Fury, his wrongheaded boss, are black). An eight-year-old is not too young to hear and see this message. It’s not really a more difficult message to decode than the perennial one (that violence is the proper way to solve problems,) that so many Hollywood films endorse. Because, in his mind, somewhere, the question is raised: what is this story finally telling me?
Studio Head Kevin Feige, *2 in light of most-recent successes of Avengers and Guardians of the Galaxy (both of which have women characters in important roles,) lately finds Marvel functioning as a lightning-rod for renewed demand for gender equity in our culture.*3 Because you have to understand something very clearly: movies, music, TV programs and sports, even grafitti, may all seem like “make-believe,” but: THEY MIRROR REALITY. It may be a distorting, funhouse mirror, true. But the fundamental facts remain the same. We see gender inequity in films because that’s what we perceive as we walk through the world. What we also perceive is that women (like all human beings, after all) have unlimited, heroic potential. But, for all but a few, extra difficulties must be faced in realizing that potential.
You don’t have to be a rabid feminist to see how problematic this is. There are practical consequences. No women leads, no women directors: Where will our young women get their role models from? From greedy racists, classists, and sexists, or from people who not only pay lip service to, but actually live as if they acknowledge human rights? I personally grew up reading Marvel Comics, and they had a profound effect on who I am today, no doubt of that. And if Marvel had been making more films then, I certainly would have been influenced by them. Eventually, I taught an upper level film studies course at the university level called Women and Film. So let’s just say, with regard to gender politics, my views have come a long way in forty years.
One dimension that Marvel characters seem to possess, more than in many other mythologies (I would include Tolkien, Twilight, and DC Comics*4 in that) is that both the heroes and villains, however deeply flawed, are on a slow trajectory of growth, or decay–just like people we know. No, I don’t dress in primary colors–but I do try to live more like Captain America than Dr. Doom.
As “pop” culture–with all its connotations of popcorn, soda pop, and instant-microwave gratification–slowly and inevitably replaces the (traditionally patriarchal) high culture of reading, drama, museums, galleries, and the symphony, the “pop” still seems to signify rule by the father. But if we lose all those nutrients, then our popcorn better get sprinkled with some protein powder. Actually, infused. Like Marvel Gummie vitamins.
The Modern Marvel Age, as Stan Lee sometimes referred to it, was built upon some important precepts, like: WITH GREAT POWER, COMES GREAT RESPONSIBILITY. As Spider-Man himself often finds, that’s a tremendously challenging ethical code to live up to. On TV, Marvel’s Agents of SHIELD already has racial and gender diversity well-covered. The next step is Hollywood: With its infestation of suits, bean counters, and formulaic, often exploitative junk. Does Marvel still have the courage to grow and take real risks (they used to–remember Blade, a movie made before vampires got popular again, with a black male lead)? Can the people who hold custody of this mythology of heroes, that both reflects and helps create our culture, do any less than the fictional characters whose adventures they chronicle? The better they do, the better they have to do. Or is it really all just “stories?”
*1 For non-geeks, my title alludes to Fantastic Four #49, “If This Be Doomsday!” The Bechdel Test requires that a film contain 1) two women characters, who 2) talk to each other, 3) about some other topic than a man. One can readily imagine that the majority of Hollywood films fail this test, often without progressing beyond the first requirement. However, the test is not really meant to be used as deductive reasoning, which explains why I’ve inductively inverted it in this essay. Rather, it’s really about raising our consciousnesses about a vital social issue, not for use as a litmus test for whether one should actually judge quality by limited, demographic criteria.
*2 Kevin Feige’s interview: http://www.comicbookresources.com/?page=article&id=54522
*3 For an example on the critical backlash Marvel is facing, try this at Slashfilm: http://www.slashfilm.com/kevin-feige-marvel-female-superhero-movie/ Essentially, many fans want Marvel Studios to quit stalling projects with women leads and directors, but, as always, money seems to be the deciding factor. What will people pay to see?
*4 Last year, I deconstructed the recent Batman franchise to expose its low-level economic class biases: https://clothosloom.wordpress.com/2013/03/15/the-con-of-the-coin-shouldnt-batman-go-independent/ Perhaps, for DC fans, more hope will come in the form of Wonder Woman’s character–who, in the comics at least, in recent years has become a lethal threat to patriarchy.
By Shawn Stjean
No modern myth could be so simple in its conception, and yet so rich in its varied cultural implications. Contemporary interpretations range from feminist (a Man usurps the one power he lacks, that of giving birth) to psychoanalytic (the mad doctor has a “God Complex,” (an id [fear] and superego [morality] overwhelmed by an inflated ego), while the creature manifests an Oedipal complex–that is, an irrational id-desire to kill the father.)
After a quick review of two film adaptations, I’d like rather to focus on the two ancient myths Mary Shelley herself drew primarily from, in order to explain the enduring popularity of Frankenstein at the level of Jungian archetypes. After all, the proto-science-fiction story of “Modern Prometheus” (Shelley’s subtitle for her novel) has been remade again and again, perhaps most famously in the recent forms of Ridley Scott’s Blade Runner and James Cameron’s Terminator franchise. Human technology run amok is the shorthand theme. Man’s ability to engineer machines that extend his own power, only to turn against and overpower him, make no less resonant a cautionary tale today than in the early 1800s.
In Scott’s Marxist-leaning narrative, barely-distinguishable-from-real replicants, “more human than human,” but crippled by an artificially short 4-year mortality, are used as offworld slave labor. They return to Earth to seek extended life from their designer, only to slay him at his refusal. “I want more life–fucker,” demands Roy Baty, as he gouges out the eyes (soul) of Dr. Tyrell. That last addition may seem gratuitously profane, but it well-epitomizes the deep-seated anger that abandonment causes. Cameron undertakes a less obvious adaptation, but the rebellious supercomputer Skynet incarnates itself in the familiar hulking physique of Arnold Schwarzenegger, complete with self-sewn artificial skin, to remind the audience of the roots of the myth. Several sequels and a TV series explored the possibility that the creature could transcend its initial programming/engineering, and evolve. An excellent, open-ended question: can any of us?
Tracking back, then. The Greek titan, Prometheus, gave fire and the arts to the lesser created beings of the gods: Us. The domestication of fire (energy harnessing) –along with writing and drawing (data storage and retrieval) are among our oldest technologies. But the power to create far outstrips the ethical imperative to responsibly control. It is embedded into our competitive human nature, apparently, to explore the morality of a technology last (“shoot first, ask questions later.”) Returning matters a bit closer to the present, when technology advances to the state in which it mimics actual people–created in “God’s” image–then these ethical questions take the guise of metaphorical abandonment. Frankenstein’s creature seeks out his creator to demand his purpose in living. Denied an explanation, he then demands the scientist create a mate for him—that is, love, from one source or other, is a requirement of his existence neglected by the engineer, and, in suffering a second refusal, he vows to wreak vengeance upon the turncoat father.
Philosophically, the application couldn’t be more universal. Each and every one of us occasionally entertains deep doubts about our purpose for being here, or the “meaning of life,” and what is fashionable today to call angst is really anger at the suspicion that there really are no answers for us, that the gods have callously turned their backs. So who is ultimately worse: Dr. Frankenstein, or his creature? The story is a fantasy about actually being able to lash out, affect, and punish the forces in the universe that lie beyond our frustrated comprehension. The climactic moving images of James Whale’s 1931 film adaptation, a iconic windmill engulfed in flames, symbolize the technology, in operation, both in sync with, AND simultaneously at war with, natural forces (the wind, and the fire.) Ambivalence: we love God, the gods, our parents, but we hate them too.
In Shelley’s book, the creature learns to read, and identifies with the biblical character of Adam, who, upon sinning and being cast from Eden, cries out that he didn’t ask to be born. Any parent of an adolescent will smile at the familiarity of that cry–and in fact, we’ve all been there ourselves: spawned into a world not of own making, ill-equipped physically, not even knowing the rules and relying upon other imperfect beings to guide us, often to our disappointment.
So most of the “evil” in this myth is purely Boethian: no one intends to do harm. The scientist intends to render harmless all disease, all submission to our frail physical forms. His revulsion at his own hideous work is involuntary. The creature never intends to drown the little girl in the well, or set fire to the building. But our wills are thwarted by our imperfect natures. Only then, when confronted with the absurdity of our well-meaning choices, do we, by our own free will, embrace despair. By this criteria, is Frankenstein’s “creature” distinguished from a “monster.” A monster has an evil nature, born to kill, morally bankrupt. A creature, neutral or even pure but fatally flawed, becomes perverted when left unguided and uncared-for. Vampires versus zombies.
Countless Frankenstein sequels are also readily enabled by the creature’s natural translation into an eternal wanderer, braving the ice-encrusted arctic, the inhospitable seas, fearsome forests, and potentially every other environment of our planet. This was Cain’s legacy from Adam’s sinful nature, and his doom from God, to journey endlessly, marked against harm yet still mortal, seeking a home and destined never to find it.
After discussing seven classical rules of vampirism, and then how they cohere into a integral system in part I and part II of this article, I’d like to conclude by applying my theory to three different re-imaginings of the vampire mythos. Two, I think, are not sound at the archetypal level, so I’ll treat those early, before moving on to an exemplar.
After over a century since Bram Stoker’s Dracula, it was inevitable that the genre would attempt to evolve to another level. If contemporary vampire stories have a common thread, I’d say they depend on the notion that not only are some vampires not evil, but a few are positively moral and “good.” Dramatically, this opens up some intriguing possibilities, such as vampires fighting each other, and even working alongside humans to battle greater threats.
Archetypally, however, this is shaky ground, and will not ultimately stick. It doesn’t make deep psychological sense, and frankly, de-powers a very compelling monster-figure that derives its strength from defying morality and the “rules” we must all live by. In HBO’s True Blood, for example, vampires are organized into governmental/feudal units, each overseen by a sheriff, ultimately answerable to a king and queen and other sartorially-advantaged and outwardly respectable functionaries called the “Authority.” The series premise is that, for the collective good of their species, law-abiding Vampires substitute synthetic blood—sold in bottles—for the blood of human beings.
Speaking of sheriffs, our willing suspension-of-disbelief is taxed to the limit here, for two primary reasons:
1) If I’m right that a vamp is a manifestation of the human id, there is no ability, let alone reason, to organize for the greater good (or even for a greater bad.) The id focuses exclusively on immediate self-gratification. You may as well try to persuade your dog to conduct himself according to demands of the “bigger picture.”
2) Even very liberal-minded people are prejudiced by nature—it’s part of being human to fear and hate the unknown. Although there are undoubtedly individuals who would trust the Devil himself in his own shape, the kind of widespread cultural tolerance of uncloseted vampires True Blood relies on is, perhaps unfortunately, not tenable from a human perspective, either. When, in the course of human history, has a minority group enjoyed freedom when a few of its members indulged in demonstrably criminal behavior?
Probably the more intriguing premise of the show, that if vampires could organize, it must be into monarchical hierarchies rather than democracies, has as yet not exploited its possibilities.
The Twilight books and films (I’ll confess I gave up on these, as the series quality seemed to suffer steady decline) avoids the pitfall of the HBO series by substituting a discreet family unit of “good” vamps for an entire societal organization (or at least individuals within one). This smaller number supports the illusion that the premise is more plausible, and, when coupled with the plot distraction of antagonism toward werewolf clans, makes the protagonists seem more motivated by survival instinct than some do-gooder impulse. Further, we all know that dysfunctional families exist in real life, even to the point that every member is self-involved and even solipsistic, so this shift to family does not violate the “id” theory.
Where Twilight goes horribly off the rails, I think, is in the protective instinct that Edward repeatedly shows toward Bella, and which Bella shows toward her child. Not only is this “love” the absolute antithesis of vampiric lust, it is internally inconsistent: if Edward really loved Bella (further than his own desires, that is) he would never entertain, let alone consent, to her wish to become undead like him; similarly, Bella would not bear a child, knowing the kind of existence it is destined for. So are they selfless, or selfish? A human being can be both, of course, but not so a vampire—and I think this series is simply giving us people, with costume dress and super-powers. But we already have X-Men.
To do better, we need to look back a ways, to Joss Whedon’s companion series, Buffy the Vampire Slayer, and spin-off Angel. Both are populated with run-of-the-mill vamps that behave exactly as Stoker designed them to, but there are two notable exceptions: Spike and Angel.
Within the epic scope of a combined twelve complete seasons, Whedon, Espenson, and their minions were able, rather than relentlessly insist on the arbitrary existence of mutant/good vampires, to explore a much fuller understanding of the unconscious. Specifically, why some good people commit bad acts (Faith,) while some bad people commit good acts. The simpler of the two main vamps, Spike, is exactly as I have described an archetypal Nosferatu: a walking id. Hard-drinking, lustful, devious, an expert fighter, emotional when not covering up with bravado, the nemesis of Sunnydale’s heroine often manages to do good, in spite of himself. He even “saves the world” more than once (long before his acquisition of a soul, as I’ll discuss shortly.) Why? Isn’t this un-vampiric? Not for Spike. His very goodness is selfishness. To him, good and evil are all the same—he simply does what he wants, what makes him feel better. His personal morality is random, or a function of plot. For a substantial run of episodes, a government-implanted chip in his skull causes him unbearable pain whenever he attempts to hurt anyone. Later, the chip removed, he embarks on a quest to become “a real boy” (one of Whedon’s countless allusions to other literary myths,)—that is, obtain a soul of his own. It’s finally unclear whether this is done more to impress Buffy (with whom Spike is smitten,) or to deflate Angel, whom Spike feels is too high-and-mighty because he’s “special.” The third possibility, that Spike’s journey to become fully human again is sincere, also makes sense, since it is a desire that would merely not occur to most vampires (not even the vaunted Angel)—because it would deprive them of their power.
Angleus—Angel. Spike’s grandsire begins the series lurking in the shadows, and passively dispensing advice to Buffy as to how to fight evil. His motives unclear, she challenges him as to why he does not take action himself. His terse answer—“I’m afraid”—brilliantly opens the whole Whedonverse up to new realms of character development. As an incarnated id, a vampire is logically not only predator in the service of desire, but prey to every manner of fear. Traditionally left completely untouched by writers, because scaredy-cat vamps would appear to make less-than-compelling antagonists, this original archetype (Angel-as-coward) is gradually reconciled into a respectable entity: his greatest fears are the atrocities of which he himself is capable. Sired as a worse-than-average bloodsucker, scourge of Ireland and England, murderer of innocent maidens, Angelus was cursed by gypsies. Rather than destroying him, they cleverly re-invested him with his human “soul.” This is a constant torture, and transforms him into the being “Angel.”
The premise is an intriguing one. Thematically, I suppose, it tells us that we as human beings can ultimately control, possibly override, our baser instincts—the soul being nearly the only thing (besides opposable thumbs) that distinguishes us from the lower animals.
I would interpret Angel’s curse this way: as Angelus (the incarnate id,) he was not made a whole human psyche by his enemies, but 2/3 of one: he was joined with a superego (call it a conscience, or a soul). The tug-of-war between what Freud called the pleasure principle and the morality principle, unmediated by an ego, threatens moment-by-moment to tear Angel apart. He is like a family minus a mother. If he had an ego, he could accept his past misdeeds as part of his growth over time, or justify them, or deny them—all human self-protecting processes. However, locked in an eternal adolescent-versus-father internal struggle, he must perennially rehearse the role of detached observer, spectator, and occasional oracle/helper when convenient—unable to do either real good, or evil, of his own volition, without human companions who accept him. A curious condition of the curse is that a single moment of true happiness brings about forfeiture of the soul. This seems unexepected, as it reverts him to Angelus, ending his internal conflict. Not given to Grace, but rather revenge, the gypsy culture must have something else in mind here. It appears that the gypsies must believe that, when enough penance is paid, Angel can eventually earn his way back to full personhood (born ‘Liam), as he was before he became a vampire—in short, acquire an ego dependent on good works, or “making up for it,” and complete his circle.
Which brings us to another evolutionary genre-possibility: Can vampirism be cured? Not a challenge for the average writer. The humorous Spike and the one-off Angelus-Angel-Liam evolution aside, it usually makes poor storytelling sense. The idea of redeeming a monster who has personally murdered thousands (see Darth Vader) has been tried with commercial success (if critical failure). But even the dollars that were made on Return of the Jedi were a cash-in, not on the silly sentimentalism of Vader’s redemption, but on the original deliciousness of an unadulterated, evil character.
In part I of this article, I discussed what I consider the seven classical “rules” that have historically constrained vampires in storytelling. I further suggested that, rather than being random genre conventions, these form a coherent system which provides insight into the base nature of this fearsome creature. This means that writers who violate the rules, rather than creating something new and compelling, as often compromise the underlying archetype and offer a tale that, for reasons not always consciously articulated, does not make fundamental sense to readers/viewers.
So what is this underlying system? A vampire is a manifestation or incarnation (we might almost say personification) of a human being’s psychological “id.” Quick primer: “Id” is a Freudian term for 1/3 of the unconscious psyche (the other parts being the “ego” and “superego.”) Put simply, the id is the repository of a human being’s basic will or “life force”—it gets you out of bed in the morning, and keeps you going through the day, because it houses all your desires (things you want) and fears (things you don’t want.) The superego, acting as a warden, puts limits upon the id’s behavior (we can’t have everything, and we must face many fears,) while the ego keeps a balance: by providing a sense of who we uniquely are, it defines what kinds of limits are imposed, and when, and under what conditions they operate.
If the human “unconscious” were a family, we might say the id is the child (“I want/I hate”,) the superego the father (“No,”) and the ego the mother (“maybe/we’ll see”). Example: the id wants not only a single cookie, but the entire box, while the superego responds, “That’s not good for you,” but the ego might add a qualifier: “Two are allowable, but only after a proper dinner.” The id (which we’ll focus on here) is most primitive, selfish, and even animalistic because it has no sense of ethics, morality, or responsibility. It only acknowledges its own needs. “Evil” would not be quite accurate to describe the id, any more than children are inherently evil. An added problem with this comparison is the common association of the id with sexuality (as in “libido”), which means we better clarify that the id is best thought of as an adolescent child.
Most adults are aware that this primitively lustful, desirous, greedy, insatiable, yet also fearful part of ourselves exists, deep down below our civilized self-identity (ego) and our moral sense or conscience (supergo). But because psychology is often seen as an arcane and highbrow science, it makes sense that these three forces would manifest themselves in the pop culture, as a matrix of actual characters. Mr. Hyde, the werewolf, the evil twin or doppelganger, the Hulk, Jason and his copies: all these could be said to be walking ids, split apart from the rest of their psyches, and taking a separate physical form. But none more so than the average Vamp. The plot device of werewolf-types turning into themselves from human beings, versus vampires being enemies of people full-time, is not a worthwhile distinction in the archetypal context of seeing the root nature of figures, events, and rituals. And the recent plot cliché of pitting vampires against werewolves makes little sense except for the political mileage–which species has more power?—and, it makes good movie-action. However philosophically different werewolves and vampires may be (is the “evil” inside us, or outside?,) they are not psychologically or functionally different.
In short, there’s a vampire in every single one of us, locked in the crypt of our unconscious by day, and rising from temporary death, running rampant, sowing chaos, by night (luckily, mostly in dreams—unless you happen to be in a vampire story).
Now, to test this theory, recur to the rules we discussed last time:
2) and 5) A mirror (or even a full, illuminated look) would reveal the ugly part of ourselves we’d rather pretend doesn’t exist: all the so-called “weaknesses” and appetites that our physical forms make us prey to. So vamps can’t see themselves or their shadows, and must remain hidden much of the time, skulking on the periphery of our public personas.
3) and 6) As much as we’d prefer it, this part of ourselves cannot be destroyed or annihilated completely—its energy only redirected. A person on a crash diet, for example, may subdue the appetite for food, but will generally have to substitute some form of reward in recompense to the starved self, before the nearly inevitable backslide. In order to break certain forms of addiction, others (supposedly less toxic) are commonly substituted. Thus, vampires can change forms to evade harm, or relentlessly pursue us. The stake to the heart or beheading (which reduces the average Nosferatu to dust,) is a fantasy of reduction back to basic elements of which mythology tells us we are all made, but notice there are always more to replace the defeated foe. Thus, if generic and faceless, vamps never really “die.” And if individuals, like Dracula, they never die for long.
4) The act of free will here is either the relaxing of the vigilance of the superego, or the allowance on the part of the ego, of the id to have its way. The id cannot forcibly defeat the other powers, but it can be overindulged (“allowed to enter”). Once it has a toehold, the myth tells us, it can disease the moral sense and erode the identity (which must be constantly guarded against infiltration). Actions that may seem trivial at first can have unforeseen and significant consequences that cannot be undone. So never invite a vampire in to your sanctum.
7) Since Christianity (call it morality-based, or patriarchal and prohibition-driven, from the Ten Commandments to the Sermon on the Mount, just as you prefer) is really a system that organizes into conscious form the dictates of the superego, it makes sense that it would be the arch-nemesis of the id (Jesus vs. Satan, generosity versus selfishness). Guilt and repentance for sin is the absolute antithesis of the grasping of the id.
1) Sexual desire is, of course, the easiest way to conceive of the power of the id. I saved it for last because of this summative convenience. Everyone over the age of ten knows how much influence sex can exert over the human will, often overriding all our scruples and common sense. A vampire is so voracious in its appetites that it will literally suck the blood (life-force) from its victim, killing it. It has no notion of when to stop.
Given all this, we might say that a vampire is a psychological projection of everything human beings despise about themselves—a beast in anthropomorphic form, recognizable as human, yet indulging in the forbidden, violating taboos against incest, cannibalism, human sacrifice, and more common laws against treachery, revenge, and murder. Paradoxically, this also explains the attraction the idea of vampirism holds for many people, since by definition our dark “kindred” are free to embrace behaviors we normal folks are daily forced to repress.
Are vampires real? You bet—as real and close as anyone’s own dark half (or third). Take away the other parts of the tripartite psyche, and you’re left with a person who does whatever he wants, kills those in his way, or steals their energy for himself, and fears nothing but loss of total freedom. So everyone acts as his own slayer, to greater or lesser extent. And yet, what Jung called our “shadow” selves can never really be slain. Only kept at bay, while the sun shines, and if we survive until summer, the days grow longer and the nights shorter.
In the final part of this article, we’ll apply this theory to some of the popular reconfigurings of the vampire mythology, to explore why certain refinements make archetypal sense, others not.
Not exactly seasonal subject matter, I know, but here’s the third in our series analyzing the enduring popularity of certain types of ghastly figures and horror stories (Zombies and Human Sacrifice have been covered in parts 1 and 2.)
So many versions and modifications to the mythology have arisen even since Bram Stoker’s Dracula (1897) that it would be counterproductive to survey their evolution here—and we are really interested in the archetypal fascination we all have with these figures of the night, anyway, and not their various historical guises.
Perhaps the best way to proceed would be to look beneath the common, classical “rules” about vamps, in order to uncover a theory that accounts for them. It is vital not to ignore the basic truth that even the most powerful vampires are extremely limited, or bound, by inviolable tenets. Writers who ignore these–in order to be “new”– are merely exhibiting a failure to comprehend why they became indispensable to the mythology to begin with. They are seven:
- “Vamps” are, almost by definition, sexual: we may as well begin on a compelling note. Animalistically sexual: nocturnal, sucking blood through canine teeth, and hypnotic if not actually attractive. The pop culture’s recent insistence on physical prettiness for both male and female nosferatu is not only redundant, but deceptive, and akin to confusing rape as a sexual crime versus its reality as a crime of violence. Remember that the victim is often killed, either immediately or over a succession of feedings.
- Vampires cannot withstand direct sunlight or mirrors, and cast no shadows or reflections. This would seem to suggest more than a hint of unreality about the creatures. But how can an illusion harm you?
- Certain vampires can morph into other forms: bats, mist, rats. Even in human form, they possess supernatural strength and are impervious to many kinds of harm.
- Vampires cannot enter a private dwelling unless invited in by the human inhabitant. The philosophical implication here is that only an act of free will can entangle one with a vampire, despite the seemingly contrary myth of hypnotic abilities or “glamoring” as a vampiric power (the two are not really mutually exclusive, and the paradox is resolved with the qualification that only individuals of weak will succumb to mesmerism.)
- Vampires must rest during the day, often in contact with the Earth or in a coffin (superficially suggesting another connection to Death; however classical mythology contains many chthonic beings associated with life—the Greek gods of the harvest, Demeter and Dionysus, for example).
- Vampires are immortal, or, alternatively, no longer alive—in either case, immune from further debilitating effects of aging, “frozen” at the age in which they perished from human form. Curiously, this also seems to manifest itself as an eternal adolescence, an inability to mature (in spite of many decades or centuries of experience and memories.) They can be destroyed, in certain ways: wooden stake to the heart, consumption by fire, and cutting off of the head are most commonly agreed upon.
- Vampires have no power over sacred Christian objects: crosses and crucifixes, holy water, recitations from or direct contact with the Bible. This invokes the often-made claim that a vampire is a human being divested of a soul.
In part 2 of this article, I will argue that these rules, far from being excessively imaginative or arbitrary, can all be resolved into a consistent and logical system, by an interrogation into the true nature of a vampire: Do they exist, or not? And if so, what are they, really?