Evil Archetypes of Pop Culture: The Base Betrayer and the Panicking Fool


By Shawn StJean

In this latest entry into our popular ongoing series at Clotho’s Loom, we examine a pair of minor character-types that recur throughout literature and film.  They might both be subsumed under the blanket of moral coward, but I prefer to get a bit more granular for the benefit of the writers out there.  Because one is a more ancient incarnation, and the other, more contemporary.

The two were brilliantly used together in James Cameron’s Aliens (1986,) the militarized sequel to Ridley Scott’s original (1979).  Paul Riser got appropriately cast as the slimy, sweating Burke, a corporate lackey who attempts to impregnate the heroine Ripley and her adopted daughter, Newt, with xenomorph embryos for the Company’s bio-weapons division.  When she discovers her so-called friend’s plot, Ripley acts as Cameron’s mouthpiece: “Y’know Burke, I don’t know which species is worse; you don’t see them fucking each other over for a goddamn percentage.”  I think anyone who has labored in an office building of any kind has known a Burke.

Long before his well-known stint as nice-guy explorer in Cameron’s Titanic, the late Bill Paxton earned a place in the director’s troupe as Private Hudson, an all-bluster Marine who kicks ass and takes names, until, that is, the tide of battle turns against his unit:  “I don’t know if you’re keping score, but we just got our asses kicked, pal!”  In the hands of a Panicking Fool, a gun becomes a weapon against everyone around him, and friendly fire becomes as great a hazard as a battalion of enemy combatants.  Especially in the more modern (Vietnam era, especially) war films does this figure turn up.  It is probably the advent of firearms into narrative that makes him so much more viable for writers: running away only endangers oneself, whereas indiscriminate gunfire endangers everyone.

Dante reserved his 9th circle of Hell (Treachery, the worst of said circles,) for the base betrayer.  Virgil leads the author-narrator to a position to witness the sufferings of Judas Iscariot, Brutus and Cassius (traitors to Julius Caesar).  While Judas’ motives have in later times been complicated by artists such as Martin Scorsese, the original motive is the ever-popular one: silver (Burke).  Brutus, of course—best known in the presentation by Shakespeare—while believing he acted in the best interest of the Roman republic, the fact remains that he literally stabbed his friend without warning.  He did not challenge him to open combat, or a political battle of wits, or so much as a game of chess.  If judged only by his actions and not by his motives, Brutus qualifies as moral coward.

The notion of moral cowardice and betrayal is complicated in western literature by the presence of women characters, from Medea to Gone Girl, because women are not traditionally expected to possess the analogue of moral courage.  This is a patently false prejudice of our culture, of course, but it has become so pervasive that, in a typical narrative stocked by male characters, any female will automatically fall suspect when a betrayal occurs.  Writers often exploit this.  They can, because male-centric narratives operate from a male perspective—including the idea that women’s motives and perspectives are, by definition, irrational, impenetrable, and mysterious.  “If there’s one thing I can’t stand, it’s a dirty, double-crossing dame,” insists Big Jim Colfax, one of the male arch-criminals of 1946s The Killers, a quintessential, Hemingway-based film noir that sets the tone for the next 70 years, and counting.  Because as a man, murder, theft, and betrayal are par for the course—but if you’re a woman, well, let’s just say that things are about to get warm for you.

Inasmuch as many genre stories (horror movies and sci-fi, westerns too) function as microcosms (literally “small worlds”) in which groups of characters represent the types of people one would find in real life, a betrayer is bound to turn up, as well as a fool (if not a panicking one—the person who needs to have the hysteria slapped from her mouth).  Panickers lose the ability of rational thought in a crisis, and in fact never tend to think ahead in the first place.  So if you know someone who leaves the house in cold weather without a coat, who drives a car around without a spare tire, first aid kit of any kind, or so much as a screwdriver, and relies on a cellular telephone and a smile to bail her out of any crisis, you may know a potential panicking fool.  By contrast, a betrayer may be a cold and calculating logician, weighing beforehand the risk-versus-reward factor of every situation, and acting upon best opportunities.  It should go without saying that such a person will perfect the art of lying.  But with a little experience, these too can be spotted in the wild.  Anyone who spreads a lie or gossip about anyone else to you, is sooner or later going to do likewise about you.  Don’t kid yourself otherwise.  Unfortunately, the experience this costs, of learning what true friendship is the hard way, is dearly bought.

One reason archetypal characters function so well is that we, as reader/viewers, recognize the part of ourselves that they personify.  Deep inside of all of us there’s a little voice that shouts “Run! Scream! Run!”, and also a voice that looks at another person and says “If zombies chase us, I’m tripping you.”  Because high-stress situations bring out our most primitive, base instincts.

Luckily, panicking fools may not be as familiar to us in life as in texts, outside video-games (wherein every deathmatch map has one person running around in the open, firing wildly.  Snipers love this guy.)  But anyone who has gone through an American secondary school knows the person who will sell you out, to further their own agenda: from miscarriers of misplaced confidences, to boyfriend-stealers (and cheating boyfriends), to friends who abandon you for better friends at the earliest opportunity.  And therein, I believe, lies the enduring popularity of the base-betrayer figure.  To the extent that all adult human beings are damaged, at least some of it can usually be traced back to betrayal.  When a person abuses your trust, they don’t simply harm you; they harm every single person, for the remainder of your life, who will be worthy of your trust.




The Collapse of American English


03262015-Kingdome_01By Shawn StJean

Perhaps my accompanying photos are a trifle hyperbolic.  Perhaps.  It’s a truism among our global neighbors that Americans (by which I mean U.S. citizens) expect everyone, everywhere, to speak English.  The corollary, of course, is that most refuse to learn other languages, such as Spanish, even when the utility of doing so is abundantly clear.  But a looming problem for our culture in the 21st century seems to be that Americans increasingly decline even to learn English–at least beyond the 3rd or 4th grade level.

This level, supported by weak resources in the slang of the moment, proves sufficient for basic writing and speaking, but does not carry us far into the realm of critical thought and communication.

I choose the word “collapse” for my title, rather than “decline,” because I mean just that–what used to be a language with hundreds of thousands of specific, nuanced and descriptive choices has and continues to converge and impode into fewer and fewer.  With the recession of traditional print media in the face of digital dissemination of what can charitably be called information, even simple affirmations like “Yes,” “certainly,” “definitely,” “acknowledged,” and “no doubt,” in the most extreme example of private text messaging, have all been replaced by a single letter: “K.”

Need this be a bad thing?  After all, what’s more efficient than “K”?  Doesn’t that free us up for more important, or at least more, activity?  Before answering, let’s look at some other casualties in this war for the space in our brains.

Examine the following short list of commonly used expressions, and you’ll realize that either they are purposefully and even defiantly vague, or that one word takes the place of many–indicative of the digital age we live in (compression, homogenization, and subtle loss of nuanced information):

“Do” replaces the verbs/actions “try” “give” “accept” “participate in” “contribute to” “tolerate” “clean.”  As in “I don’t do Christmas.”

“Go” travel/venture/explore/pedal/fly/walk/hike/swim/jog and even “communicate something uncomfortable,” as in “Don’t go there.”

“huge” /big/large/important/significant/influential/knowledgeable/enthusiastic.  “I’m a huge fan.”  In my ear, this sounds ridiculous even on the face of it.  We all speak in metaphors of one degree or another all the time (“collapse” is a minor metaphor when not speaking of a physical structure,) but the above expression equates to saying the gushing adorer is an abnormally large person (or ventilating device.)  One might as well offer to wave oversized palm leaves, ancient-Egyptian style, at the object of worship.

“way” very/much/far/long (“This license is way out of date.” “This sauce has way more garlic than the recipe calls for.”)  This one in particular disturbs me because it demonstrates we aren’t just discussing slang here.  “Way” has been adopted not just in common speech, but by professional writers.  It has infiltrated the language in a permanent, um, way–ahem–manner.

“You’re all set.”

“It’s all good.”


“it’s all about”

“comes into play”

“deals with”

“back in the day”

Of course, words are invented, repurposed, and recombined all the time.  I must be overracting.  Aren’t these replacing archaic usages?  We’ve got “tweet.”  And “text.”  “Sick,” “diesel.” Oh, and “literally” can apparently now mean just the opposite, “metaphorically”–I mean, does it really matter?

“[   ] is a thing.”  Ah, yes, thing–the one catch-all noun when people grasp for a word and cannot find it, the very expression of inarticulateness, has become an official word to describe a fad, trend, icon, object of buzz or gossip, popular occurrence or consumer good, news item of the day, or week.  We had all those expressions, and they all relied upon small distinctions.  At this stage in human (d)evolution, we needed “thing”?

Okay.  Let’s say I’m right.  So the language is imploding.  What’s at stake here?

Many will not miss the subtleties that have dispersed into ether, I imagine.  Then again, it’s difficult to miss something you never knew you had.  What about the millions of unborn youngsters who will grow up with effective working vocabularies of a mere few thousand words?  Will they write poetry that amounts to more than a colorful tag on a railroad bridge?  Will they read it?  Will they understand the U.S. Constitution, even as they are called increasingly upon to “defend” it?  Will the historical records of the 19th and 20th centuries begin to sound as impenetrable as Shakespearean solilioquies do to us?  And I’m not talking about the kind of missing material in a contraction: to anyone but a fiction-writer or screenwriter, the distinction between “I’ve” and “I have” is not great.  One might use it to distinguish among charaters who are high-born or low-born, for example.  For the rest of us, it’s merely a convenience.

George Orwell warned writers not to compose in cliche’s.  He claimed, essentially, that writing in the pre-digested shorthand phrases of others leads to thinking in the pre-digested shorthand phrases of others.  Other signs that your thinking has been compromised: Do you find yourself regularly Googling information that you could remember with just a bit of effort?  Are you trusting that information (or that from Wikipedia, Mapquest, Siri, or the CBS Evening News) enough to act upon it or pass it on to another human being without double-checking it?  Are you cut-and-pasting that information (either in written or verbal form) without rephrasing it?  My overall point here is there exist vital differences among raw data, information (processed data), and intelligence (interpreted information).  And yet many of us are not bothering to recognize them.  Not because we lack the cognitive ability, but because we lack the critical tools and the will to use them.

A brief [ mostly harmless] experiment should serve here.  Raise your hand if you like music.

That should include most of you, one hopes. If you like music, you have probably in your time looked up some song lyrics.  In the old days, we read them out of LP album covers–which meant the source was the band’s record label, presumably direct from the songwriters themselves, which meant little chance of transmissional error.  Nowadays, we all know where song lyrics get found.  Dozens of websites cater to this need; even Google has gotten directly into the act through their search engine.  Look up a song or two that you know intimately, but the performed and recorded lyrics of which are not 100% crystal-clear by listening. I can guarantee you that, as transcribed onto your website of choice, you will not be long in discovering blatant errors in those lyrics which materially alter their meaning.  Furthermore, and more appallingly to me, you will discover upon cross-checking that most, if not all, of the alternative websites repeat that same error.  Which means, of course, that they are all “borrowing” from each other, and profiting off both you and the songwriters with little regard for the truth.  Now, if the stakes here seem low to you, import your experiment to the television news programs.  Jon Stewart had a running bit on his incarnation of The Daily Show dedicated to proving that not only do major news outlets shamelessly plagiarize from each other, but they do so in unedited cliche’s.  Again, in the old days, we might double-check their intelligence in what used to be called printed newspapers.  Umm.  Except. . .

One of the great virtues of written language is its precision, yet increasingly written English begins to resemble spoken English, even in widely disseminated and professionally published print media.  And spoken English begins to resemble colloquial English.  Don’t think so?  Ask an octogenarian (someone born roughly during the Great Depression, as of 2017) if their parents would use the word “cool” as part of their everyday discourse.  Nowadays, try to find someone who doesn’t.  Not that I think “cool” has done the language any great harm.  As far as I can tell, it was first used in America, in its modern sense, by Emerson in the 1840s–which probably means it dates back even farther and derives from the British.  But this word may prove the exception rather than the rule.  As it is, it conflates a much more typically detailed appraisal of a person, event, or object.  A girl who might once have been variously described as “tolerant,” “forgiving,” “loose,” “free-thinking,” “substance-abusing,” or “not a nag” is now simply “cool.”

Of course, one might argue that simple is better; the fewer moving parts in a machine, the more reliable it is likely to be (read “mousetrap.”)

I doubt the sustainability of that argument.  Another, more insidious example: “fewer” vs. “lesser” (or less).  Almost no one but your English teacher bothers with this one anymore.  Here’s why: who cares if your supermarket checkout line reads (correctly) “fewer than 12 items” or (incorrectly) “less than 12 items”?  Can’t we just dispense with one of these?  Well, we could.  Except one of them refers predominantly to individual items and people, and the other refers objects in bulk or concepts.  That is, “fewer people are finding jobs their college degrees prepared them for.”  NOT “less people.”  Because those people are individuals, not some vague statistic.  There’s less forest, which means fewer trees.  There may be “less opportunity.”  There may be “less rain this year” or even “less cod in these waters.”  But if there are unaccountably “less people,” we had better start looking for them.   And reevaluating the value we place on human life.

I’d like to conclude with a different, and more familiar example; possibly the mostly commonly transmitted text message in English:

Where R U

It (or some variant) is quick, servicible, doesn’t cost much effort to send, or–hypothetically–to answer.  And yet this message has probably caused more misunderstandings and needless arguments than most.  Why?  It’s laden with ambiguity (or even what deconstructors call “undecideability”).  In the absence of voice intonation, facial expression, pronunciation, linguistic context, primary and/or secondary punctuation, and so on, the receiver must interpolate those for herself.  Here’s how that might go, in response:

“None of your damn business.”

“Uh oh, he’s saying I’m late again.”

“Did I promise to be somewhere right now?”

“I’m at Main Street and Vine”

“She really wants to know Who am I with, and What am I doing?”

“I left an hour ago.”

Texts and tweets may count portability and quickness among their virtues, but they certainly cannot include clarity in that list.  Even among intimates, this message is as likely to lead to a dispute as an informative reply.  Another aspect that’s missing, and increasingly missing from written communication especially, is any sense of formality, professionalism, or what used be called politeness.  Now, you may say, “Well, that’s just a text message.”  Sure.  But ask yourself how many e-mails you have received without a greeting, a signature, an identification of the sender or introduction, or even so much as a rudimentary spell-check?  Did you answer them?  If you did, you, as are we all,  are complicit in the process of collapse.  Compare these two e-mails, typical of what I, as a college professor, have received from freshman students:

[2007]  Dear Professor:  I’m sorry I missed class last Tuesday and Thursday as my grandmother died.  I misplaced my copy of the syllabus.  Can you tell me what we did in class so I can make up the work?  Thanks, Kayla

[2017]  I missed class last week would you tell me what I missed

Neither one of these qualifies as polished, professional communication–especially from a writing student–but I think you’ll agree that the former has a few lingering virtues to recommend it, which have gone glimmering in the latter.  In fact, were I to delve deeper into my records of the past, we’d find that the students of the 1990s had bothered to include my actual name; that the excuses were often more inventive and frequently included such touches as offers of doctor’s notes; that a request to meet in office hours was not unheard of upon missing a week’s worth of training; that the student might have actually acquired class notes from a peer before writing; that the student would bother to identify which of the four classes I teach she was enrolled in.

I’m not sure that the degradation of the language–as slow and inevitable as abuse of the atmosphere that has summoned the effects of global warming– will contribute materially to the collapse of the society, the culture, or possibly even our civilization.  But I don’t fancy it helping.  It’s perhaps predictable that as our planet becomes more overpopulated, as more wealth becomes concentrated into fewer hands, and as such factors demand a parallel dynamic of information becoming the province of fewer people (collectors,) the rest of us will not find encouragement to strengthen our language skills beyond the consumer sphere (that is, you and I only need know how communicate well enough to work and buy and perhaps sell a bit.)

As for writing, a culture’s written language is the primary repository of its history.  Without a sense of history, it cannot evolve.

The solution?  Same as it’s always been, and the advice is good not just for writers, but for anyone who wishes to grow their brain and live up to something approaching their potential: READ.  Read anything.  Comic books, advertisements, editorials, romance novels, cereal boxes, movie credits.  Some are better than others, obviously.  Personally, I recommend Hawthorne, Hemingway, and Wharton, along with Carl Sagan for those whose tastes require something a little more contemporary–here was a man who knew a bit about large-scale collapse–but that’s just me.


Evil Archetypes of Pop Culture: the Crusader


By Shawn StJean

It seems as if Universal Studios’ inauguration of their “Dark Universe” franchise, beginning with The Mummy, should have monster-genre fans everywhere uncovering easter-eggs and salivating, in werewolf-fashion, for future installments.  What’s next?  Creature from the Black Lagoon?  Dracula?  Frankenstein?  Given the success of Marvel Studios and its web of interconnected sagas, and the generosity of audiences even toward the far-less compelling DC Comics movie adaptations, this seems a logical gamble in the Hollywood and Pinewood of 2017 and beyond.  More interesting to me, as I combine a sort of hit-and-run mini-review here with a broader, deconstructive cultural analysis, is how the real villains of The Mummy are not the title character and the soulless zombies she creates from humans by draining their life-force (souls) to revivify herself.

The film itself makes a dubious beginning to re-introduce the Dark Universe into the 21st century, relying as it often does on relentlessly flat jokes, and worse, convoluted exposition in favor of any attempt at plot or characterization.  One could call Tom Cruise merely miscast, if his part weren’t so deplorably underwritten: supposedly a profiteering soldier who steals Mideastern relics, he mostly blinks his eyes and shakes his head through the bulk of the narrative, eventually alerting the audience for the dozenth time that the Mummy has a telepathic entry to “inside his head.”  Rather, it’s Russell Crowe, playing Cruise’s antagonist Dr. Henry Jekyll, and his minions, who warrant our serious attention here.

Jekyll, keeping his nefarious Hyde persona barely at bay with regular injections, leads a secret society of monster-hunters with the self-appointed mission to rid the world of evil.  He’s willing to go so far as to facilitate the Mummy’s obsession with her “Chosen,” Cruise, allowing him to be killed with a sacrificial dagger and incarnate the Egyptian god of Death, Set, so that he can then be “obliterated” under controlled conditions, “a sacrifice for the greater good.”  I’ve discussed human sacrifice at length previously in this series, but here it points us back to the deep motivations of Jekyll, our modern-day crusader.  Edward Hyde, grappling with Cruise, points out “It’s Jekyll who wants to kill you,” whereas he wants a partnership with the sergeant.

The archetype of the crusader, not one that springs to mind immediately, nevertheless forms part of the canon of recurring iconic figures in myth.  Self-adorned in the garb and acoutrements of a White Knight, the Crusader’s single-minded pursuit–a holy mission–brings him/her repeatedly into a death-struggle with what s/he imagines to be incarnated evil (but which is only a projection of the knight’s own private sin,)  and may even suffer a savior/God complex, as here.  Think Ahab: the white whale represents, to him, the sum total of all evil–not coincidentally having deprived the Captain of his own leg in a previous encounter.  In order to slay Moby-Dick, Ahab will sacrifice his ship and the lives of his entire crew, yet rationalizes this insane quest just as Jekyll here: keep an eye on the big picture, fellas.

The film has great potential in this regard, but squanders several opportunities to fully realize its themes.  Cruise is called a thief and mercenary on the surface with the soul of a good man attempting to emerge, whereas Crowe is a respectable doctor and leader hiding a soul of evil and “chaos” just beneath his respectable exterior.  His three-piece suit is the shield and cloak and sword-oath of a crusader, 900 years later, working ostensibly in a righteous cause, while committing atrocities along the way.  The capture and subsequent torture of the Mummy should help us realize this, but the film has buried any sympathy we might have had for the title character under its unnecessary agenda of portraying her as wholly evil.  And the contrast between Crowe and Cruise is never made direct enough; the yin and yang never bleed into each other.  Crowe’s Jekyll does indeed have the final words: “it takes a monster to fight a monster,” but it echoes too much like sequel-pimping.  We haven’t been shown the knights of the second Crusade desecrating Mesopotamian and Egyptian crypts, only told.  The crusader-knights turned to zombies, a refreshing turn from the usual T-virus, should help.  The mummy has reanimated their corpses to continue the mission they had in life–mindlessly carrying out someone else’s political agenda.  But the film is neither so subtle as to emphasize this impressionistically, nor so obvious as to have someone shout it out.  And honestly, as for sequels, I’d be happy never to see Cruise’s character again.

A crusader, like any soldier drafted into a foreign war, has to believe in the worth of the cause.  And yet, the deep disillusionment in the face of true horror in and around battlefields transforms the idealistic campaigner into a monster.  The cycle of post-Vietnam movies imported the process to American cinemas.  The best of the protagonists become world-weary and learn to hate the crusade itself; of course, since Universal is hoping to kick off a franchise, Jekyll can experience no such awakening.  We saw it most explicitly in Oliver Stone’s Platoon, but it may have been Sean Connery in 1976s British Robin and Marian who expressed it most succinctly:   “I keep thinking of all the death I’ve seen. I’ve hardly lost a battle, and I don’t know what I’ve won. ‘The day is ours, Robin,’ you used to say, and then it was tomorrow. But where did the day go?”  Perhaps only coincidentally premiering during the year of America’s bicentennial and following the final withdraw from Vietnam, the film nevertheless carried the message of counterculture from an empire whose sun had finally set on ocean slicks of blood.

And thus Cruise and his army cronies import the crusade against “insurgents” into modern-day Iraq.  The ruins and crypts and mass  graves and the walking dead are what empires leave behind in their quest of manifest destiny.  See, Egypt hadn’t much taste for expansion–the film stretches noticeably to bury the mummy in the Persian gulf, ancient Mesopotamia, seeming to want us to make the connection to the global political stage of modern day, to function as social criticism against American Empire–yet it can’t resist the weight, or rather lack of it, of its special effects, star power, and declared identity as traditional, if updated “monster movie.”  Essentially, it sells out.

What’s missing from the narrative of The Mummy, failing a major rewrite, is for Cruise to have a genuine epiphany, whether accomplished through his psychic connection or whatever silly device: it’s men exactly like Jekyll, in complicity with men like himself, that create the mummies of the world in the first place.  The civilians, the displaced farmers, the maimed and burned children, the revengers all haunting the wasted landscape.  But this would bring him and Crowe’s character into an irreconcilable conflict.  Instead, Universal seems to want to move them into formation of some half-assed Scooby Gang.

One final note: it’s perhaps a curious feature that the Mummy made her way to London, but watch for that motif as the series progresses: Stoker’s Dracula and Shelley’s Frankenstein both relied upon the device, which goes back at least as far as Beowulf.  A curse earned abroad must always, whether in the diseased persons of returning soldiers, or in boxes of stolen treasure, or in the more amorphous forms of displaced, refugee souls, make its way home.

50ea3594ecad6c8d459d2bf3146d3cb5--temples-tattoo-ideas (1)


The T-Virus: (p)Resident Evil Makes Sure It Doesn’t Miss You

1447738-1024x576-desktopnexus-comby Ed Anger, opportunistic occasional contributor

Trivia.  Americans can’t look away, like a car wreck.  Since when did a tweet–ANY tweet–become newsworthy?  This is an avenue specifically designed to carry information Too Trivial for Traditional media.  If you missed it the first time, by definition it wasn’t important!

Twitter, or as I like to refer to this bottom of the social media barrel, Twerper.  After all, who but a Twit or Twerp would exchange insults through a means that cannot possibly have repercussions other than a 140-character counter-insult?  You think anyone’s gonna stand toe-to-toe with Arnold Schwarzenegger and tell him he was a lousy governor, no matter how many secret service agents he’s got at his back?

Apparently TV and Taxpayer money (the two dominant consonants of “TriVia” uncoincidentally lurking there as well) aren’t hip enough anymore.  Just what we all needed, the Maury Povich show with semi-literate politicians!

The entire culture, having apparently run out of real topics–the trials and heroics of mere mortals don’t generate enough interest–has been inoculated (like that free flu-shot they give you at the supermarket that gives you the flu) and thus addicted to trivia.  Imagine yourself as the Texas Ranger who gets assigned to the case of Tom Brady’s missing Super Bowl jersey.  “Hell, it’s not bad enough this fella makes 300 times my salary for playing ball, now I get to track down his dirty laundry.  So this is what my career has come to. . .”  Let’s hope nobody robs a bank or kills somebody while he’s not at his post.

Newspapers have already become as thin as Target flyers. Once they start reprinting tweets, we’ll have hit the Trifecta of redundant, useless information that distracts us from the latest global warming evidence or how the debt-ceiling got hiked today.

So T-Rump (how does one type with hands like a T-Rex?) goes after Schwarzenegger, and the Governator shoots back a better one.  What a pair of Trumps.  Ad nauseum.  Then we tune in to the 6-o’clock news to witness how the leaders of our nation have devolved to antics that most of us outgrew as 11-year old children.  Quite a Trip.

I’ve had enough of this Twaddle.  I think I have a case of the D.T.s.  Need a drink. . .


Stick to the Routine: “Bad Niggers” versus “Good Niggers” in the Election Aftermath


By Google Shawn StJean

“Always do the right thing.”

Whether or not you recognize that line from 1989’s film of the same name, if the terms of the statement raise any kind of question in your mind, then you probably noticed the quotation marks around it, and in my title: as an eductor, I have about as much love for racial slurs as the author of 1885’s Adventures of Huckleberry Finn.  But they do continue to exist.  Both Spike Lee and Mark Twain interrogated institutional racism, a century apart–and by extension sexism and classism–in America, by the risky means of inverting usual terms of right and wrong (among them, the n-word: “He was a mighty good nigger, Jim was,” Huck assures his readers, in a horrifyingly realistic moment of combined affection and condecension.)

We’d like to believe things evolve with time.  Will America be any further along in 2089?  How about 2019?  Evolution is a process of fits and starts.

Many feel the recent election results, installing Donald Trump in the White House, do not signal progress for the immediate future.

The white, male, and (racially, if not economically) privileged part of me agrees with the other whites out there, including those who wanted neither Clinton nor Trump for a leader: it’s a society, we have rules, the man won the election under those rules.  Get over it.  BUT the part of me that feels the threat by the undeserving wealthy, the corrupt system of campaign finance and two parties, and knows the rules were made and bent to serve some and not others, agrees with Wanda Sykes and the protesters, and even the rioters:  it’s not a time to be good, play the humble, obedient citizen, and smile.  It’s a time to resist and show anger and use bad words.

The girls smile

and people forget

the snow packs

a skier tracks

and people forget

forget they’re hiding

The Who’s 1982 song “Eminence Front” reminds us we are asked to, and demanded to, wear masks, of one type or another, all the time: that the face Bruce Wayne is in fact more false than Batman’s cowl.  We wear them so constantly in our waking hours, like eyeglasses, that we forget we do it; and life is one long costume ball.  Only when someone drops their mask, do we remember, and project our resentment.

Black comedian Wanda Sykes dropped her mask a few days ago during the Comics Come Home charity fundraiser for cancer care in Boston, abandoning any pretense of her usual routine and instead ranting about Trump, for which she was booed off the stage.  She shouted and gestured obscenities at the audience in response.  Not what they expected; totally inappropriate for the occasion, I imagine many said.  Denis Leary, who reportedly got big laughs for his jokes about Trump earlier in the evening, later said publicly that it is not the business of the event-runners to censor the performers.

One needn’t invoke Malcolm X here for a precedent for Sykes’ actions.  Dr. King himself, a minister by trade  and protester by necessity, in his “Letter from Birmingham Jail,” made it clear that change will not come from conforming to the rules others set, or from waiting for an appropriate time or behaving in conventionally acceptable ways to suit an occasion.  I have always believed this was why the Occupy Wall Street protests were culturally ineffective.  Those folks set up tents in a pen in Zuccotti Park and grazed there within its confines, using the portable toilets, policing their own trash, doing as they were told, behaving, essentially, and utterly failing to disrupt business as usual until they left less than two months later.   The intentions were politically and morally correct; but the tactics were about as threatening as a Boy Scout jamboree.

On the same night as the Sykes debacle, a few hundred miles south, another black comedian, Dave Chappelle, hosted Saturday Night Live, and received almost universal acclaim for his monologue on Trump and the racial divide in America.

Don’t mistake me: we all have our varied talents, and Chappelle killed it, as they say in show-biz–I have no quarrel with him, no iota of disrespect for anyone following their true conscience.  Sketches following the monologue, notably one in which Chappelle was joined by Chris Rock, assured us all that the white-liberal idea that racism is dead in America is laughable.  It’s creative protest.  I believe Dr. King would approve–with reservations.  We all need to remember that, in the final analysis, Chappelle’s being what Ralph Ellison characterized Invisible Man‘s (1950) protagonist as: a “good nigger.”  Not so much because of what he said, but what he did.  He’s helping, like President Obama himself, to defuse a potentially explosive situation, and diffuse the mass of energy that collected and creates the danger of blowing up the status quo: “we been here before.”  And while it might not be pretty, folks of the Wanda Sykes temper, white or black, whether ranting or rioting, are refusing to exit so quietly and quickly from the stage as Hillary Clinton and the rest of the so-called white liberals out there.  For them, the stakes are higher.

Chappelle uses the n-word freely, as so many black comics have done over the past decades, and we laugh.  The level of discomfort is lessened because he’s part of that “historically disenfranchised” group, and he’s earned custody of the word.  As he said, “If I could quit being black today, I’d be out’ the game.”  And we’re reminded subliminally by that one word, in the context of a comedy routine, of all the associations of it: slavery, lynchings, racial profiling and shootings, gentrification, poverty, institutional prejudices from schools to the military to the workplace to the neighborhoods.  And in the hermetically sealed, artificially constructed TV bubble of SNL, we’re authorized to laugh–and we do, because we know those things are all true.

By contrast, the image of the “bad nigger,” as epitomized for many by Malcolm X, by the time of the 1990s, had become so familiar that white filmmaker Kevin Smith parodied (thus depowering) it in Chasing Amy,*following Tarantino’s more tenuous effort in Pulp Fiction (for which, in a real-life travesty, Samuel L. Jackson was misclassified in the Oscars as “supporting actor” to John Travolta).  Both filmmakers capitalized on the ur-image of fear in America: a man of color with a gun.  MCG.  Not just a mouth–an equalizing weapon.

Inasmuch as some of us like to refer to “most people” as if we were above all the fear and hate, human beings are, at bottom, visceral creatures.  We are still a primitive species, as of the dawn of the 21st century.  It’s possible that all the pundits have over-sophisticated the explanations for Trump’s victory; that fundamentally, it all comes down to change, or more precisely: backlash.  Eight years with a black man as national leader, no matter how good or bad, there was no way a woman could win it, here and now.  People wanted the package of white patriarchal values back, they’re comfortable with it, and, in the great pendulum swing of the rise and fall of American civilization, they again have it.

I recently listened to a college radio station near Amherst, MA (supposedly one of our country’s many pockets of liberalism.)  Three white, male sophomores discussed, in the perfect comfort of privilege, which professional sports teams’ names and logos should be abolished, and which are okay.  Cleveland’s Indians, apparently, are fine with them, but Washington’s Redskins are offensive.  Being an inheritor of power, a “fortunate son,” usually entails such presumptions.  These are the spiritual grandsons of our president-elect.  They decide; the team owners decide; maybe the players get a vote; the people living on reservations are not consulted–or perhaps worse, their opinions are collated and presumed for them in one deft, hypothetical/hypocritical-hybrid gesture.

As a great man–though not an American– once said: “…they know not what they do.”   But that won’t stop them from passing judgment.

Chappelle’s act and his concluding promise to “give Trump a chance” followed by the script’s ritualistic “We got a great show tonight!” receives universal At-A-Boys from such white folks, while Sykes will suffer condemnation and consequence for her 15 minutes of infamy.  Some people will undoubtedly try to blacklist her (our very language reveals its lingering biases).  She left her place.  But did she do the right thing?  Did he?  Is staying within one’s designated sphere, keeping appropriate, saying please and thank you, using the servant’s entrance, waiting for approval, bowing, conceding, hoping for the best in the face of overwhelming evidence; are these actions the “right thing” when one perceives a clear and present danger?

I’m not certain throwing middle fingers or throwing rocks or looting are the most effective tactics; then again, I’m not entirely sure they’re not.  As opposed to the over 50% of all voting Americans who didn’t really want their candidate in office, but voted for them in order to keep the other candidate out, and who will now shrug and do nothing (“wait and see”), or those like myself who declined to vote at all, at least Sykes (and Chappelle) are doing something.  You want change?  Doing something, then, will always be more right than doing nothing.



*of course, there exist enough layers of irony in Smith’s film to cast doubt on exactly what he’s parodying.  The character I refer to turns out to be a gentle, gay (black) man who wears the public mask of “Hooper X,” the creator of  White-Hatin’ Coon, in order to profit off the cultural hero worship of the racial/rebel hero.  Is his tirade against white supremacy then rendered inauthentic?  Is he merely an opportunist?  Unfortunately, all this being a lot to sort out for a comedy, won’t the individual viewers take away whatever interpretation most supports their own perspective?

Holiday GoodReads Giveaway–win a bound copy of Clotho’s Loom



I am NOT contributing to pre-Holiday hype, Black-Friday-style.  There’s a reason for this post appearing so early: you want to enter early and often.  (Please note that you must be a member of the GoodReads website to do so–there are Login and Join buttons on the page.)

It’s been awhile since I plugged GoodReads, the massive online community of folks who share news and resources about all things literary.  This year I managed to get organized well-enough to have a giveaway end right on Christmas day.  The novel that gives this website its name will make a thoughtful gift for that literate person on your list–or even yourself. . .

And speaking of lists, Clotho’s Loom is on a dozen of them, including “Novels about Motherhood” and “Veteran Recommendations.” So please vote for it via this second link when you get a minute, and increase its visiblity!


Go, Thief! Writing as Collaborative Piracy


howididit (1)

“A thief who steals from a thief is pardoned for one hundred years”—Eli Wallach as Calvera (Magnificent Seven/Seven Samurai remake soon in theaters near you)

As a grad student taking creative writing classes, I did a lot of workshopping, but received little practical advice.  Most everything learned is earned, not given.  However, the best counsel I got was as an undergraduate toiling away in my Intro to Fiction course: “If you see something you like, steal it” (the professor/novelist who uttered these words will obviously not mind my failing to attribute to him here.)

There’s not much stealing of money by novelists, short-fictioneers, playwrights, and poets going on (we leave that to the publishing houses).  On the other hand, the best and worst of us do steal material pretty liberally from each other.   Some of this is unconscionable laziness, but I think those who take their craft seriously do hold themselves to a few self-imposed rules, which I’d like to codify here by supplementing my old professor’s advice with what I call the “Rule of Three O’s”: “By all means, steal: but try not to steal too often; nor too obnoxiously; nor too obviously.  Penalties exist for each.

  • Too often. No one likes to be labeled an unoriginal hack.  I mean, if you do this daily, you might as well become a television journalist and get paid well.  None of them seems to have recognized that Donald Trump has lifted most of his campaign platform from Adolf Hitler (“Make _______ Great Again,”) but you know that if one did, they’d all be parroting it.  Because there is no honor among thieves: they turn on each other.   Genre writers are in greatest danger of returning to the well too often, killing the golden goose, choose your cliché (a word-level version of this crime).
  • Too obnoxiously. You wouldn’t carjack a Corvette and then drive it around the same county without at least a re-paint, would you?  That’s just not right.  A plot, for example, needs to be sufficiently re-dressed to make it palatable.  Some recognize that the story of Jason Bourne is a retelling of Frankenstein, just as Blade Runner was (and to some extent the Wolverine and Deadpool tales): Scientist manipulates human limitations; scientist gets re-visited over and over by the subject of his experiment (“Why would he come back now?”)  It’s a good story with much psychological depth and breadth, as well as moral/ethical implications, which is why it gets told every five years.  Another version of obnoxious theft is a too-clever playing with familiar phrases.  Some writers get so good at this, they’re dangerous:  “Code of Dishonor,” “Twice Bitten,” “Can’t Stand In Heat.”  Seriously, why would you even crack the cover of a book entitled “For a Few Zombies More?”  Do you expect the writing to improve after that?
  • Too obviously. If last week’s marquee title is The Terminator, and you rush-premiere a B-movie called The Decimator, The De-resonator, or The Decaffeinater, even the trolls on the forums will crucify you (now there’s a story worth retelling,) without even watching it. And you deserve it.

Which leaves us with the question of how to do stealing right.  There are perhaps a hundred ways, so let’s “borrow” a few from the greats:

Allusions. Usually at the word level, these nuggets are on full display for those in your audience who may have read more than three or four books.  When John Steinbeck cribbed his novel’s title, In Dubious Battle, from the proem of John Milton’s epic Paradise Lost, it was more than a pretty phrase he admired.  He wanted to signal, perhaps, that workers in contemporary America (the many and weak) were being warred upon by Satanic forces (the few and powerful).  Steinbeck, in fact, grifted several of his titles from Biblical or semi-biblical sources: The Grapes of Wrath, East of Eden, and other writers (Of Mice and Men).

Homages/parodies. In Young Frankenstein, Mel Brooks wasn’t trying to get away with piracy, but instead to rely on the audience’s familiarity with both the story and previous remakes from Hollywood.  The result is wonderful: How I did It is the title of the Baron’s journal. The great danger I see, today, is that in a semi-literate culture, exposure to 2nd, 3rd, and farther-removed parodies takes the place of reading the originals, rather than supplementing it.  Children, of course, will claim they can survive on candy; and so it’s no surprise to hear twenty-somethings argue they can distill the important news from The Daily Show and Saturday Night Live.

Shakespeare, yes even he of the cranium enormous, raided the Plot and trappings of Hamlet, Julius Caesar, and probably Othello and Romeo and Juliet from earlier sources. Now, given what he accomplished with them, and the relative scarcity of masterplots, this is forgivable.  How many people recognize that The Terminator is a thematic retelling of Sophocles’ Oedipus Rex (by attempting to avoid predestined events, one can actually bring them to pass)?  Cameron simply converted prophecy to time-travel (two sides of the same coin).

Update/Remake for a modern audience. Emerson said that every generation had to reinvent its stories (here I’m paraphrasing—at least I’m giving the guy credit!)  Especially the out-of-copyright ones (wink to the publishers and movie studios there.)     I suspect 2016’s Birth of a Nation will not much resemble the original.  You did know that was first made by D.W. Griffiths in 1915, adapted from Thomas Dixon’s novel The Clansmen (1905,) didn’t you?  Film and literary critics fascinate themselves with analyzing how remakes tell us much about the culture that produced them, by emphasizing and deemphasizing certain elements of the ur-story.

Recasting from a different perspective. Euripides is perhaps the most prolific of western writers here.  He recast much of Greek mythology from the point-of-view of “the other,” the marginalized characters: Medea, The Trojan Women, The Bacchae.  A very neat and risky trick.  How many Americans do you know that would enjoy a film about how the Russians sacrificed twenty million souls to defeat the Nazis in World War II?  And yeah, that did happen.

Which brings us to pseudo-history. Spielberg’s Amistad, Mann’s The Last of the Mohicans, Stone’s JFK, anything by Michael Moore in a more documentary mode, are all masterful narratives.  They are not history, by any serious definition.  BUT, they weren’t meant to be: they ARE meant to raise the spirit of inquiry in the audience, to challenge them to learn more and seek the truth themselves.  Poe did this with his unreliable narrators, but the solutions lay within his stories themselves.  Here, the facts lie outside the story, in other accounts one would have to research.  Sadly, this is all too infrequently done, and the pseudo-history stands as the somewhat-removed Truth.

So writers, don’t worry so much about books getting stolen, in either analog or digital form.  One way or another, it’s all in the public domain, there for the taking–isn’t it?  Put it this way–no one ever promised to pay you, anyway.  Given the choice, you’d rather give it away than keep it to yourself.