Pittsburgh’s Laura Smith Interviews Cranky Bear Wakes Up story-sketchbook author, Shawn StJean

Poet, editor, and author of several children’s books, including The Castle Park Kids, Laura Smith works tirelessly on the Indie Publishing scene.  One of her public services is to interview authors to aid their discoverability, amongst the flood of self-published works out there.  In the interview, you’ll find insights into the new picture book illustrated by Todd StJean, Cranky Bear Wakes Up, and a few about Shawn StJean’s older and forthcoming prose works, too. . .

Here’s a sample:  “So the book has an element of allegory. Put simply, we all need friends–and not just on days of personal crisis, but every day–especially the days when they need us.         It’s also a ‘story-sketchbook,’ which means kids are encouraged to color and draw in it themselves (as the back cover clearly shows, I hope.) There’s no more important attribute for a child to cultivate than an active imagination, I believe.”

Please SHARE one of the following interview links among your network:








Cranky Bear Wakes Up: Todd and Shawn StJean’s New Children’s Paperback Published by Glas Daggre

cbbackcoverjpegJust in time for the Holidays (wink, wink,) a new “story-sketchbook” suitable for kids, ages 3-10, will be on the shelves this week from Glas Daggre, the independent publisher that makes this website its headquarters.  The illustrated, 8×10″ trade paperback features artwork by my ever-prolific brother,  Todd, and a humanistic adventure narrative in which cute and fuzzy mammals–though initially misguided–learn to make room on our planet for the birds, insects, fish, and other creatures that share it.  It makes ideal reading-aloud before bedtime.

Amazon purchase page (the legitimate retail price should be $13.99:) https://www.amazon.com/dp/1981271864/ref=sr_1_1?s=books&ie=UTF8&qid=1512521353&sr=1-1 (or you can search Amazon for “StJean” or the full book title.)  There will be no e-book as yet, but a hardcover should follow next year.

Barnes and Noble: https://www.barnesandnoble.com/w/cranky-bear-wakes-up-dr-shawn-stjean/1127591429?ean=9781981271863

PLEASE REVIEW on Amazon.com or in your favorite venue, should you give this work a try.  It should be turning up in the usual sales channels during the coming months.  Of course, you can contact the publisher directly for a copy with competitive pricing, and, upon request, an author’s signature.


Trailer-itis: Where’s the Justice in this League?


Movie Review by Shawn StJean

[spoilers follow]

When superhero-fatigue finally kills that movie genre for big profitablity, as it inevitably will, we diehard fans can lay part of the blame on trailers.  They’re symptomatic of the general ripoff that has become of the neighborhood cinema experience: once you walk in the door, your appetite will be ruined by the prospect of a six-dollar soda pop and a nine-dollar popcorn.  Trailers can make mediocre movies look good enough to get your bum in the seat; and apparently the philosophy of some studios, these days, is that the film itself holds no obligation to deliver on its promise.

A trailer–as the four that preceeded Justice League certainly did–can stripmine the film for its best fragments of dialogue and a few choice CGI effects, along with a licensed, catchy song.  It may hold back one or two plot twists, but if the film is disappointing enough, this might not even be a bad thing, from a promotional standpoint.

Much like a trailer, the JL feature film appears to me to have been cut with a blunt instrument, too quickly and heedlessly: too many repetitive shots of Aquaman and Wonder Woman bashing/being bashed by Steppenwolf, too many slo-mo shots of Wonder Woman in general.  All this to disguise the almost total lack of a plot. It’s nonsense unworthy of a Supergirl episode.  And as a villian, Steppenwolf is a second-rate enforcer in his best moments, and not nearly as entertaining as the comedic Grandmaster of the vastly superior Thor: Ragnarock.   His evil parademon minions make the flying monkeys of The Wizard of Oz look menacing.  Anyone anticipating so much as a cameo by Jack Kirby’s New Gods’ uber-villian, Darkseid (which Steppenwolf’s appearance in the trailers surely suggests) will go away empty-handed.

What the trailer doesn’t show us is the pathetic resurrection of Superman from his supposed death at the end of Batman Vs. Superman.  Since when do good guys hold (CGI-)seances, anyway, and when in the history of storytelling did that ever come to good?  The film even acknowledges this with two direct references to Stephen King’s Pet Sematary, and two of the five league members saying “this is a bad idea.”  Of course, whatever Superman might have been missing from his resurrection is taken care of in a ten minute fight scene, and visit from Lois Lane.  Any decent writer/director team–or the editor. . .someone–should have realized two characters digging up a grave would remind everyone, over the age of 35, of Young Frankenstein–though with far less humor.

The film itself completely contradicts the message of the trailers, which features Batman “putting together a team.”  Joss Whedon has repeatedly  shown, most recently for Marvel, throughout his career, that this Seven Samurai plot can be recycled to good effect, but no one here seems to have bothered to try.  The Justice League proves absolutely ineffectual without Superman:  and no number of Beatles covers of “Come Together” can shroud that fact.  They might as well have used “Help!”  And, by the way, the script doesn’t attempt to make any sense of the idea of “Justice,” whatsover.

There are a few laughs, and fun punches, and quiet moments: for example, Diana re-sockets Batman’s shoulder:  “You can’t do this forever, Bruce.”  I’m barely doing it now,” he replies, in a nod to all of us mere mortals.  It’s not a horrible film: but if popcorn costs nine bucks, it better not leave you hungry.  It should be noted that Gal Gadot’s Wonder Woman is the standout character for the third film in a row, and the trailers rightly emphasize this fact.  Her lasso of truth produces some of the decent moments of the movie, and Superman appears not so easily to bat her around, as the others.  Don’t be surprised, then, if she continues, like Atlas, to carry the DC movie universe on her shoulders, for the forseeable future.



Evil Archetypes of Pop Culture: The Base Betrayer and the Panicking Fool


By Shawn StJean

In this latest entry into our popular ongoing series at Clotho’s Loom, we examine a pair of minor character-types that recur throughout literature and film.  They might both be subsumed under the blanket of moral coward, but I prefer to get a bit more granular for the benefit of the writers out there.  Because one is a more ancient incarnation, and the other, more contemporary.

The two were brilliantly used together in James Cameron’s Aliens (1986,) the militarized sequel to Ridley Scott’s original (1979).  Paul Riser got appropriately cast as the slimy, sweating Burke, a corporate lackey who attempts to impregnate the heroine Ripley and her adopted daughter, Newt, with xenomorph embryos for the Company’s bio-weapons division.  When she discovers her so-called friend’s plot, Ripley acts as Cameron’s mouthpiece: “Y’know Burke, I don’t know which species is worse; you don’t see them fucking each other over for a goddamn percentage.”  I think anyone who has labored in an office building of any kind has known a Burke.

Long before his well-known stint as nice-guy explorer in Cameron’s Titanic, the late Bill Paxton earned a place in the director’s troupe as Private Hudson, an all-bluster Marine who kicks ass and takes names, until, that is, the tide of battle turns against his unit:  “I don’t know if you’re keping score, but we just got our asses kicked, pal!”  In the hands of a Panicking Fool, a gun becomes a weapon against everyone around him, and friendly fire becomes as great a hazard as a battalion of enemy combatants.  Especially in the more modern (Vietnam era, especially) war films does this figure turn up.  It is probably the advent of firearms into narrative that makes him so much more viable for writers: running away only endangers oneself, whereas indiscriminate gunfire endangers everyone.

Dante reserved his 9th circle of Hell (Treachery, the worst of said circles,) for the base betrayer.  Virgil leads the author-narrator to a position to witness the sufferings of Judas Iscariot, Brutus and Cassius (traitors to Julius Caesar).  While Judas’ motives have in later times been complicated by artists such as Martin Scorsese, the original motive is the ever-popular one: silver (Burke).  Brutus, of course—best known in the presentation by Shakespeare—while believing he acted in the best interest of the Roman republic, the fact remains that he literally stabbed his friend without warning.  He did not challenge him to open combat, or a political battle of wits, or so much as a game of chess.  If judged only by his actions and not by his motives, Brutus qualifies as moral coward.

The notion of moral cowardice and betrayal is complicated in western literature by the presence of women characters, from Medea to Gone Girl, because women are not traditionally expected to possess the analogue of moral courage.  This is a patently false prejudice of our culture, of course, but it has become so pervasive that, in a typical narrative stocked by male characters, any female will automatically fall suspect when a betrayal occurs.  Writers often exploit this.  They can, because male-centric narratives operate from a male perspective—including the idea that women’s motives and perspectives are, by definition, irrational, impenetrable, and mysterious.  “If there’s one thing I can’t stand, it’s a dirty, double-crossing dame,” insists Big Jim Colfax, one of the male arch-criminals of 1946s The Killers, a quintessential, Hemingway-based film noir that sets the tone for the next 70 years, and counting.  Because as a man, murder, theft, and betrayal are par for the course—but if you’re a woman, well, let’s just say that things are about to get warm for you.

Inasmuch as many genre stories (horror movies and sci-fi, westerns too) function as microcosms (literally “small worlds”) in which groups of characters represent the types of people one would find in real life, a betrayer is bound to turn up, as well as a fool (if not a panicking one—the person who needs to have the hysteria slapped from her mouth).  Panickers lose the ability of rational thought in a crisis, and in fact never tend to think ahead in the first place.  So if you know someone who leaves the house in cold weather without a coat, who drives a car around without a spare tire, first aid kit of any kind, or so much as a screwdriver, and relies on a cellular telephone and a smile to bail her out of any crisis, you may know a potential panicking fool.  By contrast, a betrayer may be a cold and calculating logician, weighing beforehand the risk-versus-reward factor of every situation, and acting upon best opportunities.  It should go without saying that such a person will perfect the art of lying.  But with a little experience, these too can be spotted in the wild.  Anyone who spreads a lie or gossip about anyone else to you, is sooner or later going to do likewise about you.  Don’t kid yourself otherwise.  Unfortunately, the experience this costs, of learning what true friendship is the hard way, is dearly bought.

One reason archetypal characters function so well is that we, as reader/viewers, recognize the part of ourselves that they personify.  Deep inside of all of us there’s a little voice that shouts “Run! Scream! Run!”, and also a voice that looks at another person and says “If zombies chase us, I’m tripping you.”  Because high-stress situations bring out our most primitive, base instincts.

Luckily, panicking fools may not be as familiar to us in life as in texts, outside video-games (wherein every deathmatch map has one person running around in the open, firing wildly.  Snipers love this guy.)  But anyone who has gone through an American secondary school knows the person who will sell you out, to further their own agenda: from miscarriers of misplaced confidences, to boyfriend-stealers (and cheating boyfriends), to friends who abandon you for better friends at the earliest opportunity.  And therein, I believe, lies the enduring popularity of the base-betrayer figure.  To the extent that all adult human beings are damaged, at least some of it can usually be traced back to betrayal.  When a person abuses your trust, they don’t simply harm you; they harm every single person, for the remainder of your life, who will be worthy of your trust.



The Collapse of American English


03262015-Kingdome_01By Shawn StJean

Perhaps my accompanying photos are a trifle hyperbolic.  Perhaps.  It’s a truism among our global neighbors that Americans (by which I mean U.S. citizens) expect everyone, everywhere, to speak English.  The corollary, of course, is that most refuse to learn other languages, such as Spanish, even when the utility of doing so is abundantly clear.  But a looming problem for our culture in the 21st century seems to be that Americans increasingly decline even to learn English–at least beyond the 3rd or 4th grade level.

This level, supported by weak resources in the slang of the moment, proves sufficient for basic writing and speaking, but does not carry us far into the realm of critical thought and communication.

I choose the word “collapse” for my title, rather than “decline,” because I mean just that–what used to be a language with hundreds of thousands of specific, nuanced and descriptive choices has and continues to converge and impode into fewer and fewer.  With the recession of traditional print media in the face of digital dissemination of what can charitably be called information, even simple affirmations like “Yes,” “certainly,” “definitely,” “acknowledged,” and “no doubt,” in the most extreme example of private text messaging, have all been replaced by a single letter: “K.”

Need this be a bad thing?  After all, what’s more efficient than “K”?  Doesn’t that free us up for more important, or at least more, activity?  Before answering, let’s look at some other casualties in this war for the space in our brains.

Examine the following short list of commonly used expressions, and you’ll realize that either they are purposefully and even defiantly vague, or that one word takes the place of many–indicative of the digital age we live in (compression, homogenization, and subtle loss of nuanced information):

“Do” replaces the verbs/actions “try” “give” “accept” “participate in” “contribute to” “tolerate” “clean.”  As in “I don’t do Christmas.”

“Go” travel/venture/explore/pedal/fly/walk/hike/swim/jog and even “communicate something uncomfortable,” as in “Don’t go there.”

“huge” /big/large/important/significant/influential/knowledgeable/enthusiastic.  “I’m a huge fan.”  In my ear, this sounds ridiculous even on the face of it.  We all speak in metaphors of one degree or another all the time (“collapse” is a minor metaphor when not speaking of a physical structure,) but the above expression equates to saying the gushing adorer is an abnormally large person (or ventilating device.)  One might as well offer to wave oversized palm leaves, ancient-Egyptian style, at the object of worship.

“way” very/much/far/long (“This license is way out of date.” “This sauce has way more garlic than the recipe calls for.”)  This one in particular disturbs me because it demonstrates we aren’t just discussing slang here.  “Way” has been adopted not just in common speech, but by professional writers.  It has infiltrated the language in a permanent, um, way–ahem–manner.

“You’re all set.”

“It’s all good.”


“it’s all about”

“comes into play”

“deals with”

“back in the day”

Of course, words are invented, repurposed, and recombined all the time.  I must be overracting.  Aren’t these replacing archaic usages?  We’ve got “tweet.”  And “text.”  “Sick,” “diesel.” Oh, and “literally” can apparently now mean just the opposite, “metaphorically”–I mean, does it really matter?

“[   ] is a thing.”  Ah, yes, thing–the one catch-all noun when people grasp for a word and cannot find it, the very expression of inarticulateness, has become an official word to describe a fad, trend, icon, object of buzz or gossip, popular occurrence or consumer good, news item of the day, or week.  We had all those expressions, and they all relied upon small distinctions.  At this stage in human (d)evolution, we needed “thing”?

Okay.  Let’s say I’m right.  So the language is imploding.  What’s at stake here?

Many will not miss the subtleties that have dispersed into ether, I imagine.  Then again, it’s difficult to miss something you never knew you had.  What about the millions of unborn youngsters who will grow up with effective working vocabularies of a mere few thousand words?  Will they write poetry that amounts to more than a colorful tag on a railroad bridge?  Will they read it?  Will they understand the U.S. Constitution, even as they are called increasingly upon to “defend” it?  Will the historical records of the 19th and 20th centuries begin to sound as impenetrable as Shakespearean solilioquies do to us?  And I’m not talking about the kind of missing material in a contraction: to anyone but a fiction-writer or screenwriter, the distinction between “I’ve” and “I have” is not great.  One might use it to distinguish among charaters who are high-born or low-born, for example.  For the rest of us, it’s merely a convenience.

George Orwell warned writers not to compose in cliche’s.  He claimed, essentially, that writing in the pre-digested shorthand phrases of others leads to thinking in the pre-digested shorthand phrases of others.  Other signs that your thinking has been compromised: Do you find yourself regularly Googling information that you could remember with just a bit of effort?  Are you trusting that information (or that from Wikipedia, Mapquest, Siri, or the CBS Evening News) enough to act upon it or pass it on to another human being without double-checking it?  Are you cut-and-pasting that information (either in written or verbal form) without rephrasing it?  My overall point here is there exist vital differences among raw data, information (processed data), and intelligence (interpreted information).  And yet many of us are not bothering to recognize them.  Not because we lack the cognitive ability, but because we lack the critical tools and the will to use them.

A brief [ mostly harmless] experiment should serve here.  Raise your hand if you like music.

That should include most of you, one hopes. If you like music, you have probably in your time looked up some song lyrics.  In the old days, we read them out of LP album covers–which meant the source was the band’s record label, presumably direct from the songwriters themselves, which meant little chance of transmissional error.  Nowadays, we all know where song lyrics get found.  Dozens of websites cater to this need; even Google has gotten directly into the act through their search engine.  Look up a song or two that you know intimately, but the performed and recorded lyrics of which are not 100% crystal-clear by listening. I can guarantee you that, as transcribed onto your website of choice, you will not be long in discovering blatant errors in those lyrics which materially alter their meaning.  Furthermore, and more appallingly to me, you will discover upon cross-checking that most, if not all, of the alternative websites repeat that same error.  Which means, of course, that they are all “borrowing” from each other, and profiting off both you and the songwriters with little regard for the truth.  Now, if the stakes here seem low to you, import your experiment to the television news programs.  Jon Stewart had a running bit on his incarnation of The Daily Show dedicated to proving that not only do major news outlets shamelessly plagiarize from each other, but they do so in unedited cliche’s.  Again, in the old days, we might double-check their intelligence in what used to be called printed newspapers.  Umm.  Except. . .

One of the great virtues of written language is its precision, yet increasingly written English begins to resemble spoken English, even in widely disseminated and professionally published print media.  And spoken English begins to resemble colloquial English.  Don’t think so?  Ask an octogenarian (someone born roughly during the Great Depression, as of 2017) if their parents would use the word “cool” as part of their everyday discourse.  Nowadays, try to find someone who doesn’t.  Not that I think “cool” has done the language any great harm.  As far as I can tell, it was first used in America, in its modern sense, by Emerson in the 1840s–which probably means it dates back even farther and derives from the British.  But this word may prove the exception rather than the rule.  As it is, it conflates a much more typically detailed appraisal of a person, event, or object.  A girl who might once have been variously described as “tolerant,” “forgiving,” “loose,” “free-thinking,” “substance-abusing,” or “not a nag” is now simply “cool.”

Of course, one might argue that simple is better; the fewer moving parts in a machine, the more reliable it is likely to be (read “mousetrap.”)

I doubt the sustainability of that argument.  Another, more insidious example: “fewer” vs. “lesser” (or less).  Almost no one but your English teacher bothers with this one anymore.  Here’s why: who cares if your supermarket checkout line reads (correctly) “fewer than 12 items” or (incorrectly) “less than 12 items”?  Can’t we just dispense with one of these?  Well, we could.  Except one of them refers predominantly to individual items and people, and the other refers objects in bulk or concepts.  That is, “fewer people are finding jobs their college degrees prepared them for.”  NOT “less people.”  Because those people are individuals, not some vague statistic.  There’s less forest, which means fewer trees.  There may be “less opportunity.”  There may be “less rain this year” or even “less cod in these waters.”  But if there are unaccountably “less people,” we had better start looking for them.   And reevaluating the value we place on human life.

I’d like to conclude with a different, and more familiar example; possibly the mostly commonly transmitted text message in English:

Where R U

It (or some variant) is quick, servicible, doesn’t cost much effort to send, or–hypothetically–to answer.  And yet this message has probably caused more misunderstandings and needless arguments than most.  Why?  It’s laden with ambiguity (or even what deconstructors call “undecideability”).  In the absence of voice intonation, facial expression, pronunciation, linguistic context, primary and/or secondary punctuation, and so on, the receiver must interpolate those for herself.  Here’s how that might go, in response:

“None of your damn business.”

“Uh oh, he’s saying I’m late again.”

“Did I promise to be somewhere right now?”

“I’m at Main Street and Vine”

“She really wants to know Who am I with, and What am I doing?”

“I left an hour ago.”

Texts and tweets may count portability and quickness among their virtues, but they certainly cannot include clarity in that list.  Even among intimates, this message is as likely to lead to a dispute as an informative reply.  Another aspect that’s missing, and increasingly missing from written communication especially, is any sense of formality, professionalism, or what used be called politeness.  Now, you may say, “Well, that’s just a text message.”  Sure.  But ask yourself how many e-mails you have received without a greeting, a signature, an identification of the sender or introduction, or even so much as a rudimentary spell-check?  Did you answer them?  If you did, you, as are we all,  are complicit in the process of collapse.  Compare these two e-mails, typical of what I, as a college professor, have received from freshman students:

[2007]  Dear Professor:  I’m sorry I missed class last Tuesday and Thursday as my grandmother died.  I misplaced my copy of the syllabus.  Can you tell me what we did in class so I can make up the work?  Thanks, Kayla

[2017]  I missed class last week would you tell me what I missed

Neither one of these qualifies as polished, professional communication–especially from a writing student–but I think you’ll agree that the former has a few lingering virtues to recommend it, which have gone glimmering in the latter.  In fact, were I to delve deeper into my records of the past, we’d find that the students of the 1990s had bothered to include my actual name; that the excuses were often more inventive and frequently included such touches as offers of doctor’s notes; that a request to meet in office hours was not unheard of upon missing a week’s worth of training; that the student might have actually acquired class notes from a peer before writing; that the student would bother to identify which of the four classes I teach she was enrolled in.

I’m not sure that the degradation of the language–as slow and inevitable as abuse of the atmosphere that has summoned the effects of global warming– will contribute materially to the collapse of the society, the culture, or possibly even our civilization.  But I don’t fancy it helping.  It’s perhaps predictable that as our planet becomes more overpopulated, as more wealth becomes concentrated into fewer hands, and as such factors demand a parallel dynamic of information becoming the province of fewer people (collectors,) the rest of us will not find encouragement to strengthen our language skills beyond the consumer sphere (that is, you and I only need know how communicate well enough to work and buy and perhaps sell a bit.)

As for writing, a culture’s written language is the primary repository of its history.  Without a sense of history, it cannot evolve.

The solution?  Same as it’s always been, and the advice is good not just for writers, but for anyone who wishes to grow their brain and live up to something approaching their potential: READ.  Read anything.  Comic books, advertisements, editorials, romance novels, cereal boxes, movie credits.  Some are better than others, obviously.  Personally, I recommend Hawthorne, Hemingway, and Wharton, along with Carl Sagan for those whose tastes require something a little more contemporary–here was a man who knew a bit about large-scale collapse–but that’s just me.


Evil Archetypes of Pop Culture: the Crusader


By Shawn StJean

It seems as if Universal Studios’ inauguration of their “Dark Universe” franchise, beginning with The Mummy, should have monster-genre fans everywhere uncovering easter-eggs and salivating, in werewolf-fashion, for future installments.  What’s next?  Creature from the Black Lagoon?  Dracula?  Frankenstein?  Given the success of Marvel Studios and its web of interconnected sagas, and the generosity of audiences even toward the far-less compelling DC Comics movie adaptations, this seems a logical gamble in the Hollywood and Pinewood of 2017 and beyond.  More interesting to me, as I combine a sort of hit-and-run mini-review here with a broader, deconstructive cultural analysis, is how the real villains of The Mummy are not the title character and the soulless zombies she creates from humans by draining their life-force (souls) to revivify herself.

The film itself makes a dubious beginning to re-introduce the Dark Universe into the 21st century, relying as it often does on relentlessly flat jokes, and worse, convoluted exposition in favor of any attempt at plot or characterization.  One could call Tom Cruise merely miscast, if his part weren’t so deplorably underwritten: supposedly a profiteering soldier who steals Mideastern relics, he mostly blinks his eyes and shakes his head through the bulk of the narrative, eventually alerting the audience for the dozenth time that the Mummy has a telepathic entry to “inside his head.”  Rather, it’s Russell Crowe, playing Cruise’s antagonist Dr. Henry Jekyll, and his minions, who warrant our serious attention here.

Jekyll, keeping his nefarious Hyde persona barely at bay with regular injections, leads a secret society of monster-hunters with the self-appointed mission to rid the world of evil.  He’s willing to go so far as to facilitate the Mummy’s obsession with her “Chosen,” Cruise, allowing him to be killed with a sacrificial dagger and incarnate the Egyptian god of Death, Set, so that he can then be “obliterated” under controlled conditions, “a sacrifice for the greater good.”  I’ve discussed human sacrifice at length previously in this series, but here it points us back to the deep motivations of Jekyll, our modern-day crusader.  Edward Hyde, grappling with Cruise, points out “It’s Jekyll who wants to kill you,” whereas he wants a partnership with the sergeant.

The archetype of the crusader, not one that springs to mind immediately, nevertheless forms part of the canon of recurring iconic figures in myth.  Self-adorned in the garb and acoutrements of a White Knight, the Crusader’s single-minded pursuit–a holy mission–brings him/her repeatedly into a death-struggle with what s/he imagines to be incarnated evil (but which is only a projection of the knight’s own private sin,)  and may even suffer a savior/God complex, as here.  Think Ahab: the white whale represents, to him, the sum total of all evil–not coincidentally having deprived the Captain of his own leg in a previous encounter.  In order to slay Moby-Dick, Ahab will sacrifice his ship and the lives of his entire crew, yet rationalizes this insane quest just as Jekyll here: keep an eye on the big picture, fellas.

The film has great potential in this regard, but squanders several opportunities to fully realize its themes.  Cruise is called a thief and mercenary on the surface with the soul of a good man attempting to emerge, whereas Crowe is a respectable doctor and leader hiding a soul of evil and “chaos” just beneath his respectable exterior.  His three-piece suit is the shield and cloak and sword-oath of a crusader, 900 years later, working ostensibly in a righteous cause, while committing atrocities along the way.  The capture and subsequent torture of the Mummy should help us realize this, but the film has buried any sympathy we might have had for the title character under its unnecessary agenda of portraying her as wholly evil.  And the contrast between Crowe and Cruise is never made direct enough; the yin and yang never bleed into each other.  Crowe’s Jekyll does indeed have the final words: “it takes a monster to fight a monster,” but it echoes too much like sequel-pimping.  We haven’t been shown the knights of the second Crusade desecrating Mesopotamian and Egyptian crypts, only told.  The crusader-knights turned to zombies, a refreshing turn from the usual T-virus, should help.  The mummy has reanimated their corpses to continue the mission they had in life–mindlessly carrying out someone else’s political agenda.  But the film is neither so subtle as to emphasize this impressionistically, nor so obvious as to have someone shout it out.  And honestly, as for sequels, I’d be happy never to see Cruise’s character again.

A crusader, like any soldier drafted into a foreign war, has to believe in the worth of the cause.  And yet, the deep disillusionment in the face of true horror in and around battlefields transforms the idealistic campaigner into a monster.  The cycle of post-Vietnam movies imported the process to American cinemas.  The best of the protagonists become world-weary and learn to hate the crusade itself; of course, since Universal is hoping to kick off a franchise, Jekyll can experience no such awakening.  We saw it most explicitly in Oliver Stone’s Platoon, but it may have been Sean Connery in 1976s British Robin and Marian who expressed it most succinctly:   “I keep thinking of all the death I’ve seen. I’ve hardly lost a battle, and I don’t know what I’ve won. ‘The day is ours, Robin,’ you used to say, and then it was tomorrow. But where did the day go?”  Perhaps only coincidentally premiering during the year of America’s bicentennial and following the final withdraw from Vietnam, the film nevertheless carried the message of counterculture from an empire whose sun had finally set on ocean slicks of blood.

And thus Cruise and his army cronies import the crusade against “insurgents” into modern-day Iraq.  The ruins and crypts and mass  graves and the walking dead are what empires leave behind in their quest of manifest destiny.  See, Egypt hadn’t much taste for expansion–the film stretches noticeably to bury the mummy in the Persian gulf, ancient Mesopotamia, seeming to want us to make the connection to the global political stage of modern day, to function as social criticism against American Empire–yet it can’t resist the weight, or rather lack of it, of its special effects, star power, and declared identity as traditional, if updated “monster movie.”  Essentially, it sells out.

What’s missing from the narrative of The Mummy, failing a major rewrite, is for Cruise to have a genuine epiphany, whether accomplished through his psychic connection or whatever silly device: it’s men exactly like Jekyll, in complicity with men like himself, that create the mummies of the world in the first place.  The civilians, the displaced farmers, the maimed and burned children, the revengers all haunting the wasted landscape.  But this would bring him and Crowe’s character into an irreconcilable conflict.  Instead, Universal seems to want to move them into formation of some half-assed Scooby Gang.

One final note: it’s perhaps a curious feature that the Mummy made her way to London, but watch for that motif as the series progresses: Stoker’s Dracula and Shelley’s Frankenstein both relied upon the device, which goes back at least as far as Beowulf.  A curse earned abroad must always, whether in the diseased persons of returning soldiers, or in boxes of stolen treasure, or in the more amorphous forms of displaced, refugee souls, make its way home.

50ea3594ecad6c8d459d2bf3146d3cb5--temples-tattoo-ideas (1)


The T-Virus: (p)Resident Evil Makes Sure It Doesn’t Miss You

1447738-1024x576-desktopnexus-comby Ed Anger, opportunistic occasional contributor

Trivia.  Americans can’t look away, like a car wreck.  Since when did a tweet–ANY tweet–become newsworthy?  This is an avenue specifically designed to carry information Too Trivial for Traditional media.  If you missed it the first time, by definition it wasn’t important!

Twitter, or as I like to refer to this bottom of the social media barrel, Twerper.  After all, who but a Twit or Twerp would exchange insults through a means that cannot possibly have repercussions other than a 140-character counter-insult?  You think anyone’s gonna stand toe-to-toe with Arnold Schwarzenegger and tell him he was a lousy governor, no matter how many secret service agents he’s got at his back?

Apparently TV and Taxpayer money (the two dominant consonants of “TriVia” uncoincidentally lurking there as well) aren’t hip enough anymore.  Just what we all needed, the Maury Povich show with semi-literate politicians!

The entire culture, having apparently run out of real topics–the trials and heroics of mere mortals don’t generate enough interest–has been inoculated (like that free flu-shot they give you at the supermarket that gives you the flu) and thus addicted to trivia.  Imagine yourself as the Texas Ranger who gets assigned to the case of Tom Brady’s missing Super Bowl jersey.  “Hell, it’s not bad enough this fella makes 300 times my salary for playing ball, now I get to track down his dirty laundry.  So this is what my career has come to. . .”  Let’s hope nobody robs a bank or kills somebody while he’s not at his post.

Newspapers have already become as thin as Target flyers. Once they start reprinting tweets, we’ll have hit the Trifecta of redundant, useless information that distracts us from the latest global warming evidence or how the debt-ceiling got hiked today.

So T-Rump (how does one type with hands like a T-Rex?) goes after Schwarzenegger, and the Governator shoots back a better one.  What a pair of Trumps.  Ad nauseum.  Then we tune in to the 6-o’clock news to witness how the leaders of our nation have devolved to antics that most of us outgrew as 11-year old children.  Quite a Trip.

I’ve had enough of this Twaddle.  I think I have a case of the D.T.s.  Need a drink. . .