The Collapse of American English

 

03262015-Kingdome_01By Shawn StJean

Perhaps my accompanying photos are a trifle hyperbolic.  Perhaps.  It’s a truism among our global neighbors that Americans (by which I mean U.S. citizens) expect everyone, everywhere, to speak English.  The corollary, of course, is that most refuse to learn other languages, such as Spanish, even when the utility of doing so is abundantly clear.  But a looming problem for our culture in the 21st century seems to be that Americans increasingly decline even to learn English–at least beyond the 3rd or 4th grade level.

This level, supported by weak resources in the slang of the moment, proves sufficient for basic writing and speaking, but does not carry us far into the realm of critical thought and communication.

I choose the word “collapse” for my title, rather than “decline,” because I mean just that–what used to be a language with hundreds of thousands of specific, nuanced and descriptive choices has and continues to converge and impode into fewer and fewer.  With the recession of traditional print media in the face of digital dissemination of what can charitably be called information, even simple affirmations like “Yes,” “certainly,” “definitely,” “acknowledged,” and “no doubt,” in the most extreme example of private text messaging, have all been replaced by a single letter: “K.”

Need this be a bad thing?  After all, what’s more efficient than “K”?  Doesn’t that free us up for more important, or at least more, activity?  Before answering, let’s look at some other casualties in this war for the space in our brains.

Examine the following short list of commonly used expressions, and you’ll realize that either they are purposefully and even defiantly vague, or that one word takes the place of many–indicative of the digital age we live in (compression, homogenization, and subtle loss of nuanced information):

“Do” replaces the verbs/actions “try” “give” “accept” “participate in” “contribute to” “tolerate” “clean.”  As in “I don’t do Christmas.”

“Go” travel/venture/explore/pedal/fly/walk/hike/swim/jog and even “communicate something uncomfortable,” as in “Don’t go there.”

“huge” /big/large/important/significant/influential/knowledgeable/enthusiastic.  “I’m a huge fan.”  In my ear, this sounds ridiculous even on the face of it.  We all speak in metaphors of one degree or another all the time (“collapse” is a minor metaphor when not speaking of a physical structure,) but the above expression equates to saying the gushing adorer is an abnormally large person (or ventilating device.)  One might as well offer to wave oversized palm leaves, ancient-Egyptian style, at the object of worship.

“way” very/much/far/long (“This license is way out of date.” “This sauce has way more garlic than the recipe calls for.”)  This one in particular disturbs me because it demonstrates we aren’t just discussing slang here.  “Way” has been adopted not just in common speech, but by professional writers.  It has infiltrated the language in a permanent, um, way–ahem–manner.

“You’re all set.”

“It’s all good.”

“basically”

“it’s all about”

“comes into play”

“deals with”

“back in the day”

Of course, words are invented, repurposed, and recombined all the time.  I must be overracting.  Aren’t these replacing archaic usages?  We’ve got “tweet.”  And “text.”  “Sick,” “diesel.” Oh, and “literally” can apparently now mean just the opposite, “metaphorically”–I mean, does it really matter?

“[   ] is a thing.”  Ah, yes, thing–the one catch-all noun when people grasp for a word and cannot find it, the very expression of inarticulateness, has become an official word to describe a fad, trend, icon, object of buzz or gossip, popular occurrence or consumer good, news item of the day, or week.  We had all those expressions, and they all relied upon small distinctions.  At this stage in human (d)evolution, we needed “thing”?

Okay.  Let’s say I’m right.  So the language is imploding.  What’s at stake here?

Many will not miss the subtleties that have dispersed into ether, I imagine.  Then again, it’s difficult to miss something you never knew you had.  What about the millions of unborn youngsters who will grow up with effective working vocabularies of a mere few thousand words?  Will they write poetry that amounts to more than a colorful tag on a railroad bridge?  Will they read it?  Will they understand the U.S. Constitution, even as they are called increasingly upon to “defend” it?  Will the historical records of the 19th and 20th centuries begin to sound as impenetrable as Shakespearean solilioquies do to us?  And I’m not talking about the kind of missing material in a contraction: to anyone but a fiction-writer or screenwriter, the distinction between “I’ve” and “I have” is not great.  One might use it to distinguish among charaters who are high-born or low-born, for example.  For the rest of us, it’s merely a convenience.

George Orwell warned writers not to compose in cliche’s.  He claimed, essentially, that writing in the pre-digested shorthand phrases of others leads to thinking in the pre-digested shorthand phrases of others.  Other signs that your thinking has been compromised: Do you find yourself regularly Googling information that you could remember with just a bit of effort?  Are you trusting that information (or that from Wikipedia, Mapquest, Siri, or the CBS Evening News) enough to act upon it or pass it on to another human being without double-checking it?  Are you cut-and-pasting that information (either in written or verbal form) without rephrasing it?  My overall point here is there exist vital differences among raw data, information (processed data), and intelligence (interpreted information).  And yet many of us are not bothering to recognize them.  Not because we lack the cognitive ability, but because we lack the critical tools and the will to use them.

A brief [ mostly harmless] experiment should serve here.  Raise your hand if you like music.

That should include most of you, one hopes. If you like music, you have probably in your time looked up some song lyrics.  In the old days, we read them out of LP album covers–which meant the source was the band’s record label, presumably direct from the songwriters themselves, which meant little chance of transmissional error.  Nowadays, we all know where song lyrics get found.  Dozens of websites cater to this need; even Google has gotten directly into the act through their search engine.  Look up a song or two that you know intimately, but the performed and recorded lyrics of which are not 100% crystal-clear by listening. I can guarantee you that, as transcribed onto your website of choice, you will not be long in discovering blatant errors in those lyrics which materially alter their meaning.  Furthermore, and more appallingly to me, you will discover upon cross-checking that most, if not all, of the alternative websites repeat that same error.  Which means, of course, that they are all “borrowing” from each other, and profiting off both you and the songwriters with little regard for the truth.  Now, if the stakes here seem low to you, import your experiment to the television news programs.  Jon Stewart had a running bit on his incarnation of The Daily Show dedicated to proving that not only do major news outlets shamelessly plagiarize from each other, but they do so in unedited cliche’s.  Again, in the old days, we might double-check their intelligence in what used to be called printed newspapers.  Umm.  Except. . .

One of the great virtues of written language is its precision, yet increasingly written English begins to resemble spoken English, even in widely disseminated and professionally published print media.  And spoken English begins to resemble colloquial English.  Don’t think so?  Ask an octogenarian (someone born roughly during the Great Depression, as of 2017) if their parents would use the word “cool” as part of their everyday discourse.  Nowadays, try to find someone who doesn’t.  Not that I think “cool” has done the language any great harm.  As far as I can tell, it was first used in America, in its modern sense, by Emerson in the 1840s–which probably means it dates back even farther and derives from the British.  But this word may prove the exception rather than the rule.  As it is, it conflates a much more typically detailed appraisal of a person, event, or object.  A girl who might once have been variously described as “tolerant,” “forgiving,” “loose,” “free-thinking,” “substance-abusing,” or “not a nag” is now simply “cool.”

Of course, one might argue that simple is better; the fewer moving parts in a machine, the more reliable it is likely to be (read “mousetrap.”)

I doubt the sustainability of that argument.  Another, more insidious example: “fewer” vs. “lesser” (or less).  Almost no one but your English teacher bothers with this one anymore.  Here’s why: who cares if your supermarket checkout line reads (correctly) “fewer than 12 items” or (incorrectly) “less than 12 items”?  Can’t we just dispense with one of these?  Well, we could.  Except one of them refers predominantly to individual items and people, and the other refers objects in bulk or concepts.  That is, “fewer people are finding jobs their college degrees prepared them for.”  NOT “less people.”  Because those people are individuals, not some vague statistic.  There’s less forest, which means fewer trees.  There may be “less opportunity.”  There may be “less rain this year” or even “less cod in these waters.”  But if there are unaccountably “less people,” we had better start looking for them.   And reevaluating the value we place on human life.

I’d like to conclude with a different, and more familiar example; possibly the mostly commonly transmitted text message in English:

Where R U

It (or some variant) is quick, servicible, doesn’t cost much effort to send, or–hypothetically–to answer.  And yet this message has probably caused more misunderstandings and needless arguments than most.  Why?  It’s laden with ambiguity (or even what deconstructors call “undecideability”).  In the absence of voice intonation, facial expression, pronunciation, linguistic context, primary and/or secondary punctuation, and so on, the receiver must interpolate those for herself.  Here’s how that might go, in response:

“None of your damn business.”

“Uh oh, he’s saying I’m late again.”

“Did I promise to be somewhere right now?”

“I’m at Main Street and Vine”

“She really wants to know Who am I with, and What am I doing?”

“I left an hour ago.”

Texts and tweets may count portability and quickness among their virtues, but they certainly cannot include clarity in that list.  Even among intimates, this message is as likely to lead to a dispute as an informative reply.  Another aspect that’s missing, and increasingly missing from written communication especially, is any sense of formality, professionalism, or what used be called politeness.  Now, you may say, “Well, that’s just a text message.”  Sure.  But ask yourself how many e-mails you have received without a greeting, a signature, an identification of the sender or introduction, or even so much as a rudimentary spell-check?  Did you answer them?  If you did, you, as are we all,  are complicit in the process of collapse.  Compare these two e-mails, typical of what I, as a college professor, have received from freshman students:

[2007]  Dear Professor:  I’m sorry I missed class last Tuesday and Thursday as my grandmother died.  I misplaced my copy of the syllabus.  Can you tell me what we did in class so I can make up the work?  Thanks, Kayla

[2017]  I missed class last week would you tell me what I missed

Neither one of these qualifies as polished, professional communication–especially from a writing student–but I think you’ll agree that the former has a few lingering virtues to recommend it, which have gone glimmering in the latter.  In fact, were I to delve deeper into my records of the past, we’d find that the students of the 1990s had bothered to include my actual name; that the excuses were often more inventive and frequently included such touches as offers of doctor’s notes; that a request to meet in office hours was not unheard of upon missing a week’s worth of training; that the student might have actually acquired class notes from a peer before writing; that the student would bother to identify which of the four classes I teach she was enrolled in.

I’m not sure that the degradation of the language–as slow and inevitable as abuse of the atmosphere that has summoned the effects of global warming– will contribute materially to the collapse of the society, the culture, or possibly even our civilization.  But I don’t fancy it helping.  It’s perhaps predictable that as our planet becomes more overpopulated, as more wealth becomes concentrated into fewer hands, and as such factors demand a parallel dynamic of information becoming the province of fewer people (collectors,) the rest of us will not find encouragement to strengthen our language skills beyond the consumer sphere (that is, you and I only need know how communicate well enough to work and buy and perhaps sell a bit.)

As for writing, a culture’s written language is the primary repository of its history.  Without a sense of history, it cannot evolve.

The solution?  Same as it’s always been, and the advice is good not just for writers, but for anyone who wishes to grow their brain and live up to something approaching their potential: READ.  Read anything.  Comic books, advertisements, editorials, romance novels, cereal boxes, movie credits.  Some are better than others, obviously.  Personally, I recommend Hawthorne, Hemingway, and Wharton, along with Carl Sagan for those whose tastes require something a little more contemporary–here was a man who knew a bit about large-scale collapse–but that’s just me.

Astronomers-Discover-a-Black-Hole-Choking-on-Stardust

Advertisements

The Collapse of the American English

 

03262015-Kingdome_01By Shawn StJean

Perhaps my accompanying photos are a trifle hyperbolic.  Perhaps.  It’s a truism among our global neighbors that Americans (by which I mean U.S. citizens) expect everyone, everywhere, to speak English.  The corollary, of course, is that most refuse to learn other languages, such as Spanish, even when the utility of doing so is abundantly clear.  But a looming problem for our culture in the 21st century seems to be that Americans increasingly decline even to learn English–at least beyond the 3rd or 4th grade level.

This level, supported by weak resources in the slang of the moment, proves sufficient for basic writing and speaking, but does not carry us far into the realm of critical thought and communication.

I choose the word “collapse” for my title, rather than “decline,” because I mean just that–what used to be a language with hundreds of thousands of specific, nuanced and descriptive choices has and continues to converge and impode into fewer and fewer.  With the recession of traditional print media in the face of digital dissemination of what can charitably be called information, even simple affirmations like “Yes,” “certainly,” “definitely,” “acknowledged,” and “no doubt,” in the most extreme example of private text messaging, have all been replaced by a single letter: “K.”

Need this be a bad thing?  After all, what’s more efficient than “K”?  Doesn’t that free us up for more important, or at least more, activity?  Before answering, let’s look at some other casualties in this war for the space in our brains.

Examine the following short list of commonly used expressions, and you’ll realize that either they are purposefully and even defiantly vague, or that one word takes the place of many–indicative of the digital age we live in (compression, homogenization, and subtle loss of nuanced information):

“Do” replaces the verbs/actions “try” “give” “accept” “participate in” “contribute to” “tolerate” “clean.”  As in “I don’t do Christmas.”

“Go” travel/venture/explore/pedal/fly/walk/hike/swim/jog and even “communicate something uncomfortable,” as in “Don’t go there.”

“huge” /big/large/important/significant/influential/knowledgeable/enthusiastic.  “I’m a huge fan.”  In my ear, this sounds ridiculous even on the face of it.  We all speak in meaphors of one degree or another all the time (“collapse” is a minor metaphor when not speaking of a physical structure,) but the above expression equates to saying the gushing adorer is an abnormally large person (or ventilating device.)  One might as well offer to wave oversized palm leaves, ancient-Egyptian style, at the object of worship.

“way” very/much/far/long (“This license is way out of date.” “This sauce has way more garlic than the recipe calls for.”)  This one in particular disturbs me because it deomonstrates we aren’t just discussing slang here.  “Way” has been adopted not just in common speech, but by professional writers.  It has infiltrated the language in a permanent, um, way–ahem–manner.

“You’re all set.”

“It’s all good.”

“basically”

“it’s all about”

“comes into play”

“deals with”

“back in the day”

Of course, words are invented, repurposed, and recombined all the time.  I must be overracting.  Aren’t these replacing archaic usages?  We’ve got “tweet.”  And “text.”  “Sick,” “diesel.” Oh, and “literally” can apparently now mean just the opposite, “metaphorically”–I mean, does it really matter?

“[   ] is a thing.”  Ah, yes, thing–the one catch-all noun when people grasp for a word and cannot find it, the very expression of inarticulateness, has become an official word to describe a fad, trend, icon, object of buzz or gossip, popular occurrence or consumer good, news item of the day, or week.  We had all those expressions, and they all relied upon small distinctions.  At this stage in human (d)evolution, we needed “thing”?

Okay.  Let’s say I’m right.  So the language is imploding.  What’s at stake here?

Many will not miss the subtleties that have dispersed into ether, I imagine.  Then again, it’s difficult to miss something you never knew you had.  What about the millions of unborn youngsters who will grow up with effective working vocabularies of a mere few thousand words?  Will they write poetry that amounts to more than a colorful tag on a railroad bridge?  Will they read it?  Will they understand the U.S. Constitution, even as they are called increasingly upon to “defend” it?  Will the historical records of the 19th and 20th centuries begin to sound as impenetrable as Shakespearian solilioquies do to us?  And I’m not talking about the kind of missing material in a contraction: to anyone but a fiction-writer or screenwriter, the distinction between “I’ve” and “I have” is not great.  One might use it to distinguish among charaters who are high-born or low-born, for example.  For the rest of us, it’s merely a convenience.

George Orwell warned writers not to compose in cliche’s.  He claimed, essentially, that writing in the pre-digested shorthand phrases of others leads to thinking in the pre-digested shorthand phrases of others.  Other signs that your thinking has been compromised: Do you find yourself regularly Googling information that you could remember with just a bit of effort?  Are you trusting that information (or that from Wikipedia, Mapquest, Siri, or the CBS Evening News) enough to act upon it or pass it on to another human being without double-checking it?  Are you cut-and-pasting that information (either in written or verbal form) without rephrasing it?  My overall point here is there exist vital differences among raw data, information (processed data), and intelligence (interpreted information).  And yet many of us are not bothering to recognize them.  Not because we lack the cognitive ability, but because we lack the critical tools and the will to use them.

A brief [ mostly harmless] experiment should serve here.  Raise your hand if you like music.

That should include most of you, one hopes. If you like music, you have probably in your time looked up some song lyrics.  In the old days, we read them out of LP album covers–which meant the source was the band’s record label, presumably direct from the songwriters themselves, which meant little chance of transmissional error.  Nowadays, we all know where song lyrics get found.  Dozens of websites cater to this need; even Google has gotten directly into the act through their search engine.  Look up a song or two that you know intimately, but the performed and recorded lyrics of which are not 100% crystal-clear by listening. I can guarantee you that, as transcribed onto your website of choice, you will not be long in discovering blatant errors in those lyrics which materially alter their meaning.  Furthermore, and more appallingly to me, you will discover upon cross-checking that most, if not all, of the alternative websites repeat that same error.  Which means, of course, that they are all “borrowing” from each other, and profiting off both you and the songwriters with little regard for the truth.  Now, if the stakes here seem low to you, import your experiment to the television news programs.  Jon Stewart had a running bit on his incarnation of The Daily Show dedicated to proving that not only do major news outlets shamelessly plagiarize from each other, but they do so in unedited cliche’s.  Again, in the old days, we might double-check their intelligence in what used to be called printed newspapers.  Umm.  Except. . .

One of the great virtues of written language is its precision, yet increasingly written English begins to resemble spoken English, even in widely disseminated and professionally published print media.  And spoken English begins to resemble colloquial English.  Don’t think so?  Ask an octogenarian (someone born roughly during the Great Depression, as of 2017) if their parents would use the word “cool” as part of their everyday discourse.  Nowadays, try to find someone who doesn’t.  Not that I think “cool” has done the language any great harm.  As far as I can tell, it was first used in America, in its modern sense, by Emerson in the 1840s–which probably means it dates back even farther and derives from the British.  But this word may prove the exception rather than the rule.  As it is, it conflates a much more typically detailed appraisal of a person, event, or object.  A girl who might once have been variously described as “tolerant,” “forgiving,” “loose,” “free-thinking,” “substance-abusing,” or “not a nag” is now simply “cool.”

Of course, one might argue that simple is better; the fewer moving parts in a machine, the more reliable it is likely to be (read “mousetrap.”)

I doubt the sustainability of that argument.  Another, more insidious example: “fewer” vs. “lesser” (or less).  Almost no one but your English teacher bothers with this one anymore.  Here’s why: who cares if your supermarket checkout line reads (correctly) “fewer than 12 items” or (incorrectly) “less than 12 items”?  Can’t we just dispense with one of these?  Well, we could.  Except one of them refers predominantly to individual items and people, and the other refers objects in bulk or concepts.  That is, “fewer people are finding jobs their college degrees prepared them for.”  NOT “less people.”  Because those people are individuals, not some vague statistic.  There’s less forest, which means fewer trees.  There may be “less opportunity.”  There may be “less rain this year” or even “less cod in these waters.”  But if there are unaccountably “less people,” we had better start looking for them.   And reevaluating the value we place on human life.

I’d like to conclude with a different, and more familiar example; possibly the mostly commonly transmitted text message in English:

Where R U

It (or some variant) is quick, servicible, doesn’t cost much effort to send, or–hypothetically–to answer.  And yet this message has probably caused more misunderstandings and needless arguments than most.  Why?  It’s laden with ambiguity (or even what deconstructors call “undecideability”).  In the absence of voice intonation, facial expression, pronunciation, linguistic context, primary and/or secondary punctuation, and so on, the receiver must interpolate those for herself.  Here’s how that might go, in response:

“None of your damn business.”

“Uh oh, he’s saying I’m late again.”

“Did I promise to be somewhere right now?”

“I’m at Main Street and Vine”

“She really wants to know Who am I with, and What am I doing?”

“I left an hour ago.”

Texts and tweets may count portability and quickness among their virtues, but they certainly cannot include clarity in that list.  Even among intimates, this message is as likely to lead to a dispute as an informative reply.  Another aspect that’s missing, and increasingly missing from written communication especially, is any sense of formality, professionalism, or what used be called politeness.  Now, you may say, “Well, that’s just a text message.”  Sure.  But ask yourself how many e-mails you have received without a greeting, a signature, an identification of the sender or introduction, or even so much as a rudimentary spell-check?  Did you answer them?  If you did, you, as are we all,  are complicit in the process of collapse.  Compare these two e-mails, typical of what I, as a college professor, have received from freshman students:

[2007]  Dear Professor:  I’m sorry I missed class last Tuesday and Thursday as my grandmother died.  I misplaced my copy of the syllabus.  Can you tell me what we did in class so I can make up the work?  Thanks, Kayla

[2017]  I missed class last week would you tell me what I missed

Neither one of these qualifies as polished, professional communication–especially from a writing student–but I think you’ll agree that the former has a few lingering virtues to recommend it, which have gone glimmering in the latter.  In fact, were I to delve deeper into my records of the past, we’d find that the students of the 1990s had bothered to include my actual name; that the excuses were often more inventive and frequently included such touches as offers of doctor’s notes; that a request to meet in office hours was not unheard of upon missing a week’s worth of training; that the student might have actually acquired class notes from  peer before writing; that the student would bother to identify which of the four classes I teach she was enrolled in.

I’m not sure that the degradation of the language–as slow and inevitable as abuse of the atmosphere that has summoned the effects of global warming– will contribute materially to the collapse of the society, the culture, or possibly even our civilization.  But I don’t fancy it helping.  It’s perhaps predictable that as our planet becomes more overpopulated, as more wealth becomes concentrated into fewer hands, and as such factors demand a parallel dynamic of information becoming the province of fewer people (collectors,) the rest of us will not find encouragement to strengthen our language skills beyond the consumer sphere (that is, you and I only need know how communicate well enough to work and buy and perhaps sell a bit.)

As for writing, a culture’s written language is the primary repository of its history.  Without a sense of history, it cannot evolve.

The solution?  Same as it’s always been, and the advice is good not just for writers, but for anyone who wishes to grow their brain and live up to something approaching their potential: READ.  Read anything.  Comic books, advertisements, editorials, romance novels, cereal boxes, movie credits.  Some are better than others, obviously.  Personally, I recommend Hawthorne, Hemingway, and Wharton, along with Carl Sagan for those whose tastes require something a little more contemporary–here was a man who knew a bit about large-scale collapse–but that’s just me.

Astronomers-Discover-a-Black-Hole-Choking-on-Stardust

 

Issues for Indie Authors: Revising The Script, one Strong Verb at a Time

vlcsnap-2015-02-23-11h07m41s52 vlcsnap-2015-02-23-11h05m11s83

By Shawn Stjean

Ever catch yourself substituting a wrong word for the right one, on purpose?  Dumbing your language down?  No?

Liar.  (As Emerson once opined, sometimes only one word works.  Like “Damn.”)

The collapse and convergence or shrinking of our language should be apparent to anyone who’s listening and reading: just observe how the word “way” (original meaning: path) has begun to permanently replace at least three other words in modern English: much, far, very.

“It’s way too easy” / “Mine’s way better”

“We go way back”/ “This happens way too often”

“That skirt’s way cool”

These in addition to its now-standard colloquial uses: “There’s no way I’m going there.” [slang for “possibility”]

We, as a literate culture, have somehow managed to lose our way [ahem.]

No, it isn’t just kids.  Watch your own language.  Authors and editors of published books and even what’s left of our newspapers have accepted such sentences as correct for twenty years.  But there’s something not so obvious here, like a complicating infection from an original illness.  And what we do once on purpose–to fit in, to seem up-to-date, for verisimilitude in dialogue, and so on, we repeat out of habit.

It’s all about verbs–weak ones–like the one in this sentence.  Go ahead–I’ll wait while you seek it.

You unearthed the problem, apprehended it, discovered it.   You found it.  Got it.  Yeah, my bad.

The verb IS (infinitive “to be” conjugated further as “was,” “were,” “are,” “being,”) lurking underneath those apostrophes and contractions, stands low as the base of a problematic pyramid, but the issue goes very deep, to the base foundations of illiteracy.  “To be,” as the weakest verb in our language, gets the most use.  It serves slave-duty.  Other third-tier infinitives: “To go” “To do.” “To say.” “To see.”

Next come hundreds of second-tier verbs, and even people who read frequently can get mired at this level, for their entire lives.   “I see what you mean.”  “I get it.” “I said so.”  “I went there.”

Crucial point: I remind my students, ad nauseum, that we don’t just desire better sounding verbs–we require more efficient verbs–ones that do more work.  “Attempt” may work no better than “try,” depending on context.  This advice runs counter to everything they assume–because everybody knows, the longer your essay, and the fancier the vocabulary, the higher the grade, right? (or the more pages in the book, the more money you can charge.)

One-dimensionality needs vigilant guarding against.

Now, among young folks I often like to point to pop culture for my examples, along with occasional pedantic references to Shakespeare and Milton.  Pop music functions well–great, thoughtful artists struggle right alongside horribly mediocre ones. Take:

Rush–the band’s name itself is a multi-signifying verb–though it’s also a noun.  Like the members themselves, the name works hard.  Check them out–and pay attention to the lyrics.

But let’s examine a more current example.  As I attended high school sporting events and practices this summer, I heard much motivational music blaring from loudspeakers:  here’s a YouTube link for the uninitiated, to the The Script’s excellent video for “Hall of Fame”: https://www.youtube.com/watch?v=mk48xRzuNvA

The music video reaches a potential that the song itself does not.  As good and catchy and emotionally stirring as this tune featuring will.i.am is, and as genuinely great as it aspires to be, it ironically relies on some of the weakest language available in English.  And remember, student-athletes hearing it are absorbed in the act, at that very moment, of pushing themselves to become better.

Be students
Be teachers
Be politicians
Be preachers

There’s a certain limited value in the repetition, or parallelism here.   And there’s the musical issue–the stanza requires one-syllable verbs.  HOWEVER, take a look at what just a little more thought can accomplish:

VS.

Be students
Seek teachers
Hear politicians
Heed preachers

The revision emphasizes the process of becoming over the state of being, as every kid jock (not all deaf ballerinas or scrawny boxers) in the grind of rehearsal, workout, or practice knows at a gut level–you have to work hard at it.  And real students–of life–require more than simple classroom attendance, or book learning.  One must venture out and interact with others–listen, try, do, fail, succeed, fail again, try again, work harder.  I’d argue that a great deal of resonance has been added by these revisions: rather than substituting meaning, they multiply it.

Third tier verbs function merely as connectors (“Jack was happy.”)  Second tier [vague and nondescriptive] verbs communicate the basic idea and no more (“I said it,”) and First tier [the best word for the job] verbs ennoble us: make us think, challenge us, inspire us, reward our effort.

Here’s a better verse from the same song:

You can move a mountain
You can break rocks
You can be a master
Don’t wait for luck
Dedicate yourself and you can find yourself

“Way” better.  And it so happens that a one-syllable word like “wait” can be the exact, perfect one.

POST-TEST.  Some might object that pop music makes an easy target.  Fair enough.  For you writers out there, here’s another example of how commercial success does not require anything like the higher standards I’ve described above.  Tune in on that frequency as you read.  Perhaps these opening paragraphs of Stieg Larsson’s The Girl With the Dragon Tattoo could be improved?

It happened every year, was almost a ritual. And this was his eighty-second birthday. When, as usual, the flower was delivered, he took off the wrapping paper and then picked up the telephone to call Detective Superintendent Morell who, when he retired, had moved to Lake Siljan in Dalarna. They were not only the same age, they had been born on the same day—which was something of an irony under the circumstances. The old policeman was sitting with his coffee, waiting, expecting the call.

“It arrived.”

“What is it this year?”

“I don’t know what kind it is. I’ll have to get someone to tell me what it is. It’s white.”

“No letter, I suppose.”

“Just the flower. The frame is the same kind as last year. One of those do-it-yourself ones.”

“Postmark?”

“Stockholm.”

“Handwriting?”

“Same as always, all in capitals. Upright, neat lettering.”

With that, the subject was exhausted, and not another word was exchanged for almost a minute. The retired policeman leaned back in his kitchen chair and drew on his pipe. He knew he was no longer expected to come up with a pithy comment or any sharp question which would shed a new light on the case. Those days had long since passed, and the exchange between the two men seemed like a ritual attaching to a mystery which no-one else in the whole world had the least interest in unravelling.

The Latin name was Leptospermum (Myrtaceae) rubinette. It was a plant about four inches high with small, heather-like foliage and a white flower with five petals about one inch across. The plant was native to the Australian bush and uplands, where it was to be found among tussocks of grass. There it was called Desert Snow. . .

Now, in America we all seem to believe that one can’t argue with success.  Yet, if this remains the best we professionals can do, I’m a little concerned about the future of the amateurs.  Because let us not forget: reading and writing remain the best activities for promoting critical thinking and growing the human brain.  Students have been taught to write in the passive voice (sentences have no actor in them, as this one.  Who taught the students to do it?,) which solves a few problems (overuse of “I”) but the cure becomes worse than the disease.  It leads to cliche’d and passive thinking.

I have no opinion on the plotting, characterization, attention-getting ability, expositional effectiveness, or any other aspect of Larssen’s work here.  He may well be a genius beyond my ken.  My example only applies to his use of language, which, by the standards described in this article, scores “mediocre” at best.  His characters certainly should be forgiven for their terseness and inarticulateness, designed in by the author as part of a shorthand between intimates.  In fact, in many ways, they speak better than the narrator (who, in two cases, uses “to be” forms three times in one sentence.)  This may sound pompous of me and hopelessly outmoded, but I would never let one of my own students get away with that.

Now lest anyone object that these can’t be improved–that sometimes one must use a lesser word–you are correct.  It’s true.  But, most of the time, it only takes another pass.  And some sweat of the brow.

REVISION:

The plant, native to the Australian bush and uplands, grew [hid, nestled, waited discovery] among tussocks of grass.

Much more efficient–AND the emphasis shifts to the important element under discussion–the plant itself.  But then again, not the best way to get paid by the word, fill up more pages, consume people’s time, or, much like the rare flower of the book, encourage the growth of readers, among stagnant masses.

Perhaps the world does know Larsson’s name–for the moment–and I’m sure he’s made his money.  Will he, or The Script and will.i.am, ever share company with that other famous William, of the 16th century, master of i.am.bic pentameter?  I wonder: After all, no one ever rode into the Hall of Fame on their third- and second-best.

Most sink to the master standard of our time–“Good enough”–or tread water as the Good many were born with.  Only the few rise to Greatness.  Because they’re willing to earn it.  Learn.  Sweat.  Think.  Work harder, smarter, and better.

But hey, as the Most like to say: it is what it is.

Right?

6fbd2da5d6c7