The Disintegration of Language
March 1, 2012
It occurred to me that it may not be difficult to understand how our society has come to so blithely consider corporations people and people little more than human resources, or things amenable to exploitation.
In his book of essays called Standing By Words (1984), America’s premier wordsmith and agrarian philosopher, Wendell Berry wrote:
“Two epidemic illnesses of our time – upon both of which virtual industries of cures have been founded – are the disintegration of communities and the disintegration of persons. That these two are related (that private loneliness, for example, will necessarily accompany public confusion) is clear enough. What seems not so well understood, because not so much examined, is the relation between these disintegrations and the disintegration of language. My impression is that we have seen, for perhaps a hundred and fifty years, a gradual increase in language that is either meaningless or destructive of meaning. And I believe that this increasing unreliability of language parallels the increasing disintegration, over the same period, of persons and communities.”
The Chinese character on the cover of this volume depicts a man beside the sign for “word”. It is the written form of xin, which Ezra Pound defined as: “Fidelity to the given word. The man here is standing by his word”.
Berry writes: “Such fidelity to the word, as evidenced by clarity of meaning and intent, would go far to reconnect language to life. Without a renewed sense of language we cannot hope to restore balance, harmony, and coherence to our lives, our land, and our communities…”
The year of publication, 1984, of course, also recalls George Orwell’s classic dystopian novel, published in 1949, in which he coined, among others, two new terms. Doublethink: the act of simultaneously accepting two mutually contradictory beliefs as correct. Newspeak: a fictional, deliberately impoverished language promoted by the authoritarian state to prevent alternative thinking.
What would be immeasurably worse than an imposed restrictive language would be the deliberate or negligent debasing of our common language.
In early American society, we would judge a man by whether he was “a man of his word”, by whether he stood by his word and could be trusted to both say what he meant and mean what he said. This was also referred to as a “plain-spoken man”.
Today, the careless use of language mirrors our careless use of both the earth’s natural heritage and the uniquely human tools that we’ve created to enhance our abilities to make a life. It may also mirror our careless, if not reckless, use of one another – as low-valued producers of our “goods”, as consumers of our products and services, and as members of what was once valued as the core of social life: community.
Cause and effect are easily confused. Perhaps, living in a world in which the law has long considered corporations as “persons” would lead one to unconsciously think of such a human artifice as a “who”. But I suspect, as I believe does Wendell Berry, that much of the malaise and cognitive confusion of our society is due to a widespread lack of diligence in the use of one of the most important and signature qualities of humanity: spoken and written language.
I was fortunate to have been born and come of age in a time before the digitalization of language and the transmission of words at the speed of light. I learned to write in longhand with pencil on paper and used nothing more complex than a manual typewriter in college to transcribe my hand-written theses into finished form. It is telling that I wrote all those college papers in a single draft, and so did not need the wizardry of digital cut-and-paste to re-order my thoughts or of automatic spell-checkers to correct my words and syntax. Writing by hand allowed me the time to think through, not only each thought, but the context of each thought within the organization of the whole thesis.
I will admit to enjoying the luxury of a word processor when, some years later I again attended college classes which required a great amount of writing. But I had already forged the ability to think as I wrote, without my writing getting ahead of my ability to comprehend the entirety of what I intended to convey. I should also state that, far from being a technological luddite, I was programming main-frame computers in 11th grade in 1968, worked for my father as a statistical analyst on a computer terminal in 1977, and taught my son to program PC computers in 1992, while writing hundreds of batch-file tertiary programs for my own use on a DOS-based computer. To this day, I am grateful for the ease of editing on a word processor and I do all my own engineering calculations on self-generated spreadsheets. But I am not either dependent upon them or limited by them. They enhance the skills I’ve developed independently of them.
Today’s youth, however, have grown up with machines and computers affixed to their bodies: first WalkMen and then Ipods, and now Ipads, “smart” phones and laptops. They know little else, are almost completely dependent upon personal technologies, and are in large measure defined by them. This has had a dramatic effect on language.
When email became a dominant mode of communication, though largely replacing the sacrament of a written letter, I communicated no differently electronically than I would in longhand. Email (or electronic mail) was a convenience – both less expensive and faster than postal mail – but it encouraged and trained succeeding generations in speed writing to match the inherent speed of the media and the consequent acceleration of our social world. The later developments of text (instant) messaging and Tweets only exacerbated an already diminished capacity to communicate carefully with the written (or key-punched) word. And it encouraged the dispatch of messages that one might not have sent into the in-erasable “cloud” of cyber space, if one took more time, thoughtfulness and care.
A consequence, not only of this latest generation of technological gadgets, but also of a generalized diminution of care in our use of both things and each other (which is, itself, a consequence of the accelerating pace and complexity of life) is that both written and spoken language have deteriorated in quality and clarity.
As a long-time user and lover of tools – both mechanical and electronic – I’ve developed the habit of using them with care, both so that they will last and so that their function remains intact. For instance, I would never use a good chisel as a screwdriver of a prybar – that is not only an inappropriate and disrespectful usage but also would tend to diminish the capacity of the tool to perform its intended function. The same is true of language. Use it with respect and care, and it will not only perform its intended function – clear and cogent communication – but also transmit not only meaning but respect to the recipient. If I use sloppy language, it suggests that I don’t care about my ideas and also that I don’t care about you.
The rise of the initial “so” (which seems to have come out of the Silicone Valley programming culture), said Michael Erard, the author of Um…: Slips, Stumbles, and Verbal Blunders, and What They Mean, is “another symptom that our communication and conversational lives are chopped up and discontinuous in actual fact, but that we try in several ways to sew them together – or ‘so’ them together, as it were – in order to create a continuous experience”.
This, I believe, is a direct result of both the speed of life and communication and the consequent fragmentation of thought that is required in a Tweeting culture. I’m hearing this linguistic habit more commonly among interviewees on radio programming, who answer almost every question with a sentence beginning with the conjunction “so” – a word intended to connect a previous thought, often in a causal capacity, with a subsequent one: “If this, then that”. It seems also to be intended to give logical weight and credibility to a statement by syntax where such veracity may not be inherent in its meaning. There needs to be a new logical fallacy named after this: perhaps argumentum ad syntacticum.
Another increasingly common usage is an introductory clause followed by a secondary, redundant verb: “The thing is”, “the fact is”, “the point is”, etc. followed by “is that…”. The double copula is the usage of two successive subject-predicate connectives when only one is necessary. For example: “My point is, is that…”. This usage also shows up with a hidden double “is”: “The problem being, is that…”.
The problem is, is that this is becoming almost epidemic in its proportions in modern spoken English (irony implied).
Until these latter two had become endemic, my primary peeve had been the misplaced “only”, which is almost universal in today’s spoken and written language – even among professional journalists and other allegedly educated and intelligent people. Grant Barrett, lexicographer and co-host of the NPR show A Way With Words, insists that this is now normative and hence OK.
Example: “I only went to the mall” (in answer, perhaps, to the question “Did you disobey me?”) which means the speaker did nothing other than going to the mall rather than the intended meaning that the speaker didn’t go anywhere other than to the mall. This muddles the meaning by suggesting that “I” didn’t disobey you in any other way, such as by hanging out with the wrong friends or smoking or drinking. The correct syntax, “I went only to the mall”, can be followed up with “Yes, but with whom did you go and what else did you do at the mall?” (though almost no one today uses “with whom” in place of the dangling proposition – “who did you go to the mall with?” – and I’m less concerned about that as it doesn’t reduce or confuse meaning).
While the argument for this usage is that the intended meaning is conveyed, I find that simply an excuse for linguistic laziness. It’s a laziness which has the effect of diminishing the range of meaning that such a modifier as “only” allows. As an adjectival modifier, “only” isolates a single thing and as an adverbial modifier it isolates a single action, but placing it immediately before the verb when it’s intended to limit the predicate object eliminates its potential to express a limit to the action.
A similar usage problem occurs with “just”. “You just have absolute tree carnage with this heavy snow just straining the branches”, said National Weather Service spokesman Chris Vaccaro (AP news story 10/30/2011). What this sentence actually states is that there is nothing other than “absolute tree carnage” and “this heavy snow” was doing nothing other than “straining the branches”. In both cases, the word “just” as a synonym for “only” is both unnecessary and misleading.
For those with a sufficiently critical eye to have noticed the placement of the period and comma outside of the quotation marks (in the preceding paragraph), rather than the accepted inside position, I will mention a major peeve that is based on simple syntactical logic.
If an entire sentence is quoted as a free-standing sentence, such as – “You just have absolute tree carnage with this heavy snow just straining the branches.” – then it’s perfectly appropriate to include the sentence punctuation within the quotation marks. But, if the quotation is embedded within another sentence, such as – “You just have absolute tree carnage with this heavy snow just straining the branches”, said National Weather Service spokesman Chris Vaccaro – then the comma (in this case) should separate the quotation from the rest of the embedding sentence.
This is even more obvious with quoted words, phrases or clauses, such as above: A similar usage problem occurs with “just”. It would make no grammatical sense to include the period within the quotes as the word “just” is not a sentence requiring punctuation and the full sentence concludes following the closing quotation mark.
An earlier sentence above demonstrates the use of quoted clauses:
Another increasingly common usage is an introductory clause followed by a secondary, redundant verb: “The thing is”, “the fact is”, “the point is”, etc. followed by “is that…”.
Since the quoted clauses are part of a series, the separating commas should logically be between the quotations , not within them, and the final punctuating period of the encompassing sentence should be isolated from the concluding quotation and its own punctuating ellipsis.
The only plausible exception to this logically-evident (but non-standard) rule is when the included quotation is a question: The correct syntax, “I went only to the mall”, can be followed up with “Yes, but with whom did you go and what else did you do at the mall?”. Yet this syntax requires a secondary concluding period for the encompassing sentence (even though my spell/syntax checker doesn’t like it).
A lesser, but still annoying peeve is the confusion between “if” and “whether” and the redundant use of the negation with “whether”.
The term “if” is used for a conditional idea, “whether” for an alternative or possibility. Thus, “let me know if you’ll be coming” means that I want to hear from you only if you’re coming (but not that “I only want to hear from you if you’re coming”). And “let me know whether you’ll be coming” means that I want to hear from you one way or the other. And “let me know whether you’ll be coming” implies the negative “or not”, so it’s unnecessary to state it.
Other common language abuses include these:
|misuse||in place of||when referring to||rather than|
|over||more than||quantity or amount||vertically above|
|under||less than or fewer than||quantity or amount||vertically below|
|less than||fewer than||discreet numerical units||mass quantity|
|incidences||incidents||events||rates of occurrence|
May be reproduced only with attribution for non-commercial purposes