In language, as in so many areas of life, what once were vices are now habits.
“She was not always popular – and I was so shame!” said Prime, who is of Ngāpuhi descent. It was a new usage – in this case, one common in Māori-English dialect – to enter Hansard: not shamed, or ashamed, but shame, a noun used as a descriptor. As they say in social media, it’s a thing. And there’s more where that came from, such as the usage of bias, as in “you shouldn’t comment on this because you’re bias”.
Defenders of orthodox grammar – the technical term is prescriptivists – bridle at such liberties, but more and more of it is coming down the pike, and there are fewer and fewer (as distinct from less and less) gatekeepers. Straitened media outlets are cutting back on subeditors, so copy often goes to print or to air with no grammatical checks. Television reporters long ago stopped using sentences, deciding that in a typical 80-second report, prepositions waste valuable time.
Ways of teaching language are constantly being questioned by educational researchers, and in 2015, schoolchildren in the land of the Queen’s English returned a particularly poor overall national exam result in the subject. The most recent horror for the pearl-clutchers was when the Economist, a beacon of clear language, used on its cover the line – “Who Cyril Ramaphosa should fire”, referring to the new South African President. Readers complained, but the editor had decided that “whom”, though correct, was “unacceptably stilted”.
The magazine’s Johnson language column had earlier predicted the demise of the perennially confusing objective-case. “For whom, the bell tolls”, the columnist concluded, saying using “who” instead rarely caused confusion. Johnson further baited the magazine’s exacting readership with a headline, “OMG, the internet is ruining language, amirite? Wrong.” The columnist concluded that online argot, fast-developing and impenetrable to many older people, was at the worst “meh” (meaning ho-hum), and on the bright side, rather good fun.
Indignation at shortcuts and vulgarities entering the lexicon is not new. Take the quaint-seeming exclamations “gadzooks!” and “zounds!”, for instance – deplored in their respective heydays, the late 17th and 16th centuries respectively, because they derived from “God’s hooks” and “God’s wounds”.
There seems to be a growing new orthodoxy: fluidity of grammar and usage is healthy and to be embraced, provided the communication is still clear. Resistance is futile, say the progressives, known as descriptivists, who hold that a language is defined by what people do with it.
The style arbiter for the sparky popular culture website BuzzFeed, Emily Favilla (a descriptivist), takes the view that “being resistant to change is impractical – and it can make you seem stodgy and miserable and irrelevant”. Her book, A World Without Whom, chronicles intensive and thoughtful discussion at BuzzFeed about correct modern usage and punctuation – not, as arch-prescriptivists might think, smart-alecky young things vandalising the lingo just to show off. They debated long and hard about the serviceability, flow and above all clarity of the site’s ever-zanier vocabulary and grammar rules.
Now, even sticklers for correct grammar are coming to conclusions similar to the Economist’s: change is inevitable and a lot of it is justified. For instance, we need new terms for new social nuances. Where would #MeToo be without the dead-eyed neologism mansplaining (not forgetting its offshoot, manspreading)? As Christchurch author and former English teacher Joe Bennett says, “Every word started as a neologism; something that someone coined.”
An enthusiastic coiner himself, Bennett says he takes care to apply a sort of “reverse engineering” structure so readers can tell the meaning from the context. He says the rules for what is and isn’t a proper word or construction have never been clear-cut and never will be; the main thing is whether people understand and use them.
Bennett says a key function of neologism and grammar-bending is as a demographic privacy shield. The young coin new words and phrases to keep the old at a distance. By the time parents get to grips with what on earth “chur”, “woke” or “bite me” might mean, they’ll be up against a whole different set of private code words. Elder disapproval or bewilderment are marks of a coinage’s success.
Exclusionary neologising is also a social-class code. Bennett instances “backslang” like “yob”, (originally non-derogatory code for boy) coined by porters in the middle of the 19th century so the toffs they lugged gear for couldn’t tell what they were on about. Cockney rhyming slang is another example of one social group devising argot to exclude another.
Bennett suspects that as English usage has evolved, grammar’s trickiest elements may now serve only as a social signifier for snobs like him, who get a kick out of understanding them where others don’t. “And don’t we hate it when someone points out our grammatical errors!”
Te reo Māori, too, is evolving. Whanau Ora and Youth Minister Peeni Henare fields messages from his mother about phrasing on Te Karere and Te Kaea. “One was the use of ‘tumeke’ as ‘thumbs-up’ or ‘choice’, which was very different to its traditional meaning of a shock or a surprise. She was onto me: ‘Hey, boy, what’s that about?’”
Henare, whose father, Erima, headed the Māori Language Commission, grew up bilingual, but now finds himself at a loss in te reo for terms like “app”, knowing his children’s cohort is right up to date with it all. “Once upon a time, we would, say, for computer, take ‘roro’ for brain and ‘hiko’, for fast and get ‘rorohiko’. But now … well, there’s a lot to keep up with.”
Henare says some Māori frown on the colloquialising of Māori terms, and others of mixing te reo and English in conversation. But he believes it’s all inevitable – and healthy. The important thing is for everyone to use and enjoy te reo, while allowing that not everyone feels comfortable with change.
British research following the 2015 exams shock has found fault not with the tendency of children’s families to use poor grammar – such as “ain’t” and “I didn’t do nothing” – which is what prescriptivists blamed, but rather with the rote teaching of dry grammar rules.
In the ensuing debates, British linguist David Crystal defended grammar as the system of building blocks by which we understand both what’s being said and what’s not and by which we compel people to take note of what we say. But he’s among those who say that demanding children be able to identify which bit is a fronted adverbial or a subordinating conjunction does nothing for literacy or communication.
In an article for the TES (formerly known as the Times Educational Supplement), he recommended instead the study of such things as the use of passive language (“The cat was chased by the dog”, rather than “The dog chased the cat”). Why not get kids thinking about why the English language allows us to say both when they mean the same thing, he asked. What if, instead, we said, “The cat was chased”? This instantly sets up a degree of curiosity, even suspense. Who chased the cat? Why? This could quickly get children exploring the reasons for different grammatical constructions. Phrases such as “entry prohibited”, he says, can get children asking why it is prohibited, by whom and why the prohibiters are unidentified.
Academics also responded to the public panic over the low exam results with research showing people switch in and out of different argots – including the “ain’t” one – with relative ease. Furthermore, such registers of English had strict grammar rules and coherent structures of their own, which had to be learnt and observed. It wasn’t that kids weren’t speaking “properly”; they were speaking properly and in a perfectly comprehensible register and could switch to orthodox usage when it was appropriate. The same was true in other languages that had a standard version and dialectal variants, such as English and French creoles.
A Manchester Metropolitan University regional dialect study found a fierce pride in distinctions from districts quite close geographically. Some northerners said “hospickle” and “lickle” for hospital and little; others “areet?” instead of “how are you?” The list goes on, but it is enough to ask whether Coronation Street would still be going today if its characters had been permitted to say things like “eh, chook, Ahm stood standeen ’ere waitin’ for a brrrew” if standard English-speakers couldn’t understand them?
Modes of speech
Bennett says we all have different modes of speech for different occasions and strict governing rods in our brains that stop us using the wrong mode for the wrong occasion. “We’re social chameleons. There’s no danger of me saying f--- if I’m talking to the person at the supermarket checkout, but if I see a friend, I’m highly likely to say, ‘How the f--- are you, where the f--- have you been?’”
In that vein, it’s telling that in Parliament, despite MPs getting passionate and even choleric, there has been only one recorded instance of the F-word – in an aside by New Zealand First’s Ron Mark.
Inevitably the prescriptivists point to vulgarity as more grease on the slippery slope to barbaric illiteracy. But since learning to string a few ughs together, humans have devoted boundless ingenuity to coming up with new profanities, most of which lose their shock value – to wit, gadzooks and zounds. Today, the online Urban Dictionary, a combination of encyclopaedia and swap-meet, demystifies all manner of old and new vulgarities from all round the world. It’s enough to curl the hair, even of a die-hard descriptivist.
In fact, the only fit response for much of the Urban cornucopia is another atrocity against the orthodoxy: WTF? That’s right, prescriptivists: initialisms and acronyms are gathering speed in that handcart to hell. WTF is indeed vulgar, but snappier and more evocative for many people’s purposes than any phrase. LOL.
This article was first published in the March 24, 2018 issue of the New Zealand Listener.