Sometimes what I read tells me what to write about. Other times the hints come from what I watch. This time it’s both. First I read a line in Richard Pryor’s autobiography Pryor Convictions with this mighty stack of intensifying negatives:
Don’t never tell nobody not to use no double negativesFebruary 27, 2023
Continual vs continuous – what’s the difference?September 15, 2022
Introduction and origins
What’s the difference between continual and continuous? There’s a short answer, but it’s misleading, so – surprise! – I’m going with the long and complicated one.
Some people make a firm distinction between the two adjectives, but others don’t or only sometimes do. The distinction has merit, but it’s not categorical, more the codification of a general but lopsided pattern.
Because the words are so close in sense and use, they’re often used interchangeably (the adverbs continually and continuously even more so). This seldom leads to confusion or difficulty, but it’s also true that each word has domains it specializes in and others it’s less suited to.
Both words come from Latin continuus ‘hanging together, uninterrupted’, continual arriving via Old French continuel. Their endings, –al and –ous, are common adjective-forming suffixes. The words’ more recent history sheds light on their use, but first let’s look at how they’re defined, since this reflects how they’re used and gets to the centre of the problem.
How to accept language change, with David CronenbergJune 24, 2022
Language change is something I watch closely, both as a copy-editor and as someone broadly interested in how we communicate. I read usage dictionaries for fun; I also read a lot of fiction, and sometimes, as a treat, it throws up explicit commentary on shifts or variation in usage.*
This happened most recently in Consumed (Scribner, 2014) by David Cronenberg (whose thoughts on language invention I covered earlier this year). Nathan, a young photojournalist, is visiting Roiphe, an elderly doctor, who calls Nathan ‘son’ just before the passage below, emphasizing the generational gap. They’re sitting in Roiphe’s kitchen:
“Want some ice water? Maybe coffee? Anything?”
“No, thanks. I’m good.”
“ ‘I’m good’ is funny. Sounds funny to me. We never used to say that. We’d say ‘I’m fine. I’m all right.’ But they do say ‘I’m good’ these days. So what are we looking at here?”
We ourself can use this pronounMarch 25, 2022
On a recent rewatch of the 1979 film The Warriors, I noticed an unusual pronoun spoken by Cleon, played by Dorsey Wright:*
Ourself, once in regular use, is now scarce outside of certain dialects, and many (maybe most) people would question its validity. I’ve seen it followed by a cautious editorial [sic] even in linguistic contexts. The Cambridge Grammar of the English Language (2002), describing it as the reflexive form of singular we – ‘an honorific pronoun used by monarchs, popes, and the like’ – says it is ‘hardly current’ in present-day English.
But that’s not the whole story, and it belies the word’s surprising versatility and stubborn survival outside of mainstream Englishes, which this post will outline. There are graphs and data further down, but let’s start with usage.
Four types of language prescriptivismJuly 25, 2021
Prescriptivism is an approach to language centred on how it should be used. It contrasts with descriptivism, which is about describing how language is used. Prescriptivism has a bad reputation among linguists and the descriptively minded. I’m in the latter group, but I routinely apply prescriptive rules in my work as a copy-editor. It’s a more nuanced picture than is generally supposed.
I’m selective about the rules I enforce, dismissing the myths that bedevil English usage. I may apply a rule one day and not the next, adjusting to house style or other factors. I also edit texts to make them more inclusive – less ableist and more gender-neutral, for example. That too is prescriptivism, though it’s not usually categorized as such.
When people use language, they’re often influenced or guided by prescriptive advice, instruction, traditions, and norms. That influence, no matter how overt, conscious, or otherwise, must be part of how we describe language and its history. So in some ways descriptivism encompasses prescriptivism, or at least it should.
The complexity and apparent conflicts here derive in large part from the tendency to lump prescriptivism into a single category. I do this myself sometimes, for convenience. But by oversimplifying the nature and aims of prescriptivism, we invite confusion, category errors, and semantic muddles.
So how might we bring this fuzzy picture into better focus? One attractive option is proposed by linguist Anne Curzan in her book Fixing English: Prescriptivism and Language History (Cambridge University Press, 2014), which seeks to clarify the heterogeneous nature of prescriptivism and to give it its historical due:
Literal decimationMarch 20, 2020
Talk to any committed language peever,* and sooner or later you’ll hear about decimate: that it properly means ‘kill one in ten’ and should not be used to mean ‘destroy a large proportion of’ or ‘inflict great harm or damage on’. This is because decimate originally referred to a practice in the Roman army of executing one in ten men in mutinous groups.
It’s the etymological fallacy: the belief that a word’s older or original meaning is the only correct one or is automatically more correct than newer, conventionally accepted ones. Words that repeatedly elicit the fallacy include aggravate, alternative, dilemma, fulsome, refute, and transpire. It’s often a vehicle for pedantic or snobbish triumphalism: I acquired this knowledge, and you didn’t, so I must display it.
Decimate is infamous in editorial circles for this reason. My rule, featured in the A–Z of English usage myths, is that if you say decimate can only mean ‘kill one in ten’, you must also call October ‘December’. (See also: quarantine for any period other than 40 days, etc.) For authoritative discussion, browse the usage notes in a few good dictionaries, starting with AHD.